Module: GoodData::UserFilterBuilder
- Defined in:
- lib/gooddata/models/user_filters/user_filter_builder.rb
Class Method Summary collapse
-
.collect_labels(data) ⇒ Object
Groups the values by particular label.
-
.collect_values(data) ⇒ Object
Collects specific values and deduplicates if necessary.
- .create_attrs_cache(filters, options = {}) ⇒ Object
- .create_cache(data, key) ⇒ Object
-
.create_expression(filter, labels_cache, lookups_cache, attr_cache, options = {}, login) ⇒ Object
Creates a MAQL expression(s) based on the filter defintion.
- .create_filter(label, values) ⇒ Object
- .create_label_cache(result, options = {}) ⇒ Object
- .create_lookups_cache(small_labels) ⇒ Object
-
.create_user_filter(expression, related) ⇒ Object
Encapuslates the creation of filter.
- .create_user_profile_mapping(filters, project_users, options = {}) ⇒ Object
- .execute_mufs(user_filters, options = {}) ⇒ Object
-
.execute_variables(filters, var, options = {}) ⇒ Array
Executes the update for variables.
-
.get_filters(file, options = {}) ⇒ Boolean
Main Entry function.
- .get_missing_users(filters, options = {}) ⇒ Object
-
.get_small_labels(labels_cache) ⇒ Object
Walks over provided labels and picks those that have fewer than certain amount of values This tries to balance for speed when working with small datasets (like users) so it precaches the values and still be able to function for larger ones even though that would mean tons of requests.
-
.maqlify_filters(filters, user_profile_mapping, options = {}) ⇒ Array
Resolves and creates maql statements from filter definitions.
-
.process_line(line, options = {}) ⇒ Object
Processes a line from source file.
- .read_data_with_header(file, memo, options) ⇒ Object
- .read_data_without_header(file, memo, options) ⇒ Object
- .read_file(file, options = {}) ⇒ Object
-
.reduce_results(data) ⇒ Array
Processes values in a map reduce way so the result is as readable as possible and poses minimal impact on the API.
- .resolve_user_filter(user = [], project = []) ⇒ Object
-
.resolve_user_filters(user_filters, vals) ⇒ Object
Gets user defined filters and values from project regardless if they come from Mandatory Filters or Variable filters and tries to resolve what needs to be removed an what needs to be updated.
-
.row_based?(options = {}) ⇒ Boolean
Function that tells you if the file should be read line_wise.
- .verify_existing_users(filters, options = {}) ⇒ Object
Class Method Details
.collect_labels(data) ⇒ Object
Groups the values by particular label. And passes each group to deduplication
126 127 128 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 126 def self.collect_labels(data) data.group_by { |x| [x[:label], x[:over], x[:to]] }.map { |l, v| { label: l[0], over: l[1], to: l[2], values: UserFilterBuilder.collect_values(v) } } end |
.collect_values(data) ⇒ Object
Collects specific values and deduplicates if necessary
131 132 133 134 135 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 131 def self.collect_values(data) data.mapcat do |e| e[:values] end.uniq end |
.create_attrs_cache(filters, options = {}) ⇒ Object
193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 193 def self.create_attrs_cache(filters, = {}) project = [:project] labels = filters.flat_map do |f| f[:filters] end over_cache = labels.reduce({}) do |a, e| a[e[:over]] = e[:over] a end to_cache = labels.reduce({}) do |a, e| a[e[:to]] = e[:to] a end cache = over_cache.merge(to_cache) attr_cache = {} cache.each_pair do |k, v| begin attr_cache[k] = project.attributes(v) rescue nil end end attr_cache end |
.create_cache(data, key) ⇒ Object
137 138 139 140 141 142 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 137 def self.create_cache(data, key) data.reduce({}) do |a, e| a[e.send(key)] = e a end end |
.create_expression(filter, labels_cache, lookups_cache, attr_cache, options = {}, login) ⇒ Object
Creates a MAQL expression(s) based on the filter defintion. Takes the filter definition looks up any necessary values and provides API executable MAQL rubocop:disable Metrics/ParameterLists
233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 233 def self.create_expression(filter, labels_cache, lookups_cache, attr_cache, = {}, login) values = filter[:values] # Do not create MUF for label when all its values is NULL (https://jira.intgdc.com/browse/TMA-1361) non_null_values = values.select { |value| !value.nil? && value.downcase != 'null' } return ['TRUE', []] if non_null_values.empty? label = labels_cache[filter[:label]] if label.nil? = "Unable to apply filter values: #{values} since the project: #{[:project].pid} doesn't have label: #{filter[:label]} for login: #{login}" if [:ignore_missing_values] GoodData.logger.warn() return ['TRUE', []] else fail end end errors = [] element_uris_by_values = Hash[values.map do |v| if lookups_cache.key?(label.uri) [v, lookups_cache[label.uri][v]] else [v, label.find_value_uri(v)] end end] missing_value_errors = element_uris_by_values.select { |_, v| v.nil? }.map do |k, _| { type: :error, label: label.title, value: k, reason: 'Can not find the value of the attribute referenced in the MUF' } end errors += missing_value_errors unless [:ignore_missing_values] element_uris = element_uris_by_values.values.compact # happens when data is not yet loaded in the project no_values = element_uris.empty? expression = if no_values && [:restrict_if_missing_all_values] # create a filter that is always false to ensure the user can not see any data # as the proper MUF can not be constructed yet case [:type] when :muf '1 <> 1' when :variable nil end elsif no_values # create a filter that is always true to ensure the user can see all data 'TRUE' elsif filter[:over] && filter[:to] over = attr_cache[filter[:over]] to = attr_cache[filter[:to]] "([#{label.attribute_uri}] IN (#{element_uris.sort.map { |e| '[' + e + ']' }.join(', ')})) OVER [#{over && over.uri}] TO [#{to && to.uri}]" else "[#{label.attribute_uri}] IN (#{element_uris.sort.map { |e| '[' + e + ']' }.join(', ')})" end [expression, errors] end |
.create_filter(label, values) ⇒ Object
105 106 107 108 109 110 111 112 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 105 def self.create_filter(label, values) { :label => label[:label], :values => values, :over => label[:over], :to => label[:to] } end |
.create_label_cache(result, options = {}) ⇒ Object
170 171 172 173 174 175 176 177 178 179 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 170 def self.create_label_cache(result, = {}) project = [:project] result.reduce({}) do |a, e| e[:filters].map do |filter| a[filter[:label]] = project.labels(filter[:label]) unless a.key?(filter[:label]) end a end end |
.create_lookups_cache(small_labels) ⇒ Object
181 182 183 184 185 186 187 188 189 190 191 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 181 def self.create_lookups_cache(small_labels) small_labels.reduce({}) do |a, e| # The validElements API allow maximum paging with 10000 items lookup = e.values(:limit => 10_000).reduce({}) do |a1, e1| a1[e1[:value]] = e1[:uri] a1 end a[e.uri] = lookup a end end |
.create_user_filter(expression, related) ⇒ Object
Encapuslates the creation of filter
297 298 299 300 301 302 303 304 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 297 def self.create_user_filter(expression, ) { related: , level: :user, expression: expression, type: :filter } end |
.create_user_profile_mapping(filters, project_users, options = {}) ⇒ Object
306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 306 def self.create_user_profile_mapping(filters, project_users, = {}) domain = [:domain] found_list = {} missing_list = [] # Get the list of user login from filters login_list = filters.flat_map do |filter| filter[:login] end # Then find user login in the users_brick_input users_brick_input = [:users_brick_input] if users_brick_input&.any? users_brick_input.map do |user| login_list << user.with_indifferent_access['login'] end end login_list.uniq.flat_map do |login| user = project_users.find { |u| u.login == login } if user found_list[login] = user.profile_url else missing_list << login end end # rubocop:disable Metrics/BlockNesting unless missing_list.empty? || domain.nil? if missing_list.size < 100 missing_list.each do |login| user = domain.find_user_by_login(login) found_list[login] = user.links['self'] if user end else if @all_domain_users[domain.name].nil? @mutex.lock if @all_domain_users[domain.name].nil? domain_users = domain.users @all_domain_users[domain.name] = domain_users GoodData.logger.info("action=lcm_get_domain_users domain=#{domain.name} number_users=#{domain_users.size} number_missing_users=#{missing_list.size} use_cache=false") else domain_users = @all_domain_users[domain.name] GoodData.logger.info("action=lcm_get_domain_users domain=##{domain.name} number_users=#{domain_users.size} number_missing_users=#{missing_list.size} use_cache=true") end @mutex.unlock else domain_users = @all_domain_users[domain.name] GoodData.logger.info("action=lcm_get_domain_users domain=##{domain.name} number_users=#{domain_users.size} number_missing_users=#{missing_list.size} use_cache=true") end missing_list.each do |login| user = domain_users.find { |u| u.login == login } found_list[login] = user.links['self'] if user end end end # rubocop:enable Metrics/BlockNesting found_list end |
.execute_mufs(user_filters, options = {}) ⇒ Object
475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 475 def self.execute_mufs(user_filters, = {}) client = [:client] project = [:project] ignore_missing_values = [:ignore_missing_values] users_must_exist = [:users_must_exist] == false ? false : true dry_run = [:dry_run] project_log_formatter = GoodData::ProjectLogFormatter.new(project) project_users = project.users filters = normalize_filters(user_filters) user_profile_mapping = create_user_profile_mapping(filters, project_users, ) user_filters, errors = maqlify_filters(filters, user_profile_mapping, .merge(users_must_exist: users_must_exist, type: :muf)) if !ignore_missing_values && !errors.empty? errors = errors.map do |e| e.merge(pid: project.pid) end fail GoodData::FilterMaqlizationError, errors end filters = user_filters.map { |data| client.create(MandatoryUserFilter, data, project: project) } to_create, to_delete = resolve_user_filters(filters, project.) to_delete = sanitize_filters_to_delete(to_delete, [:users_brick_input], user_profile_mapping) unless [:no_sanitize] if [:do_not_touch_filters_that_are_not_mentioned] GoodData.logger.warn("Data permissions computed: #{to_create.count} to create") else GoodData.logger.warn("Data permissions computed: #{to_create.count} to create and #{to_delete.count} to delete") end if dry_run GoodData.logger.warn('Option "dry_run" specified. No user filters will be altered!') create_results = to_create.map { |x| { status: 'dry_run', user: x.first, type: 'create' } } delete_results = to_delete.map { |x| { status: 'dry_run', user: x.first, type: 'delete' } } return { created: {}, deleted: {}, results: create_results + delete_results } end if to_create.empty? create_results = [] else create_results = to_create.each_slice(100).flat_map do |batch| batch.pmapcat do |, group| group.each(&:save) res = client.get("/gdc/md/#{project.pid}/userfilters?users=#{}") items = res['userFilters']['items'].empty? ? [] : res['userFilters']['items'].first['userFilters'] payload = { 'userFilters' => { 'items' => [{ 'user' => , 'userFilters' => items.concat(group.map(&:uri)) }] } } res = client.post("/gdc/md/#{project.pid}/userfilters", payload) # turn the errors from hashes into array of hashes update_result = res['userFiltersUpdateResult'].flat_map do |k, v| v.map { |r| { status: k.to_sym, user: r, type: :create } } end update_result.map do |result| result[:status] == :failed ? result.merge(GoodData::Helpers.symbolize_keys(result[:user])) : result end end end project_log_formatter.log_user_filter_results(create_results, to_create) create_errors = create_results.select { |r| r[:status] == :failed } fail "Creating MUFs resulted in errors: #{create_errors}" if create_errors.any? end if to_delete.empty? delete_results = [] elsif ![:do_not_touch_filters_that_are_not_mentioned] delete_results = to_delete.each_slice(100).flat_map do |batch| batch.flat_map do |, group| results = [] if res = client.get("/gdc/md/#{project.pid}/userfilters?users=#{}") items = res['userFilters']['items'].empty? ? [] : res['userFilters']['items'].first['userFilters'] payload = { 'userFilters' => { 'items' => [ { 'user' => , 'userFilters' => items - group.map(&:uri) } ] } } res = client.post("/gdc/md/#{project.pid}/userfilters", payload) results.concat(res['userFiltersUpdateResult'] .flat_map { |k, v| v.map { |r| { status: k.to_sym, user: r, type: :delete } } } .map { |result| result[:status] == :failed ? result.merge(GoodData::Helpers.symbolize_keys(result[:user])) : result }) end group.peach(&:delete) results end project_log_formatter.log_user_filter_results(delete_results, to_delete) delete_errors = delete_results.select { |r| r[:status] == :failed } if delete_results fail "Deleting MUFs resulted in errors: #{delete_errors}" if delete_errors&.any? end end { created: to_create, deleted: to_delete, results: create_results + (delete_results || []) } end |
.execute_variables(filters, var, options = {}) ⇒ Array
Executes the update for variables. It resolves what is new and needed to update.
456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 456 def self.execute_variables(filters, var, = {}) client = [:client] project = [:project] dry_run = [:dry_run] to_create, to_delete = execute(filters, var.user_values, VariableUserFilter, .merge(type: :variable)) return [to_create, to_delete] if dry_run # TODO: get values that are about to be deleted and created and update them. # This will make sure there is no downitme in filter existence unless [:do_not_touch_filters_that_are_not_mentioned] to_delete.each { |_, group| group.each(&:delete) } end data = to_create.values.flatten.map(&:to_hash).map { |var_val| var_val.merge(prompt: var.uri) } data.each_slice(200) do |slice| client.post("/gdc/md/#{project.obj_id}/variables/user", :variables => slice) end [to_create, to_delete] end |
.get_filters(file, options = {}) ⇒ Boolean
Main Entry function. Gets values and processes them to get filters that are suitable for other function to process. Values can be read from file or provided inline as an array. The results are then preprocessed. It is possible to provide multiple values for an attribute tries to deduplicate the values if they are not unique. Allows for setting over/to filters and allows for setting up filters from multiple columns. It is specially designed so many aspects of configuration are modifiable so you do have to preprocess the data as little as possible ideally you should be able to use data that came directly from the source system and that are intended for use in other parts of ETL.
39 40 41 42 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 39 def self.get_filters(file, = {}) values = get_values(file, ) reduce_results(values) end |
.get_missing_users(filters, options = {}) ⇒ Object
144 145 146 147 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 144 def self.get_missing_users(filters, = {}) users_cache = [:users_cache] filters.reject { |u| users_cache.key?(u[:login]) } end |
.get_small_labels(labels_cache) ⇒ Object
Walks over provided labels and picks those that have fewer than certain amount of values This tries to balance for speed when working with small datasets (like users) so it precaches the values and still be able to function for larger ones even though that would mean tons of requests
224 225 226 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 224 def self.get_small_labels(labels_cache) labels_cache.values.select { |label| label &.values_count &. < 100_000 } end |
.maqlify_filters(filters, user_profile_mapping, options = {}) ⇒ Array
Resolves and creates maql statements from filter definitions. This method does not perform any modifications on API but collects all the information that is needed to do so. Method collects all info from the user and current state in project and compares. Returns suggestion of what should be deleted and what should be created If there is some discrepancies in the data (missing values, nonexistent users) it finishes and collects all the errors at once
376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 376 def self.maqlify_filters(filters, user_profile_mapping, = {}) fail_early = [:fail_early] == false ? false : true users_cache = [:users_cache] labels_cache = create_label_cache(filters, ) small_labels = get_small_labels(labels_cache) lookups_cache = create_lookups_cache(small_labels) attrs_cache = create_attrs_cache(filters, ) create_filter_proc = proc do |login, f| expression, errors = create_expression(f, labels_cache, lookups_cache, attrs_cache, , login) safe_login = login.downcase profiles_uri = if [:type] == :muf user_profile_mapping[safe_login].nil? ? ('/gdc/account/profile/' + safe_login) : user_profile_mapping[safe_login] elsif [:type] == :variable (users_cache[login] && users_cache[login].uri) else fail 'Unsuported type in maqlify_filters.' end if profiles_uri && expression && expression != 'TRUE' [create_user_filter(expression, profiles_uri)] + errors else [] + errors end end # if fail early process until first error results = if fail_early x = filters.inject([true, []]) do |(enough, a), e| login = e[:login] if enough y = e[:filters].pmapcat { |f| create_filter_proc.call(login, f) } [!y.any? { |r| r[:type] == :error }, a.concat(y)] else [false, a] end end x.last else filters.flat_map do |filter| login = filter[:login] filter[:filters].pmapcat { |f| create_filter_proc.call(login, f) } end end results.group_by { |i| i[:type] }.values_at(:filter, :error).map { |i| i || [] } end |
.process_line(line, options = {}) ⇒ Object
Processes a line from source file. It is processed in 2 formats. First mode is column_based. It means getting all specific columns. These are specified either by index or name. Multiple values are provided by several rows for the same user
Second mode is row based which means there are no headers and number of columns can be variable. Each row specifies multiple values for one user. It is implied that the file provides values for just one label
93 94 95 96 97 98 99 100 101 102 103 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 93 def self.process_line(line, = {}) index = [:user_column] || 0 login = line[index] results = [:labels].mapcat do |label| column = label[:column] || Range.new(1, -1) values = column.is_a?(Range) ? line.slice(column) : [line[column]] [create_filter(label, values.compact)] end [login, results] end |
.read_data_with_header(file, memo, options) ⇒ Object
71 72 73 74 75 76 77 78 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 71 def self.read_data_with_header(file, memo, ) CSV.foreach(file, headers: true, return_headers: false) do |row| key, data = process_line(row, ) memo[key] = [] unless memo.key?(key) memo[key].concat(data) end memo end |
.read_data_without_header(file, memo, options) ⇒ Object
63 64 65 66 67 68 69 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 63 def self.read_data_without_header(file, memo, ) CSV.foreach(file, headers: false, return_headers: false) do |row| key, data = process_line(row, ) memo[key] = [] unless memo.key?(key) memo[key].concat(data) end end |
.read_file(file, options = {}) ⇒ Object
53 54 55 56 57 58 59 60 61 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 53 def self.read_file(file, = {}) memo = Hash[] if row_based?() read_data_without_header(file, memo, ) else read_data_with_header(file, memo, ) end memo end |
.reduce_results(data) ⇒ Array
Processes values in a map reduce way so the result is as readable as possible and poses minimal impact on the API
119 120 121 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 119 def self.reduce_results(data) data.map { |k, v| { login: k, filters: UserFilterBuilder.collect_labels(v) } } end |
.resolve_user_filter(user = [], project = []) ⇒ Object
422 423 424 425 426 427 428 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 422 def self.resolve_user_filter(user = [], project = []) user ||= [] project ||= [] to_create = user - project to_delete = project - user { :create => to_create, :delete => to_delete } end |
.resolve_user_filters(user_filters, vals) ⇒ Object
Gets user defined filters and values from project regardless if they come from Mandatory Filters or Variable filters and tries to resolve what needs to be removed an what needs to be updated
433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 433 def self.resolve_user_filters(user_filters, vals) project_vals_lookup = vals.group_by(&:related_uri) user_vals_lookup = user_filters.group_by(&:related_uri) a = vals.map(&:related_uri) b = user_filters.map(&:related_uri) users_to_try = (a + b).uniq results = users_to_try.map do |user| resolve_user_filter(user_vals_lookup[user], project_vals_lookup[user]) end to_create = results.map { |x| x[:create] }.flatten.group_by(&:related_uri) to_delete = results.map { |x| x[:delete] }.flatten.group_by(&:related_uri) [to_create, to_delete] end |
.row_based?(options = {}) ⇒ Boolean
Function that tells you if the file should be read line_wise. This happens if you have only one label defined and you do not have columns specified
49 50 51 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 49 def self.row_based?( = {}) [:labels].count == 1 && ![:labels].first.key?(:column) end |
.verify_existing_users(filters, options = {}) ⇒ Object
149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 |
# File 'lib/gooddata/models/user_filters/user_filter_builder.rb', line 149 def self.verify_existing_users(filters, = {}) users_must_exist = [:users_must_exist] == false ? false : true users_cache = [:users_cache] domain = [:domain] if users_must_exist missing_users = filters.reject do |u| next true if users_cache.key?(u[:login]) domain_user = (domain && domain.find_user_by_login(u[:login])) users_cache[domain_user.login] = domain_user if domain_user next true if domain_user false end unless missing_users.empty? fail "#{missing_users.count} users are not part of the project and " \ "variable cannot be resolved since :users_must_exist is set " \ "to true (#{missing_users.join(', ')})" end end end |