Bulk Edit Users App 5 jobs [Morning Glory] 10/08/2022

Bulk Edit Users App 5 jobs [Morning Glory] 10/08/2022

IN PROGRESS


Overview

Per PERF-267, test Bulk Edit (PERF-271) of 10K  records to understand the workflow behavior before and when the mod-data-export-worker task crashes, if it crashes at all. 

  • How long does it take to export 10K records?
  •  What happens to the job that is running, will it be able to resume and complete successfully when the new task is spun up? 
  • Look for a memory trend and use it to decide on the number of concurrent jobs needed to reach the tipping point.  


Infrastructure

PTF -environment

  • 10 m6i.2xlarge EC2 instances  (changed. In Lotus it was m5.xlarge)
  • 2 instances of db.r6.xlarge database instances, one reader and one writer
  • MSK
    • 4 m5.2xlarge brokers in 2 zones
    • auto.create-topics.enable = true
    • log.retention.minutes=120
    • 2 partitions per DI topics
  • okapi (running tasks -3)
    • 1024 CPU units, 1360 MB mem
  • mod-users (running tasks -2) 
    • 128 CPU units, 896 MB mem
  • mod-data-export (running tasks -1) 
    • 1024 CPU units, 896 MB mem
  • mod-data-export-spring (running tasks -1) 
    • 256 CPU units, 1844 MB mem
  • mod-data-export-worker (running tasks -1) 
    • 256 CPU units, 1844 MB mem
  • mod-notes (running tasks -2) 
    • 128 CPU units, 896 MB mem
  • mod-agreements (running tasks -2) 
    • 128 CPU units, 1382 MB mem


Software Versions

  • mod-data-export-worker: 1.5.0-SNAPSHOT.54
  • mod-data-export-spring: 1.4.2
  • mod-agreements: 5.2.1
  • mod-notes: 3.1.0
  • mod-users: 18.3.0

Test Runs

Scenario
*Users App:

1.Navigate to the Bulk edit app
2.Select Users App
3.Select Users identifier from "Records identifier" dropdown
4.Upload .csv file with Users identifiers by dragging it on the Drag & drop area
5.Click "Actions" menu => "Download matched records (CSV)"
6.Open the downloaded to the local machine file
7.Modify Users status or patron group in the file => Save changes
8.Click "Actions" menu => Select "Start bulk edit (CSV)"
9.Upload the modified file to the Drag & drop zone => Hit "Next" => Hit "Commit changes"
10.Click "Actions" => Select "Download changed records (CSV)"

Record identifier files location - bulk_edit_test_data.zip

Test

Records number

Job count

Results for finding records

Results for updating recordsComment

Record identifier file

Time to process (file upload time+ edited file upload time+ commit time)

1.

25005

3/5-PASS

2/3 -stuck IN_PROGRESS

3/3 uploaded was updated successfullyAll files have distinct user barcodes different for each file

Files with barcodes and different names

2500-user1-barcodes.csv, 2500-user2-barcodes.csv, 2500-user3-barcodes.csv, 2500-user4-barcodes.csv, 2500-user5-barcodes.csv

± 40 sec upload time

±40 sec edited file upload time

± 1 min commit changes time

Results

Summary 

  • This is the initial test report for Bulk Edits Users App functionality. 
  • 5 parallel jobs can be performed simultaneously only if started with the ramp-up of a minimum of 10sec (for both upload and editing processes). If the jobs started with fewer 10s intervals they will have IN_PROGRESS status forever.
  • records file uploading time of about 40 sec;
  • records file with edited data uploading time of about 40 sec
  • Commit changes time 1 min.
  • Memory trend: memory usage is stable.
  • CPU utilization for mod-users was very high up to 255% for mod-data-export-worker up to 111% for upload records.


Memory usage


RDS CPU utilization

RDS CPU utilization did not exceed 31%

CPU utilization

CPU utilization for mod-users was very high up to 255% for mod-data-export-worker up to 111%



Notable observations

  • There is no way to track edited file uploading progress.
  • Has the edited file upload started yet?