Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

IN PROGRESS

...

Infrastructure

PTF -environment

  • 10 m6i.2xlarge EC2 instances  (changed. In Lotus it was m5.xlarge)
  • 2 instances of db.r6.xlarge database instances, one reader and one writer
  • MSK
    • 4 m5.2xlarge brokers in 2 zones
    • auto.create-topics.enable = true
    • log.retention.minutes=120
    • 2 partitions per DI topics
  • okapi (running tasks -3)
    • 1024 CPU units, 1360 MB mem
  • mod-users (running tasks -2) 
    • 128 CPU units, 896 MB mem
  • mod-data-export (running tasks -1) 
    • 1024 CPU units, 896 MB mem
  • mod-data-export-spring (running tasks -1) 
    • 256 CPU units, 1844 MB mem
  • mod-data-export-worker (running tasks -1) 
    • 256 CPU units, 1844 MB mem
  • mod-notes (running tasks -2) 
    • 128 CPU units, 896 MB mem
  • mod-agreements (running tasks -2) 
    • 128 CPU units, 1382 MB mem


Software Versions

  • mod-data-export-worker v 1.4.1 
  • mod-data-export-spring v 1.4.1
  • mod-agreements:5.2.0
  • mod-notes: 3.1.0
  • mod-users: 18.3.0

Test Runs

Scenario
*Users App:

...

Test

Records number

Duration

Results for finding records

Comment

Record identifier file

 name
Time to process (file upload time+ edited file upload time+ commit time)

1.

100multiple time checkPASSAlways Pass 

100_User_barcodes_ptf.csv

bulk_edit_test_data.zip

about 5+5+2sec (file upload time+ edited file upload time+ commit time)

2.

1000multiple time checkPASSAlways Pass

1.000_User_barcodes_ptf.csv

about 15+15+5 sec

3.

2000multiple time checkPASSAlways Pass

2.000_User_barcodes_ptf.csv

20+20+15 sec

4.

2500multiple time checkPASSAlways Pass

2.500_User_barcodes_ptf.csv

about 30+ 30 sec
5.2560multiple time checkPASSAlways Pass (the max record number)

2.560_User_barcodes_ptf.csv


about 30 +30sec
6.2590multiple time checkPASS/FAILSometimes PASS or FAIL

2.590_User_barcodes_ptf.csv


about 30 sec
7.2600multiple time checkFAILthe identifier file can be uploaded but the edited file upload is not available2.600_User_barcodes_ptf.csvabout 30 sec
8.3000multiple time checkFAILthe identifier file can be uploaded but the edited file upload is not available3.000_User_barcodes_ptf.csvabout 30 sec
9.5000multiple time checkFAILthe identifier file can be uploaded but the edited file upload is not available5.000_User_barcodes_ptf.csvabout 30 sec
10.10000multiple time checkFAILthe identifier file can be uploaded but the edited file upload is not available10.000_User_barcodes_ptf.csvabout 30 sec

...

  • This is the initial test report for Bulk Edits Users App functionality. 
  • 10 K records can not be exported, there is a limit of about 2560 records with no fails(3 times try) up to 2590 records can be successful from time to time, and from 2600 records-fails.
  • records file uploading time of about 30 sec for both success and failure;
  • records file with edited data uploading time of about 30 sec
  • The system is unstable and every time fails during the commit changes procedure for more than 2000 users (folio account blocks).
    • files with the whole amount of data can be downloaded for making changes. When you use windows and try to open the file in excel-> barcode values are automatically changed to the suggested format and as a result, users can change barcodes.
    • The start is unidentified (no warnings about the process is started available);
  • Memory trend: memory usage is stable.
  • CPU utilization for mod-users was very high up to 135% for 3000, 5000, and 10000 records bulk edit. Record identifier file upload failed.
  • 5 parallel jobs for 10k records can be performed simultaneously only if started with the ramp-up of a minimum of 10sec (for both upload and editing processes). If the jobs started with fewer 10s intervals they will have IN_PROGRESS status forever.
  • Failover test performed for uploading the file with 2000 records (mod-data-export-worker task was stopped) - result "Fail to upload file"→ Jobs status becomes "In progress" and does not change.

...