Overview
- This document contains the results of testing Data Import for MARC Bibliographic records with an update job on the Quesnelia [ECS] release on qcon environment.
- PERF-846Getting issue details... STATUS
Summary
- Data Import tests finished successfully on qcon environment using the PTF - Updates Success - 2 profile and file with 25k records.
- Comparing with previous testing results Poppy and Quesnelia releases
- Data Import processed all jobs including test on 3 tenants concurrently without errors for Quesnelia releases.
- Data Import durations stayed in the same time range in Average for Quesnelia releases but it works stable and without errors.
- During testing, we noticed that mod-permission did not have any spikes and used 12% CPU for Quesnelia releases. For Poppy releases we had error.
Test Runs
Test â„– | Scenario | Test Conditions | Results |
---|---|---|---|
1 | DI MARC Bib Create | 5K, 10K, 25K, 50K, 100K consequentially (with 5 min pause) | Completed |
CICO | 8 users | ||
2 | DI MARC Bib Update | 5K, 10K, 25K, 50K, 100K consequentially (with 5 min pause) |
|
CICO | 8 users |
Test Results
This table contains durations for Data Import.
Profile | MARC File | DI Duration Quesnelia (hh:mm:ss) | Check In, Check Out Response time (8 users) Quesnelia | |
---|---|---|---|---|
CI Average sec | CO Average sec | |||
DI MARC Bib Create (PTF - Create 2) | 5K.mrc | 0:03:21 | ||
10K.mrc | 0:06:51 | |||
25K.mrc | 0:12:41 | |||
50K.mrc | 0:23:19 | |||
100K.mrc | 0:51:24 | |||
DI MARC Bib Create (PTF - Updates Success - 2) | 5K.mrc | 0:04:12 | ||
10K.mrc | 0:08:15 | |||
25K.mrc | 0:20:38 | |||
50K.mrc | 0:43:06 | |||
100K.mrc | 1:29:09 |
Comparison
This table contains durations comparison between Poppy and Quesnelia releases.
Resource utilization for Test â„–1
Service CPU Utilization
Here we can see that mod-data-import used 150% CPU in spikes.
Service Memory Utilization
Here we can see that all modules show a stable trend.
DB CPU Utilization
DB CPU was 92%.
DB Connections
Max number of DB connections was 1690.
DB load
Top SQL-queries
# | TOP 5 SQL statements |
---|---|
1 |
|
2 |
|
3 |
|
4 |
|
5 |
|
Resource utilization for Test â„–2
Service CPU Utilization
Here we can see that mod-data-import used 130% CPU in spikes.
Service Memory Utilization
Here we can see that all modules show a stable trend.
DB CPU Utilization
DB CPU was 92%.
DB Connections
Max number of DB connections was 1685.
DB load
Top SQL-queries
# | TOP 5 SQL statements |
---|---|
1 |
|
2 |
|
3 |
|
4 |
|
5 |
|
Appendix
Infrastructure
PTF - environment Quesnelia (qcon)
11 m6i.2xlarge EC2 instances located in US East (N. Virginia)us-east-1 [Number of ECS instances, instance type, location region]
1 instance of db.r6.xlarge database instance: Writer instance
OpenSearch
domain: fse
Number of nodes: 9
Version: OpenSearch_2_7_R20240502
MSK - tenat
4 kafka.m5.2xlarge brokers in 2 zones
Apache Kafka version 2.8.0
EBS storage volume per broker 300 GiB
auto.create.topics.enable=true
log.retention.minutes=480
default.replication.factor=3
Kafka consolidated topics enabled
Methodology/Approach
DI tests scenario (DI MARC Bib Create and Update) were started from UI on Quesnelia (qcon) env with file splitting features enabled on a ecs environment..
Test runs:
- Test 1: Manually tested 5K, 10K, 25K, 50K, 100K consequentially (with 5 min pause) records files, DI (DI MARC Bib Create ) started on College tenant(cs00000int_0001) only, and CICO with 8 users on background.
- Test 2: Manually tested 5K, 10K, 25K, 50K, 100K consequentially (with 5 min pause) records files, DI (DI MARC Bib Update) started on College tenant(cs00000int_0001) only, and CICO with 8 users on background.
At the time of the test run, Grafana was not available. As a result, response times for Check-In/Check-Out were parsed manually from a .jtl files, using the start and finish dates of the data import tests. These results were visualized in JMeter using a Listener (Response Times Over Time).