Table of Contents outline true
...
Profile | MARC File | DI Duration Quesnelia (hh:mm:ss) | Check In, Check Out Response time (8 users) Quesnelia | |
---|---|---|---|---|
CI Average sec | CO Average sec | |||
DI MARC Bib Create (PTF - Create 2) | 5K.mrc | 0:03:21 | 0.831 | 1.357 |
10K.mrc | 0:06:51 | 0.845 | 1.410 | |
25K.mrc | 0:12:41 | 0.719 | 1.333 | |
50K.mrc | 0:23:19 | 0.691 | 1.327 | |
100K.mrc | 0:51:24 | 0.664 | 1.335 | |
DI MARC Bib Create (PTF - Updates Success - 2) | 5K.mrc | 0:04:12 | 0.764 | 1.458 |
10K.mrc | 0:08:15 | 0.779 | 1.377 | |
25K.mrc | 0:20:38 | 0.755 | 1.401 | |
50K.mrc | 0:43:06 | 0.750 | 1.444 | |
100K.mrc | 1:29:09 | 0.730 | 1.458 |
Check-in/Check-out without DI
Scenario | Load level | Request | Response time, sec Quesnelia | ||
---|---|---|---|---|---|
95 perc | average | ||||
Circulation Check-in/Check-out (without Data import) | 8 users | Check-in | Check-out | 0.635 | 0.493 |
Check-out | 1.243 | 1.078 |
Comparison
This table contains DI durations with CICO comparison between Poppy and Quesnelia releases.
Profile | MARC File | DI Durationwith CI/CO | DI Delta Poppy/Quesnelia (hh:mm:ss) | Check In, Check Out Response time (8 users) | Check In, Check Out Response time (8 users) | Delta, % | ||||
Poppy | Quesnelia | Poppy/Quesnelia | Poppy/Quesnelia | |||||||
Poppy | Quesnelia | CI Average sec | CO Average sec | CI Average sec | CO Average sec | CI | CO | |||
DI MARC Bib Create (PTF - Create 2) | 5K.mrc | 00:02:53 | 0:03:21 | +0:00:28 | 0.901 | 1.375 | ||||
10K.mrc | 00:04:32 | 0:06:51 | +0:02:19 | 0.902 | 1.47 | |||||
25K.mrc | 00:11:14 | 0:12:41 | +0:01:27 | 1 | 1.571 | |||||
50K.mrc | 00:21:55 | 0:23:19 | +0:01:24 | 0.981 | 1.46 | |||||
100K.mrc | 00:47:02 | 0:51:24 | +0:04:22 | 1.018 | 1.491 | |||||
DI MARC Bib Create (PTF - Updates U1.571pdates Success - 2) | 5K.mrc | 00:03:19 | 0:04:12 | +0:00:53 | 0.755 | 1.169 | ||||
10K.mrc | 00:06:20 | 0:08:15 | +0:01:55 | 0.75 | 1.307 | |||||
25K.mrc | 00:14:04 | 0:20:38 | +0:06:34 | 0.822 | 1.403 | |||||
50K.mrc | 00:29:59 | 0:43:06 | +0:13:07 | 0.893 | 1.424 | |||||
100K.mrc | 01:03:03 | 1:29:09 | +0:26:06 | 0.908 | 1.51 |
Detailed CICO response time comparison without DI
Scenario | Load level | Request | Response time, sec Poppy | Response time, sec Quesnelia | ||
---|---|---|---|---|---|---|
95 perc | average | 95 perc | average | |||
Circulation Check-in/Check-out (without Data import) | 8 users | Check-in | 0.489 | 0.431 | 0.635 | 0.493 |
Check-out | 0.969 | 0.828 | 1.243 | 1.078 |
Detailed CICO response time comparison for CICO with DI in Poppy
Request* | Response time (avg, sec) | ||
---|---|---|---|
Pure CICO | CICO + 100K MARC BIB Create | CICO + 100K MARC BIB Update | |
Check-Out Controller | 0.828 | 1.491 | 1.51 |
Check-In Controller | 0.431 | 1.018 | 0.908 |
POST_circulation/check-out-by-barcode (Submit_barcode_checkout) | 0.266 | 0.647 | 0.718 |
POST_circulation/check-in-by-barcode (Submit_barcode_checkin) | 0.187 | 0.57 | 0.477 |
GET_circulation/loans (Submit_barcode_checkout) | 0.128 | 0.233 | 0.215 |
GET_inventory/items (Submit_barcode_checkin) | 0.048 | 0.126 | 0.118 |
GET_inventory/items (Submit_barcode_checkout) | 0.046 | 0.125 | 0.117 |
GET_note-links (Submit_barcode_checkout) | 0.046 | 0.024 | 0.024 |
GET_users (Submit_patron_barcode) | 0.037 | 0.041 | 0.037 |
GET_circulation/loans (Submit_patron_barcode) | 0.028 | 0.03 | 0.049 |
GET_automated-patron-blocks (Submit_patron_barcode) | 0.024 | 0.026 | 0.024 |
GET_users (Get_check_in_page) | 0.023 | 0.054 | 0.051 |
*Top-10 requests were taken for analysis.
...
- Test 1: Manually tested 5K, 10K, 25K, 50K, 100K consequentially (with 5 min pause) records files, DI (DI MARC Bib Create ) started on College tenant(cs00000int_0001) only, and CICO with 8 users on background.
- Test 2: Manually tested 5K, 10K, 25K, 50K, 100K consequentially (with 5 min pause) records files, DI (DI MARC Bib Update) started on College tenant(cs00000int_0001) only, and CICO with 8 users on background.
At the time of the test run, Grafana was not available. As a result, response times for Check-In/Check-Out were parsed manually from a .jtl files, using the start and finish dates of the data import tests. These results were visualized in JMeter using a Listener (Response Times Over Time).