Check-in-check-out Test Report (Kiwi)
















Overview

This is a report for a series of Check-in-check-out test runs against the Kiwi release. 

Back End:

  • mod-circulation-22.1.0
  • mod-circulation-storage-13.1.0
  • mod-inventory-18.0.0
  • mod-inventory-storage-22.0.0
  • mod-authtoken-2.9.0
  • mod-pubsub-2.4.0
  • mod-patron-blocks-1.4.0
  • mod-feesfines-17.0.0
  • okapi-4.9.0

Front End:

  • folio_circulation-6.0.0
  • Item Check-in (folio_checkin-6.0.0)
  • Item Check-out (folio_checkout-7.0.0)

Infrastructure:

  • 71 back-end modules deployed in 141 ECS tasks
  • 3 okapi ECS services
  • 6 m5.xlarge  EC2 instances
  • 2 db.r6g.xlarge AWS RDS instance
  • INFO logging level

High Level Summary

  • Kiwi performs much better than Juniper GA, response times for check-ins are <900ms, checkout <1500ms, with not much variation between 1 and 20 users
  • Database performance is better and uses much less CPU compared to Juniper GA
  • Worst-performing APIs are still POST /checkin-by-barcode and POST /checkout-by-barcode.  The response time are still about 500ms. GET /circulation/loans also takes more than 200ms. GET /inventory/item (by barcode) takes less than 100ms now.
  • Longevity test shows response times worsen over time, probably due to the growing DB CPU utilization.  CIRCSTORE-304 - Getting issue details... STATUS Potentially could address this situation

Test Runs

Test

Virtual Users

Duration

1.

130 mins

2.

530 mins

3.

830 mins

4.

2030 mins
5.1 (repeat)30 mins

Results

Response Times (Average of all tests listed above, in seconds)


Average (seconds)50th %tile (seconds)75th %tile (seconds)95th %tile  (seconds)

Check-inCheck-outCheck-inCheck-outCheck-inCheck-outCheck-inCheck-out
1 user0.8381.5820.7671.4640.9151.7160.9151.716
5 users0.731.3760.6761.2720.7691.4481.0522.061
8 users0.7581.3920.6741.2280.8391.441.2022.343
20 users0.8991.5060.7311.3030.9311.5351.5312.504

Response times are consistently good, sub-second for check-ins and around 1.5 seconds for check-outs even in the 75th percentile. Not much variations between 1 and 20 users. 

Comparisons to Last Release

The following tables compare Juniper HF3 test results against Kiwi General release (deployed at Bugfest).

Response Time Comparison

In general there is no regression in performance.  The response times between Kiwi and Juniper are very close to each other for 1-8 users load unless they were in the 95th percentile group or the 20 users load where Kiwi clearly out-perform Juniper.  In the tables below, the Delta columns express the differences between Juniper and Kiwi releases in percentage. Any percentage +/-5% is not statistically is within the margin of error.  It is also noteworthy that Kiwi seems to invoke the GET /automated-patron-blocks 3 times instead of once UICHKOUT-755 - Getting issue details... STATUS .  This call averages 25ms under all loads, so if 2 of these 3 calls were not needed (why would the UI calls it thrice?) then the Kiwi's Checkout average response times could improve by another 50ms. 

Note: JP = Juniper build, KW = Kiwi build


Average50th percentile 
Check-in JPCheck-in KWDeltaCheck-out JPCheck-out KWDeltaCheck-in JPCheck-in KWDeltaCheck-out JPCheck-out KWDelta
1 user0.9440.83811.23%1.5791.582-0.19%0.8350.7678.14%1.4111.464-3.76%
5 users0.8110.739.99%1.3591.376-1.25%0.7500.6769.87%1.231.272-3.41%
8 users0.8890.75814.74%1.4251.3922.32%0.7850.67414.14%1.2621.2282.69%
20 users1.3860.89935.14%2.211.50631.86%1.1720.73137.63%1.8871.30330.95%