[FOLIO-1245] create automated performance testing environment Created: 15/May/18  Updated: 12/Nov/18  Resolved: 09/Jul/18

Status: Closed
Project: FOLIO
Components: None
Affects versions: None
Fix versions: None

Type: Umbrella Priority: P3
Reporter: Jakub Skoczen Assignee: Wayne Schneider
Resolution: Done Votes: 0
Labels: ci, performance, sprint38, sprint39, sprint40, sprint41, sprint42, sprint44, sprint45, sprint48
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original estimate: Not Specified

Attachments: PNG File stages.png    
Issue links:
Blocks
blocks FOLIO-1120 run performance tests daily against f... Open
blocks FOLIO-1121 upgrade backend modules daily of foli... Open
blocks FOLIO-1122 generate performance test output Open
blocks MODINVSTOR-60 Create JMeter scripts for performance... Closed
is blocked by FOLIO-1265 automation around loading inventory r... Open
is blocked by FOLIO-1285 create new repository for aut perf te... Closed
Relates
relates to FOLIO-948 use okapi deployment persistence for ... Open
relates to FOLIO-1255 Load set of circulation records into ... Open
relates to FOLIO-1256 Load set of user records into perform... Open
relates to UXPROD-750 Performance tests - part 2 Closed
Sprint:
Development Team: Prokopovych

 Description   

A living document describing the approach has been created during a session in Durham and is located here:

https://docs.google.com/document/d/1MiF4N3ot3ewT8j9_jer6_br5ekOpXMrMEJR93WWPALA/edit



 Comments   
Comment by Jakub Skoczen [ 17/May/18 ]

Guys, we have discussed that Hongwei Ji and Varun Javalkar from EBCSO will be helping out in the set up of the performance test environment, perfomance test coverage with jMeter and automation of the performance test suite.

The work-in-progress document linked in the description captures the rough approach and initial requirements and goals for the environment and the test suite. Let's extend this document with more details and organize the work into smaller and more tangible deliverables. Hongwei Ji Mark Veksler if you'd like to take a first stab at scoping the deliverables before the actual work commences Wayne Schneider and I will be happy to review them and provide comments. I know you guys have some initial ideas already so any graphical or textual descriptions are welcome – either put them directly here or in the FOLIO Wiki or Gdoc/Drive and link them here. Thanks.

Comment by Hongwei Ji [ 17/May/18 ]

Hi Jakub Skoczen and Wayne Schneider, I see in FOLIO-1066, Wayne Schneider already has a way to create a performance testing environment. Is it automated or can it be automated? More importantly, is it AWS independent?We do have a way to create FOLIO but it is coupled with AWS native technologies like ECS and ALB, so might not be a neutral candidate to benchmark FOLIO performance. Either way, we are happy to work together to automate the performance environment creation, come up with standard test data, and contribute to JMeter test scripts. More importantly, we should/will make all those available to community so they have more confidence on FOLIO and also benefit from these as well.

Comment by Jakub Skoczen [ 29/May/18 ]

Hongwei Ji Wayne Schneider Guys, did you get a chance to talk about bootstrapping the env?

Comment by Hongwei Ji [ 04/Jun/18 ]

Hi Jakub Skoczen and all, this is what I have so far. Thanks for the help from Wayne Schneider and John Malconian
It is a Jenkins pipeline with following stages:

  • Check out
  • Create environment - calls AWS CloudFormation template to create three EC2 instances. One m5.large instance for OKAPI, and two m5.xlarge instances for Modules and Database. The DB instance has a bit more disk space allocated.
  • Check environment - basically wait for all instances to be fully ready (separated it from Create env stage for convenience)
  • Bootstrap DB - pull and start Postgresql Docker image on DB EC2 instance
  • Bootstrap OKAPI - pull and start OKAPI Docker image used by snapshot-stable site from Docker hub and start it on OKAPI EC2 instance
  • Bootstrap Modules - pull and start all modules Docker images used by snapshot-stable site. Also register/discover/enable modules for supertenant (skip to create a test tenant). Communication within FOLIO goes through private IPs. Access to OKAPI goes through public IP.
  • Populate data - pull EBSCO sample data from a public S3 bucket to populate database
  • Run JMeter tests - run a sample JMeter test developed by Varun Javalkar (will add more soon).
  • Tear down environment - delete the AWS stack.
    Here is the screenshot of a sample build

    There are still work to do. Let me know if you have suggestions/comments for now. Thanks.
Comment by Hongwei Ji [ 06/Jun/18 ]

BTW, I stood up a temp Jenkins to show the pipeline. http://jenkins.int.aws.folio.org:9200/

Comment by John Malconian [ 06/Jul/18 ]

After some modifications to FOLIO Jenkins and the configuration in the repository, I'm able to get a successful run in FOLIO Jenkins.

https://jenkins-aws.indexdata.com/job/Automation/job/folio-perf-test/

I've merged branch FOLIO-1245 Closed into master.

Outstanding:

  • It appears that the Performance graphs are not populated. Not sure why this is.
  • How do we want to run and use this job? Daily? How will it be reviewed?
Comment by Hongwei Ji [ 06/Jul/18 ]

John Malconian, I believe the Performance graphs need at least two successful builds to show the trend. As for the job frequencies, I have been running it only once a day in our environment.

Comment by John Malconian [ 09/Jul/18 ]

Thanks, Hongwei Ji. You are correct about the graphs. I've scheduled the job to be run daily.

Comment by Jakub Skoczen [ 25/Jul/18 ]

What is left here to make sure we can use the environment to track performance regressions and address them. We need a clear upgrade process, is it implemented?

Comment by Hongwei Ji [ 25/Jul/18 ]

Jakub Skoczen it pulls the latest code from snapshot-stable automatically every time. Can you elaborate the "clear upgrade process"?

Generated at Thu Feb 08 23:11:54 UTC 2024 using Jira 1001.0.0-SNAPSHOT#100246-sha1:7a5c50119eb0633d306e14180817ddef5e80c75d.