mod-search: Test Reindexing of Instances (Poppy)

Overview

  • The purpose of the document is to compare results of reindexing of instances on Orchid and Poppy releases. 

Recommendations & Jiras

Test Summary

Comparing results with PERF-430 reindexing on Poppy release is faster than on Orchid. 9h 47 min against 11h 10 min.

Incorrect behaviour for reindexing was observed during tests #3,4. It was unexpectedly fast (just 1 hour). Jira ticket created RANCHER-1032

After problem was fixed additional 2th tests were carried out. So reindexing results during tests #5,6 are consistent on Poppy release and took 9 hours 46 min and 9 hours 38 min.

It was observed that after 2 hours of instance reindexing all instances are indexed and searchable from UI. The rest of time It goes with ocp2_contributor_fs09000000 indexing.

Average memory usage for mod-search on Orchid - 36% against 47% on Poppy, and for mod-inventory-storage it decreased almost twice on Orchid - 50% against 26% on Poppy).

Average CPU utilization decreased significantly on Poppy (for mod-search on Orchid - 75% against 20% on Poppy, for mod-inventory-storage on Orchid - 50% against 15% on Poppy). 

Max. Indexing latency changed from 19 ms on Orchid to 60 ms on Poppy.

Test Runs /Results

Test #

Test Conditions

Duration 

Notes



Reindexing on Orchid release11 hr 10 min
  • mod-search task count =8
  • open search instance scaled up to r6g.4xlarge.search

2. 2023_10_01 10:18 - 20:05 UTC

Reindexing on Poppy release



9 hours 47 min
  • mod-search task count =8
  • mod-inventory-storage task count = 2
  • mod-okapi task count = 3
  • open search instance scaled up to r6g.4xlarge.search
  • without configuration of number_of_replicas and refresh_interval values of ES/OpenSearch

3. 2023-10-03 10:33 - 11:33 UTC

1 hour Failed

4. 2023-10-03 12:08 -  13:05 UTC

57 min Failed
5. 2023-10-09 17:41 - 03:27 UTC9 hours 46 min
6. 2023-10-10 06:05 - 15:43 UTC9 hours 38 min

Indexing size

Test #2

healthstatusindexuuidpri repdocs.countdocs.deletedstore.sizepri.store.size
greenopenocp2_authority_fs09000000XE9S-Me_StyzkP3XQqPqBA405562390233.1mb233.1mb
greenopenocp2_contributor_fs09000000N59vfY3YTsOopikzeqU5NA424629419130048414.6gb4.8gb
greenopenocp2_instance_fs09000000vCgkqlfuRYyWExbd8o0BoQ42107337292712122.9gb40.9gb
greenopenocp2_instance_subject_fs09000000l-07qruwRRKpKR2ec8Gylg424071905117629919.7gb6.5gb

Test #3 Indexing first time after a new OS cluster was created

healthstatusindexuuidprirepdocs.countdocs.deletedstore.sizepri.store.size
greenopen.opensearch-observabilitykPAJ8TqaR06AQZFYekHeyA1200624b208b
greenopenocp2_instance_fs0900000061c7lHERQGab0nWBS6tU-w42107337290124.5gb41.3gb
greenopenocp2_instance_subject_fs09000000T9QYzSdMS-GFcO4skPIQiw42002.4kb832b
greenopenocp2_contributor_fs09000000qPUffL5HQtWNpyScGOA6cA42002.4kb832b
greenopen.kibana_19Q4bvyKCRpiiwcPNcLFs9g121015.5kb5.1kb

Test #4

healthstatusindexuuidprirepdocs.countdocs.deletedstore.sizepri.store.size
greenopen.opensearch-observabilitykPAJ8TqaR06AQZFYekHeyA1200624b208b
greenopenocp2_instance_fs09000000pj6TE-z6SkeewcFbfr98yg42107337290124.7gb42gb
greenopenocp2_contributor_fs09000000lVNwo3siRVKvTGg0_YVivQ42002.4kb832b
greenopenocp2_instance_subject_fs09000000LwlykCF9SrWiu4UV1Spmtw42002.4kb832b
greenopen.kibana_19Q4bvyKCRpiiwcPNcLFs9g121015.5kb5.1kb

Test #5

healthstatusindexuuidprirepdocs.countdocs.deletedstore.sizepri.store.size
greenopenocp2_instance_fs09000000RJFH1FFRTLC2f8tAZuTulg42107337290123.5gb40.9gb
greenopenocp2_authority_fs090000009QX5G1CkSMWTIaD-U2Re2g405562390238.6mb238.6mb
greenopenocp2_instance_subject_fs090000008VxNRdHjRYOC-fB-G6B4ow42407668773667618.1gb6gb
greenopenocp2_contributor_fs090000006pmK601zR3eFwZ02PR0SCg42463665270857412.6gb3.8gb

Test #6

healthstatusindexuuidprirepdocs.countdocs.deletedstore.sizepri.store.size
greenopenocp2_instance_fs09000000l-8gxUPaRHu9Rb77krV2Pw42107337290124gb41.2gb
greenopenocp2_authority_fs090000009QX5G1CkSMWTIaD-U2Re2g405562390238.6mb238.6mb
greenopenocp2_contributor_fs09000000oxx-7teWRieeiHmj2mZPqw42463499273221812.8gb3.8gb
greenopenocp2_instance_subject_fs09000000OaxNET2SRO2nG1oPeTy61w42407664747241118.6gb6.5gb

Memory Usage

Poppy

Test #2

mod-search max. 74% during an hour, 64% - during second hour and the rest 7 hours with average 47%. 

mod-inventory-storage avr. 26%

Test #3,4

Test #5

mod-search max. 68% during an hour, 60% - during second hour and the rest 7 hours with average 47%. 

mod-inventory-storage avr. 26%

Test #6

mod-search max. 68% during an hour, 60% - during second hour and the rest 7 hours with average 45%. 

mod-inventory-storage avr. 26%

Orchid

mod-search avr. 36%

mod-inventory-storage avr. 50%


Memory consumption

Test #Module

Initial spike

avg. usage

2. Poppy

mod-search74%47%
mod-inventory-storage26%26%
5. Poppymod-search68%47%
mod-inventory-storage24%22%
6. Poppymod-search60%45%
mod-inventory-storage24%21%
Orchid

mod-search47%36%
mod-inventory-storage50%50%

CPU Utilization 

Poppy

Test #2

After 2 hour of reindexing utilization of CPU resources decreased to minimum. 1-2% for mod-inventory-storage and less than 1 % for mod-search

Test #3,4

Test #5

Test #6


CPU utilization

Test #Poppy release

Initial spike

avg. usage

2

mod-search32%20% - first 2 hours. Close to 1% after 2 hours of indexing
mod-inventory-storage49%15% - first 2 hours. Close to 1% after 2 hours of indexing

5

mod-search30%20% - first 2 hours. Close to 1% after 2 hours of indexing
mod-inventory-storage50%15% - first 2 hours. Close to 1% after 2 hours of indexing
6mod-search30%20% - first 2 hours. Close to 1% after 2 hours of indexing
mod-inventory-storage54%15% - first 2 hours. Close to 1% after 2 hours of indexing
Orchidmod-search200%75% - first 2 hours. Close to 1% after 2 hours of indexing
mod-inventory-storage100%50% - first 2 hours. Close to 1% after 2 hours of indexing

Orchid

RDS CPU Utilization 

Poppy

Test #2

Utilization max. 63%, avr. 22% during a second hour, after 2 hours of indexing DB utilization decreased to the level as before test start - 6%

Test #3,4