Done
Details
Assignee
Serhii_NoskoSerhii_NoskoReporter
Serhii_NoskoSerhii_NoskoPriority
P2Story Points
3Sprint
NoneDevelopment Team
FolijetFix versions
Release
Lotus R1 2022TestRail: Cases
Open TestRail: CasesTestRail: Runs
Open TestRail: Runs
Details
Details
Assignee
Serhii_Nosko
Serhii_NoskoReporter
Serhii_Nosko
Serhii_NoskoPriority
Story Points
3
Sprint
None
Development Team
Folijet
Fix versions
Release
Lotus R1 2022
TestRail: Cases
Open TestRail: Cases
TestRail: Runs
Open TestRail: Runs
Created December 10, 2021 at 7:02 AM
Updated March 3, 2022 at 6:44 PM
Resolved January 17, 2022 at 9:09 AM
Kafka Cache is used in SRM in DataImportJournalKafkaHandler and it should be replaced by another solution for handling duplicates, it described in the diagram below.
Steps to be done:
Use previously added tables events_processed and events_handlers from story:
Populate events_handlers table with value DataImportJournalKafkaHandler after DataImportJournalVerticle starts.
Modify DataImportJournalKafkaHandler to catch ConstraintViolation exception and write into log about duplicate event.