...
- JobDefinition (uuid, profile - defines job type: insert/update) for import created in mod-srm
- MARC file + JobDefinition ID are uploaded from WEB client to mod-data-import (stored in memory, can be persisted. possible oom)
- MARC records are packed into batches and put to Kafka queue DI_RAW_RECORDS_CHUNK_READ
- mod-srm reads batches from the queue, validates them and publishes DI_MARC_FOR_UPDATE_RECEIVED event - since JobProfile contains an action for MARC_Bib update, original records are not saved in mod-srs.
- mod-srs reads message from DI_MARC_FOR_UPDATE_RECEIVED. Tries to find entity according to match criteria
- If found: exports result to DI_SRS_MARC_BIB_RECORD_MATCHED
- mod-srs receives match result and updates the MARC_Bib record according to profile. And publishes result to DI_SRS_MARC_BIB_RECORD_MODIFIED_READY_FOR_POST_PROCESSING
- mod-inventory reads message from DI_SRS_MARC_BIB_RECORD_MODIFIED_READY_FOR_POST_PROCESSING, updates the corresponding Instance and if no more actions in profile: export to DI_COMPLETE
- mod-srs receives match result and updates the MARC_Bib record according to profile. And publishes result to DI_SRS_MARC_BIB_RECORD_MODIFIED_READY_FOR_POST_PROCESSING
- If not found: DI_SRS_MARC_BIB_RECORD_NOT_MATCHED event is published