Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. JobDefinition (uuid, profile - defines job type: insert/update) for import created in mod-srm
  2. MARC file + JobDefinition ID are uploaded from WEB client to mod-data-import (stored in memory, can be persisted. possible oom)
  3. MARC records are packed into batches and put to Kafka queue DI_RAW_RECORDS_CHUNK_READ
  4. mod-srm reads batches from the queue, validates them and publishes DI_MARC_FOR_UPDATE_RECEIVED event - since JobProfile contains an action for MARC_Bib update, original records are not saved in mod-srs. 
  5. mod-srs reads message from DI_MARC_FOR_UPDATE_RECEIVED. Tries to find entity according to match criteria
  6. If found: exports result to DI_SRS_MARC_BIB_RECORD_MATCHED
    1. mod-srs receives match result and updates the MARC_Bib record according to profile. And publishes result to DI_SRS_MARC_BIB_RECORD_MODIFIED_READY_FOR_POST_PROCESSING
    2. mod-inventory reads message from DI_SRS_MARC_BIB_RECORD_MODIFIED_READY_FOR_POST_PROCESSING, updates the corresponding Instance and if no more actions in profile: export to DI_COMPLETE
  7. If not found:  DI_SRS_MARC_BIB_RECORD_NOT_MATCHED event is published

Diagram

Source

Image RemovedPlantUML diagramImage Added