Flow description
- JobDefinition (uuid, profile) for import created in mod-srm
- MARC file is + JobDefinition ID are uploaded from WEB client to mod-data-import (stored in memory, can be persisted. possible oom)
- MARC records are packed into batches and put to Kafka queue?? here starts the job ??
- mod-srm reads batches from the queue, validates and passes to mod-srs via Kafka queue. JobStarts on first chunk received
- mod-srs stores records into PostgreSQL database and returns the result back via Kafka queue (broken records are also stored as 'error record')
- mod-srm reads the profile and creates JSON payload (containing parsed MARC, profile, mapping parameters) for processing. exports it to Kafka queue (one message per MARC entry)
FOR CREATE
- FOR CREATE: mod-inventory reads the message, creates Instance. Stores (via OKAPI HTTP) in mod-inventory-storage. Exports message ('INVENTORY_INSTANCE_CREATED')
- FOR CREATE: mod-srs reads message and updates according entry with instance id. Creates new message with updated payload ('DI_SRS_MARC_BIB_INSTANCE_HRID_SET')
- FOR CREATE: mod-inventory updates Instance with updated fields (Kateryna Senchenko export or same thread?)
- FOR CREATE: mod-inventory reads the message, creates Holdings. Stores (via OKAPI HTTP) in mod-inventory-storage. Exports message (Kateryna Senchenko)
- FOR CREATE: mod-inventory reads the message, creates Items. Stores (via OKAPI HTTP) in mod-inventory-storage. Exports message (Kateryna Senchenko)
- FOR CREATE: mod-inventory reads message and exports to 'DI_COMPLETED' (Kateryna Senchenko)
FOR UPDATE
- mod-inventory reads the message, tries to match it according to profile rules. Exports matched records to Kafka queue one by one.
- mod-inventory-storage reads messages and stores matched entity to DB. exports the result to queue
- mod-inventory reads the message, tries to match (cycle 7-9).
- if nothing can be matched, exports result to kafka queue
- mod-srm reads queue, updates the progress. updates job state.
...