Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Flow description

  1. JobDefinition (uuid, profile) for import created in mod-srm
  2. MARC file + JobDefinition ID are uploaded from WEB client to mod-data-import (stored in memory, can be persisted. possible oom)
  3. MARC records are packed into batches and put to Kafka queue
  4. mod-srm reads batches from the queue, validates and passes to mod-srs via Kafka queue. JobStarts on first chunk received
  5. mod-srs stores records into PostgreSQL database and returns the result back via Kafka queue (broken records are also stored as 'error record')
  6. mod-srm reads the profile and creates JSON payload (containing parsed MARC, profile, mapping parameters) for processing. exports it to Kafka queue (one message per MARC entry)

FOR CREATE

  1. FOR CREATE: mod-inventory reads the message, creates Instance. Stores (via OKAPI HTTP) in mod-inventory-storage. Exports message ('INVENTORY_INSTANCE_CREATED')
  2. FOR CREATE: mod-srs reads message and updates according entry with instance id. Creates new message with updated payload ('DI_SRS_MARC_BIB_INSTANCE_HRID_SET')
  3. FOR CREATE: mod-inventory updates Instance with updated fields (Kateryna Senchenko export or same thread?)
  4. FOR CREATE: mod-inventory reads the message, creates Holdings. Stores (via OKAPI HTTP) in mod-inventory-storage. Exports message (Kateryna Senchenko)
  5. FOR CREATE: mod-inventory reads the message, creates Items. Stores (via OKAPI HTTP) in mod-inventory-storage. Exports message (Kateryna Senchenko)
  6. FOR CREATE: mod-inventory reads message and exports to 'DI_COMPLETED' (Kateryna Senchenko)

...