Skip to end of banner
Go to start of banner

Data Import

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Flow description

  1. MARC file is uploaded from WEB client to mod-data-import
  2. MARC records are packed into batches and put to Kafka queue
  3. ?? here starts the job ??
  4. mod-srm reads batches from the queue, validates and passes to mod-srs via Kafka queue
  5. mod-srs stores records into PostgreSQL database and returns the result back via Kafka queue
  6. mod-srm reads the profile and creates payload for processing. exports it to Kafka queue (one message per MARC entry)
  7. mod-inventory reads the message, tries to match it according to profile rules. Exports matched records to Kafka queue one by one.
  8. mod-inventory-storage reads messages and stores matched entity to DB. exports the result to queue
  9. mod-inventory reads the message, tries to match (cycle 7-9).
  10. if nothing can be matched, exports result to kafka queue
  11. mod-srm reads queue, updates the progress. updates job state.


  • No labels