|
Approach
Create a java app that has to do the update work:
1. Create data-import job profile for MARC-to-MARC matching by 999ff$s subfield and update bibliographic record. Job/Match/Action/Mapping profiles will be hidden
- Job Profile: Release Upgrade - Migrate MARC bibliographic records
- Action Profile: Release Upgrade - Migrate MARC bibliographic records
- Match Profile: Release Upgrade - Migrate MARC bibliographic records
- Mapping Profile: Release Upgrade - Migrate MARC bibliographic records
2. Load bibliographic records by pages of 50k (should be configurable) from SRS (GET /source-storage/records?recordType=MARC_BIB&state=ACTUAL&offset=<P>&limit=<N>
3. Verify that createdDate & updatedDate of records are older that time of script launch (If new records was detected - update totalRecords)
4. Prepare mrc file or JSON payload (README)
5. Initialize data-import job
6. Wait until the data-import job finished (GET /change-manager/jobExecutions/<id> and check status of job)
7. Load the next page and repeat 3-6 until there are no authority records left.
8. Delete the job profile created in 1st step
9. Logging should exist to indicate N of batch that is in progress now.
Validate that a user can view Job status
- job status is shown on the Data Import Logs UI
- For each record update, user can view SRS/MOD-Inventory-Storage output
Documentation
- Instructions must be provided to Hosting providers/System administrators for using the standalone application
- Must consider that some libraries have already upgraded to Morning Glory weeks/months before this implementation.
- Include a note that this should be run off-hours
- Release notes for Morning Glory and Nolana should be updated and include link to Instructions
Testing - MORE details to discuss
- Need a story for PTF
- Need an environment(s) to test
- Upgrade Lotus > Morning Glory
- Upgrade Morning Glory > Nolana
|