Details
Reporter
Magda ZacharskaMagda ZacharskaPotential Workaround
Export records, update using MARCedit and re-importPO Rank
1Front End Estimate
XXL < 30 daysFront End Estimator
Vadym ShchekotilinVadym ShchekotilinFront-End Confidence factor
70%Back End Estimate
XXL < 30 daysBack End Estimator
Viachaslau KhandramaiViachaslau KhandramaiBack-End Confidence factor
90%Release
Sunflower (R1 2025)SA support
TestRail: Cases
Open TestRail: CasesTestRail: Runs
Open TestRail: Runs
Details
Details
Reporter
Magda Zacharska
Magda ZacharskaPotential Workaround
Export records, update using MARCedit and re-import
PO Rank
1
Front End Estimate
XXL < 30 days
Front End Estimator
Vadym Shchekotilin
Vadym ShchekotilinFront-End Confidence factor
70%
Back End Estimate
XXL < 30 days
Back End Estimator
Viachaslau Khandramai
Viachaslau KhandramaiBack-End Confidence factor
90%
Release
Sunflower (R1 2025)
SA support
TestRail: Cases
Open TestRail: Cases
TestRail: Runs
Open TestRail: Runs
Created February 4, 2023 at 2:49 AM
Updated March 25, 2025 at 2:02 PM
Current situation or problem:
Multiple MARC bib records can be edited only through ETL process that consists of exporting record, modifying them outside FOLIO using third party software and then re-importing them. That flow does not support updating administrative data and previewing changes before they are committed.
In scope:
The feature will expand functionality delivered in and will support other MARC fields with the exception of 00x fields.
Update MARC fields for Instances with source set to MARC.
Administrative data and MARC fields can be updated in one bulk edit job.
Identify records that will be updated based on a list of records identifiers (HRID, UUID) or a query.
Support all but 00x fields.
Users should be able to:
add new field with multiple subfields
remove field
modify existing value of the subfield
Users will receive a feedback about encountered errors so that they can be addressed:
Errors when identifying records to be edited - before the bulk edit job starts
Errors encountered during saving the data - after the bulk edit jobs ends
User should be able to preview the changes in MARC format before they are committed.
After the changes are committed, the user can downloach the records in the MARC format
Logs of the completed bulk edit jobs are available for a defined period of time (30 days by default) and provide following information:
list of recorcords affected by bulk edit (file with identifiers or query results)
file with modified records before the changes are committed in .csv and .mrc formats
file with modified records after the changes are committed in .csv and .mrc formats
files with errors and warnings encountered while matching records or committing changes
Expected max volume of 100 000 instances to be updated in one bulk edit job.
The functionality supports ECS and non-ECS environments. In ECS environment shared instances can be edited from the central and member tenants according with user’s affiliation and permissions. Local instances can be edited only on a member tenant.
Out of scope: Records with source other than MARC.
MARC Instances mockups: as implemented in
Administrative data mockups: as implemented in and
Mockups:
Use case: https://folio-org.atlassian.net/wiki/display/BULKEDIT/Bulk+Edit+Use+Cases
Proposed solution:
Story should be implemented as pre-requisite for basic implementation.
The main task here is to expand the list of actions and allow editing of fields other than 5xx and 9xx, which has already been done in .
Join ContentUpdate and MarcContentUpdate flows, start MARC and FOLIO flows independently and sync on completion on applying changes and committing changes stages.
The following actions
add new field with multiple subfields
remove field
modify existing value of the subfield
modify existing indicators
look are in place. It is necessary to confirm with front end team requirements for marc-update-request contentment changes.
NFRs:
Performance: Ensure that bulk edits handle large volumes of SRS Bib records efficiently, maintaining response times below <tbd> seconds per record. – MMZ: should be similar to DI processing 50k records in ~35 min.
Data Integrity: Implement error-handling and logging to prevent data corruption, ensuring accuracy in batch updates.
Security: Restrict bulk edit access based on permissions, safeguarding sensitive bibliographic data.
Reliability: Provide 99.9% availability for bulk edit functionalities, minimizing downtime.
Notes:
It is nice to have already implemented points 2-4 from before starting work on this feature.