Data Import Implementers Topic Tracker

This page is meant to track progress on issues such as bugs, new features, or topics to be discussed for Data Import. Topics or questions posted in slack will be added here as well.

 

 

 

Topic Status legend:

Open - To be discussed

Blocked - The group is waiting for further information or some specific action to be completed before progress can continue

In progress - Discussion is ongoing and work is potentially in progress

Closed - Discussion resolved and required actions completed. Closed topics are found on our Archived Data Import Topics page.

How to contribute to other people's discussion topics:

  1. Do not add detail to closed or discussed topics as your comments may be overlooked. In this situation, it might be best to Add your details as a new topic and reference the previous topic.

  2. To contribute to an existing topic. Add a new paragraph to the description column.

  3. @mention yourself at the beginning of the paragraph

How to indicate you are also interested in a topic:

  1. @mention yourself in the "Interested parties" column and add your institution name

How are topics archived:

When a topic status is set to closed by it's "Owner". The topic must also be moved to the Data Import Topic Tracker Archive.

  1. Copy the topic and paste it at the top of the Archived topics page that is nested under this page

  2. Delete the topic from this page

Data Import Issues by Status

Data Import Issues by Type

 

List of Data Import Jira Issues

key summary type created priority status votes
Loading...
Refresh

Topics
Status can be sorted to see Open, In Progress, Closed or Blocked

 

Status

Topic

Description/use case

Date Added

Provided By (Name/Institution)

Interested Parties

Has Been Discussed (Link to agenda/minutes)

Jira Link

Action Required

Status

Topic

Description/use case

Date Added

Provided By (Name/Institution)

Interested Parties

Has Been Discussed (Link to agenda/minutes)

Jira Link

Action Required

1

OPEN

Jobs run immediately after canceled jobs take excess time

Overview: Jobs started immediately after canceling a job get stuck and don't progress

Steps to Reproduce:

  1. Log into Orchid bugfest

  2. Use data import to begin a file of 5000 records with a moderately complex import profile

  3. Cancel the data import job when it hits around 2%

  4. Immediately go to Inventory and trigger an Inventory Single record import by selecting OCLC and using record # 1234567

  5. Then do a few more Inventory single record imports

  6. Watch the log to confirm that the stopped job actually stops and the single record imports are happening quickly

Expected Results:

The job cancels and stops processing. Data import jobs started after the cancellation act normally.

Actual Results:

Single record imports started after canceling a job like this are slow. One single record import after a cancelation took 17 minutes.

Additional Information:

In Nolana, canceled jobs created a large number of error messages that seemed to affect performance. Have the logs been checked to be sure this isn't still happening? Is something else causing this behavior?

 

Attached file and profile "cornell ebz" can be used to replicate.

BE notes (possible solution):

  1. Create cache in mod-source-record-manager with jobs that are IN_PROGRESS. When job is marked as CANCELLED, COMPLETED, ERROR - update the cache and send Kafka event (probably, we only need the event for CANCELLED status)

  2. mod-source-record-storage, mod-inventory, mod-orders, mod-invoice should also have the same cache as in mod-source-record-manager. Subscribe these modules to the status update event. When Job_Cancelled event is received - update cache, set status of that job to CANCELLED. 

  3. In mentioned above 4 modules, when event for record processing is received, before processing - check JobExecution status in the cache. If it is not cancelled - proceed with processing. If there is no record for the given JobExecution, call mod-source-record-manager endpoint to retrieve info for the cache (alternatively call mod-source-record-manager to get the cache data only on module startup). 

 

2023-03-29

@Jenn Colt 

All

2024-2-21 Data Import Subgroup meeting

UXPROD-4704: Stop processing the job after it was canceled by userOpen

UXPROD-4704: Stop processing the job after it was canceled by userOpen

Converted from bug to new feature.

Ryan T. to get information on how this affects slicing

2

Blocked 

The number of created invoices is displayed when all invoices have errors with invoice lines

Overview: ** 

The file has 18 invoices and 1104 invoice lines.

Steps to Reproduce:

  1. Admin user is logged in.

Steps to Reproduce:

  1. Duplicate the "Default – GOBI monograph invoice" profile.

  2. Update next fields into the copied profile:

  •  

    • Name: any unique name

    • Incoming record type: EDIFACT Invoice

    • FOLIO record type: Invoice

    • Description: clean-up the field

    • Details block:
      1) Batch group: any option from the dropdown list
      2) Vendor name: use Organization lookup to find and select GOBI Library Solutions (which will also automatically fill in the Accounting code)
      3)Payment method: any option from the dropdown list.

       3. Create a new "Action Profile" with a unique valid name and with the next properties in it:

  •  

    • Action: Create

    • FOLIO record type: Invoice

    • Link the field mapping profile from step#2.

       4. Create a new "Job profile" with a unique valid name in it and with the next properties in it:

  •  

    • Accepted data type: EDIFACT

    • Link an action profile from step#3.

       5. Upload a valid EDIFACT file using Job profile from the previous step.

       6. Wait till the file is uploaded.

       7. Check log UI and summary to see the record with the upload result.

       8. Pay attention on the 'Invoice' column in the 'Created' row.

Expected Results: The '0' number of created invoices is displayed in cells in the row with the 'Created' row in the 'Summary' table in logs.
Actual Results: The '18' number of created invoices is displayed in cells in the row with the 'Created' row in the 'Summary' table in logs.

NOTE: Recreated on Poppy Bugfest:

 

2023-11-23

Tetiana Paranich

 

Dec 13 2023 Data Import Subgroup meeting

MODSOURMAN-1094: The number of created invoices is displayed when all invoices have errors with invoice linesBlocked

 

3

Open

Investigate deleting old versions of records from SRS, SPIKE

When SRS records are updated, the previous version is marked as old (and the newest version is marked as actual), but the older versions are not deleted. Over time, many, many previous versions of records will build up in SRS and potentially affect performance.

If we wanted to remove the old records, how complicated would that be, and what might we need to take into consideration?

KS: there are also lots of "trash" data saved to SRS as a result of failed or stopped imports (records linked to a snapshot/jobExecution that is Cancelled or records that don't have 999 ff i UUIDs) - consider ways to clean up that data as well.

  • Developers

    • If UI import log related to the previous version of a record has not yet been deleted, what would happen if an SRS record related to the log was deleted. Would it break the log? Can we prevent SRS records from being deleted if they are still connected to an import log in the UI?

    • Should we plan to keep the current and most recent previous? (would that be helpful for when we implement the rollback feature?)

    • Would there be any issues related to the various UUIDs assigned during imports?

    • Would there be any issues related to

      • quickMARC updates?

      • LDP data extracts?

      • Data export

      • OAI-PMH

    • OK for it to apply to all SRS records? all 3 MARC types, EDIFACT invoices?

    • How often to run the cleanup? Make it variable in the MOD or UI settings?

    • How much effort would this be? T-shirt sizes for UI and BE

  • SMEs

    • Would this be helpful? Do we have any way to measure or estimate the impact on SRS/Import performance before implementing?

    • Any requirements?

    • Any questions or concerns?

Results of this spike

  • Wiki page with design

  • All required Jira stories

  • T-shirt sizes for UI and MOD

  • Decide if this is a separate feature or just a couple of stories

 

2022-08-16

 

 

 

MODSOURMAN-857: SPIKE: investigate deleting old versions of records from SRSOpen

 

4

Open

Field is shown after being removed via data import when field mapping profile has rule allowing updates for this field

Field is shown after being removed via data import (when field mapping profile has rule allowing updates for this field)

Preconditions:

  • The "830" MARC field with "$a" value "Cambridge tracts in mathematics" of imported "Instance" record must be linked to the "100" field of imported "MARC Authority" record ("Cambridge tracts in mathematics and mathematical physics").

  • Authorized user with the following permissions:
      Data import: Can upload files, import, and view logs
      Inventory: All permissions
      MARC Authority: View MARC authority record
      quickMARC: Can Link/unlink authority records to bib records
      quickMARC: View, edit MARC bibliographic record
      UI: Data export module is enabled

  • User is on "Inventory" pane with search results for "Instance" record which was linked with "MARC Authority" (see Precondition, e.g.: "The algebraic theory of modular systems / by F.S. Macaulay.").

Steps to Reproduce:

  1. Check the checkbox that is displayed next to the "Instance" record which is linked to "MARC Authority" records on the search result pane.
    For example: "The algebraic theory of modular systems / by F.S. Macaulay.".

  2. Click on the "Actions" button on the second pane and select "Export Instances (MARC)" option.

  3. Go to "Data export" app.

  4. Download exported ".mrc" file by clicking on the "File name" column value.

  5. Open downloaded ".mrc" file via "MarcEdit" (or any similar app).

  6. Delete the linked field (see precondtions): "830" field

  7. Save the edited ".mrc" file.

  8. Go to the "Data import" app >> Click on the "or select files" button >> Select the updated ".mrc" file (see previous step) >> click the "Open" button

  9. Click on the created in precondition "Update MARC Bib by matching 999 ff $s subfield value (830 - update all)" job profile >> Click on the "Actions" in the appeared third pane >> Select "Run" option >> Click on the "Run" button in appeared modal.

  10. Find updated record in "Inventory" app

  11. Click on the "Actions" button and select "Edit MARC bibliographic record" option from the expanded menu.

Expected Results: Deleted "830" field (see step 6) is not shown.

Actual Results: Deleted "830" field (see step 6) is shown and has divided boxes (see attached screencast).

 

2023-03-09

 

 

 

MODSOURCE-691: Field is shown after being removed via data import (when field mapping profile has rule allowing updates for this field)Open

 

5

Blocked

match on 035$a with qualifier fails

When updating an SRS record using a match on the 035$a with a qualifier on the incoming MARC record, the match fails.

Steps to Reproduce:

  1. Log into bugfest-poppy.

  2. Open Data Import and look at Job 10638.

    1. Job Profile = CAH Update SRS MARC on 035$a match w/ qualifier

      1. This profile matches MARC to MARC on the incoming 035$a with a qualifier exactly matching the 035$a of the existing record : (MiFhGG)galncbln000092 -> galncbln000092.

      2. For matches, the record is updated overriding the protection on the 856. 

    2. Sample test records = 51356_test_records.mrc

Expected Results: The job matches the incoming records to the SRS records associated with in10783235 and in10783236 and updates these records with the new 856 in the incoming MARC record.

Actual Results: The incoming records are not matched, the log for SRS MARC says 'No action', and the SRS record is not updated.  

Additional Information: I tested the Field Mapping Profile without the 856 protection and it still failed (Job 10641).  

When reviewing logs on an internal system, the error messages given note that a match is not found.

Additional testing was done on changing the 'Match criterion' of the existing record and no value was found to make the Match profile successful.  

Original testing done in an Orchid environment.

 

2023-11-15

@Yael Hod 

@Corrie Hutchinson (Unlicensed) 

 

2024-1-31 Data Import Subgroup meeting

MODDICORE-386: Match on 035$a with a qualifier failsBlocked

Ryan to review Jira with Folijet leads to understand current design and identify requirement gaps

Partial matching, e.g. begins with, ends with, is required but it does not function as it should. Only exact matching seems to work.

6

Open

Subfield can't be removed when updating Marc bib upon import

Subfield cannot be removed when updating "MARC Bib" upon import when  field mapping profile has rules allowing update of several subfields in all fields (including the subfield which is being added)

Preconditions:

• The job profile should be created for "Data import" app: "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)"

• "Match profile" should have following specified:

Field: 999
Ind 1: f
Ind 2: f
Subfield: s

  • Action profile should be Update MARC Bib

• "Field mapping profile" should have following rules specified:

Update specific fields only

Field: *
In.1: *
In.2: *
Subfield: 1.

Field: *
In.1: *
In.2: *
Subfield: 2.

No Field protection overrides

• Attached "MARC Bib" record should be imported to the system via "Default - Create instance and SRS MARC Bib" job profile (it has subfields "$1", "$2" in two fields)

Steps to Reproduce:

  1. Export previously imported "MARC Bib" record from Preconditions (For preconditions, use MARC_bib_Black_panther_five_nonRepeatable_linkable_fields.mrc file attached to this Jira, and import using the default Create MARC Bib and Instance job profile)

  2. Open exported .mrc file in "MarcEdit" or similar editor

  3. In 130 field: delete "$2" subfield and update "$1" subfield value (e.g., change "$1" value to "new")

  4. In 240 field: delete "$2" subfield and add one more "$1" subfield (e.g., add "$1onemore")

  5. Save updated file

  6. Import updated file using "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)" job profile created in Preconditions

  7. Open updated "MARC Bib" record for edit ("Actions" → "Edit MARC bibliographic record") 

Expected Results: "$2" removed from both edited fields. Following "$1"subfield values are shown in fields:

  •  "130" field: "$1new"

  • "240" field: "$1original $1onemore"

Actual Results: "$2" subfields are not removed from edited fields. "$1" subfields updated/added as expected

 

2023-06-01

 

 

 

MODDICORE-353: Subfield cannot be removed when updating "MARC Bib" upon import (field mapping profile allows update of several subfields in all fields)Open

 

7

Open

Fields duplicated when adding one subfield when updating Marc bib upon import

Fields duplicated when adding one subfield when updating "MARC Bib" upon import when  field mapping profile has rules allowing update of several subfields in all fields (including the subfield which is being added)

Preconditions:

• The job profile should be created for "Data import" app: "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)"

• "Match profile" should have following specified:

Field: 999
Ind 1: f
Ind 2: f
Subfield: s

  • Action profile should be: Update MARC Bibs

• "Field mapping profile" should have following rules specified:

Update specific fields:

Field: *
In.1: *
In.2: *
Subfield: 1.

Field: *
In.1: *
In.2: *
Subfield: 2.

No overrides to MARC field protection

• Attached "MARC Bib" record should be imported to the system via "Default - Create instance and SRS MARC Bib" job profile

Steps to Reproduce:

  1. Export previously imported "MARC Bib" record from Preconditions 

  2. Open exported .mrc file in "MarcEdit" or similar editor

  3. Add "$1" subfield values in with no "$1" (e,g.:

    1. add "$1555555" to "035" field

    2. add "$1test" to "264" field

  4. Save updated file

  5. Import updated file using "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)" job profile created in Preconditions

  6. Open updated "MARC Bib" record for edit ("Actions" → "Edit MARC bibliographic record") 

Expected Results: "$1" added to edited "035", "264" fields. Fields were not duplicated 

Actual Results: "035", "264" fields were duplicated. First copy of each remains unchanged (no added subfield), while second copies contain added "$1" subfield

 

2023-06-01

 

 

 

MODDICORE-352: Fields duplicated when adding one subfield when updating "MARC Bib" upon import (field mapping profile allows update of several subfields in all fields)Open

 

8

Open

Fields duplicated when adding several subfields when updating marc bib upon import

Fields duplicated when adding several subfields when updating "MARC Bib" upon import when  field mapping profile has rules allowing update of corresponding subfields in corresponding fields

Preconditions:

• The job profile should be created for "Data import" app: "Update MARC Bib records by matching 999 ff $s subfield value (240, 600)"

• "Match profile" should have following specified:

Field: 999
Ind 1: f
Ind 2: f
Subfield: s

  • Actin profile should be: Update MARC Bibs

• "Field mapping profile" should have following rules specified:

Update specific MARC fields

Field: 240
In.1: *
In.2: *
Subfield: 1.

Field: 240
In.1: *
In.2: *
Subfield: 2.

Field: 600
In.1: *
In.2: *
Subfield: 1.

Field: 600
In.1: *
In.2: *
Subfield: 2.

No overrides to the existing field protections

• Attached "MARC Bib" record should be imported to the system via "Default - Create instance and SRS MARC Bib" job profile

Steps to Reproduce:

  1. Export previously imported "MARC Bib" record from Preconditions 

  2. Open exported .mrc file in "MarcEdit" or similar editor

  3. Add "$1" and "$2" subfields to "240" field (e.g., add "$1test$2testing")

  4. Add "$1" and "$2" subfields to one of the "600" fields (e.g., add "$1test$2testing" to the third "600" field)

  5. Save updated file

  6. Import updated file using "Update MARC Bib records by matching 999 ff $s subfield value (240, 600)" job profile created in Preconditions

  7. Open updated "MARC Bib" record for edit ("Actions" → "Edit MARC bibliographic record") 

Expected Results: "$1", "$2" added to edited "240", "600" fields. Fields were not duplicated 

Actual Results: "240", "600" fields were duplicated. First copy of each remains unchanged (no added subfields), while second copies contain added "$1", "$2" subfields

 

2023-06-01

 

 

 

MODDICORE-351: Fields duplicated when adding several subfields when updating "MARC Bib" upon import (field mapping profile allows update of these fields/subfields)Open

 

9

Open

Duplicate field is added when updating $0 in linked marc bib field upon data import if field mapping profile allows $0 update

Duplicate field is added when updating "$0" in linked "MARC bib" field upon data import if field mapping profile specifically allows "$0" update

2023-02-15

 

 

 

MODSOURCE-594: Duplicate field is added when updating "$0" in linked "MARC bib" field upon data import if field mapping profile specifically allows "$0" updateOpen

 

10

Open

Incorrect behavior of "Delete Files" button

Note: Does not always reproduce

"Delete files" request deletes the file, but does not always show in the UI
Delete request returns "Cannot delete uploadDefinition 09ef7415-34e4-44cd-9af3-31953df9f200 - linked files are already being processed".

2022-06-02

 

 

 

MODDATAIMP-691: Incorrect behaviour of "Delete files" button Open

 

11

Open

Asynchronous migration is not completed

The asynchronous migration script was run but migration has not been completed, the migration job is still IN_PROGRESS.

2023-06-04

 

 

 

MODSOURCE-665: SPIKE: Investigate cause for asynchronous migration not completingOpen

 

12

Open

Review and fix Marc updates for individual fields

Currently (as of Orchid), the Data Import MARC Updates for specific fields do not handle repeatable fields properly. The logic needs updating, and UI may need updating to indicate how incoming repeatable MARC fields should be handled vis-a-vis the same repeatable field(s) in the existing SRS MARC Bib. This is similar to how the field protection logic needed updating to handle repeatable vs non-repeatable fields properly.

2023-02-20

 

 

 

UXPROD-4080: Review and fix MARC Updates for individual fieldsDraft

 

13

BLOCKED

Partial matching doesn't work

Partial matching, e.g. begins with, ends with, is required but it does not function as it should. Only exact matching seems to work.

2021-01-25

@Yael Hod (Stanford)
@Corrie Hutchinson (Unlicensed) Chicago

 

2024-1-31 Data Import Subgroup meeting

MODDICORE-386: Match on 035$a with a qualifier failsBlocked

Review Jira with Folijet leads to understand current design and identify requirement gaps.
14

in progress

Add new subfields to Electronic access (856)

New subfields in the MARC 856 field need to be represented in Inventory data. The same elements should appear in the electronic access block in Instance, Holdings, and Item records. https://www.loc.gov/marc/bibliographic/bd856.html

2023-09-14

 

 

2024-7-10 Data Import Subgroup meeting

 

UXPROD-4467: Electronic Access Block--New Elements (Instance, Holdings, Items)Open

 

15

In Progress

Ability to change the link to a profile rather than just remove it

Current situation: We are only able to link or unlink profiles (field mapping to action, action to a job, match to a job, etc).

New Feature: We want to be able to change the link rather than just unlink

Expected behavior: There is another option that allows the user to change the link to a different profile.

Use case: The wrong profile was used and the new one needs to be added. Rather than unlinking everything, it'd be easier to just update the link to the correct one.

2024-02-27

@Jennifer Eustis

 

2024-2-28 Data Import Subgroup meeting

2024-7-10 Data Import Subgroup meeting

https://folio-org.atlassian.net/browse/UXPROD-4935

Ryan will create a ticket. This might involve rethinking the profiles page setup.

16

Open

Unable to pull vendor account number from POL when importing EDIFACT invoices

In our previous system the vendor account number lived at the PO/Invoice level. Now it is on the POL/invoice line. I have not found a way when loading EDIFACT invoice files to draw this directly from the POL or to retrieve it from the vendor file. This means for each invoice we must put in all of the vendor account numbers manually, which adds up and is prone to error. If there is a way that data import could pull this value from the POL it would save so much time in our processing.

2024-04-12

@Kimberly Pamplin

 

2024-7-10 Data Import Subgroup meeting

 

Need more information from @Kimberly Pamplin

17

In Progress

Additional values needed for Electronic access fields or 856 subfields

Issue: Right now, only a few subfields from the 856 are mapped. We would like to expand that ability to include the non public note (856$x), access status (856$7) and terms governing access (856$n).

 

@Jennifer Eustis

All

2024-7-10 Data Import Subgroup meeting

https://folio-org.atlassian.net/browse/UIIN-2579

Ryan will also look into mapping indicators.
Need to account for all Inventory record types
Need to account for bulk edit, data import/export, ???
18

In Progress

Ensure consistency of UI for blank indicators between Bulk Edit, Data Export, quickMarc, and Data Import

Issue: Data Import displays blanks with a space. Quickmarc is a slash and so on. To avoid confusion, it'd be great to make sure that blank indicators and how marc fields and subfields are mapped in bulk edit, data import, and data export are done in a consistent way.

2024-02

@Jennifer Eustis

All

2024-7-10 Data Import Subgroup meeting

 

Ryan is bringing this topic to Magda and Christine to discuss.
19

IN PROGRESS

Reporting: Have the ability to download a list of instance, holdings, or item record identifiers that were successfully imported

Issue: There isn't a way to retrieve a list of identifiers through the Data Import log.

2024-07-11

@Jennifer Eustis

All

 

 

Ryan will look into making row 54-56 as one epic that has smaller stories.
20

IN PROGRESS

Reporting: Have the ability to save a list of successfully imported records to a list in the Lists App

New Functionality. In addition to downloading a list, it would be great to be able to save the imported identifiers to a list in the Lists App

2024-07-11

@Jennifer Eustis

All

 

 

 

21

IN PROGRESS

Reporting: Have the ability to download a list of errors from an import

Issue: The only way to see errors is to navigate in the log and to click on the title to see the jason. Having an export like in Bulk Edit would be helpful.

2024-07-11

@Jennifer Eustis

All

 

https://folio-org.atlassian.net/browse/UIDATIMP-914

 

22

OPEN

Ability to view application log

When DI was in the planning phase with, there was a request to be able to view the application log. Examples were provided from other systems. This is still needed. This was shown as "server logs" in the original wireframes. See

2024-08-15

Lab Session

ALL

 

 

 

23

OPEN

Ability to update instance and marc srs in same job

Users need to be able to update the administrative data and also override protected fields to update the srs bib record. Tested in lab session 10-17-2024 this didn’t work in Poppy (Chicago test environment).

2024-10-17

Lab session

all

 

 

 

24

OPEN

Add date and start/stop running date and times to the summary log page

Right now to see these times, you have to click out of the summary log view and back to the brief log view. Having this information displayed also on the summary page is helpful and needed.

2024-10-17

Lab session

ALL

 

 

 

25

OPEN

Data Import log does not provide reason for No Action status.

The reason was previously provided as an error even though No action is means updates or creates were not taken because of the profile logic. (Multiple matches, single match with no create or update action provided, or no matches, etc.) The reason for No action should be provided in the log because it could be any of a number of scenarios.

2024-10-23

@Christie Thomas University of Chicago

 

 

 

 

26

OPEN

Update instance, holdings, and item in reverse order.

Right now the instance, holdings, and item must be updated in that order. It is also not possible to update an item independently and then, in the same job, match and update the instance and holdings. When updating all three records as a part of a shelf ready workflow, integrations (FOLIO app and external) require that the barcode be added to the item before the holdings record is updated. We need to be able to match and instance, holdings, and item (in that order) and then update the item, holdings, and instance or the item, instance, holdings in either of those orders. Error message from import in Poppy: io.vertx.core.json.DecodeException: Failed to decode:Cannot deserialize value of type `java.util.LinkedHashMap<java.lang.Object,java.lang.Object>` from Array value (token `JsonToken.START_ARRAY`) at [Source: (String)"[{"id":"babefda2-17c3-4ff2-a677-f469c1b7bb59","_version":3,"hrid":"13642832","holdingsTypeId":"0c422f92-0f4d-4d32-8cbe-390ebc33a3e5","formerIds":[],"instanceId":"1980ec39-2d53-42d9-839b-d4d080850c76","permanentLocationId":"fad8517a-aae4-5b69-855e-01843e6e4d88","effectiveLocationId":"fad8517a-aae4-5b69-855e-01843e6e4d88","electronicAccess":[],"callNumberTypeId":"95467209-6d7b-468b-94df-0f5d7ad2747d","callNumber":"PL2260.52.B536A5 2019","notes":[],"holdingsStatements":[],"holdingsStatementsForInde"[truncated 371 chars]; line: 1, column: 1]""

2024-10-31

@Christie Thomas University of Chicago

 

 

 

 

27

OPEN

Update the SRS with override field protections enabled and update the instance status and cataloged date in a single job.

It is not possible to pair and SRS update with an instance update in the same job. We have the need to update the srs marc record and the instance record in a single job with a single match. (Match and instance or an srs marc record and specify an srs marc bibliographic update profile and an instance update profile. Or create an action that is linked to multiple field mapping profiles.

2024-10-31

@Christie Thomas

University of Chicago

 

 

 

 

28

 

Delete holdings and items in batch via data import

When marking an instance for deletion we should be able to also delete all holdings and items attached to the instance or delete holdings and items targeted by identifier.

2024-11-21

@Christie Thomas University of Chicago