Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This page is meant to track progress on issues such as bugs, new features, or topics to be discussed for Data Import. Topics or questions posted in slack will be added here as well.

Topic Status legend:

Status
colourBlue
titleOpen
- To be discussed

Status
titleBlocked
- The group is waiting for further information or some specific action to be completed before progress can continue

Status
colourYellow
titleIn progress
- Discussion is ongoing and work is potentially in progress

Status
colourGreen
titleClosed
- Discussion resolved and required actions completed

Priority Status legend:

Status
colourRed
titleCritical
- High Priority

Status
subtletrue
colourYellow
titleHigh
- Medium Priority

Status
subtletrue
titleLow
- Low Priority

. Closed topics are found on our Archived Data Import Topics page.

How to contribute to other people's discussion topics:

  1. Do not add detail to closed or discussed topics as your comments may be overlooked. In this situation, it might be best to Add your details as a new topic and reference the previous topic.

  2. To contribute to an existing topic. Add a new paragraph to the description column.

  3. @mention yourself at the beginning of the paragraph

How to indicate you are also interested in a topic:

  1. @mention yourself in the "Interested parties" column and add your institution name

How are topics archived:

When a topic status is set to closed by it's "Owner". The topic must also be moved to the " Data Import Topic Tracker Archive" page.

  1. Copy the topic and paste it at the top of the Archived topics page that is nested under this page

  2. Delete the topic from this page

Data Import Issues by Status

Jira Charts
borderfalse
showinfortrue
serverSystem JIRA
jqlstatusCategory%20IN%20(%22undefined%22%2C%20%22In%20Progress%22%2C%20%22To%20Do%22)%20AND%20labels%20IN%20(%22data-import%22)
statTypestatuses
chartTypepie
width
isAuthenticatedtrue
serverId01505d01-b853-3c2e-90f1-ee9b165564fc

Data Import Issues by Type

Jira Charts
borderfalse
showinfortrue
serverSystem Jira
jql%20labels%20%3D%20data-import%20
statTypeissuetype
chartTypepie
width
isAuthenticatedtrue
serverId01505d01-b853-3c2e-90f1-ee9b165564fc

List of Data Import Jira Issues

Jira Legacy
serverSystem Jira
columnIdsissuekey,summary,issuetype,created,priority,status,votes
columnskey,summary,type,created,priority,status,votes
maximumIssues10
jqlQuery labels = data-import AND statusCategory in (2, 4)
serverId01505d01-b853-3c2e-90f1-ee9b165564fc

Topics
Status can be sorted to see Open, In Progress, Closed or Blocked

MARC-MARC matches and MARC-Inventory matches have differing use cases. Pairing a MARC-MARC match with a more specific MARC-Instance or MARC–Holdings or MARC–Item match allows for identifying a specific record to be updated, or confirms that a new record is needed.

Status

Topic

Description/use case

Date Added

Provided By (Name/Institution)

Interested Parties

Has Been Discussed (Link to agenda/minutes)

Jira Link

Action Required

1

Status

2023-06-13

Corrie Hutchinson (Unlicensed) 

All

2024-1-17 Data Import Subgroup meeting

2024-1-10 Data Import Subgroup meeting

Jira Legacy
colourGreenBlue
titleCLOSED

MARC-MARC Matching Enhancements

Expand

We want to ensure that MARC-MARC matching works properly for repeatable and non-repeatable fields, especially 0XX/9XX fields, and that they can pair well with Inventory submatches.

In scope:

  • MARC-MARC matches that result in multiple possible hits can be narrowed to single records with MARC-Inventory or static value submatches

  • Review any existing matching bugs and plan to resolve as part of this feature

Out of scope:

  • After a MARC-MARC or MARC-Instance match, a user can include both Instance and MARC Bib actions afterwards (need examples from users)

  • Confirm that MARC matches are working properly using indicator wildcards (asterisks) versus blanks

  • Currently ISBN matches do not translate 10-digit and 13-digit so that they can be matched against each other, Include in this feature, or handle as a separate feature in the future? See MODSOURMAN-269

  • Should we add a bug for not being able to have an override action for field protections under a MARC-Instance match?

  • What else?

Use case(s):

  • SMEs: Please add examples*
     * MARC-MARC match on OCLC number and then submatch by Instance status

  • MARC-MARC match on 001 and then submatch for holdings by permanent location

  • Need a use case that results in multiple SRS hits that then need to be narrowed down by Inventory match

2020-05-13

All

All

2024-1-17 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUXPROD-2742

  •  More Use Cases and scenarios
2

Status
colourGreen
titleclosed

Data Import removes duplicate 856s in SRS

Overview:  When updating an SRS record via Data Import, some MARC fields are duplicated while others are de-duped without notification or guidelines.    
**

Expand

Steps to Reproduce:

  1. Log into any environment on Orchid or Nolana.

  2. Identify any record in inventory and export the MARC record.

  3. Add two exact duplicate 856 fields to the record.

    1. Exact duplicates including indicators and subfields.  Copy & paste.

  4. Import using a simple overlay job profile matching on a subfield of the 999 to update the SRS.  Do not use MARC modifications.

Expected Results: The SRS record contains duplicate 856 fields.
Actual Results: The SRS record contains only one of the duplicate 856 fields.
**

Additional Information:  We know that Data Import does not de-dupe the 903 field, for example, during an update but it does the 856 field.  Data Import jobs which create new SRS records includes the duplicate 856 fields.  This raises several questions:

  • Why does the system de-dupe during an update without explicit instructions from the user?  

  • Which fields does the system de-dupe and when?

  • Is this a bug?  Or by design?  

From testing, there appears to be no difference between de-duping of the 856 when field protections are applied or not.

OPEN

Jobs run immediately after canceled jobs take excess time

Overview: Jobs started immediately after canceling a job get stuck and don't progress

Expand

Steps to Reproduce:

  1. Log into Orchid bugfest

  2. Use data import to begin a file of 5000 records with a moderately complex import profile

  3. Cancel the data import job when it hits around 2%

  4. Immediately go to Inventory and trigger an Inventory Single record import by selecting OCLC and using record # 1234567

  5. Then do a few more Inventory single record imports

  6. Watch the log to confirm that the stopped job actually stops and the single record imports are happening quickly

Expected Results:

The job cancels and stops processing. Data import jobs started after the cancellation act normally.

Actual Results:

Single record imports started after canceling a job like this are slow. One single record import after a cancelation took 17 minutes.

Additional Information:

In Nolana, canceled jobs created a large number of error messages that seemed to affect performance. Have the logs been checked to be sure this isn't still happening? Is something else causing this behavior?

Attached file and profile "cornell ebz" can be used to replicate.

BE notes (possible solution):

  1. Create cache in mod-source-record-manager with jobs that are IN_PROGRESS. When job is marked as CANCELLED, COMPLETED, ERROR - update the cache and send Kafka event (probably, we only need the event for CANCELLED status)

  2. mod-source-record-storage, mod-inventory, mod-orders, mod-invoice should also have the same cache as in mod-source-record-manager. Subscribe these modules to the status update event. When Job_Cancelled event is received - update cache, set status of that job to CANCELLED. 

  3. In mentioned above 4 modules, when event for record processing is received, before processing - check JobExecution status in the cache. If it is not cancelled - proceed with processing. If there is no record for the given JobExecution, call mod-source-record-manager endpoint to retrieve info for the cache (alternatively call mod-source-record-manager to get the cache data only on module startup). 

2023-03-29

Jenn Colt 

All

2024-2-21 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURMAN-970

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMPUXPROD-879

  •  Need to define what deduplication means
  •  Clarify expectations
  •  Deduping in the UI vs SRS
  •  Deduping the incoming record
3

Status
colourBlue
titleOPEN

Jobs run immediately after canceled jobs take excess time

Overview: Jobs started immediately after canceling a job get stuck and don't progress

expand

4704

Converted from bug to new feature.

Ryan T. to get information on how this affects slicing

2

Status
titleBlocked
 

The number of created invoices is displayed when all invoices have errors with invoice lines

Overview: ** 

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURMAN-970

Jira LegacyserverSystem JiraserverId01505d01-
Expand

The file has 18 invoices and 1104 invoice lines.

Steps to Reproduce:

  1. Log into Orchid bugfest

  2. Use data import to begin a file of 5000 records with a moderately complex import profile

  3. Cancel the data import job when it hits around 2%

  4. Immediately go to Inventory and trigger an Inventory Single record import by selecting OCLC and using record # 1234567

  5. Then do a few more Inventory single record imports

  6. Watch the log to confirm that the stopped job actually stops and the single record imports are happening quickly

Expected Results:

The job cancels and stops processing. Data import jobs started after the cancellation act normally.

Actual Results:

Single record imports started after canceling a job like this are slow. One single record import after a cancelation took 17 minutes.

Additional Information:

In Nolana, canceled jobs created a large number of error messages that seemed to affect performance. Have the logs been checked to be sure this isn't still happening? Is something else causing this behavior?

Attached file and profile "cornell ebz" can be used to replicate.

BE notes (possible solution):

  1. Create cache in mod-source-record-manager with jobs that are IN_PROGRESS. When job is marked as CANCELLED, COMPLETED, ERROR - update the cache and send Kafka event (probably, we only need the event for CANCELLED status)

  2. mod-source-record-storage, mod-inventory, mod-orders, mod-invoice should also have the same cache as in mod-source-record-manager. Subscribe these modules to the status update event. When Job_Cancelled event is received - update cache, set status of that job to CANCELLED. 

  3. In mentioned above 4 modules, when event for record processing is received, before processing - check JobExecution status in the cache. If it is not cancelled - proceed with processing. If there is no record for the given JobExecution, call mod-source-record-manager endpoint to retrieve info for the cache (alternatively call mod-source-record-manager to get the cache data only on module startup). 

2023-03-29

Jenn Colt 

All

2024-2-21 Data Import Subgroup meeting

  1. Admin user is logged in.

Steps to Reproduce:

  1. Duplicate the "Default – GOBI monograph invoice" profile.

  2. Update next fields into the copied profile:

    • Name: any unique name

    • Incoming record type: EDIFACT Invoice

    • FOLIO record type: Invoice

    • Description: clean-up the field

    • Details block:
      1) Batch group: any option from the dropdown list
      2) Vendor name: use Organization lookup to find and select GOBI Library Solutions (which will also automatically fill in the Accounting code)
      3)Payment method: any option from the dropdown list.

       3. Create a new "Action Profile" with a unique valid name and with the next properties in it:

    • Action: Create

    • FOLIO record type: Invoice

    • Link the field mapping profile from step#2.

       4. Create a new "Job profile" with a unique valid name in it and with the next properties in it:

    • Accepted data type: EDIFACT

    • Link an action profile from step#3.

       5. Upload a valid EDIFACT file using Job profile from the previous step.

       6. Wait till the file is uploaded.

       7. Check log UI and summary to see the record with the upload result.

       8. Pay attention on the 'Invoice' column in the 'Created' row.

Expected Results: The '0' number of created invoices is displayed in cells in the row with the 'Created' row in the 'Summary' table in logs.
Actual Results: The '18' number of created invoices is displayed in cells in the row with the 'Created' row in the 'Summary' table in logs.

NOTE: Recreated on Poppy Bugfest:

2023-11-23

Tetiana Paranich

Dec 13 2023 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUXPRODMODSOURMAN-4704

Converted from bug to new feature.

Ryan T. to get information on how this affects slicing

4

1094

3

2023-09-14

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUIDATIMP-1521

10

Status
colourGreen
titleCLOSED

Data Import log displays ambiguous information for successful matches on anything but 999i/instance uuid

f your match profile matches incoming MARC 020 on instance ISBN, and the incoming MARC record matches an instance, the reported status in the log of the SRS record will be Updated if the MARC record also contains a 999-ff-i matching the instance's UUID, but Created if the incoming MARC record does not contain a 999-ff-i. In both cases the status reported for the instance in the log is Updated.

Expand

To a librarian wishing to overlay existing FOLIO records with matching incoming MARC record, it is a source of great confusion that, when there is a match and an underlying SRS record, the status of the SRS record is sometimes reported as Updated and sometimes as Created – even if you are matching on the same field, overlaying the same record. This may cause enough uncertainty about whether Data Import works as expected, overlaying the right records when it should, to make the librarian reluctant about using Data Import.

It seems that either

  • there is something wrong in how FOLIO creates/updates existing SRS records when matching on anything but the UUID in 999i
    or

  • the information about SRS update/creation displayed in the log is incorrect or designed in a way that is more confusing than useful to end users (librarians)

It would be great, if as a first step, someone with insight into how Data Import works could review the current behaviour to assess whether librarians can "safely" use Data Import with match profiles even though the log shows this ambiguous result.

Steps to Reproduce:

To test, you need:

  • a FOLIO instance with an underlying SRS record

  • three .mrc files: one with the UUID of the instance in 999i, one without a 999i field but with the ISBN of the instance in 020a, and one with a 999i field that contains a value which is not an instance UUID.

Use the following job profile:
https://bugfest-kiwi.folio.ebsco.com/settings/data-import/job-profiles/view/5d105836-e127-48f4-a942-dec42ef265e6

Try the following three:

  1. Using a job profile that matches on identifier ISBN, import a MARC record that contains a 999i field with a FOLIO identifier.

  2. Using a job profile that matches on identifier ISBN, import a MARC record that does not contain a 999i field with a FOLIO identifier.

  3. Using a job profile that matches on identifier ISBN, import a MARC record that contains a 999i field with the value "catsarecute"

Expected Results:

  1. Using a job profile that matches on identifier ISBN, import a MARC record that contains a 999i field with a FOLIO identifier.
    The incoming 020 of the incoming MARC record successfully matches on the ISBN of the instance. In the log, instance and SRS have status Updated.

  1. Using a job profile that matches on identifier ISBN, import a MARC record that does not contain a 999i field with a FOLIO identifier.
    *The incoming 020 of the incoming MARC record successfully matches on the ISBN of the instance. In the log, instance and SRS have status Updated.

  1. Using a job profile that matches on identifier ISBN, import a MARC record that contains a 999i field with the value "catsarecute"
    The incoming 020 of the incoming MARC record successfully matches on the ISBN of the instance. In the log, instance and SRS have status Updated.

2022-05-30

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURMAN-848

11

Status
colourBlue
titleOpen

Field is shown after being removed via data import when field mapping profile has rule allowing updates for this field

Field is shown after being removed via data import (when field mapping profile has rule allowing updates for this field)

Expand

Preconditions:

  • The "830" MARC field with "$a" value "Cambridge tracts in mathematics" of imported "Instance" record must be linked to the "100" field of imported "MARC Authority" record ("Cambridge tracts in mathematics and mathematical physics").

  • Authorized user with the following permissions:
      Data import: Can upload files, import, and view logs
      Inventory: All permissions
      MARC Authority: View MARC authority record
      quickMARC: Can Link/unlink authority records to bib records
      quickMARC: View, edit MARC bibliographic record
      UI: Data export module is enabled

  • User is on "Inventory" pane with search results for "Instance" record which was linked with "MARC Authority" (see Precondition, e.g.: "The algebraic theory of modular systems / by F.S. Macaulay.").

Steps to Reproduce:

  1. Check the checkbox that is displayed next to the "Instance" record which is linked to "MARC Authority" records on the search result pane.
    For example: "The algebraic theory of modular systems / by F.S. Macaulay.".

  2. Click on the "Actions" button on the second pane and select "Export Instances (MARC)" option.

  3. Go to "Data export" app.

  4. Download exported ".mrc" file by clicking on the "File name" column value.

  5. Open downloaded ".mrc" file via "MarcEdit" (or any similar app).

  6. Delete the linked field (see precondtions): "830" field

  7. Save the edited ".mrc" file.

  8. Go to the "Data import" app >> Click on the "or select files" button >> Select the updated ".mrc" file (see previous step) >> click the "Open" button

  9. Click on the created in precondition "Update MARC Bib by matching 999 ff $s subfield value (830 - update all)" job profile >> Click on the "Actions" in the appeared third pane >> Select "Run" option >> Click on the "Run" button in appeared modal.

  10. Find updated record in "Inventory" app

  11. Click on the "Actions" button and select "Edit MARC bibliographic record" option from the expanded menu.

Expected Results: Deleted "830" field (see step 6) is not shown.

Actual Results: Deleted "830" field (see step 6) is shown and has divided boxes (see attached screencast).

2023-03-09

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURCE-691

12

Status
titleBlocked

match on 035$a with qualifier fails

When updating an SRS record using a match on the 035$a with a qualifier on the incoming MARC record, the match fails.

Expand

Steps to Reproduce:

  1. Log into bugfest-poppy.

  2. Open Data Import and look at Job 10638.

    1. Job Profile = CAH Update SRS MARC on 035$a match w/ qualifier

      1. This profile matches MARC to MARC on the incoming 035$a with a qualifier exactly matching the 035$a of the existing record : (MiFhGG)galncbln000092 -> galncbln000092.

      2. For matches, the record is updated overriding the protection on the 856. 

    2. Sample test records = 51356_test_records.mrc

Expected Results: The job matches the incoming records to the SRS records associated with in10783235 and in10783236 and updates these records with the new 856 in the incoming MARC record.

Actual Results: The incoming records are not matched, the log for SRS MARC says 'No action', and the SRS record is not updated.  

Additional Information: I tested the Field Mapping Profile without the 856 protection and it still failed (Job 10641).  

When reviewing logs on an internal system, the error messages given note that a match is not found.

Additional testing was done on changing the 'Match criterion' of the existing record and no value was found to make the Match profile successful.  

Original testing done in an Orchid environment.

2023-11-15

Yael Hod 

Corrie Hutchinson (Unlicensed) 

2024-1-31 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-386

Ryan to review Jira with Folijet leads to understand current design and identify requirement gaps

Partial matching, e.g. begins with, ends with, is required but it does not function as it should. Only exact matching seems to work.

13

Status
colourGreen
titleclosed

data import sorts protected fields out of order after update

Field protections moves the protected field to the first Nxx field. For instance, if the protected field is a 541, the protected 541 becomes the first of all 5xx tags. (see screen shot attached.)

Expand

Steps to Reproduce:

  1. Log into folio-snapshot as diku_admin.

  2. Go to settings > Data import > field protections and add the 541 tag as a protected field. 

  3. In settings > data import create a new update instance mapping and action profile. The profile can be empty or the status / statistical code/ cataloged date can be changed. 

    1. Create a new match profile on the 999ff$i to instance uuid.

    2. Create a new job using the new match profile and instance uuid.

  4. Use the default import profile to create a new record in the system that has a 541 tag. 

  5. Go to inventory and locate the newly created record. Select the record and then use the Action menu to export the instance.

  6. Open the newly created and edit the 541 in QuickMarc so that you will know that this was the existing 541 record. Save the record. 

  7. Use data import to overlay the FOLIO record. There should be two 541s in the record. The existing 541 (protected) will be the first 5xx field in the record and the newly imported 541 will be in its existing place in the record.

Expected Results: Both 541 fields will be next to each other filed according to their previous position. I think it can be assumed that protected 541 will be the first 541 in the new record where both 541 fields are present.

Actual Results: The protected 541 is filed at the beginning of all 5xx fields in the record and the new 541 is in place as it was in the incoming file.

2023-05-08

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-358

14

Status
colourBlue
titleOpen

Subfield can't be removed when updating Marc bib upon import

Subfield cannot be removed when updating "MARC Bib" upon import when  field mapping profile has rules allowing update of several subfields in all fields (including the subfield which is being added)

Expand

Preconditions:

• The job profile should be created for "Data import" app: "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)"

• "Match profile" should have following specified:

Field: 999
Ind 1: f
Ind 2: f
Subfield: s

  • Action profile should be Update MARC Bib

• "Field mapping profile" should have following rules specified:

Update specific fields only

Field: *
In.1: *
In.2: *
Subfield: 1.

Field: *
In.1: *
In.2: *
Subfield: 2.

No Field protection overrides

• Attached "MARC Bib" record should be imported to the system via "Default - Create instance and SRS MARC Bib" job profile (it has subfields "$1", "$2" in two fields)

Steps to Reproduce:

  1. Export previously imported "MARC Bib" record from Preconditions (For preconditions, use MARC_bib_Black_panther_five_nonRepeatable_linkable_fields.mrc file attached to this Jira, and import using the default Create MARC Bib and Instance job profile)

  2. Open exported .mrc file in "MarcEdit" or similar editor

  3. In 130 field: delete "$2" subfield and update "$1" subfield value (e.g., change "$1" value to "new")

  4. In 240 field: delete "$2" subfield and add one more "$1" subfield (e.g., add "$1onemore")

  5. Save updated file

  6. Import updated file using "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)" job profile created in Preconditions

  7. Open updated "MARC Bib" record for edit ("Actions" → "Edit MARC bibliographic record") 

Expected Results: "$2" removed from both edited fields. Following "$1"subfield values are shown in fields:

  •  "130" field: "$1new"

  • "240" field: "$1original $1onemore"

Actual Results: "$2" subfields are not removed from edited fields. "$1" subfields updated/added as expected

2023-06-01

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-353

15

Status
colourBlue
titleOpen

Fields duplicated when adding one subfield when updating Marc bib upon import

Fields duplicated when adding one subfield when updating "MARC Bib" upon import when  field mapping profile has rules allowing update of several subfields in all fields (including the subfield which is being added)

Expand

Preconditions:

• The job profile should be created for "Data import" app: "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)"

• "Match profile" should have following specified:

Field: 999
Ind 1: f
Ind 2: f
Subfield: s

  • Action profile should be: Update MARC Bibs

• "Field mapping profile" should have following rules specified:

Update specific fields:

Field: *
In.1: *
In.2: *
Subfield: 1.

Field: *
In.1: *
In.2: *
Subfield: 2.

No overrides to MARC field protection

• Attached "MARC Bib" record should be imported to the system via "Default - Create instance and SRS MARC Bib" job profile

Steps to Reproduce:

  1. Export previously imported "MARC Bib" record from Preconditions 

  2. Open exported .mrc file in "MarcEdit" or similar editor

  3. Add "$1" subfield values in with no "$1" (e,g.:

    1. add "$1555555" to "035" field

    2. add "$1test" to "264" field

  4. Save updated file

  5. Import updated file using "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)" job profile created in Preconditions

  6. Open updated "MARC Bib" record for edit ("Actions" → "Edit MARC bibliographic record") 

Expected Results: "$1" added to edited "035", "264" fields. Fields were not duplicated 

Actual Results: "035", "264" fields were duplicated. First copy of each remains unchanged (no added subfield), while second copies contain added "$1" subfield

2023-06-01

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-352

16

Status
colourBlue
titleOpen

Fields duplicated when adding several subfields when updating marc bib upon import

Fields duplicated when adding several subfields when updating "MARC Bib" upon import when  field mapping profile has rules allowing update of corresponding subfields in corresponding fields

Expand

Preconditions:

• The job profile should be created for "Data import" app: "Update MARC Bib records by matching 999 ff $s subfield value (240, 600)"

• "Match profile" should have following specified:

Field: 999
Ind 1: f
Ind 2: f
Subfield: s

  • Actin profile should be: Update MARC Bibs

• "Field mapping profile" should have following rules specified:

Update specific MARC fields

Field: 240
In.1: *
In.2: *
Subfield: 1.

Field: 240
In.1: *
In.2: *
Subfield: 2.

Field: 600
In.1: *
In.2: *
Subfield: 1.

Field: 600
In.1: *
In.2: *
Subfield: 2.

No overrides to the existing field protections

• Attached "MARC Bib" record should be imported to the system via "Default - Create instance and SRS MARC Bib" job profile

Steps to Reproduce:

  1. Export previously imported "MARC Bib" record from Preconditions 

  2. Open exported .mrc file in "MarcEdit" or similar editor

  3. Add "$1" and "$2" subfields to "240" field (e.g., add "$1test$2testing")

  4. Add "$1" and "$2" subfields to one of the "600" fields (e.g., add "$1test$2testing" to the third "600" field)

  5. Save updated file

  6. Import updated file using "Update MARC Bib records by matching 999 ff $s subfield value (240, 600)" job profile created in Preconditions

  7. Open updated "MARC Bib" record for edit ("Actions" → "Edit MARC bibliographic record") 

Expected Results: "$1", "$2" added to edited "240", "600" fields. Fields were not duplicated 

Actual Results: "240", "600" fields were duplicated. First copy of each remains unchanged (no added subfields), while second copies contain added "$1", "$2" subfields

2023-06-01

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-351

17

Status
colourGreen
titleCLOSED

Job summary: error column does not display errors

When there is any error related to an instance/authority/orders/invoice, the error column does not display it.

2024-01-25

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUIDATIMP-1590

18

Status
colourGreen
titleCLOSED

Field mapping profiles: state of the final form fields is not set

When switching between Folio record types fields with the same name do not reset the state (value, dirty, etc.), although the field values are equal to the initial.

Current workaround: start over/refresh page

2021-06-30

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUIDATIMP-947

19

Status
colourGreen
titleCLOSED

Incorrect quantity is displayed in the cell of no action and error rows at the individual import job's log

The '1' number of Instance is displayed in cell in the row with the 'No action' and 'Error' rows header in the 'Summary table' at the individual import job's log.

2024-01-19

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURMAN-1117

20

Status
colourGreen
titleCLOSED

The status of srs marc is created after match+modify action

Expected Results: The status of SRS MARC is 'Updated' in the Import log after uploading MARC file for update.

2023-03-07

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-1008

21

Status
colourGreen
titleCLOSED

PMSystem displayed as source in quickmarc view when record was created by non matches action of job profile

"PMSystem" displayed as source (instead of User's last and first name) in "Edit MARC authority record" view when record was created by "Non-matches" action of job profile.

2023-03-07

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURCE-608

22

Status
colourBlue
titleOpen

Duplicate field is added when updating $0 in linked marc bib field upon data import if field mapping profile allows $0 update

Duplicate field is added when updating "$0" in linked "MARC bib" field upon data import if field mapping profile specifically allows "$0" update

2023-02-15

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURCE-594

23

Status
colourGreen
titleCLOSED

Single record overlay creates duplicate oclc #/035

When "Overlay source bibliographic record' is employed for the first time, duplicate 035 fields are created.   **  

2023-04-12

Status
colourGreenBlue
titleCLOSED

Adding MARC modifications to imports with update actions creates broken records

Overview: 

Expand

Steps to Reproduce:

  1. Log into Morning Glory bugfest

  2. Go to inventory and select import using the "OCLC with MARC modifications" option and import OCLC number 31934425

  3. The record should import successfully and the title will have received an obvious MARC modification

  4. Attempt to overlay the same record with the same OCLC number.

Expected Results:

Overlaying the record works. The MARC is modified as described in the profile and the instance and SRS are updated.

Actual Results:

The instance is not updated. A modified SRS record is created but still has the original OCLC 001 and 003. QuickMARC will not work on the record.

Additional Information:
URL:  Update profile: https://bugfest-mg.int.aws.folio.org/settings/data-import/job-profiles/view/1c351fa4-d578-434f-a02a-7ff46af16f06?query=single&sort=name

An example of an instance with this issue: https://bugfest-mg.int.aws.folio.org/inventory/view/53e28701-dccc-49be-a01d-9adaa15f4cb6?query=neuromancer&sort=title&xidtype=0dd718cf-a09a-4f1c-be6a-0cf0de58b424

Job profiles

BE Notes:

  • First Modify action saves the record in SRS, subsequent Match on MARC Bib in SRS returns multiple match error, because it finds not only the original record, but also the one that was just saved.

  • During modify action post-processing an attempt appears to change instance hrid because of incoming record doesn't contain actual instance hrid in 001
    that causes error on instance update.

Note for QAs:
When this bug is fixed, create a new TestRail for modification, followed by match, followed by update action, as outlined in the repro steps above. Then unlink this Jira from TestRail C350914

2022-08-04

Jenn Colt 

All

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-710

5

Status
colourGreen
titleCLOSED

Adding MARC modifications to single record overlay doesn't respect field protections

Overview: When a MARC modification action is added to the end of the single record overlay job, protected fields from the existing MARC SRS are removed rather than from the incoming file.

Expand

Steps to Reproduce:

  1. Log into some FOLIO snapshot environment 

  2. Go to Settings/Data Import

  3. Duplicate the field mappings for Inventory Single Record - Default for the update marc bib and update instance

  4. Create a new field mapping for a marc modification where the fields 029, 506, 856, 583 are deleted

  5. Create action profiles linked to the field mappings you just created

  6. Duplicate the inventory single record match for no srs and existing srs

  7. In field protections, make sure that you add 583 with subfield 5 and data MU in addition to 856 with subfield 5 and data MU.

  8. Create a job with a match to existing marc srs. On matches update marc srs. On no matches, another match to no srs and then under that update instance. Then at the level of the 1st match, add the modify marc action (see screenshot)

  9. Bring in a record and add a 583 with a $5 MU and a 856 $5 MU

  10. Overlay

Expected Results: The existing and protected fields should remain in the record.

Actual Results: The protected fields have been removed.

Additional Information:
URL: The job profile on orchid bugfest is https://bugfest-orchid.int.aws.folio.org/settings/data-import/job-profiles/view/b590fc78-f069-42a1-bfdd-3988c7d6be00?query=fc%20test&sort=name. 

2023-08-23

Jennifer Eustis 

All

2024-2-21 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-897

Functionality audit being prepared in DI lab

6

Status
colourGreen
titleCLOSED

Single record overlay creates duplicate OCLC # / 035

Overview:  When "Overlay source bibliographic record' is employed for the first time, duplicate 035 fields are created.

Expand

Steps to Reproduce:

  1. Log into bugfest-Orchid 

  2. Open Inventory and search for an instance (random records were chosen and the behavior was duplicated each time) 

  3. From Actions, choose 'Overlay source bibliographic record':

    1. External target = OCLC WorldCat

    2. Profile = Inventory single record - default update instance (Default)

    3. Enter the OCLC# of the instance currently being viewed

  4. Click 'Import'

Expected Results: The bibliographic record and instance are updated with the latest OCLC version.
Actual Results: The bibliographic record and instance are updated but with duplicate 035 fields for those OCLC#s with a prefix.  
Additional Information: Testing shows that this happens on the initial overlay, but subsequent overlays do not continue to 'add' duplicates.  

Records tested in bugfest-orchid: in523951 and in2486915 (screenshots of before overlay and after are attached) 

Duplicate data causes issues with integrations and other functions that rely on the OCLC# as a match point.  

2023-04-12

Corrie Hutchinson (Unlicensed) 

All

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-339

7

Status
titleBlocked
 

The number of created invoices is displayed when all invoices have errors with invoice lines

Overview: ** 

Expand

The file has 18 invoices and 1104 invoice lines.

Steps to Reproduce:

  1. Admin user is logged in.

Steps to Reproduce:

  1. Duplicate the "Default – GOBI monograph invoice" profile.

  2. Update next fields into the copied profile:

  • Name: any unique name

  • Incoming record type: EDIFACT Invoice

  • FOLIO record type: Invoice

  • Description: clean-up the field

  • Details block:
    1) Batch group: any option from the dropdown list
    2) Vendor name: use Organization lookup to find and select GOBI Library Solutions (which will also automatically fill in the Accounting code)
    3)Payment method: any option from the dropdown list.

       3. Create a new "Action Profile" with a unique valid name and with the next properties in it:

  • Action: Create

  • FOLIO record type: Invoice

  • Link the field mapping profile from step#2.

       4. Create a new "Job profile" with a unique valid name in it and with the next properties in it:

  • Accepted data type: EDIFACT

  • Link an action profile from step#3.

       5. Upload a valid EDIFACT file using Job profile from the previous step.

       6. Wait till the file is uploaded.

       7. Check log UI and summary to see the record with the upload result.

       8. Pay attention on the 'Invoice' column in the 'Created' row.

Expected Results: The '0' number of created invoices is displayed in cells in the row with the 'Created' row in the 'Summary' table in logs.
Actual Results: The '18' number of created invoices is displayed in cells in the row with the 'Created' row in the 'Summary' table in logs.

NOTE: Recreated on Poppy Bugfest:

2023-11-23

Tetiana Paranich

Dec 13 2023 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURMAN-1094

8

Status
colourBlue
titleOpen

Investigate deleting old versions of records from SRS, SPIKE

When SRS records are updated, the previous version is marked as old (and the newest version is marked as actual), but the older versions are not deleted. Over time, many, many previous versions of records will build up in SRS and potentially affect performance.

Expand

If we wanted to remove the old records, how complicated would that be, and what might we need to take into consideration?

KS: there are also lots of "trash" data saved to SRS as a result of failed or stopped imports (records linked to a snapshot/jobExecution that is Cancelled or records that don't have 999 ff i UUIDs) - consider ways to clean up that data as well.

  • Developers

    • If UI import log related to the previous version of a record has not yet been deleted, what would happen if an SRS record related to the log was deleted. Would it break the log? Can we prevent SRS records from being deleted if they are still connected to an import log in the UI?

    • Should we plan to keep the current and most recent previous? (would that be helpful for when we implement the rollback feature?)

    • Would there be any issues related to the various UUIDs assigned during imports?

    • Would there be any issues related to

      • quickMARC updates?

      • LDP data extracts?

      • Data export

      • OAI-PMH

    • OK for it to apply to all SRS records? all 3 MARC types, EDIFACT invoices?

    • How often to run the cleanup? Make it variable in the MOD or UI settings?

    • How much effort would this be? T-shirt sizes for UI and BE

  • SMEs

    • Would this be helpful? Do we have any way to measure or estimate the impact on SRS/Import performance before implementing?

    • Any requirements?

    • Any questions or concerns?

Results of this spike

  • Wiki page with design

  • All required Jira stories

  • T-shirt sizes for UI and MOD

  • Decide if this is a separate feature or just a couple of stories

2022-08-16

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURMAN-857

9

Status
colourGreen
titleCLOSED

Not able to use the system generated match profiles

I am attempting to create a new job profile for single record import to get rid of the junk fields. I was able to do this a while back on our test tenant and today I was finally able to get around to creating it on our production tenant. 

Expand

However, when I went to add the Match profile (Inventory Single Record - Default match for existing SRS record) which is a system generated profile, it did not appear as one of the options. When I look at the Action profiles, I am able to choose system generated ones, so the problem is only with the Match profiles. We are on Orchid-SP-5. I was also able to recreate the issue on Snapshot. (I don't have Bugfest access so I did not try that.)

Steps to Reproduce:

  1. Log into FOLIO Snapshot

  2. Go to Settings–>Data Import–>Job Profiles–>Actions–>New Job Profile

  3. Under Overview, click on the plus sign

  4. Choose Match

Expected Results: A list of all Match Profiles (both system provided and locally created) is shown

Actual Results: Only locally created Match Profiles are shown (if there are none, you get "no results found")

Additional Information:
When I did my original creation of the job profiles in May, our test tenant would have been on Orchid-SP-3 or 4, so this seems to be something that was introduced in one of the patches since then.

Open

Investigate deleting old versions of records from SRS, SPIKE

When SRS records are updated, the previous version is marked as old (and the newest version is marked as actual), but the older versions are not deleted. Over time, many, many previous versions of records will build up in SRS and potentially affect performance.

Expand

If we wanted to remove the old records, how complicated would that be, and what might we need to take into consideration?

KS: there are also lots of "trash" data saved to SRS as a result of failed or stopped imports (records linked to a snapshot/jobExecution that is Cancelled or records that don't have 999 ff i UUIDs) - consider ways to clean up that data as well.

  • Developers

    • If UI import log related to the previous version of a record has not yet been deleted, what would happen if an SRS record related to the log was deleted. Would it break the log? Can we prevent SRS records from being deleted if they are still connected to an import log in the UI?

    • Should we plan to keep the current and most recent previous? (would that be helpful for when we implement the rollback feature?)

    • Would there be any issues related to the various UUIDs assigned during imports?

    • Would there be any issues related to

      • quickMARC updates?

      • LDP data extracts?

      • Data export

      • OAI-PMH

    • OK for it to apply to all SRS records? all 3 MARC types, EDIFACT invoices?

    • How often to run the cleanup? Make it variable in the MOD or UI settings?

    • How much effort would this be? T-shirt sizes for UI and BE

  • SMEs

    • Would this be helpful? Do we have any way to measure or estimate the impact on SRS/Import performance before implementing?

    • Any requirements?

    • Any questions or concerns?

Results of this spike

  • Wiki page with design

  • All required Jira stories

  • T-shirt sizes for UI and MOD

  • Decide if this is a separate feature or just a couple of stories

2022-08-16

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICOREMODSOURMAN-339857

24428

Status
colourGreen
titleCLOSED

Status descending sort on Data Import view all page not working

In Honeysuckle Bugfest, on the Data Import View all, the status sort ascending works, but not descending

2020-12-09

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUIDATIMP-798

29

Status
colourGreen
titleCLOSED

MARC holdings update log has additional empty row

MARC Holdings update log has additional empty row. If two "MARC Holdings" records are updated by one job, then 2 additional empty rows will be displayed.

2023-05-01

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURMAN-983

30

Status
colourGreen
titleCLOSED

Data Import field mapping profile is saved with data deleted from the system

The user can save a mapping profile with data that has been deleted from the system.

2022-08-19

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICONV-290

31

Status
colourGreen
titleclosed

Alert modal with error message is displayed on page after entering '##*' characters and clicking on search button

see steps in JIRA

2022-06-10

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICONV-250

32

Status
colourGreen
titleclosed

DI Log: Title missing but status reads updated

see JIRA

2023-10-24

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURMAN-1069

33

Status
colourBlue
titleOpen

Asynchronous migration is not completed

The asynchronous migration script was run but migration has not been completed, the migration job is still IN_PROGRESS.

2023-06-04

Status
colourGreenBlue
titleCLOSED

Invoice level adjustments do not work

When loading an EDIFACT invoice using a field mapping profile with invoice-level adjustments, the adjustments error

2021-03-29

Kimberly Pamplin 

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-125

25

Status
colourGreen
titleCLOSED

Invoice line level adjustments don't work

When loading an EDIFACT invoice using a field mapping profile with invoice-level adjustments, the adjustments error

2021-03-29

Kimberly Pamplin 

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-124

26

Status
colourGreen
titleCLOSED

Data import incorrectly maps Resource type for no display constant generated

See steps in JIRA

2023-04-19

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-804

27

Status
colourBlue
titleOpen

Incorrect behavior of "Delete Files" button

Note: Does not always reproduce

"Delete files" request deletes the file, but does not always show in the UI
Delete request returns "Cannot delete uploadDefinition 09ef7415-34e4-44cd-9af3-31953df9f200 - linked files are already being processed".

2022-06-02

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-691

Open

Field is shown after being removed via data import when field mapping profile has rule allowing updates for this field

Field is shown after being removed via data import (when field mapping profile has rule allowing updates for this field)

Expand

Preconditions:

  • The "830" MARC field with "$a" value "Cambridge tracts in mathematics" of imported "Instance" record must be linked to the "100" field of imported "MARC Authority" record ("Cambridge tracts in mathematics and mathematical physics").

  • Authorized user with the following permissions:
      Data import: Can upload files, import, and view logs
      Inventory: All permissions
      MARC Authority: View MARC authority record
      quickMARC: Can Link/unlink authority records to bib records
      quickMARC: View, edit MARC bibliographic record
      UI: Data export module is enabled

  • User is on "Inventory" pane with search results for "Instance" record which was linked with "MARC Authority" (see Precondition, e.g.: "The algebraic theory of modular systems / by F.S. Macaulay.").

Steps to Reproduce:

  1. Check the checkbox that is displayed next to the "Instance" record which is linked to "MARC Authority" records on the search result pane.
    For example: "The algebraic theory of modular systems / by F.S. Macaulay.".

  2. Click on the "Actions" button on the second pane and select "Export Instances (MARC)" option.

  3. Go to "Data export" app.

  4. Download exported ".mrc" file by clicking on the "File name" column value.

  5. Open downloaded ".mrc" file via "MarcEdit" (or any similar app).

  6. Delete the linked field (see precondtions): "830" field

  7. Save the edited ".mrc" file.

  8. Go to the "Data import" app >> Click on the "or select files" button >> Select the updated ".mrc" file (see previous step) >> click the "Open" button

  9. Click on the created in precondition "Update MARC Bib by matching 999 ff $s subfield value (830 - update all)" job profile >> Click on the "Actions" in the appeared third pane >> Select "Run" option >> Click on the "Run" button in appeared modal.

  10. Find updated record in "Inventory" app

  11. Click on the "Actions" button and select "Edit MARC bibliographic record" option from the expanded menu.

Expected Results: Deleted "830" field (see step 6) is not shown.

Actual Results: Deleted "830" field (see step 6) is shown and has divided boxes (see attached screencast).

2023-03-09

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURCE-665691

345

2023-02-20

Status
colourGreen
titleCLOSED

RRT - Invoices don't display fund codes

Institution specific - MI State Univ./ Library of Michigan

2024-01-04

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODINVOSTO-173

35

Status
colourBlue
titleOpen

Review and fix Marc updates for individual fields

Currently (as of Orchid), the Data Import MARC Updates for specific fields do not handle repeatable fields properly. The logic needs updating, and UI may need updating to indicate how incoming repeatable MARC fields should be handled vis-a-vis the same repeatable field(s) in the existing SRS MARC Bib. This is similar to how the field protection logic needed updating to handle repeatable vs non-repeatable fields properly.

Blocked

match on 035$a with qualifier fails

When updating an SRS record using a match on the 035$a with a qualifier on the incoming MARC record, the match fails.

Expand

Steps to Reproduce:

  1. Log into bugfest-poppy.

  2. Open Data Import and look at Job 10638.

    1. Job Profile = CAH Update SRS MARC on 035$a match w/ qualifier

      1. This profile matches MARC to MARC on the incoming 035$a with a qualifier exactly matching the 035$a of the existing record : (MiFhGG)galncbln000092 -> galncbln000092.

      2. For matches, the record is updated overriding the protection on the 856. 

    2. Sample test records = 51356_test_records.mrc

Expected Results: The job matches the incoming records to the SRS records associated with in10783235 and in10783236 and updates these records with the new 856 in the incoming MARC record.

Actual Results: The incoming records are not matched, the log for SRS MARC says 'No action', and the SRS record is not updated.  

Additional Information: I tested the Field Mapping Profile without the 856 protection and it still failed (Job 10641).  

When reviewing logs on an internal system, the error messages given note that a match is not found.

Additional testing was done on changing the 'Match criterion' of the existing record and no value was found to make the Match profile successful.  

Original testing done in an Orchid environment.

2023-11-15

Yael Hod 

Corrie Hutchinson (Unlicensed) 

2024-1-31 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUXPROD-4080

36

ee9b165564fc
keyMODDICORE-386

Ryan to review Jira with Folijet leads to understand current design and identify requirement gaps

Partial matching, e.g. begins with, ends with, is required but it does not function as it should. Only exact matching seems to work.

6

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-984

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURMAN-1106

38

Status
titleBLOCKED

Partial matching doesn't work

Partial matching, e.g. begins with, ends with, is required but it does not function as it should. Only exact matching seems to work.

2021-01-25

Yael Hod (Stanford)
Corrie Hutchinson (Unlicensed) Chicago

2024-1-31 Data Import Subgroup meeting

Status
colourGreenBlue
titleclosed

Job profile with POL/VRN match cascade does not finish properly

Some Inventory records (Instances/Holdings/Items) get created when orders are opened, depending on the Inventory setting in the POL. During testing of POL/VRN matching, I noticed that some holdings being updated by importing MARC Bibs were having their source changed from FOLIO to MARC. We need to ensure this DOES NOT happen.

2022-05-25

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODINV-709

37

Status
colourGreen
titleCLOSED

Poppy data import log does not include links to records for SRS updates.

For SRS updates with no instance update, the changes to the import logs have resulted in no record links in the log. Before the log indicated that the instance was updated and included a link to the instance. Now, the log indicates that the SRS record is updated, but there is no link. I am guessing it is because there is no SRS record per se in FOLIO. Can we revisit the decision to not display the instance update status with a link? Or add a link to the instance to the SRS updated status?  PPT from Data Import subgroup work:

Widget Connector
urlhttps://docs.google.com/presentation/d/13nOybznTtLSrKsycvMxzIFnPxEuFRuV5gpXXZBdF_3A/edit#slide=id.p2

From this spreadsheet it appears that the instance should also be updated and provide the record link:

Lref gdrive file
urlhttps://docs.google.com/spreadsheets/d/1QVbRt0b-icn-KxzZiuF60tGZv80m1hnUGqOuwEXRo7U/edit#gid=1406822162
.

Open

Subfield can't be removed when updating Marc bib upon import

Subfield cannot be removed when updating "MARC Bib" upon import when  field mapping profile has rules allowing update of several subfields in all fields (including the subfield which is being added)

Expand

Preconditions:

• The job profile should be created for "Data import" app: "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)"

• "Match profile" should have following specified:

Field: 999
Ind 1: f
Ind 2: f
Subfield: s

  • Action profile should be Update MARC Bib

• "Field mapping profile" should have following rules specified:

Update specific fields only

Field: *
In.1: *
In.2: *
Subfield: 1.

Field: *
In.1: *
In.2: *
Subfield: 2.

No Field protection overrides

• Attached "MARC Bib" record should be imported to the system via "Default - Create instance and SRS MARC Bib" job profile (it has subfields "$1", "$2" in two fields)

Steps to Reproduce:

  1. Export previously imported "MARC Bib" record from Preconditions (For preconditions, use MARC_bib_Black_panther_five_nonRepeatable_linkable_fields.mrc file attached to this Jira, and import using the default Create MARC Bib and Instance job profile)

  2. Open exported .mrc file in "MarcEdit" or similar editor

  3. In 130 field: delete "$2" subfield and update "$1" subfield value (e.g., change "$1" value to "new")

  4. In 240 field: delete "$2" subfield and add one more "$1" subfield (e.g., add "$1onemore")

  5. Save updated file

  6. Import updated file using "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)" job profile created in Preconditions

  7. Open updated "MARC Bib" record for edit ("Actions" → "Edit MARC bibliographic record") 

Expected Results: "$2" removed from both edited fields. Following "$1"subfield values are shown in fields:

  •  "130" field: "$1new"

  • "240" field: "$1original $1onemore"

Actual Results: "$2" subfields are not removed from edited fields. "$1" subfields updated/added as expected

2023-06-01

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-386

  •  Review Jira with Folijet leads to understand current design and identify requirement gaps.
39

353

7

2024-7-10 Data Import Subgroup meeting

Status
colourGreenBlue
titleCLOSED

 RRT, 5C match bug

Problem of match that didn't match

2023-09-28

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODINV-882

40

Status
colourGreen
titleCLOSED

Duplicate records in incoming file causes problems after overlay process with no error reported

When overlaying instance records, if the incoming file has duplicate records and therefore multiple incoming matches for one match in FOLIO, the record that was overlaid in FOLIO cannot be opened using quickMARC. Note: In this scenario the incoming file has the duplicate records and therefore duplicate match points. FOLIO does not have duplicates.

2021-12-15

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURCE-530

41

Status
colourYellow
titlein progress

Add new subfields to Electronic access (856)

New subfields in the MARC 856 field need to be represented in Inventory data. The same elements should appear in the electronic access block in Instance, Holdings, and Item records. https://www.loc.gov/marc/bibliographic/bd856.html

2023-09-14

Open

Fields duplicated when adding one subfield when updating Marc bib upon import

Fields duplicated when adding one subfield when updating "MARC Bib" upon import when  field mapping profile has rules allowing update of several subfields in all fields (including the subfield which is being added)

Expand

Preconditions:

• The job profile should be created for "Data import" app: "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)"

• "Match profile" should have following specified:

Field: 999
Ind 1: f
Ind 2: f
Subfield: s

  • Action profile should be: Update MARC Bibs

• "Field mapping profile" should have following rules specified:

Update specific fields:

Field: *
In.1: *
In.2: *
Subfield: 1.

Field: *
In.1: *
In.2: *
Subfield: 2.

No overrides to MARC field protection

• Attached "MARC Bib" record should be imported to the system via "Default - Create instance and SRS MARC Bib" job profile

Steps to Reproduce:

  1. Export previously imported "MARC Bib" record from Preconditions 

  2. Open exported .mrc file in "MarcEdit" or similar editor

  3. Add "$1" subfield values in with no "$1" (e,g.:

    1. add "$1555555" to "035" field

    2. add "$1test" to "264" field

  4. Save updated file

  5. Import updated file using "Update MARC Bib records by matching 999 ff $s subfield value (subfields 1, 2)" job profile created in Preconditions

  6. Open updated "MARC Bib" record for edit ("Actions" → "Edit MARC bibliographic record") 

Expected Results: "$1" added to edited "035", "264" fields. Fields were not duplicated 

Actual Results: "035", "264" fields were duplicated. First copy of each remains unchanged (no added subfield), while second copies contain added "$1" subfield

2023-06-01

2024-01-31

2024-2-28 Data Import Subgroup meeting

2024-7-10 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-1026

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICONV-388

  •  Developers will look into this
43

Status
colourGreen
titleClosed

There needs to be a warning or error to stop the job when a job contains no action profiles.

If a job doesn't have any actions, nothing happens and there is a risk that the records are corrupted.

2021-02-08

Sara Colglazier

Jenn Colt

Christie Thomas

2024-2-28 Data Import Subgroup meeting

2024-7-10 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICONV-373

  •  Ryan will ask what the behavior is for when actions are missing and you try to run a job or if you can edit a job with no actions.

Behavior now in FOLIO snapshot

44

Status
colourGreen
titleCLOSED

When importing EDIFACT files, invoices lines aren't in order

When importing an Edifact file, the invoices lines aren't in order.

2024-02-06

Corrie Hutchinson, Jennifer Eustis

Kimberly Pamplin

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUXPROD-4467

42

Status
colourGreen
titleCLOSED

Enforce an order to deletion for Data Import profiles

There needs to be an enforced order to deletion for Data import profiles to prevent this or there should be a confirmation window that lets you delete associated match/action profiles with the job profile if they aren't in use in another Job profile

Expand

From 2024-01-31 chat: Technically, not true Lynne. I accidentally deleted a job profile without unlinking stuff first (which I was not stopped from doing). I was then not allowed to delete the associated profiles because they needed to be unlinked first, which I could not do because I could no longer get to the job profile to unlink them.

MODDICORE-352

8

Status
colourBlue
titleOpen

Fields duplicated when adding several subfields when updating marc bib upon import

Fields duplicated when adding several subfields when updating "MARC Bib" upon import when  field mapping profile has rules allowing update of corresponding subfields in corresponding fields

Expand

Preconditions:

• The job profile should be created for "Data import" app: "Update MARC Bib records by matching 999 ff $s subfield value (240, 600)"

• "Match profile" should have following specified:

Field: 999
Ind 1: f
Ind 2: f
Subfield: s

  • Actin profile should be: Update MARC Bibs

• "Field mapping profile" should have following rules specified:

Update specific MARC fields

Field: 240
In.1: *
In.2: *
Subfield: 1.

Field: 240
In.1: *
In.2: *
Subfield: 2.

Field: 600
In.1: *
In.2: *
Subfield: 1.

Field: 600
In.1: *
In.2: *
Subfield: 2.

No overrides to the existing field protections

• Attached "MARC Bib" record should be imported to the system via "Default - Create instance and SRS MARC Bib" job profile

Steps to Reproduce:

  1. Export previously imported "MARC Bib" record from Preconditions 

  2. Open exported .mrc file in "MarcEdit" or similar editor

  3. Add "$1" and "$2" subfields to "240" field (e.g., add "$1test$2testing")

  4. Add "$1" and "$2" subfields to one of the "600" fields (e.g., add "$1test$2testing" to the third "600" field)

  5. Save updated file

  6. Import updated file using "Update MARC Bib records by matching 999 ff $s subfield value (240, 600)" job profile created in Preconditions

  7. Open updated "MARC Bib" record for edit ("Actions" → "Edit MARC bibliographic record") 

Expected Results: "$1", "$2" added to edited "240", "600" fields. Fields were not duplicated 

Actual Results: "240", "600" fields were duplicated. First copy of each remains unchanged (no added subfields), while second copies contain added "$1", "$2" subfields

2023-06-01

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fckeyMODDATAIMP-995ee9b165564fc
keyMODDICORE-351

9

Status
colourBlue
titleOpen

Duplicate field is added when updating $0 in linked marc bib field upon data import if field mapping profile allows $0 update

Duplicate field is added when updating "$0" in linked "MARC bib" field upon data import if field mapping profile specifically allows "$0" update

2023-02-15

Jira Legacy
serverSystem Jira

jqlQueryORDER BY created DESC

serverId01505d01-b853-3c2e-90f1-ee9b165564fc

45

keyMODSOURCE-594

10

Overview: The Poppy release introduced functionality to create multiple holdings and items from a single MARC bib, using data from 9xx fields in the MARC record (see Image RemovedUXPROD-2741: Import of MARC Bibs to create/update multiple holdings and items: BE workCLOSED )

However, if the holdings or item field mapping profile contains a conditional mapping (e.g. Permanent holdings location = 945$a; else “LOCCODE”), only the first specified 9xx field will be used to create a single holdings/item.

Expand

Steps to Reproduce:

  1. Log into Poppy Bugfest as a user with admin permissions.

  2. Create a job profile using the steps here: https://foliotest.testrail.io//index.php?/cases/view/388505

    1. On the holding field mapping profile, update the Permanent location mapping to be 945$h; else "1 hour earlier (1he)"

Case 1:

  1. Use the .mrc file attached to the Testrails and import using the job profile created in Step 2.

Expected Results: Multiple holdings and items are created.

Actual Results: Only a single holdings/item representing the first 945 field is created.

Case 2:

  1. Modify holdings mapping profile to contain Permanent location mapping 945$h

  2. Import attached NoLocationMarcField.mrc file

Expected Results: Job finished with status “Completed with errors“, error log for holdings says that permanent location should be not null

Actual Results: Job finished with status “Completed with errors“, error log for holdings show StackTraceThrowable exception

Additional Information: Removing the ; else “LOCCODE” produced expected results. See my tests by looking at the logs for 12177 (successful creation of multiple holdings/items) and 12180 (only created single holdings/item). The MARC records used for these examples and the job profiles are listed within the log.

Molly Driscoll

Status
colourGreenBlue
titleCLOSED

Data import job profile will not create multiple holdings/items when conditional mapping is used in field mapping profiles

Open

Incorrect behavior of "Delete Files" button

Note: Does not always reproduce

"Delete files" request deletes the file, but does not always show in the UI
Delete request returns "Cannot delete uploadDefinition 09ef7415-34e4-44cd-9af3-31953df9f200 - linked files are already being processed".

2022-06-02

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-691

11

Status
colourBlue
titleOpen

Asynchronous migration is not completed

The asynchronous migration script was run but migration has not been completed, the migration job is still IN_PROGRESS.

2023-06-04

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODSOURCE-665

12

Status
colourBlue
titleOpen

Review and fix Marc updates for individual fields

Currently (as of Orchid), the Data Import MARC Updates for specific fields do not handle repeatable fields properly. The logic needs updating, and UI may need updating to indicate how incoming repeatable MARC fields should be handled vis-a-vis the same repeatable field(s) in the existing SRS MARC Bib. This is similar to how the field protection logic needed updating to handle repeatable vs non-repeatable fields properly.

2023-02-20

46

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDICORE-394

  •  Request to be added to Poppy CSP 2

UXPROD-4080

13

2024-02-27

Status
colourGreen
titleCLOSED

Item creation using Data Import is missing data

Overview : When importing a MARC bibliographic record with 9xx fields designating order, holdings, and item record data, the enumeration and copy number fields of the item record fail to populate as instructed.

BLOCKED

Partial matching doesn't work

Partial matching, e.g. begins with, ends with, is required but it does not function as it should. Only exact matching seems to work.

2021-01-25

Yael Hod (Stanford)
Corrie Hutchinson (Unlicensed)Christie Thomas Chicago

2024-1-31 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-1009

47

MODDICORE-386

  •  Review Jira with Folijet leads to understand current design and identify requirement gaps.
14

2024-2-28 Data Import Subgroup meeting

Status
colourYellow
titleIn Progress

Ability to change the link to a profile rather than just remove it

Current situation: We are only able to link or unlink profiles (field mapping to action, action to a job, match to a job, etc).

New Feature: We want to be able to change the link rather than just unlink

Expected behavior: There is another option that allows the user to change the link to a different profile.

Use case: The wrong profile was used and the new one needs to be added. Rather than unlinking everything, it'd be easier to just update the link to the correct one.

2024-02-27

Jennifer Eustis

in progress

Add new subfields to Electronic access (856)

New subfields in the MARC 856 field need to be represented in Inventory data. The same elements should appear in the electronic access block in Instance, Holdings, and Item records. https://www.loc.gov/marc/bibliographic/bd856.html

2023-09-14

2024-7-10 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUXPROD-4935

  •  Ryan will create a ticket. This might involve rethinking the profiles page setup.
48

4467

15

Overview : When creating orders with an order format of electronic, an instance, and a holdings records from a MARC bibliographic record using Data Import, the quantity in both the ‘Cost details' and ‘Location’ section of the POL is ‘0' despite the field mapping profile instructing it be '1’.  

A nearly identical profile for order format equal to print does not display this behavior

Status
colourGreenYellow
titleCLOSED

Quantity = 0 in POL for order format equal to electronic

In Progress

Ability to change the link to a profile rather than just remove it

Current situation: We are only able to link or unlink profiles (field mapping to action, action to a job, match to a job, etc).

New Feature: We want to be able to change the link rather than just unlink

Expected behavior: There is another option that allows the user to change the link to a different profile.

Use case: The wrong profile was used and the new one needs to be added. Rather than unlinking everything, it'd be easier to just update the link to the correct one.

2024-02-27

Corrie Hutchinson (Unlicensed)Christie Thomas

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMP-1010

49

Status
colourGreen
titleCLOSED

DI Jobs stall when matching on a holdings and/item nested under an instance

2024-03-01

Christie Thomas

2024-2-28 Jennifer Eustis

2024-2-28 Data Import Subgroup meeting

2024-7-10 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODDATAIMPUXPROD-1012

  •  CSP 3 Poppy
50

4935

  •  Ryan will create a ticket. This might involve rethinking the profiles page setup.
16

Status
colourBlue
titleOpen

Unable to pull vendor account number from POL when importing EDIFACT invoices

In our previous system the vendor account number lived at the PO/Invoice level. Now it is on the POL/invoice line. I have not found a way when loading EDIFACT invoice files to draw this directly from the POL or to retrieve it from the vendor file. This means for each invoice we must put in all of the vendor account numbers manually, which adds up and is prone to error. If there is a way that data import could pull this value from the POL it would save so much time in our processing.

2024-04-12

Kimberly Pamplin

2024-7-10 Data Import Subgroup meeting

5117

Status
colourGreen
titleCLOSED

Import profile with Instance match to POL and Vendor Reference Number not working

Overview: Vendor records were received containing the POL and Vendor Reference Number (VRN) are not matching to the source = FOLIO Instance records that the GOBI API created through the Orders app. 

Expand

Current workaround: No workaround

Steps to Reproduce:

  1. Log into Orchid bug-fest 

  2. Create brief pending order records using POL numbers and Vendor Reference numbers from MARC file

  3. Use Import job profile: GOBI API eBooks - Full cataloging

  4. Upload records and run job

  5. Vendor Reference Numbers and POL numbers must be unique. If running the import job multiple times, you must change these data points in the MARC records.

Expected Results: Match on the PO Line and then the Vendor Order Reference number before updating the Instance record with full cataloging record and the holdings item with the correct holdings type and permanent location

Actual Results: Records are all discarded

2023-08-03

Lynne Fors

All

Yellow
titleIn Progress

Additional values needed for Electronic access fields or 856 subfields

Issue: Right now, only a few subfields from the 856 are mapped. We would like to expand that ability to include the non public note (856$x), access status (856$7) and terms governing access (856$n).

Jennifer Eustis

All

2024-7-10 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyMODINVUIIN-876

Fix is in Quesnelia

Formerly UIDATIMP-1506

52

2579

  •  Ryan will also look into mapping indicators.
  •  Need to account for all Inventory record types
  •  Need to account for bulk edit, data import/export, ???
18

Status
colourYellow
titleIn Progress

Additional values needed for Electronic access fields or 856 subfields

Issue: Right now, only a few subfields from the 856 are mapped. We would like to expand that ability to include the non public note (856$x), access status (856$7) and terms governing access (856$n).Ensure consistency of UI for blank indicators between Bulk Edit, Data Export, quickMarc, and Data Import

Issue: Data Import displays blanks with a space. Quickmarc is a slash and so on. To avoid confusion, it'd be great to make sure that blank indicators and how marc fields and subfields are mapped in bulk edit, data import, and data export are done in a consistent way.

2024-02

Jennifer Eustis

All

2024-7-10 Data Import Subgroup meeting

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUIIN-2579

  •  Ryan will also look into mapping indicators.
  •  Need to account for all Inventory record types
  •  Need to account for bulk edit, data import/export, ???
53

  •  Ryan is bringing this topic to Magda and Christine to discuss.
19

Status
colourYellow
titleIN PROGRESS

Reporting: Have the ability to download a list of instance, holdings, or item record identifiers that were successfully imported

Issue: There isn't a way to retrieve a list of identifiers through the Data Import log.

2024-07-11

Jennifer Eustis

All

  •  Ryan will look into making row 54-56 as one epic that has smaller stories.
20

Status
colourYellow
titleIN PROGRESS

Reporting: Have the ability to save a list of successfully imported records to a list in the Lists App

New Functionality. In addition to downloading a list, it would be great to be able to save the imported identifiers to a list in the Lists App

2024-07-11

Jennifer Eustis

All

21

Status
colourYellow
titleIn Progress

Ensure consistency of UI for blank indicators between Bulk Edit, Data Export, quickMarc, and Data Import

Issue: Data Import displays blanks with a space. Quickmarc is a slash and so on. To avoid confusion, it'd be great to make sure that blank indicators and how marc fields and subfields are mapped in bulk edit, data import, and data export are done in a consistent way.

2024-02

Jennifer Eustis

All

2024-7-10 Data Import Subgroup meeting

  •  Ryan is bringing this topic to Magda and Christine to discuss.
54

Status
colourYellow
titleIN PROGRESS

Reporting: Have the ability to download a list of instance, holdings, or item record identifiers that were successfully imported

Issue: There isn't a way to retrieve a list of identifiers through the Data Import log.

2024-07-11

Jennifer Eustis

All

  •  Ryan will look into making row 54-56 as one epic that has smaller stories.
55

Status
colourYellow
titleIN PROGRESS

Reporting: Have the ability to save a list of successfully imported records to a list in the Lists App

New Functionality. In addition to downloading a list, it would be great to be able to save the imported identifiers to a list in the Lists App

2024-07-11

Jennifer Eustis

All

56

Status
colourYellow
titleIN PROGRESS

Reporting: Have the ability to download a list of errors from an import

Issue: The only way to see errors is to navigate in the log and to click on the title to see the jason. Having an export like in Bulk Edit would be helpful.

2024-07-11

Jennifer Eustis

All

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUIDATIMP-914

57

IN PROGRESS

Reporting: Have the ability to download a list of errors from an import

Issue: The only way to see errors is to navigate in the log and to click on the title to see the jason. Having an export like in Bulk Edit would be helpful.

2024-07-11

Jennifer Eustis

All

Jira Legacy
serverSystem Jira
serverId01505d01-b853-3c2e-90f1-ee9b165564fc
keyUIDATIMP-914

22

Status
colourBlue
titleOPEN

Ability to view application log

When DI was in the planning phase with, there was a request to be able to view the application log. Examples were provided from other systems. This is still needed. This was shown as "server logs" in the original wireframes. See

Lref gdrive file
urlhttps://drive.google.com/drive/folders/171ZNr7-HQ9rFRiYNBGn2Skg8OAxM-DxA

2024-08-15

Lab Session

ALL

23

Status
colourBlue
titleOPEN

Ability to update instance and marc srs in same job

Users need to be able to update the administrative data and also override protected fields to update the srs bib record. Tested in lab session 10-17-2024 this didn’t work in Poppy (Chicago test environment).

2024-10-17

Lab session

all

24

Status
colourBlue
titleOPEN

Add date and start/stop running date and times to the summary log page

Right now to see these times, you have to click out of the summary log view and back to the brief log view. Having this information displayed also on the summary page is helpful and needed.

2024-10-17

Lab session

ALL

25

Status
colourBlue
titleOPEN

Ability to view application log

When DI was in the planning phase with, there was a request to be able to view the application log. Examples were provided from other systems. This is still needed. This was shown as "server logs" in the original wireframes. See

Lref gdrive file
urlhttps://drive.google.com/drive/folders/171ZNr7-HQ9rFRiYNBGn2Skg8OAxM-DxA

2024-08-15

Lab Session

ALL

58

Status
colourBlue
titleOPEN

Ability to update instance and marc srs in same job

Users need to be able to update the administrative data and also override protected fields to update the srs bib record. Tested in lab session 10-17-2024 this didn’t work in Poppy (Chicago test environment).

2024-10-17

Lab session

all

59

Status
colourBlue
titleOPEN

Add date and start/stop running date and times to the summary log page

Right now to see these times, you have to click out of the summary log view and back to the brief log view. Having this information displayed also on the summary page is helpful and needed.

2024-10-17

Lab session

ALL

60

Status
colourBlue
titleOPEN

Data Import log does not provide reason for No Action status.

The reason was previously provided as an error even though No action is means updates or creates were not taken because of the profile logic. (Multiple matches, single match with no create or update action provided, or no matches, etc.) The reason for No action should be provided in the log because it could be any of a number of scenarios.

2024-10-23

Christie Thomas University of Chicago

61

Status
colourBlue
titleOPEN

Update instance, holdings, and item in reverse order.

Right now the instance, holdings, and item must be updated in that order. It is also not possible to update an item independently and then, in the same job, match and update the instance and holdings. When updating all three records as a part of a shelf ready workflow, integrations (FOLIO app and external) require that the barcode be added to the item before the holdings record is updated. We need to be able to match and instance, holdings, and item (in that order) and then update the item, holdings, and instance or the item, instance, holdings in either of those orders.Data Import log does not provide reason for No Action status.

The reason was previously provided as an error even though No action is means updates or creates were not taken because of the profile logic. (Multiple matches, single match with no create or update action provided, or no matches, etc.) The reason for No action should be provided in the log because it could be any of a number of scenarios.

2024-10-23

Christie Thomas University of Chicago

26

Status
colourBlue
titleOPEN

Update instance, holdings, and item in reverse order.

Right now the instance, holdings, and item must be updated in that order. It is also not possible to update an item independently and then, in the same job, match and update the instance and holdings. When updating all three records as a part of a shelf ready workflow, integrations (FOLIO app and external) require that the barcode be added to the item before the holdings record is updated. We need to be able to match and instance, holdings, and item (in that order) and then update the item, holdings, and instance or the item, instance, holdings in either of those orders. Error message from import in Poppy: io.vertx.core.json.DecodeException: Failed to decode:Cannot deserialize value of type `java.util.LinkedHashMap<java.lang.Object,java.lang.Object>` from Array value (token `JsonToken.START_ARRAY`) at [Source: (String)"[{"id":"babefda2-17c3-4ff2-a677-f469c1b7bb59","_version":3,"hrid":"13642832","holdingsTypeId":"0c422f92-0f4d-4d32-8cbe-390ebc33a3e5","formerIds":[],"instanceId":"1980ec39-2d53-42d9-839b-d4d080850c76","permanentLocationId":"fad8517a-aae4-5b69-855e-01843e6e4d88","effectiveLocationId":"fad8517a-aae4-5b69-855e-01843e6e4d88","electronicAccess":[],"callNumberTypeId":"95467209-6d7b-468b-94df-0f5d7ad2747d","callNumber":"PL2260.52.B536A5 2019","notes":[],"holdingsStatements":[],"holdingsStatementsForInde"[truncated 371 chars]; line: 1, column: 1]""

2024-10-31

Christie Thomas University of Chicago

6227

Status
colourBlue
titleOPEN

Update the SRS with override field protections enabled and update the instance status and cataloged date in a single job.

It is not possible to pair and SRS update with an instance update in the same job. We have the need to update the srs marc record and the instance record in a single job with a single match. (Match and instance or an srs marc record and specify an srs marc bibliographic update profile and an instance update profile. Or create an action that is linked to multiple field mapping profiles.

2024-10-31

Christie Thomas

University of Chicago

28

Delete holdings and items in batch via data import

When marking an instance for deletion we should be able to also delete all holdings and items attached to the instance or delete holdings and items targeted by identifier.

2024-11-21

Christie Thomas University of Chicago