Data Import Implementers Topic Tracker
This page is meant to track progress on issues such as bugs, new features, or topics to be discussed for Data Import. Topics or questions posted in slack will be added here as well.
Topic Status legend:
Open - To be discussed
Blocked - The group is waiting for further information or some specific action to be completed before progress can continue
In progress - Discussion is ongoing and work is potentially in progress
Closed - Discussion resolved and required actions completed. Closed topics are found on our Archived Data Import Topics page.
How to contribute to other people's discussion topics:
Do not add detail to closed or discussed topics as your comments may be overlooked. In this situation, it might be best to Add your details as a new topic and reference the previous topic.
To contribute to an existing topic. Add a new paragraph to the description column.
@mention yourself at the beginning of the paragraph
How to indicate you are also interested in a topic:
@mention yourself in the "Interested parties" column and add your institution name
How are topics archived:
When a topic status is set to closed by it's "Owner". The topic must also be moved to the Data Import Topic Tracker Archive.
Copy the topic and paste it at the top of the Archived topics page that is nested under this page
Delete the topic from this page
Data Import Issues by Status
Data Import Issues by Type
List of Data Import Jira Issues
Topics
Status can be sorted to see Open, In Progress, Closed or Blocked
Status | Topic | Description/use case | Date Added | Provided By (Name/Institution) | Interested Parties | Has Been Discussed (Link to agenda/minutes) | Jira Link | Action Required | |
---|---|---|---|---|---|---|---|---|---|
1 | OPEN | Jobs run immediately after canceled jobs take excess time | Overview: Jobs started immediately after canceling a job get stuck and don't progress Steps to Reproduce:
Expected Results: The job cancels and stops processing. Data import jobs started after the cancellation act normally. Actual Results: Single record imports started after canceling a job like this are slow. One single record import after a cancelation took 17 minutes. Additional Information: In Nolana, canceled jobs created a large number of error messages that seemed to affect performance. Have the logs been checked to be sure this isn't still happening? Is something else causing this behavior?
Attached file and profile "cornell ebz" can be used to replicate. BE notes (possible solution):
| 2023-03-29 | @Jenn Colt | All | https://folio-org.atlassian.net/browse/MODSOURMAN-970 UXPROD-4704: Stop processing the job after it was canceled by userOpen | Converted from bug to new feature. Ryan T. to get information on how this affects slicing | |
2 | Blocked | The number of created invoices is displayed when all invoices have errors with invoice lines | Overview: ** The file has 18 invoices and 1104 invoice lines. Steps to Reproduce:
Steps to Reproduce:
3. Create a new "Action Profile" with a unique valid name and with the next properties in it:
4. Create a new "Job profile" with a unique valid name in it and with the next properties in it:
5. Upload a valid EDIFACT file using Job profile from the previous step. 6. Wait till the file is uploaded. 7. Check log UI and summary to see the record with the upload result. 8. Pay attention on the 'Invoice' column in the 'Created' row. Expected Results: The '0' number of created invoices is displayed in cells in the row with the 'Created' row in the 'Summary' table in logs. NOTE: Recreated on Poppy Bugfest:
| 2023-11-23 | Tetiana Paranich |
|
| ||
3 | Open | Investigate deleting old versions of records from SRS, SPIKE | When SRS records are updated, the previous version is marked as old (and the newest version is marked as actual), but the older versions are not deleted. Over time, many, many previous versions of records will build up in SRS and potentially affect performance. If we wanted to remove the old records, how complicated would that be, and what might we need to take into consideration? KS: there are also lots of "trash" data saved to SRS as a result of failed or stopped imports (records linked to a snapshot/jobExecution that is Cancelled or records that don't have 999 ff i UUIDs) - consider ways to clean up that data as well.
Results of this spike
| 2022-08-16 |
|
|
|
| |
4 | Open | Field is shown after being removed via data import when field mapping profile has rule allowing updates for this field | Field is shown after being removed via data import (when field mapping profile has rule allowing updates for this field) Preconditions:
Steps to Reproduce:
Expected Results: Deleted "830" field (see step 6) is not shown. Actual Results: Deleted "830" field (see step 6) is shown and has divided boxes (see attached screencast).
| 2023-03-09 |
|
|
|
| |
5 | Blocked | match on 035$a with qualifier fails | When updating an SRS record using a match on the 035$a with a qualifier on the incoming MARC record, the match fails. Steps to Reproduce:
Expected Results: The job matches the incoming records to the SRS records associated with in10783235 and in10783236 and updates these records with the new 856 in the incoming MARC record. Actual Results: The incoming records are not matched, the log for SRS MARC says 'No action', and the SRS record is not updated. Additional Information: I tested the Field Mapping Profile without the 856 protection and it still failed (Job 10641). When reviewing logs on an internal system, the error messages given note that a match is not found. Additional testing was done on changing the 'Match criterion' of the existing record and no value was found to make the Match profile successful. Original testing done in an Orchid environment.
| 2023-11-15 | @Yael Hod @Corrie Hutchinson (Unlicensed) |
| Ryan to review Jira with Folijet leads to understand current design and identify requirement gaps Partial matching, e.g. begins with, ends with, is required but it does not function as it should. Only exact matching seems to work. | ||
6 | Open | Subfield can't be removed when updating Marc bib upon import | Subfield cannot be removed when updating "MARC Bib" upon import when field mapping profile has rules allowing update of several subfields in all fields (including the subfield which is being added)
| 2023-06-01 |
|
|
|
| |
7 | Open | Fields duplicated when adding one subfield when updating Marc bib upon import | Fields duplicated when adding one subfield when updating "MARC Bib" upon import when field mapping profile has rules allowing update of several subfields in all fields (including the subfield which is being added)
| 2023-06-01 |
|
|
|
| |
8 | Open | Fields duplicated when adding several subfields when updating marc bib upon import | Fields duplicated when adding several subfields when updating "MARC Bib" upon import when field mapping profile has rules allowing update of corresponding subfields in corresponding fields
| 2023-06-01 |
|
|
|
| |
9 | Open | Duplicate field is added when updating $0 in linked marc bib field upon data import if field mapping profile allows $0 update | Duplicate field is added when updating "$0" in linked "MARC bib" field upon data import if field mapping profile specifically allows "$0" update | 2023-02-15 |
|
|
|
| |
10 | Open | Incorrect behavior of "Delete Files" button | Note: Does not always reproduce "Delete files" request deletes the file, but does not always show in the UI | 2022-06-02 |
|
|
|
| |
11 | Open | Asynchronous migration is not completed | The asynchronous migration script was run but migration has not been completed, the migration job is still IN_PROGRESS. | 2023-06-04 |
|
|
| MODSOURCE-665: SPIKE: Investigate cause for asynchronous migration not completingOpen |
|
12 | Open | Review and fix Marc updates for individual fields | Currently (as of Orchid), the Data Import MARC Updates for specific fields do not handle repeatable fields properly. The logic needs updating, and UI may need updating to indicate how incoming repeatable MARC fields should be handled vis-a-vis the same repeatable field(s) in the existing SRS MARC Bib. This is similar to how the field protection logic needed updating to handle repeatable vs non-repeatable fields properly. | 2023-02-20 |
|
|
| UXPROD-4080: Review and fix MARC Updates for individual fieldsDraft |
|
13 | BLOCKED | Partial matching doesn't work | Partial matching, e.g. begins with, ends with, is required but it does not function as it should. Only exact matching seems to work. | 2021-01-25 | @Yael Hod (Stanford) |
| Review Jira with Folijet leads to understand current design and identify requirement gaps. | ||
14 | in progress | Add new subfields to Electronic access (856) | New subfields in the MARC 856 field need to be represented in Inventory data. The same elements should appear in the electronic access block in Instance, Holdings, and Item records. https://www.loc.gov/marc/bibliographic/bd856.html | 2023-09-14 |
|
|
UXPROD-4467: Electronic Access Block--New Elements (Instance, Holdings, Items)Open |
| |
15 | In Progress | Ability to change the link to a profile rather than just remove it | Current situation: We are only able to link or unlink profiles (field mapping to action, action to a job, match to a job, etc). New Feature: We want to be able to change the link rather than just unlink Expected behavior: There is another option that allows the user to change the link to a different profile. Use case: The wrong profile was used and the new one needs to be added. Rather than unlinking everything, it'd be easier to just update the link to the correct one. | 2024-02-27 | @Jennifer Eustis |
| Ryan will create a ticket. This might involve rethinking the profiles page setup. | ||
16 | Open | Unable to pull vendor account number from POL when importing EDIFACT invoices | In our previous system the vendor account number lived at the PO/Invoice level. Now it is on the POL/invoice line. I have not found a way when loading EDIFACT invoice files to draw this directly from the POL or to retrieve it from the vendor file. This means for each invoice we must put in all of the vendor account numbers manually, which adds up and is prone to error. If there is a way that data import could pull this value from the POL it would save so much time in our processing. | 2024-04-12 | @Kimberly Pamplin |
|
| Need more information from @Kimberly Pamplin | |
17 | In Progress | Additional values needed for Electronic access fields or 856 subfields | Issue: Right now, only a few subfields from the 856 are mapped. We would like to expand that ability to include the non public note (856$x), access status (856$7) and terms governing access (856$n). |
| @Jennifer Eustis | All | Ryan will also look into mapping indicators. Need to account for all Inventory record types Need to account for bulk edit, data import/export, ??? | ||
18 | In Progress | Ensure consistency of UI for blank indicators between Bulk Edit, Data Export, quickMarc, and Data Import | Issue: Data Import displays blanks with a space. Quickmarc is a slash and so on. To avoid confusion, it'd be great to make sure that blank indicators and how marc fields and subfields are mapped in bulk edit, data import, and data export are done in a consistent way. | 2024-02 | @Jennifer Eustis | All |
| Ryan is bringing this topic to Magda and Christine to discuss. |