The default data import app view opens to a list of logs from previously executed data import jobs.
The columns on the display provide you with basic information about
Add some basic info here - details and summary, statuses, hotlinks; summary available as of Morning Glory
To view a log, click on the file name. It will open in a new job summary view.
The summary section appears at the top of the screen. The column names are the record types, and the rows give you the status of the record import - created, updated, discarded, or error. This provides you with a quick way to know how the job ran.
Below the summary section, you'll see listed a row for the record actions that were taken.
Clicking on the record title will give you different kinds of information.
- If the record import generated an error or a discard, clicking on the title will give you the error from data import. It will open a new pane.
- If the record import was successful, clicking the title will show you an import log, showing you the underlying JSON data for what was created or updated.
Logs on the Data Import Landing page and Logs on the Data Import View All page
Add info here - Inventory Single Record Import suppressed from DI Landing page
Searching and Filtering Import Logs
Data Import has a specific user interface for viewing import logs. If you want to search for past data import logs that you cannot find on the front page, go to Actions > View all.
Search options include
Filter options include
|
Interpreting technical errors
Error | Possible meanings |
---|---|
java.lang.IndexOutOfBoundsException: Index 0 out of bounds for length 0 | Problem with job profile |
Deleting Import Logs via the UI
If a user needs to delete data import logs, they need the permission "Data import: Can delete import logs".
to be added: reasons why you would want to delete an import log
It's import to know that once a log is deleted, it cannot be retrieved.
The log is marked for deletion but not purged instantly; the purge process is managed by your hosting. The default behavior is to purge data import logs every 24 hours, but the value can be set to a different interval as part of configuring mod-source-record-manager.
Deleting Import Logs via API
To delete Data Import logs the following endpoint can be used:
Method | URL | ContentType | Description |
---|---|---|---|
DELETE | /change-manager/jobExecutions | application/json | Marks for deletion specified ids of DI logs |
Body of the request should contain a list of log IDs that should be deleted (see example below):
Code Block |
---|
{ "ids": ["c4908351-f963-4575-9aa5-38e56f83e94e", "9ba1fde2-114e-4e63-937f-2e9c4c431787"] } |
Example of the request:
Code Block |
---|
curl --location --request DELETE 'https://okapi-bugfest-mg.int.aws.folio.org/change-manager/jobExecutions' \ --header 'x-okapi-tenant: fs09000000' \ --header 'x-okapi-token: eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJmb2xpbyIsInR5cGUiOiJsZWdhY3ktYWNjZXNzIiwidXNlcl9pZCI6IjllYjY3MzAxLTZmNmUtNDY4Zi05YjFhLTYxMzRkYzM5YTY4NCIsImlhdCI6MTY1OTUxMzA3NywidGVuYW50IjoiZnMwOTAwMDAwMCJ9.NJ4o4CQ0TgI6StbWsmvp8iOlHCb_xLLH1H8wse24chM' \ --header 'content-type: application/json' \ --data-raw '{ "ids": ["c4908351-f963-4575-9aa5-38e56f83e94a", "9ba1fde2-114e-4e63-937f-2e9c4c431787"] }' |
** The user performing this operation should have "change-manager.jobexecutions.delete" permission assigned
Successful response would have 200 status and information about logs marked for deletion
Code Block |
---|
{ "jobExecutionDetails": [ { "jobExecutionId": "c4908351-f963-4575-9aa5-38e56f83e94e", "isDeleted": true }, { "jobExecutionId": "9ba1fde2-114e-4e63-937f-2e9c4c431787", "isDeleted": true } ] } |
Please note that DI logs are not deleted from the DB instantaneously, instead they are marked as "DELETED". Hard deletion of DI logs happens periodically - every 24h by default, but this frequency can be configured by setting "periodic.job.execution.permanent.delete.interval.ms" for mod-source-record-manager to a different value in milliseconds.