Scheduling export of inventory records through an API

Description

In the existing implementation the data export can only be triggered manually. For the exports that reoccur on regular basis (like incremental export of all records that were added or modified since the last export), the application will need to provide the API so that the export could be triggered by the external custom export script .

This feature covers backend work that would support a scenario when the library has an export job that needs to run on the regular basis on the data identified in a consistent way. Such jobs are mostly run when the exported data is need for integration with the external services and the file generated by the export might need to be FTP-ed to a specific location.

The user should be able to:

  1. schedule when the job will need to run (quarterly, monthly, weekly, daily, at a specificied time)

  2. determine if this re-occurring export job

  3. the files generated by export are stored in the standard location

  4. the job will be associated with a mapping profile that will determine required data manipulation.

  5. identify the data that will be exported by CQL query that can take system parameters (like date of the last execution) for example or by providing list of UUIDs if static data needs to be exported.

Additional information:
Updated workaround has been attached.

Priority

Development Team

Firebird

Assignee

Solution Architect

Parent Field Value

None

Parent Status

None

Attachments

3

Checklist

hide

TestRail: Results

Activity

Show:

Magda Zacharska May 23, 2023 at 7:39 PM

Adding LC2 label based on prioritization during the LCAP MM meeting on May 17, 2023. The priority may change depending on the ease of workaround implementation.

Magda Zacharska October 27, 2020 at 8:55 PM

since this is backend only feature, then the scheduled jobs will be part of data-export/job-execution, I assume.
There is a feature for cancelling, pausing and resuming jobs () that would cover cancelling and pausing (skipping) the job. Deleting the jobs will need to be handled by purging the running jobs for now.

Kruthi Vuppala October 27, 2020 at 7:29 PM

Under the scope of this feature,

  • Should there be ability to view the scheduled jobs?

  • Ability to cancel/delete the scheduled job overall

  • Ability to skip a single cycle- say on a monthly job just skip for a single month?

Monica Arnold October 22, 2020 at 7:13 PM

Attached is a slightly modified version of the steps needed for automating data export. Note that this document is specific to Goldenrod. Some API endpoints are changing in Honeysuckle.

Magda Zacharska October 7, 2020 at 10:12 PM

Attaching 's notes related to automating data export.

Details

Reporter

Potential Workaround

*Start the export:* 1. Post file definition to /data-export/fileDefinition (this is an equivalent of uploading UUIDs or CQL) 2. Start the export by /data-export/export *Retrieve the files generated by the export:* 1. Get completed jobExecutionId and fileId : /data-export/jobExecution - by querying by completed time stamp 2. Download the file by /data-export/jobExecutions/{jobExecutionId}/download/{fileId}

PO Rank

90

PO Ranking Note

Needed for automated exports

Front End Estimate

XXL < 30 days

Front End Estimator

Front-End Confidence factor

Low

Back End Estimate

XXXL: 30-45 days

Back End Estimator

Back-End Confidence factor

20%

Release

Umbrellaleaf (R3 2025)

Rank: 5Colleges (Full Jul 2021)

R1

Rank: Cornell (Full Sum 2021)

R4

Rank: GBV (MVP Sum 2020)

R1

Rank: hbz (TBD)

R1

Rank: Grand Valley (Full Sum 2021)

R2

Rank: TAMU (MVP Jan 2021)

R1

Rank: Chicago (MVP Sum 2020)

R1

Rank: MO State (MVP June 2020)

R1

Rank: U of AL (MVP Oct 2020)

R3

Rank: Lehigh (MVP Summer 2020)

R1

TestRail: Cases

Open TestRail: Cases

TestRail: Runs

Open TestRail: Runs

Created March 18, 2020 at 2:27 AM
Updated March 25, 2025 at 7:17 PM
TestRail: Cases
TestRail: Runs