Market intelligence | Stats
This page outlines current stats for ILS specs related to data import through the user interface.
ILS | Type | single tenant vs multi-tenant | Data import action (imported record type) | Maximum number of records per import job | Maximum file size per import job | Duration (during work day) / (off hours) | Notes/additional factors |
---|---|---|---|---|---|---|---|
Alma | cloud | multi-tenant | MARC, match by system ID | 30k per hour (work day) / 120k per hour (off hours) | Ex Libris Alma documentation | ||
Sierra | on-prem | single-tenant | MARC authority, create new | 1M records (single file) | 5-6 hours during the work day | Anecdotal | |
Sierra | cloud | single-tenant | MARC bibs & items, create new | 30K records (single file) | 40 minutes | Anecdotal | |
Sierra | cloud | multi-tenant | 10,000 bib records in 10-20 minutes with Scheduler 10,000 bib records in 20-30 minutes with Data Exchange | Above numbers based on medium system load. Sierra can prioritize db writes on a per process basis including setting thresholds. IOW, record load processes can be prioritized against other system processes that write to the db. | |||
Sirsi Unicorn | on-prem | single-tenant | MARC bibs & items, create new, match on 001 (occurred in 2010) | 43K+ records (single file) | 12 minutes (work day) | View partial log from test load of this file | |
Polaris | on-prem | Create Update Create & update | 1,000 | any time of day | Importing large quantities of records may slow response times for other processes in Polaris. Therefore, you may want to start the import process when the library is closed and when it will not interfere with scheduled system backups.” Source: Polaris documentation | ||
Voyager | on-prem | For optimum importing performance, import 10,000 records (or less) at one time. If your record file is larger than 10,000 records, it should be broken into smaller sets of records If you are using WebAdmin, which also uses Bulk Import, the same general rules apply, however because you are uploading the file through your browser you should keep the imports in the 1000-5000 records range. | For optimum importing performance, import 10,000 records (or less) at one time. If your record file is larger than 10,000 records, it should be broken into smaller sets of records (using the -b and -e parameters) and then imported one after the other.” Voyager documentation | ||||
Koha | |||||||
Aleph |