Firebird - Definition of Done


  1. Implementation of requirements is completed, and acceptance criteria have been met. 

  2. UI changes (if any) meet UX guidelines.

  3. RCA Group value is added for bugs. 

  4. PRs are reviewed and approved by at least two engineers. 

  5. Critical code smells and vulnerabilities found by SonarQube and lint errors are fixed before merging the code.

  6. Unit tests are written and passed (80% of module’s code coverage is a bottom limit and 100% is preferred for critical code). Exceptions could be for legacy components with coverage below 80%.

  7. Existing integration/Karate tests are maintained/adjusted and passed. The ticket for new Karate test (if needed) is created and linked and planned for the same flower release.

  8. Code is merged into master/target branch with all conflicts resolved and available on reference environment/specific environment (in case of critical service patch).

  9. Interface version and module implementation version, and UI (package.json) are updated (see guidance). The fixVersion field in JIRA is updated to match the version that ships the feature/bugfix.

  10. Sample and Reference data are updated to match the feature and schema changes. 

  11. For API major version changes, a note in Slack channel #development is posted. In case of 3rd-party API usage, its latest API document is validated (for no discrepancy). 

  12. For breaking changes, the version is bumped up in affected modules and affected modules' readme files are updated. API changes are backwards compatible (if possible).

  13. Data migration scripts are completed for schema changes. Migration pipeline ran successfully. 

  14. Data migration from one major/minor version of the module to another has been verified and took 10 minutes at most for the data volume size of Bugfest.

  15. NFR related items: 

    1. If potential performance impact on functionality or migration is identified, the ticket for PTF team is created for the same flower release and linked.

    2. Module is implemented following stateless principles and that is confirmed during PR review. 

    3. Implementation is done using R/W split (if applicable). 

    4. Tenant data is isolated in data schemas from other tenants. 

    5. Secrets are not stored in source repositories. 

    6. No sensitive information is stored in logs (logins, passwords, API keys). 

    7. Application logs are collected in a unified form and location (see here and here). 

    8. UI: Internationalization (I18n) is taken care of in application code (see details). 

    9. UI: Accessibility check list is considered, and new functionality covered with accessibility tests. 


  1. Impacted areas are identified and regression test set is designed. 

  2. Positive, negative, edge test cases are documented/updated in TestRail according to testing process

  3. Test cases for automation are identified (wherever applicable). 

  4. Testing in Chrome browser against Acceptance criteria of the ticket is performed: 

    1. with non-admin user 

    2. with Bugfest like data set (request Kitfox team to set env up via ticket in Rancher Jira project)

    3. on multi-tenant environment (Performance/Sprint testing) 

    4. with R/W split On 

    5. with High Availability activated 

  5. Regression testing for affected areas is completed. 

  6. All known P1, P2, and possible P3 bugs related to ticket are fixed. No unprioritized bugs are left. 


  1. Breaking API changes are documented in the Release Notes.

  2. Data migrations that could take time should be brought to the attention of hosting provider.

    1. If data migration of a module from one version to another took over 10+ minutes, it has been discussed with hosting providers (at Folio release - Weekly release* meeting hosted by Oleksii P.) to check how acceptable that can be and documented in Release Notes with notion of: 

      1. Estimate of how long the data migration can take - matrix of volume of data to time

      2. Is there any visible impact on the UI if the data is not migrated? 

      3. Can the customer live without this migrated data till it’s migrated? 

      4. Can the data migration be done without downtime?

      5. What expectations need to be set with customers during this data migration process? 

      6. What is the impact if the tenant is migrated from one release to another with this migration being done after the upgrade? 

  3. Deployment information is provided in launchDescriptor section of module descriptor. 

  4. Release notes and README file are created/updated and include details on infrastructure, configuration changes (DB_HOST, KAFKA_PORT, ELASTICSEARCH_URL, Kafka topic partitions number, etc.) if any. 

  5. Any manual steps that could not be automated and supposed to be done by hosting providers/librarians themselves are documented in Release Notes.


  1. The work delivered is tested and accepted by PO (or another stakeholder if applicable) on snapshot.

Please note that all items in checklist marked with [M] are mandatory.


User Story/Bug

System Demo/Review


[M] Implementation of requirements is completed and acceptance criteria have been met.Y

[M] Unit/API tests are written and are passing (for API test screenshot of test run is added). At least 80% code coverage is expected (for new code) and 100% is preferred for critical code.


[M] Y

[M] Performance tests completed (if applicable) Y

[M] Check the lists to make sure they show at least 11 items, which means the limit was explicitly set and default 10 items limit is not applied Y

[M] Accessibility check list is considered.Y

[M] UI: Check if required permissions are set for new endpoints (if applicable)

new - Functionality is checked with non-admin user (appropriate permissions are set for regular user to make sure that verified functionality is accessible)

new - Functionality is checked on multi-tenant environment (perf, sprint testing, bugfest)

[M] Pull request is created, reviewed and approved by 2 other developers (in case repo belongs to other team, at least 1 approval from the owner is required)


[M] Required updates are implemented based on comments to Pull request.


[M] Fix code smells, security vulnerabilities, lint errors that are reported by Sonarqube and other tools in CI pipeline before merging code to master


[M] Existing API tests (backend modules) and Integration tests (UI modules) are maintained/implemented/improved and pass


[M] Any configuration and/or build scripts are updated and tested


[M] Data migration scripts are implemented for schema changesY

[M] Dev verification is performed by a developer on Vagrant Box/Rancher/bugfest environment and issues resolved if found


[M] Code is merged with all conflicts resolved and is available on and (update links)Y

[M] QA is performed by a QA/developer on Integration (snapshots) environment and issues resolved if found


[M] Updates are tested and accepted by PO 


[M] No open critical bugs on any user stories


[M] DoD of each user story, included in demo are met


[M] All updates are demoed from the shared environment ( or

In case of some technical reasons we have to demo from local machines, it must be explicitly disclosed to the community and the reasons should be clarified.


[M] Releases are created following: Release procedures and Regular FOLIO releases.


[M] Installation and deployment scripts are updated


[M] All critical bugs reported by QA, manual testing, UAT, PO etc. are fixed


[M] User documentation updated (deployment documentation, scripts/packaging etc.) (if applicable)