Existing Module Technical Evaluations
- 1 Overview
- 2 Background and Motivation
- 3 Key Differences from New Module Evaluation
- 4 Process Approach
- 4.1 Key Principles
- 4.2 Process Steps
- 4.3 Tools and Resources
- 5 Evaluation Criteria
- 6 Goals and Expected Outcomes
- 6.1 Primary Goals
- 6.2 Success Metrics
- 7 The Pilot Program
- 7.1 Timeline
- 7.2 Selected Pilot Modules
- 7.3 Selection Criteria
- 8 Future
- 9 Related Pages and Resources
Overview
The Existing Module Evaluation is an initiative by the FOLIO Technical Council (TC) to extend their technical review process beyond just new modules to include modules that are already part of official FOLIO releases. This addresses a gap where many modules were created before technical standards were formally defined, or need to be reviewed against current standards.
Background and Motivation
The FOLIO Technical Council has had a well-established process for evaluating new modules through their Technical Council Review (TCR) process since its adoption. However, many existing modules in FOLIO:
Predate the creation of evaluation criteria and standards
Were created before values and best practices were formally defined
Have never undergone formal technical review
May have accumulated technical debt over time
May not conform to current technical standards
In fall 2023, during process improvements, the TC formally acknowledged that "existing FOLIO modules ideally conform to the same Values and Criteria" as new modules, and committed to "work with development teams to align the current reality of the code with the Values and Criteria over time, as practicable."
Key Differences from New Module Evaluation
The existing module evaluation process differs from new module evaluation in several important ways:
Aspect | New Module Evaluation | Existing Module Evaluation |
|---|---|---|
Initiation | Teams contact TC | TC contacts teams |
Participation | Required for inclusion | Best Effort |
Consequences | Acceptance/rejection | No threat of removal from releases, at this time. |
Focus | Gate-keeping | Improvement and support |
Reporting | Standardized process | Flexible (Jira, wiki, team preference) |
Goal | Ensure standards compliance | Identify tech debt and work collaboratively |
Process Approach
Key Principles
Courteous outreach: Use "soft language" when approaching teams
Avoid tech-debt terminology: Focus on standards and improvement
Make it optional: Emphasize that participation is voluntary
Professional courtesy: Expect teams to address issues discovered, but without enforcement mechanisms
Collect feedback: Include an explicit step to gather team feedback after the evaluation
Living process: The process will evolve based on pilot experiences
Process Steps
The evaluation process follows these general steps:
Module Selection: TC identifies candidate modules using selection criteria
Reviewer Assignment: TC co-chairs assign reviewers for each module
Team Outreach: Reviewer contacts the responsible team to explain the process and request participation
Evaluation: Reviewer conducts technical evaluation using the criteria
Results Discussion: Reviewer and team discuss findings
Issue Documentation: Issues are documented in the team's preferred format (Jira, wiki, etc.)
Feedback Collection: Team provides feedback on the evaluation process
Follow-up: TC tracks progress on addressing identified issues (with professional courtesy, no enforcement)
Tools and Resources
Evaluation criteria document: Available here.
Module evaluator tool: Developed by Olamide Kolawole - tc-module-eval
Only evaluates licensing for Javascript and Java projects at the time of writing.
Tracking spreadsheet: Created by Maccabee Levine with module ages, teams, and release information. Spreadsheet
Evaluation Criteria
The existing module evaluation uses similar (but potentially adjusted) criteria as the new module evaluation, including:
Use of officially approved technologies
Code quality and maintainability
Security practices
Testing coverage
Documentation
License compliance
Environment variable management (added November 2025)
Some criteria may be adjusted or grandfathered for existing modules depending on their age and circumstances. The full criteria are documented in the tech-council GitHub repository.
Goals and Expected Outcomes
Primary Goals
Establish a repeatable process for evaluating existing modules
Identify technical debt in current FOLIO modules
Work collaboratively with teams to improve code quality
Ensure all modules meet current technical standards over time
Provide positive feedback and support to development teams
Success Metrics
Teams feel supported rather than criticized
Issues identified are actionable and get addressed
Process is sustainable and not overly burdensome
Community accepts and values the evaluation process
Code quality across FOLIO improves over time
The Pilot Program
Timeline
Initial discussions: February 2024
Process development: June-July 2025
Pilot launch: July-August 2025
Status: Ongoing as of January 2026
Selected Pilot Modules
Two modules were selected for the initial pilot:
mod-ebsconet (backend, Spitfire team)
Backend module for integration with EBSCOnet external systems
ui-export-manager (frontend, Firebird team)
Frontend module for mod-data-export
These modules were selected because they:
Are newer but predate the formal evaluation process
Represent both frontend and backend code
Are likely to have some issues but also achievable improvements
Selection Criteria
When choosing modules for evaluation, the TC considers:
Age: Modules from around lines 220-270 in the release spreadsheet (neither too old nor too new)
Type balance: Both frontend and backend modules
Quality metrics: Using tools like SonarCloud and Semgrep to identify modules with room for improvement
Likelihood of success: Preferably modules where discovered problems can realistically be resolved
Future
The pilot is designed to:
Test and refine the evaluation process
Identify what works and what needs adjustment
Eventually expand to more existing modules
Consider evaluating shared libraries, plugins, and other code beyond modules
Potentially include self-evaluation options once the process matures
Run evaluations once per flower release cycle
Feedback for the pilot can be added to Existing Module Evaluation Pilot Feedback
Related Pages and Resources
Key Meeting Notes
2025-07-30 Evaluate Existing Modules - July pilot program kickoff
2025-06-25 Evaluating Existing Modules - Foundational discussion establishing approach
2024-02-28 - Criteria for evaluating existing modules - Early criteria discussions
Related Processes
New Module Technical Evaluations - The original new module evaluation process
External Resources
GitHub tech-council repository - Process documentation and evaluation templates
TCR Jira Board - Tracking existing module evaluations