Gap Analysis/Feature Ranking 2019 Proposal
(Created by Dracine, Kristin & Holly)
Overview of Gap Analysis 2018
The first Gap Analysis was conducted in summer of 2018 and included what were then called the “early implementers”--those institutions planning on implementing FOLIO in 2019. The primary purpose of the Gap Analysis was for each participant to identify features/functions needed by their institution that appeared to not be planned for FOLIO. A secondary purpose of the Gap Analysis was for each participant to indicate which features/functions are needed in order for their institution to “go-live.” A spreadsheet containing all planned FOLIO features, sorted by functional area, was provided with a column for each institution to provide a ranking. Any “missing” features were to be added to the bottom of the spreadsheet. It was noted that gaps would be found later and priorities would change, but this was a good starting point.
Gap Analysis/Feature Ranking 2019 Goals
Provide a project check that FOLIO development is on target along with the ability to adjust development to meet partners’ implementation timelines.
Provide an avenue for implementers to work together collaboratively to identify and rank missing features and expand details on needed functionality.
Identify gaps and areas of concern in a proactive manner.
Increase transparency for partner institutions regarding the state of FOLIO development and install confidence in its ability to meet partners’ needs.
Gap Analysis/Feature Ranking 2019 Approach
A major flaw with Gap Analysis/Feature Ranking 2018 was that institutions were forced to identify gaps and rank features/functions based on brief descriptions. Now that much of FOLIO has been developed, we can actually look at many of the FOLIO features/functions themselves rather than read a brief description about them.
We recommend that an institution start with the Feature Ranking component of this exercise, which will provide a background in what remains to be done for each functional area (what we call Epic):
A spreadsheet will be provided that lists all features that are not completed or currently being worked on (there is no need for completed features or features in-progress for the next release to be ranked/re-ranked--it’s too late). Previous rankings will be provided if they exist for the institution. The spreadsheet will be broken into worksheets by Epic (functional area) to make it easier to manage.
The spreadsheet will also include a worksheet that lists all completed features and features being worked on in the current release, should the institution decide to rank/re-rank these features for their own internal use. Previous rankings will be provided if they exist for the institution. This worksheet will be sorted by Epic (functional area) to make it easier to manage.
Each institution will be able to view the rankings by other institutions. This was extremely helpful during Gap Analysis 2018.
Rankings will be uploaded to the feature JIRA issues so we have them available for reference and updating.
After completing the Feature Ranking, the institution should move on to the actual Gap Analysis exercise.
For the Gap Analysis we recommend an institution take the following approach:
Set up a team for a particular functional area.
The functional area team will review FOLIO as it currently exists for their functional area (which will be the Aster Release (aka Q4 2018 for Gap Analysis 2019).
The functional area team will then review the features planned for that functional area by:
Reviewing a spreadsheet provided that contains all features not in the release of FOLIO reviewed in step 2 above. For features that are in-progress, links to additional information such as user stories, mock-ups, etc.,will be provided if available. For features that are being completed in anticipation of the the Bellis Release (aka Q1 2019) in April, the URLs of the test site will be provided.
Asking the Product Owner questions via the JIRA issue Comment feature. For functional areas that are less developed and/or more complicated, we could schedule sessions where the Product Owner presents the current version and future plans and answers questions. SIG time could be used for this, with other institution experts attending.
After reviewing the plans for the functional area, the functional area team may identify “gaps”, which should be reported on the spreadsheet as soon as they are identified. The types of “gaps” that will be found include:
Missing features needed for their institution to go-live that are not currently addressed by any feature defined in JIRA.
Refinements of existing features where the functional area team feels the need to add more details to the JIRA issue to ensure a particular go-live need is documented.
Desired features, which are not go-live requirements. These features do not need to be documented as part of this process and should be brought to the SIG as part of the normal development process.
Holly Mistlebauer will contact the appropriate Product Owner for the missing or refined features, who will then review what the institution has identified and discuss the requirement with them. The result is usually one of the following:
The Product Owner proves that the feature is already in FOLIO or planned for FOLIO, to which the institution concurs.
The Product Owner proves that the feature was discussed by the SIG previously and presents information about why the SIG decided against it, to which the institution concurs.
The Product Owner brings the feature to the SIG for discussion after which the Product Owner will create a JIRA issue for ranking by all institutions.
Results will be presented to the Product Council, who will advocate for resources and prioritization changes as needed.
Gap Analysis/Feature Ranking 2019 Time Frame
Feature Ranking: Institutions will start on March 1 (or shortly thereafter) and finish by April 30.
Gap Analysis:
Institutions will start steps 1-4 above on March 1 (or shortly thereafter) and finish by April 30.
Product Owners will complete step 5 above by May 30.
Initial results will be presented to the Product Council in early May.
Final results will be presented to the Product Council in early June, facilitating resource advocacy and prioritization as needed.