Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Date

Recording

TK

Attendees

TKhttps://recordings.openlibraryfoundation.org/folio/implementation-group/2023-01-31T11:00/

Attendees

Buddy Pennington; Ian Walls; Jeremy Seiferth; Kara Hart; Karen Newbery; Michelle Futornick; Molly Driscoll; Peter Murray; Bob Scheier; Tara Barnett; Tod Olson

Recommended Resources and Pre-Reads

Agenda

  • Housekeeping
  • UPGRADES
    • Discussion Questions
      • From Julie: How do you deal with the release notes? What are the general steps you take when you upgrate to a new version of FOLIO? Any lessons learnt to share?
      • How do you organize your testing? I've seen some amazing checklists and spreadsheets. (Tara)
      • How much time do you allocate for testing? How many people are involved in testing? (Tara)<<Add Questions Here>>
  • Closing
    • Any actions or follow through?

Notes

Time

Topic

Notes

TK

Action Items

2:30Housekeeping
  • Tara asks about what was up with the onboarding page ideas. Nobody seems to know.
  • Reminder that we're talking about spine label printing two weeks from now.
5:00Release Notes
  • How do you deal with the release notes? What are the general steps you take when you upgrate to a new version of FOLIO? Any lessons learnt to share?
  • Ian explains his strategy: reading the release notes, trying to understand what each bit is about, and determining how it impacts their partner libraries. Also looks for new features--things they've been waiting for. Also checking for bug fixes for issues that have been reported. Ian tried to get this down to half a dozen points, and provides this information to partner libraries before they have access to their test system.
  • (7:12) Tara notes that she and Ian both work for vendors. She asks whether libraries expect their partners to be reading these notes--are they actually for vendors, or for libraries directly?
  • Tod notes that this varies by library. Chicago does much of this themselves because of their history and experience. The process at Chicago is very similar to Ian's outline--look at the release notes, figure out the effects of any changes, and farm them out to subject matter experts. Buddy says that they are similar--they organize changes/features around the apps, then farm those out to the departments that use them.
  • (9:13) Buddy asks: Morning Glory has a release digest, but there doesn't seem to be one for Nolana. Was curious about how those are created, as they are very helpful. Molly says Oleksii made this--we could ping him and see if there's one forthcoming.
  • (11:10) Michelle is on a small team led by Khalilah that is focused on improving release notes. Michelle can take back any comments. The purpose of this team is to look at how the release notes could be improved/easier to use.
  • (12:45) Tara notes that everyone--vendor or library--seems to start their process with the release notes, and there is a process of translation for communication with others. Ian says "anything that's a list of Jiras" needs to be analyzed before he passes it on to a library.
13:45Testing Process
  • Tara asks what people actually do when they are testing. What did you look at? What were you looking for?
  • (14:15) Ian says that he looks for the pages loading correctly, no obvious permissions errors, nothing broken, any customizations/changes have persisted. He does not do a bugfest-level intensive test. Partner libraries get access to the test environment for two weeks, and they are encouraged to do daily operations in the system.
  • (15:30) Tod says that Chicago tries to identify all the workflows that are implicated. This differs hotfix/flower release. This is the main way Chicago organizes their testing. Also: do new workflows work the way they expect? For hotfixes, testing is very targeted. Different areas of the library keep their own lists. (18:15) Ian notes that Bywater partners are smaller--at these institutions, the scale is small enough that it doesn't need to be divided up.
  • (19:00) Tara asks if folks look at data in their testing. Bob says that they do--they followed an example from 5 Colleges combined with a sample checklist from Sierra, and modified for their uses and needs in FOLIO. Holy Cross found some data discrepancies in an upgrade.
  • (20:00) At Stanford, they are still migrating--some of their data checks are likely only appropriate for migration. Michelle asks if it is necessary to check the data when upgrading once you are live. (20:45) Bob says that it was valuable to check. Tod says that there can be updates to the reference data as well--there could be surprises here as well. Some processes would lead Chicago to check data as well. (22:30) Bob notes that they also did check reference data--there were some minor things that did not survive the upgrade process which were caught by testing.
  • (24:00) Tara asks if anyone uses reporting to check the data. Tod says this is a technique used for sanity checking.
  • Holy Cross Essentials Testing checklist
24:45Time Needed For Testing
  • Tara asks how long testing takes? Ian makes a Total Recall reference that goes straight over Tara's head. The answer is two weeks. Tod says that workloads can vary, so they propose a schedule (two weeks, also) and ask others to buy in. Bob says they take a week or two, depending on workloads. (Holy Cross is small.)
27:30Sharing Procedures
  • Tara asks if there's value in seeing other peoples' checklists. Is this process generalizable?
    • Kara has gone through one upgrade so far, and they are building their own checklist, and thinks this has value. It's important to check all integrations testings--those are easy to forget.
30:00Automatic Upgrades
  • Ian asks--to what degree are people comfortable with automatic upgrades for smaller changes? How much rigor is applied to a hotfix upgrade? If we were able to get to rolling upgrades (vs flower releases), would this be ok. Karen (?) does not trust this process now. Chicago does not trust this now either. It sounds like this trust is a ways off.
  • (31:30) Michelle asks if folks install every hotfix. Tara says this probably depends on your hosting arrangement. Tod says it depends if the fix addresses a hot issue. Mostly yes.
  • (33:15) Tod notes that hotfixes sometimes have unintended side effects. There is some wariness about this.
34:20Testing Discovery
  • Buddy (?) asks if people check their discovery/opac. Bob says yes.
  • Kara says that FOLIO is connected to EDS and VuFind, but their test environment is not connected to those. Therefore they cannot test that.
    • Some discussion of different hosting set ups--it's important to remember that not everyone will have a test version of VuFind, reporting, external import tools, etc, and this depends a lot on what you're trying to test. For example, there is no test robot for storage and retrieval at Chicago. Several "creative" solutions proposed.
38:00Groundbreaking Conclusion
  • The overall rule seems to be, "if you can test it, you should test it"
38:15Managing Upgrades
  • Tara asks from a project management perspective, how do folks actually green light/red light their testing procedure? In a large institution, how is green light indicated?
  • Ian says that it's an implicit go-ahead unless someone says stop. They haven't actually had to pull the brakes yet. Tara says that she actually has, in at least one case.
40:00Sharing Knowledge
  • Tara shows an idea for an implementer's guide to upgrade testing. Tara asks for ideas. Ian says there should be a reminder of the release structure. There is also a suggestion for a link to the release page of the wiki. Bob says to add a copy/document rather than a link. Molly reminds us that you can follow spaces on the wiki, and releases is a good one to follow. She likes to see what's coming and what's being updated. This is super helpful. Buddy says a reminder to do FOLIO testing but also post-release testing would be important, and what to focus on (integrations, for example).
44:50Reference Environments?
  • Michelle asks if anyone uses the reference environments to test new features in FOLIO. In general, folks use those environments for planning new workflows, not testing whether things work.
45:10OCLC Records
  • Bob asks if we can do a session on what people are using for pushing OCLC records. Michelle seconds this. Connexion, Record Manager, others? Side questions--different authentications and so on.

Action Items

  •  Tara Barnett will finish the Implementers' Guide to Upgrade Testing
  •  Bob asks if we can do a session on what people are using for pushing OCLC records. Tara Barnett will add this to the idea bank.