Developer Onboarding from SIG meeting on 2023-06-26: this recording provides detailed information about how FOLIO data are transformed and organized by LDP1 and Metadb databases. It also gives a brief overview of how to navigate the FOLIO Analytics query repository.
Recording from Reporting SIG meeting on February 2: this recording includes a tutorial on how to assign yourself as a reviewer on GitHub and generally how to undertake a review; the tutorial starts around the timestamp 9:40. Use the normal FOLIO password to access the recording.
Recording from Review Board Training on February 27: this recording includes grabbing a script from GitHub, logging into the test database on DBeaver, trying to run the query, and even adding a custom table to the LDP to be able to test comments added to public tables
Recording of an example review: an older recording walking through a review, including connecting to the database and running the query
Pull requests each need reviews from two different people
In GitHub, reviewers should first add themselves to a pull request as a reviewer before submitting a review
You can see if a PR has reviewers assigned by opening the PR and looking in the upper-right corner, where it says "Reviewers"
If a PR has fewer than two reviewers, it should have a "needs reviewer" label. Please add this label if you find a PR with 0 or 1 reviewers. Please remove this label if you find a PR with 2 reviewers.
When you find a PR that has fewer than two reviewers, you can add yourself by clicking on the gear icon in the Reviewers section and adding yourself. (If you are not listed as an option, ask Nassib to add you to the repository as a possible reviewer.)
Please only add yourself to a small number of reviews at any time - only the amount you're planning to complete in the next week.
To begin your review, look for the "Add your review" button at the top of the page. If you don't see it, make sure you have been added as a reviewer, and try refreshing the page.
Click "Add your review," then click "Review changes" on the page that opens.
A review includes a comment and a decision. Inside the comment box, paste in the checklist.
Update the checklist by removing items that are not relevant.
Items in the top section ("All queries") apply to any query submitted
The "report queries" section is only necessary for queries submitted to the sql/report_queries or sql_metadb/report_queries folders.
The "derived tables" section is only necessary for queries submitted to the sql/derived_tables or sql_metadb/derived_tables folders.
Now work through each item in the checklist and make sure the submitted query satisfies the requirements. This should involve looking through the code and additional documentation submitted, executing the query against a test database, and reviewing the results of the query.
To download the code, you can click the three dots at the top right of the view of the file on the review page. There should be an option to "View File". Clicking on this should show you a preview of the full file, and then you can click "Raw" to go to the raw version of the file. You can save this version from the browser.
See the Test environments section below for information about how to connect to a test database to execute the query.
For each required item in the checklist that is confirmed, change [ ] to [x].
Decisions:
If all items in the checklist have been completed correctly and you don't see any additional problems with the query, your decision should be to Approve the pull request.
If one or more required items are not completed correctly, your decision should be to Request changes.
If you have a question, you can submit a Comment without approving or requesting changes.
If you request changes, please be on the look out for additional changes that the submitters makes to address your issues. When the changes are complete, you should re-review the query.
Differences between LDP1 and Metadb queries to look out for
The documentation about Migrating from LDP to Metadb outlines the major differences between LDP1 and Metadb queries.
As a reviewer, the query features that you should especially be looking for are:
Data types: in Metadb, identifier fields are in UUID format, whereas in LDP1 queries those fields are typically in VARCHAR format. Other data type conversions you should look for are dates, date-times, numbers, and booleans. You can check the data types in the results of the query to make sure they are correct. Data type problems can also prevent a query from executing without error.
Additional joins: in Metadb, the original json records and the record with extracted columns are in separate tables. Metadb queries may have more joins, including joins to tables that look almost like duplicates (e.g., loans and loans__t). This is not a problem, but it can make it harder to think through the query logic.
Test environments
Our test environments for FOLIO Analytics queries are maintained by Index Data as a project called testbed.
LDP1: reviewers have access to an LDP1 database connected to FOLIO Snapshot, named ldp1_folio_snapshot as of January 19, 2023. Review the testbed site to see how to request access.
Metadb: reviewers have access to a Metadb database connected to a FOLIO instance hosted by Index Data, named folio_nolana as of January 19, 2023. Review the testbed site to see how to request access.
Once you have access, many reviewers use DBeaver to connect to the databases and execute queries. You should receive the connection information for these databases (e.g., host, database name, username, password) when your accounts are created.
After connecting to the databases, you can use them to test execution of queries.
With the LDP1 test environment, derived tables can be tested in the "local" schema. If you get an error running the query because there is already a table with the same name in the local schema, you can edit the query to rename the table. (You'll have to edit every place where the table name occurs, including all of the indexes.)
With the Metadb test environment, derived tables can be tested in your personal schema. It should match your username.
Adding test data to the reference environment
Sometimes a query requires data that aren't available in the test environments. You can add new data to the FOLIO instances connected to these databases in order to test the queries.
LDP1 : Add data to FOLIO snapshot by logging in with the normal FOLIO account (see "Demo Sites" section on main FOLIO wiki page) and just using the FOLIO apps to create or modify records. The updates will show up in the connected LDP1 database every hour(?).
Metadb: Review the testbed site for information on how to request an account on the FOLIO instance hosted by Index Data. When you have access, you can create and modify records there. New data will show up in the connect Metadb database very quickly.
Communicating with a PR submitter
If you need to communicate with the person who submitted the pull request, you can easily do this right in GitHub.
When you click on a pull request from the list, you should see any communications and actions on the PR since it was submitted.
At the bottom, you should see a text box to add a comment. To tag the PR submitter in your comment, you can type "@" followed by their username.
If you send feedback but don't hear back through GitHub after waiting for a day or two, you might choose to follow up with the submitter via Slack or email.