Magda 16:18
So let's go to our agenda again. And let's move to the next part, which is the Morning Glory release notes. As you know, morning, Glory was released on September 16th, and I would like to walk you through some points on the release notes. So first, the new app. There is only one a new app which is Bulk Edit in Morning Glory. And a quick overview what we support. In the in app approach, inventory items temporary and permanent locations and item statuses. In the CSV approach, user records. There are two sets of permissions that are required for this application, in app edit and view, and CSV edit and view. Depending on those permissions a user can access those applications. There are known there are Known limitations. As our performance tests discovered, in Morning Glory, you can only do a Bulk Edit of 2500 user records at the time. And for items, the limit is 10,000. This is a lot less than we were hoping for. Unfortunately, this is what it is for Morning Glory. We are looking at architectural changes to support a larger data set in the future.
Magda 18:54
The last part here, module configuration, is for those who are hosting themselves. For everybody else, this is being set by the hoster. Those are the memory requirements for the container that runs the module data export worker, which is being used in Bulk Edit
Erin 19:01
The performance test the performance work that you were talking about, is tracked in a particular Jira?
Magda 19:09
JIRA and reports. I will link the list. There is a new performance testing Jira that we started in the Morning Glory. Some of the work will spill over into Nolana. Starting next week, we will be working on bug fixes for the functionality that was already delivered and testing, performance testing, adding API tests, and adding automated tests for the UI. So, most of the tests can be run automatically.
Regarding the known issues, there are two Jiras that are linked to Bulk Edit. One which we discussed a couple of meetings ago, has to do with Bulk Edit results triggered by holdings identifiers. When the mapping between holdings and items is not one-to-one, it may happen that the list of identifiers is shorter than the list of matched items. This will need to be addressed in Nolana. We will be working on this functionality.
Then the next issue is a little bit more serious. When you update datasets larger or larger than 10,000 records, there is a significant delay of up to several minutes before the display of the preview screen, i.e., the "are you sure" form. We are working on improvements.
Unknown Speaker 22Erin 22:59
did Did you ever see it timeout or just it would always come?
Unknown Speaker 23Magda 23:05
So, there are instances that we saw in our tests and this is also linked to this bulk edit it is timing out not often we saw this in we saw this rarelyWe saw this, but it happened rarely. So I cannot tell you know it never happened. Yes, we saw that happened. But if this is the case, you just need to submit the the bucket you need to start it all over. Okay, the Bulk Edit again. The data is not committed at this point, the data is not committed. So this is because you have to wait and then hit the button. Exactly commit the changes. And the additional questions.Unknown Speaker 23:55 . Any additional questions?
Erin 23:55
Jennifer has a question in the chat says . "Does it affect any other operations happening at that time?"
Unknown Speaker 24:02
You mean on the other operations? If somebody works in inventory, for example on make changes? Denise, is this your question?
Unknown Speaker 24:13 Jennifer 24:13So for example, what happens if you have this like a theme to import, single record import and then check-ins and checkouts like everything happened? You know, we would say to import that, you know then a checkout, With data import it can't disrupt other operations and it should record , so I was just wondering what happened here.Unknown Speaker 24:34
So the problems we saw the problems that are happening and the slowness that you see is caused by a mod data export worker that is being used. So separate from the inventory module or a data import module there should be no interference. However, if somebody exports a defect at this time or a circulation logs are being exported. Or Bursar is being exported at the same time, it will have an effect effect. However, in our tests, when we did the tests we ran, we ran all of those three export at the same time, so EDIFACT, Bursar circulation and bulk edit. And that's why we got down to 10,000 records for items for example, because if we increase the number, it will definitely affect other export that may happen at the same time. Thank you. Any other questions? Then I will move to the next part of our meeting. Again, I navigated away from from our agenda. The next part, I would like to ask you about the behavior of the holdings source. There is a different behavior in inventory. And in bulk edit, what is happening in inventory, if their holdings source is not populated, the UI assumes the source of the instance that the holdings is associated with. In bulk edit, we don't do this because I am afraid this is a dangerous process. But I may be wrong. So I would like to hear your opinion. This is the JIRA. And the screenshots will, will demonstrate the issue. So this is an example from from the snapshot. I have a holdings and the holding says source folio on the on the inventory holdings view page. However, when I do the bulk edit, and this is the screen, the landing page, we submitted identifiers, the preview is populated. And you see most of the records have empty value for source. Nothing is there. When you look at the developer tools that show what is coming back from the database, it shows that source it is no. So the question for this group is are you comfortable with source being left blank here because it's blank in the database? Or would you rather the we follow the inventory behavior? And that if the instance is folio, then we can assume that the source is folio?
Unknown Speaker 28:33
So when when you're saying that if the Holdings Record doesn't have a source, it assumes it's the same as the instance record when you when you say it assumes is that implying that it just displays it in the UI, but it does have value in the underlying record?
Unknown Speaker 28:50
That's correct. So this is what is happening here in the inventory UI. The UI populates this field with the data that is actually coming from instance.
Unknown Speaker 29:05
That seems weird to me, but I don't know what the metadata folks feel about that. This is not the only place in folio where the UI shows you something that doesn't that isn't actually in the underlying data. Go ahead, Jennifer.
Unknown Speaker 29:23
That seems really strange. I know. We've had problems with that in the past. So it depends. So for example, if you have Mark holdings, rather than folio holdings, you can edit that and quick mark. So I think having the source is important to know like which we can open it. That was that was my understanding. So and I know for us in our instance, well at least as far as I know, this should not know anywhere this source so yeah,
Unknown Speaker 29:59
right. If we seeing this as no in places other than the reference data, the reference environment.
Transcribed by https://otter.ai