Handling array structures in PostgreSQL JSON objects
(FOLIO-921)
|
|
| Status: | Open |
| Project: | FOLIO |
| Components: | None |
| Affects versions: | None |
| Fix versions: | None | Parent: | Handling array structures in PostgreSQL JSON objects |
| Type: | Sub-task | Priority: | P3 |
| Reporter: | Niels Erik Nielsen | Assignee: | Unassigned |
| Resolution: | Unresolved | Votes: | 0 |
| Labels: | platform-backlog | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original estimate: | Not Specified | ||
| Sprint: | |
| Development Team: | Core: Platform |
| Description |
|
Sometimes repeated fields are "complex" objects with one property being a foreign key reference. One third of the fields in the "Instance" records fall in that category. The problem is how to enforce integrity constraints. I believe we have enforced it for non-repeating fields by duplicating the value out into the PostgreSQL record proper, so it can be enforced by standard RDB constraints. However, that would not work well for repeated fields. Of course we could entrust the clients, that would be UIs, load-scripts and other back-end modules, to enforce it, but it's not DRY and not as safe as storage module-level or – the even safer – database-level constraints. |
| Comments |
| Comment by Julian Ladisch [ 07/Nov/17 ] |
|
If the array has no (or a high) limit for the number of values we need to create an additional table that holds the duplicated values. If there is a reasonable limit we can create that number of fields (author1 author2, author3, ...) in the original table. |