Reporting: Analytics and Audit Data Logging for External Reporting
(UXPROD-330)
|
|
| Status: | Closed |
| Project: | UX Product |
| Components: | None |
| Affects versions: | None |
| Fix versions: | None | Parent: | Reporting: Analytics and Audit Data Logging for External Reporting |
| Type: | Story | Priority: | P3 |
| Reporter: | VBar | Assignee: | Hongwei Ji |
| Resolution: | Done | Votes: | 0 |
| Labels: | analytics, kafka, reporting | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original estimate: | Not Specified | ||
| Issue links: |
|
||||||||||||||||||||
| Epic Link: | Reporting: Analytics and Audit Data Logging for External Reporting | ||||||||||||||||||||
| Back End Estimate: | Medium < 5 days | ||||||||||||||||||||
| Development Team: | EBSCO - FSE | ||||||||||||||||||||
| Description |
|
Create an Okapi post- filter handler for the purposes of capturing all transaction data from Okapi for the purposes of building a Data Lake. This is considered interim until a decision is made for the Asynchronous Event Service (AES). The filter will capture all transactions and pass the to the message queue (
|
| Comments |
| Comment by Hongwei Ji [ 15/Oct/18 ] |
|
We have done similar during the data capture POC. So what I did was to clean up the old code and removed irrelevant pieces (RabbbitMQ, MongoDB and PostgreSQL) and checked into a new repo named mod-aes. Tested with a Kafka environment and can see FOLIO traffic is captured as messages in Kafka. Mark this ticket as complete. |