Is there a technical solution to import an Excel file with 60k+ rows
I need a technical solution to import an Excel file with 60k+ rows, multiple columns with different prices, into AEM (Adobe Experience Manager). These records should be filterable, searchable, and paginated. The results should later be exportable as CSV. Content Fragments are not a viable solution, as it would require parsing all the records just to populate the available filters (some filters depend on others).
The solution should be a migration of this functionality: https://billing.christianacare.org/transparency/chargemaster?combine=knee&ipid=1&ipid2=2&hsp=Wilmington+Hospital
Reading the file with each request/search is not a viable solution. The file we received is over 7MB in size, and it would take forever to read, format the data into something understandable by Java, filter it, and then return it.
Displaying the data is not the issue, not with the frontend being the bottleneck. We need a data structure that can be filtered, sorted, and paginated IN AEM, using Java.
These files are updated approximately every 6 months. The data from that file will be accessed by multiple instances of the same component or different components, and we do not want to update dozens of components every time the file changes.
Also, regardless of how the data is stored, publishing should be taken into account because the file is uploaded to Author, and the data will need to be available on Publish as well.