Speed up big queries




Hi all,

I am looking for best approach to execute workflow below several times per day. There are two big tables (connection via FDA !!!) . Table AUDIT contains IDs and timestamp of records in CORE table which are modified. Let's say workflow will start every 30 minutes and we need to fetch data from CORE which is modified after last execution of workflow.


  • how to pass a date of last execution into AUDIT query (for example for an option which will be created to store date) ?
  • What about passing IDs from AUDIT results to CORE query? This can be done in JS file for sure, but in GUI is it possible? It is ORACLE behind, so we probably can expect an error if we pass more than 1000 values in IN clause (... where ID  in (1000 different values)...)

Is there any better way to optimize fetching data from these tables via FDA in general?

Thanks in advance.



Accepted Solutions (1)

Accepted Solutions (1)