Currently I was facing issues in ACS import jobs when ever more than 1000 files loading in a single run.
My import workflow has one transfter activity to pick files from external SFTP, load activity to match the data types,enrichment to redirect values ,split and update activity alone.
As per my observation if single file is comming in an hour with 2Million records there is no issues but if multiple files like 1000 comes in a single run the job getting hangged running for 1-2 days and getting failed.
As there is no scripts available to code to pick first 50 files from SFTP in ACS we just scheduling every one hour to pick the files but if the number of files increase per run its creating issue like workflow hang.
Can some provide a better solution for it and confirm me what could be the maximum limit of files per run of any import workflow where load,enrichment,split and update activities will work properly.
I don't think that workflows are designed to handle that much. I don't know if there's a particular limit but when several files are picked, activities are run as many times as the number of files, which can lead to a huge number of working tables if there are many files.
I'd recommend checking with support if there is a workaround (I don't know any besides working on your end to reduce the number of files - as you said, better have 1 huge file than 1000 small ones).
In our project we want to execute multiple files at a time. We will have atmost 30 files in a single run. But we are not able to achieve the processing of multiple files. it is executing just one file even though all the files are downloaded by Transfer File Activity.