Hi,
We are configuring AEP to perform de ingestion of serveral files types. we place all the files into the root directory of our SFTP server due to security restrictions.
It's any way to configure the flow or the SFTP source connection to specifiy de file pattern name?
We need to configure several flows but we need that the several SFTP sources points to the same SFPT server.
at:
path | string |
it's possible to include wildcards? like "/myfile*.csv"
BR
Solved! Go to Solution.
Topics help categorize Community content and increase your ability to discover relevant content.
Views
Replies
Total Likes
The Adobe Experience Platform data flow will ingest all files which are present in a directory with a timestamp which is greater than the last time the dataflow has run. Currently it does have the ability to setup a restriction rule based upon file type to ingest data.
Hello @AlfMep ,
As far as I know, there is no functionality to filter files by name directly when reading files from the same folder in a single data flow from an SFTP source.
For more details, you can check the documentation here: https://experienceleague.adobe.com/en/docs/experience-platform/sources/ui-tutorials/dataflow/cloud-s...
Scheduling Flows Based on File Drop Time
If files follow different patterns and are dropped at different times (e.g., file type A at 1:00 AM and file type B at 2:00 AM):
While this isn’t an ideal solution, it can work as a temporary workaround. This is not future proof solution, if you looking for long term it is not best.
Best Practice: Use Separate Subfolders for File Types
A better approach would be to work with your internal team to organize files into separate subdirectories for each file type.
/typeA
and files of type B in /typeB
.This is a more robust and scalable solution for managing your ingestion flows effectively.
Kind regards,
Parvesh
Hi,
That means, that the flow is always collecting all the files existing into the source folder? Without checking the file name?
BR
Alf
The Adobe Experience Platform data flow will ingest all files which are present in a directory with a timestamp which is greater than the last time the dataflow has run. Currently it does have the ability to setup a restriction rule based upon file type to ingest data.
Hello @AlfMep ,
There is no straight forward solution from AEP source side with only one level of folder, however I have designed a solution for such organization restrictions is to have upstream compressed and archive the file(s) of a single flow in a single static file name archive file and placed on the common folder.
Supported Compression formats: https://experienceleague.adobe.com/en/docs/experience-platform/sources/ui-tutorials/dataflow/cloud-s...
Deciphering the above details considering tarZip for the soluting
#1 File Generation Logic
---------------------
Source1 --> generate a static compressed file called source1.tar.gz with all relevant file(s) of source with any naming convention
Source2 --> generate a static compressed file called source2.tar.gz with all relevant file(s) of source with any naming convention
Source3 --> generate a static compressed file called source3.tar.gz with all relevant file(s) of source with any naming convention
#2 Update corresponding source flow to reflect static archive file.
#3 transfer these files to SFTP and that's it.
This solution works seamlessly and has more benefits in terms transfers rates.
Let me know if you need more help and do mark this thread accordingly.
~cheers,
Naresh Nakirikanti.
Views
Likes
Replies
Views
Likes
Replies
Views
Likes
Replies
Views
Likes
Replies