Expand my Community achievements bar.

SOLVED

Configure Connection or Flow to retrieve via SFTP files by patten name

Avatar

Level 1

Hi,

 

We are configuring AEP to perform de ingestion of serveral files types. we place all the files into the root directory of our SFTP server due to security restrictions.

It's any way to configure the flow or the SFTP source connection to specifiy de file pattern name?

 

We need to configure several flows but we need that the several SFTP sources points to the same SFPT server.

 

https://developer.adobe.com/experience-platform-apis/references/flow-service/#operation/createSource...

at:

path
string

it's possible to include wildcards? like  "/myfile*.csv"

 

BR

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

1 Accepted Solution

Avatar

Correct answer by
Employee

The Adobe Experience Platform data flow will ingest all files which are present in a directory with a timestamp which is greater than the last time the dataflow has run.  Currently it does have the ability to setup a restriction rule based upon file type to ingest data.  

View solution in original post

4 Replies

Avatar

Community Advisor

Hello @AlfMep , 

 

As far as I know, there is no functionality to filter files by name directly when reading files from the same folder in a single data flow from an SFTP source.

For more details, you can check the documentation here: https://experienceleague.adobe.com/en/docs/experience-platform/sources/ui-tutorials/dataflow/cloud-s...

Suggested Workarounds:

  1. Scheduling Flows Based on File Drop Time
    If files follow different patterns and are dropped at different times (e.g., file type A at 1:00 AM and file type B at 2:00 AM):

    • Configure Flow A to run at 1:05 AM to process file type A.
    • Configure Flow B to run at 2:05 AM to process file type B.
    • To avoid Flow A picking up file type B, ensure that some required fields are unique to the file type. This way, the ingestion fails gracefully if Flow A encounters an unexpected file format.

    While this isn’t an ideal solution, it can work as a temporary workaround. This is not future proof solution, if you looking for long term it is not best.

  2. Best Practice: Use Separate Subfolders for File Types
    A better approach would be to work with your internal team to organize files into separate subdirectories for each file type.

    • For example, place files of type A in /typeA and files of type B in /typeB.
    • Then configure each data flow to read from the respective directory.
    • This will ensure accurate and real-time ingestion without unnecessary complexities.

This is a more robust and scalable solution for managing your ingestion flows effectively.

 

Kind regards,
Parvesh

 

Avatar

Level 1

Hi,

That means, that the flow is always collecting all the files existing into the source folder? Without checking the file name?

 

BR

Alf

Avatar

Correct answer by
Employee

The Adobe Experience Platform data flow will ingest all files which are present in a directory with a timestamp which is greater than the last time the dataflow has run.  Currently it does have the ability to setup a restriction rule based upon file type to ingest data.  

Avatar

Community Advisor

Hello @AlfMep ,

 

There is no straight forward solution from AEP source side with only one level of folder, however I have designed a solution for such organization restrictions is to have upstream compressed and archive the file(s) of a single flow in a single static file name archive file and placed on the common folder.

 

Supported Compression formats: https://experienceleague.adobe.com/en/docs/experience-platform/sources/ui-tutorials/dataflow/cloud-s...

 

Deciphering the above details considering tarZip for the soluting

 

#1 File Generation Logic

---------------------

Source1 --> generate a static compressed file called source1.tar.gz with all relevant file(s) of source with any naming convention

Source2 --> generate a static compressed file called source2.tar.gz with all relevant file(s) of source with any naming convention 

Source3 --> generate a static compressed file called source3.tar.gz with all relevant file(s) of source with any naming convention

 

#2 Update corresponding source flow to reflect static archive file.

 

#3 transfer these files to SFTP and that's it.

 

This solution works seamlessly and has more benefits in terms transfers rates.

 

Let me know if you need more help and do mark this thread accordingly.

 

~cheers,

Naresh Nakirikanti.