Hi,
I have an import workflow that loads a file every day from SFTP, I want to create an automated process in my import workflow that transfers the imported file to another 'Archive' folder that I created on SFTP to dump all the old files that were previously successfully loaded. Please let me know if that is possible, and if so how I can do it.
Solved! Go to Solution.
Hi,
If you're using the File collector activity, there's a setting to choose the folder where it archives files upon ingest. If you're not using File collector, you can manage files with execCommand(), 'move' in windows, 'mv' in linux, with extra flexibility such as zipping and purging archives on different properties.
Thanks,
-Jon
Views
Replies
Total Likes
Hi,
If you're using the File collector activity, there's a setting to choose the folder where it archives files upon ingest. If you're not using File collector, you can manage files with execCommand(), 'move' in windows, 'mv' in linux, with extra flexibility such as zipping and purging archives on different properties.
Thanks,
-Jon
Views
Replies
Total Likes
Hi Jon,
It is soo great to see your response. I am no longer with Accenture, so I get stuck sometimes trying to figure out on my own when I cant find help within my new firm. Thank you so much for taking time to respond.
I am not sure how to use or configure the execcommand(), so I tried using the File transfer activity and I gave my archive location under the 'file historization settings' and reran the workflow, but it still didnt transfer the file to the archive folder. Screenshot attached
I tried using 'File collector' and it keeps giving me error saying the source directory doesnt exist. since the file is on SFTP, how do I configure the external account in the File collector so it knows where to pick the file from?
Views
Replies
Total Likes
I heard from the grapevine, congrats on the new gig. If you're pulling the file down from an external sftp you have to precede the collector with a File transfer activity (or execCommand again for more features ), then set the storage dir the same for both.
Views
Replies
Total Likes
So I did like you said, and I am not sure why it keeps saying there was no file and ends the workflow. in the file collector I am giving the exact filename.csv for it to look for, even then it is not finding the file. Not sure what am I doing wrong. I even have the storage dir set the exact same for both. Also is there is a simple code that I can use or a syntax that you can share if I wanted to use the execcommand?
Views
Replies
Total Likes
In the File collector init script can you add logInfo(execCommand('ls /shared/dir'))? It'll list out the files in the dir.
These kind of issues break down to either invalid paths or insufficient permissions.
Views
Replies
Total Likes
Tried that, got this error.
Also, do you think the file permissions for the csv file has to be 777? Like the folder permission where it needs to have read, write, delete access. I dont think it applies to the csv files since they keep changing, but still wanted to check with you.
Views
Replies
Total Likes
You have to change the path there to your shared dir. 777 isn't needed here as you're not writing and have only 1 user.
Views
Replies
Total Likes
Views
Likes
Replies