Hi All,
We have a schema with huge amount of records and the size of the data is greater than 3 G.B.
We are trying to export this data using Jobs into our local machine but, every time when we are exporting the file we are getting the following error:
12/20/2016 5:46:24 AM XSV-350122 An error occurred and the process has been stopped.
Also the file is getting downloaded but only up to 2 GB.
Is there a way to export complete data through Jobs or Workflow script approach.
Please provide your suggestion or solutions.
Note: Through SFTP we were able to download 3 GB file but this could take a lot time for both download and upload.
Thanks
Solved! Go to Solution.
Views
Replies
Total Likes
Ok, indeed the generic export function is not designed to handle big files like this. Maybe you can try using Targeting workflows like this instead:
As there is no easy way to directly download files on the local machine from workflow activities, you can export it on the Campaign server so that it can be accessed in HTTP, or use a file transfer activity to make it available on a FTP server.
Hope this helps,
Florent.
Hello,
Have you tried segmenting the data to export 2 or 3 smaller files to see if this issue could be caused by the size of the final file? I'll check if there is a known limit or an easy work around.
Florent.
Views
Replies
Total Likes
fleltuerto wrote...
Hello,
Have you tried segmenting the data to export 2 or 3 smaller files to see if this issue could be caused by the size of the final file? I'll check if there is a known limit or an easy work around.
Florent.
Hi Florent,
I have tried segmenting the data but as there is no split option in Jobs, I'm unable to achieve the expected result. We have filtering conditions where we can filter data but cannot Segment the data (the is as per my understanding, please correct me if I'm wrong).
I have tried to setup an external SFTP account between two instances but was not successful.
It would be helpful if you suggest any other approach where we can directly export the data to local.
Thanks
Views
Replies
Total Likes
Ok, indeed the generic export function is not designed to handle big files like this. Maybe you can try using Targeting workflows like this instead:
As there is no easy way to directly download files on the local machine from workflow activities, you can export it on the Campaign server so that it can be accessed in HTTP, or use a file transfer activity to make it available on a FTP server.
Hope this helps,
Florent.
So far I tried to export the file directly through Data Extraction but may be due to firewall restrictions it is not getting exported.
Even I tried to connect to Putty and it didn't work.
Finally the last thing which I did is used a SFTP software (xftp) to transfer files between the servers and it worked.
But the concern here is even though it copies the file to destination, it requires client side network connectivity as it downloads a temporary file while transferring the data.
If at all there is any other approach apart from the above. Please let me know. As this might help reduce a lot of effort.
Thanks
Views
Replies
Total Likes
What kind of file format are you using i.e. CSV? I have implemented the similar thing by breaking a file in small files 500 MB by using a JS script to write the file in zipped format to a network file system which is quite good. if this sis something you want i can help you with that.
Views
Replies
Total Likes
Hi Amit,
Yes, I'm Trying to export in CSV format. However, I feel the approach which I'm trying to do appears to be a bad practice. Working on a local file of Huge size(3GB) is not advisable. This may cause loss of data during any importing activities.
Thank you for your help and effort fleltuerto and Amit_Kumar.
Views
Replies
Total Likes
Views
Likes
Replies
Views
Likes
Replies