Expand my Community achievements bar.

We are excited to introduce our latest innovation to enhance the Adobe Campaign user experience — the Adobe Campaign v8 Web User Interface!
SOLVED

HOW TO EXPORT A HUGE FILE THROUGH GENERIC IMPORTS AND EXPORTS IN JOBS?

Avatar

Level 2

Hi All,

We have a schema with huge amount of records and the size of the data is greater than 3 G.B.

We are trying to export this data using Jobs into our local machine but, every time when we are exporting the file we are getting the following error: 

12/20/2016 5:46:24 AM    XSV-350122 An error occurred and the process has been stopped.

Also the file is getting downloaded but only up to 2 GB.

Is there a way to export complete data through Jobs or Workflow script approach.

Please provide your suggestion or solutions.

Note: Through SFTP we were able to download 3 GB file but this could take a lot time for both download and upload.

Thanks

1 Accepted Solution

Avatar

Correct answer by
Level 10

Ok, indeed the generic export function is not designed to handle big files like this. Maybe you can try using Targeting workflows like this instead:

  1. Use a Query activity to get the data you need.
  2. Optionally if you want to segment the data so that the final are smaller, add a Split activity and create as much segments as you need to divide the data incoming from the Query.
  3. Add Data extraction (file) activities at the end of each outbound transition coming from the Split activity. In this activity, you can specify the name of the file, the data you need to extract, etc.

As there is no easy way to directly download files on the local machine from workflow activities, you can export it on the Campaign server so that it can be accessed in HTTP, or use a file transfer activity to make it available on a FTP server.

Hope this helps,

Florent.

View solution in original post

6 Replies

Avatar

Level 10

Hello,

Have you tried segmenting the data to export 2 or 3 smaller files to see if this issue could be caused by the size of the final file? I'll check if there is a known limit or an easy work around.

Florent.

Avatar

Level 2

fleltuerto wrote...

Hello,

Have you tried segmenting the data to export 2 or 3 smaller files to see if this issue could be caused by the size of the final file? I'll check if there is a known limit or an easy work around.

Florent.

 

Hi Florent,

I have tried segmenting the data but as there is no split option in Jobs, I'm unable to achieve the expected result. We have filtering conditions where we can filter data but cannot Segment the data (the is as per my understanding, please correct me if I'm wrong).

I have tried to setup an external SFTP account between two instances but was not successful. 

It would be helpful if you suggest any other approach where we can directly export the data to local.

Thanks

Avatar

Correct answer by
Level 10

Ok, indeed the generic export function is not designed to handle big files like this. Maybe you can try using Targeting workflows like this instead:

  1. Use a Query activity to get the data you need.
  2. Optionally if you want to segment the data so that the final are smaller, add a Split activity and create as much segments as you need to divide the data incoming from the Query.
  3. Add Data extraction (file) activities at the end of each outbound transition coming from the Split activity. In this activity, you can specify the name of the file, the data you need to extract, etc.

As there is no easy way to directly download files on the local machine from workflow activities, you can export it on the Campaign server so that it can be accessed in HTTP, or use a file transfer activity to make it available on a FTP server.

Hope this helps,

Florent.

Avatar

Level 2

So far I tried to export the file directly through Data Extraction but may be due to firewall restrictions it is not getting exported.

Even I tried to connect to Putty and it didn't work.

Finally the last thing which I did is used a SFTP software (xftp) to transfer files between the servers and it worked.

But the concern here is even though it copies the file to destination, it requires client side network connectivity as it downloads a temporary file while transferring the data.

 

If at all there is any other approach apart from the above. Please let me know.  As this might help reduce a lot of effort.

Thanks smiley

Avatar

Level 10

What kind of file format are you using i.e. CSV? I have implemented the similar thing by breaking a file in small files 500 MB by using a JS script to write the file in zipped format to a network file system which is quite good. if this sis something you want i can help you with that.

Avatar

Level 2

Hi Amit,

Yes, I'm Trying to export in CSV format. However, I feel the approach which I'm trying to do appears to be a bad practice. Working on a local file of Huge size(3GB) is not advisable. This may cause loss of data during any importing activities.

Thank you for your help and effort fleltuerto and Amit_Kumarsmiley