Ok, indeed the generic export function is not designed to handle big files like this. Maybe you can try using Targeting workflows like this instead:
Use a Query activity to get the data you need.
Optionally if you want to segment the data so that the final are smaller, add a Split activity and create as much segments as you need to divide the data incoming from the Query.
Add Data extraction (file) activities at the end of each outbound transition coming from the Split activity. In this activity, you can specify the name of the file, the data you need to extract, etc.
As there is no easy way to directly download files on the local machine from workflow activities, you can export it on the Campaign server so that it can be accessed in HTTP, or use a file transfer activity to make it available on a FTP server.
Yes, I'm Trying to export in CSV format. However, I feel the approach which I'm trying to do appears to be a bad practice. Working on a local file of Huge size(3GB) is not advisable. This may cause loss of data during any importing activities.
What kind of file format are you using i.e. CSV? I have implemented the similar thing by breaking a file in small files 500 MB by using a JS script to write the file in zipped format to a network file system which is quite good. if this sis something you want i can help you with that.
Have you tried segmenting the data to export 2 or 3 smaller files to see if this issue could be caused by the size of the final file? I'll check if there is a known limit or an easy work around.
I have tried segmenting the data but as there is no split option in Jobs, I'm unable to achieve the expected result. We have filtering conditions where we can filter data but cannot Segment the data (the is as per my understanding, please correct me if I'm wrong).
I have tried to setup an external SFTP account between two instances but was not successful.
It would be helpful if you suggest any other approach where we can directly export the data to local.