we are using data warehouse exports.
The Files that we are receiving a quite big -> in avg. 1GB on hourly basis.
Our db don't support ZIP format (the same like Athena, Redshift, Snowflake etc.), so we need to convert the files first in GZIP.
Since unzip large files consumes a lot of time & ressources, we would like to submit an idea:
- add the gzip compression format to the exports
- split large files automatically (like in Data Feeds) into smaller chunks of gzip files.
It's actually the same way Data Feeds currently exports the data.
Implementing this idea would save a lot of costs. While receiving .gzip exports we could stop post-processing jobs that do "stupid" unzip-and-gzip.