Hi,
we are using data warehouse requests and everything works fine.
Unfortunately we put a lot of ressources to unzip the exports
Is there plan to change the compression type or add gzip compression in the next future?
Like in Data Feeds, where bigger files are automatically splitted into smaller ones?
ZIP it's not a supported format by Snowflake, Athena, etc ....
Instead of unzip -> gzip -> load, we would like to load the exports directly.
Regards,
Piotr
Solved! Go to Solution.
Hi Piotr,
Out of curiosity, what is the average file size (.zip) and export frequency?
P.S.: I would advise to submit it as an idea to get votes for this feature.
Hi Piotr,
Out of curiosity, what is the average file size (.zip) and export frequency?
P.S.: I would advise to submit it as an idea to get votes for this feature.
Hi Adrey,
the average size about 500MB, but 30% of the exports are beyond 1GB -> up to 1.5GB in a single file for each of our reporting suites.
The exports are scheduled on hourly basis. Currently we divide those with segments into smaller ones, but still it's a lot of overhead:
- to manage those exports (like add/remove columns)
- unzip and gzip
Ok, I'll submit this as an idea.
Thx!
Views
Replies
Total Likes
Views
Replies
Total Likes
Views
Replies
Total Likes
Views
Replies
Total Likes