Your achievements

Level 1

0% to

Level 2

Tip /
Sign in

Sign in to Community

to gain points, level up, and earn exciting badges like the new
Bedrock Mission!

Learn more

View all

Sign in to view all badges

SOLVED

Data Warehouse - requests gzip compression

piotrspring
Level 3
Level 3

Hi,

 

we are using data warehouse requests and everything works fine. 

Unfortunately we put a lot of ressources to unzip the exports 😞

 

Is there plan to change the compression type or add gzip compression in the next future?

Like in Data Feeds, where bigger files are automatically splitted into smaller ones?

 

ZIP it's not a supported format by Snowflake, Athena, etc .... 

Instead of unzip -> gzip -> load, we would like to load the exports directly.

 

Regards,

Piotr

 

1 Accepted Solution
Andrey_Osadchuk
Correct answer by
Community Advisor
Community Advisor

Hi Piotr,

 

Out of curiosity, what is the average file size (.zip) and export frequency?

 

P.S.: I would advise to submit it as an idea to get votes for this feature.

View solution in original post

3 Replies
Andrey_Osadchuk
Correct answer by
Community Advisor
Community Advisor

Hi Piotr,

 

Out of curiosity, what is the average file size (.zip) and export frequency?

 

P.S.: I would advise to submit it as an idea to get votes for this feature.

View solution in original post

piotrspring
Level 3
Level 3

Hi Adrey,

 

the average size about 500MB, but 30% of the exports are beyond 1GB -> up to 1.5GB in a single file for each of our reporting suites.

The exports are scheduled on hourly basis. Currently we divide those with segments into smaller ones, but still it's a lot of overhead:

- to manage those exports (like add/remove columns)

- unzip and gzip 

 

Ok, I'll submit this as an idea.

 

Thx!  

 

 

 

Andrey_Osadchuk
Community Advisor
Community Advisor
500+ Mb on hourly basis — is it referring to Data Warehouse or Data Feeds?
piotrspring
Level 3
Level 3
500+ MB for Data Warehouse, for all reporting suites it's about 2-3 GB hourly