datawarehouse gzip compression | Community
Skip to main content
Level 2
April 1, 2020
New

datawarehouse gzip compression

  • April 1, 2020
  • 3 replies
  • 3583 views

Hi,

 

we are using data warehouse exports.

The Files that we are receiving a quite big -> in avg. 1GB on hourly basis.

 

Our db don't support ZIP format (the same like Athena, Redshift, Snowflake etc.), so we need to convert the files first in GZIP.

 

Since unzip large files consumes a lot of time & ressources,  we would like to submit an idea:

- add the gzip compression format to the exports

- split large files automatically (like in Data Feeds) into smaller chunks of gzip files.

 

It's actually the same way Data Feeds currently exports the data.

 

Cheers,

Piotr  

3 replies

April 21, 2020

Implementing this idea would save a lot of costs. While receiving .gzip exports we could stop post-processing jobs that do "stupid" unzip-and-gzip.

October 31, 2020

please implement. We are way beyond the times when files were manually extracted and put into an Excel for Analysis. 

September 23, 2025

We have the same problem. Please implement gzip, which is widely used. In contrast to ZIP wich is archaic and not widely supported in the modern data landscape.