Your achievements

Level 1

0% to

Level 2

Tip /
Sign in

Sign in to Community

to gain points, level up, and earn exciting badges like the new
BedrockMission!

Learn more

View all

Sign in to view all badges

piotrspring
Community profile piotrspring Level 3
Job title here
Location here
8 BADGES
Level 3

Level 3

Learn more
Joined the community 30-03-2020 7:03:08 AM
Offline
Top badges earned by piotrspring
Customize the badges you want to showcase on your profile
Re: Data Warehouse - requests gzip compression
Avatar
Give Back 3
Level 3
piotrspring
Level 3

Likes

22 likes

Total Posts

5 posts

Correct reply

0 solutions
Top badges earned
Give Back 3
Give Back
Ignite 1
Shape 1
Boost 5
View profile
piotrspring
- Adobe Analytics
500+ MB for Data Warehouse, for all reporting suites it's about 2-3 GB hourly

Views

4.3K

Likes

0

Replies

0
Betreff: Data Warehouse - requests gzip compression
Avatar
Give Back 3
Level 3
piotrspring
Level 3

Likes

22 likes

Total Posts

5 posts

Correct reply

0 solutions
Top badges earned
Give Back 3
Give Back
Ignite 1
Shape 1
Boost 5
View profile
piotrspring
- Adobe Analytics
submited as an idea https://experienceleaguecommunities.adobe.com/t5/adobe-analytics-ideas/datawarehouse-gzip-compression/idi-p/357547

Views

4.3K

Likes

0

Replies

0
datawarehouse gzip compression
Avatar
Give Back 3
Level 3
piotrspring
Level 3

Likes

22 likes

Total Posts

5 posts

Correct reply

0 solutions
Top badges earned
Give Back 3
Give Back
Ignite 1
Shape 1
Boost 5
View profile
piotrspring
- Adobe Analytics
Hi, we are using data warehouse exports.The Files that we are receiving a quite big -> in avg. 1GB on hourly basis. Our db don't support ZIP format (the same like Athena, Redshift, Snowflake etc.), so we need to convert the files first in GZIP. Since unzip large files consumes a lot of time & ressources, we would like to submit an idea:- add the gzip compression format to the exports- split large files automatically (like in Data Feeds) into smaller chunks of gzip files. It's actually the same w...

Views

2.8K

Likes

13

Replies

2
Re: Data Warehouse - requests gzip compression
Avatar
Give Back 3
Level 3
piotrspring
Level 3

Likes

22 likes

Total Posts

5 posts

Correct reply

0 solutions
Top badges earned
Give Back 3
Give Back
Ignite 1
Shape 1
Boost 5
View profile
piotrspring
- Adobe Analytics
Hi Adrey, the average size about 500MB, but 30% of the exports are beyond 1GB -> up to 1.5GB in a single file for each of our reporting suites.The exports are scheduled on hourly basis. Currently we divide those with segments into smaller ones, but still it's a lot of overhead:- to manage those exports (like add/remove columns)- unzip and gzip Ok, I'll submit this as an idea. Thx!

Views

4.3K

Likes

0

Replies

0
Data Warehouse - requests gzip compression
Avatar
Give Back 3
Level 3
piotrspring
Level 3

Likes

22 likes

Total Posts

5 posts

Correct reply

0 solutions
Top badges earned
Give Back 3
Give Back
Ignite 1
Shape 1
Boost 5
View profile
piotrspring
- Adobe Analytics
Hi, we are using data warehouse requests and everything works fine. Unfortunately we put a lot of ressources to unzip the exports 😞 Is there plan to change the compression type or add gzip compression in the next future? Like in Data Feeds, where bigger files are automatically splitted into smaller ones? ZIP it's not a supported format by Snowflake, Athena, etc .... Instead of unzip -> gzip -> load, we would like to load the exports directly. Regards,Piotr

Views

6.2K

Likes

9

Replies

5