We recently realized we're well over our licensed storage amount. In an effort to reduce the amount of storage we're using, we've been trying to identify a way to view the overall size of each dataset. Of course there's a way to view how much data has been ingested over the past 7 or 30 days, but has anyone figured out how to query or view the total size of each dataset?
If not, would this be helpful to you as a feature request to Adobe?
Hi @coreyac1 , I agree if it doesn't exists its good to have that metric.
But to your case for deletion I would have nominated all those datasets that are with high number of records considering the fact more the records more would be the size of dataset (few exceptions).
This workaround might help you.
We have a simmilar situation in which we dont fully understand how the actual consumed storage is calculated. It seems that the actual sum of all datasets is way below the actual consumed storage which indicates that a lot of other things consume storage.
Unfortuantely this is not really transparent and almost impossible to understand the actual source of the storage consumption.
If anyone has a resource to read about this topic I would be highly thankful.
When it comes to storage I know you are looking just at data sets as your main culprit, but I suggest you also look at segmentation - i.e. if its streaming or batch or edge. Also, look at all the events triggered, there could be something going on there that impacts your storage.
Adobe Support can give you the number but nothing from UI point of view.