Expand my Community achievements bar.

SOLVED

exceeding guardrails in data ingestion in AEP - what aep is then doing

Avatar

Level 8

What AEP is doing if we send for example more than 20GB batch data in one hour? How is it queued? Or if we exceed number of requests?

I went through this guardrails but customer has additional pre-sales questions - https://experienceleague.adobe.com/en/docs/experience-platform/ingestion/guardrails

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

1 Accepted Solution

Avatar

Correct answer by
Level 2

if the guardrails are mentioned as hard limit, then the platform will not allow to exceed that guardrail. if not, platform will continue to accept but there will be delay in processing or/and there will be performance degradation. but if we exceed the guardrails in extremely high volume then there will be error as well and the ingestion will timeout. 

 

In case of streaming data exceeding the guardrails continuously and platform is taking lot of time in evaluating the streaming segment for each streaming ingestion, then you will see the impact where the streaming segment evaluation will be stopped as well temporarily.

 

--ssj

View solution in original post

3 Replies

Avatar

Community Advisor

Hi @Michael_Soprano , While it is not explicitly documented, it's possible that AEP internally queues excess data for later processing when the system load decreases, This has to be checked with product team. Otherwise,

When you exceed the 20GB hourly batch ingestion limit in AEP, the platform will either return an error indicating that the ingestion limit has been exceeded or may throttle your requests, slowing down ingestion or temporarily blocking further uploads.

you'll encounter similar behavior if the number of concurrent requests and the overall request rate limit exceeds.

Avatar

Correct answer by
Level 2

if the guardrails are mentioned as hard limit, then the platform will not allow to exceed that guardrail. if not, platform will continue to accept but there will be delay in processing or/and there will be performance degradation. but if we exceed the guardrails in extremely high volume then there will be error as well and the ingestion will timeout. 

 

In case of streaming data exceeding the guardrails continuously and platform is taking lot of time in evaluating the streaming segment for each streaming ingestion, then you will see the impact where the streaming segment evaluation will be stopped as well temporarily.

 

--ssj

Avatar

Community Advisor

@Michael_Soprano 
One thing to note there is that the 20GB limit per hour is regarding Batch Ingestion API. Ingestion from batch source can handle up to 200GB per hour.