Your achievements

Level 1

0% to

Level 2

Tip /
Sign in

Sign in to Community

to gain points, level up, and earn exciting badges like the new
Bedrock Mission!

Learn more

View all

Sign in to view all badges

SOLVED

Cache POST call in dispatcher/fastly in AEM cloud

Avatar

Level 3

Can anyone guide, how to cache POST call in dispatcher in AEM cloud?

1 Accepted Solution

Avatar

Correct answer by
Employee Advisor

There are 3 rules which are backed into the dispatcher regarding caching:

  1. Only the response of GET requests are cached.
  2. The requested file must have an extension
  3. The statuscode of the request must be 200.

And then there are other options you can configure, for example regarding paths, extensions etc.

 

Also the HTTP RFCs mention that only GET should be cached, because POST requests are designed to change status, and therefor caching their result and returning them on a different POST request will return inconsistent data.

 

If course you can build your own code into AEM to cache POST requests (or part of it), but by default neither AEM nor the Dispatcher support that. And in case of AEM as a Cloud Service also the CDN does not support it.

View solution in original post

0 Replies

Avatar

Correct answer by
Employee Advisor

There are 3 rules which are backed into the dispatcher regarding caching:

  1. Only the response of GET requests are cached.
  2. The requested file must have an extension
  3. The statuscode of the request must be 200.

And then there are other options you can configure, for example regarding paths, extensions etc.

 

Also the HTTP RFCs mention that only GET should be cached, because POST requests are designed to change status, and therefor caching their result and returning them on a different POST request will return inconsistent data.

 

If course you can build your own code into AEM to cache POST requests (or part of it), but by default neither AEM nor the Dispatcher support that. And in case of AEM as a Cloud Service also the CDN does not support it.