Hi Community,
I am ingesting data from external sources into AEP dataset. Each table has 4-5 calculated fields. When size of data will be huge around like 1lakh records in each file with calculated fields, will it going to impact system performance. I am creating calculated fields to convert datatime into UTC format and some fields are using split function. Do we have any best practices on using calculated fields in data
mapping.
Thanks in advance
Solved! Go to Solution.
Views
Replies
Total Likes
@SantoshRa5 Short Answer - Yes, calculated fields are super handy but using calculated fields in your dataflow mappings mainly when you're working with large files, can impact ingestion performance depending on how complex the logic is. However, AEP is designed to handle such workloads.
A few best practices:
If you're just doing things like converting to UTC, splitting a timestamp, you're likely fine, just be mindful of stacking 4, 5 calculated fields per record on very large files.
@SantoshRa5 Short Answer - Yes, calculated fields are super handy but using calculated fields in your dataflow mappings mainly when you're working with large files, can impact ingestion performance depending on how complex the logic is. However, AEP is designed to handle such workloads.
A few best practices:
If you're just doing things like converting to UTC, splitting a timestamp, you're likely fine, just be mindful of stacking 4, 5 calculated fields per record on very large files.
@SantoshRa5 Just checking in — were you able to resolve your issue?
We’d love to hear how things worked out. If the suggestions above helped, marking a response as correct can guide others with similar questions. And if you found another solution, feel free to share it — your insights could really benefit the community. Thanks again for being part of the conversation!
Views
Replies
Total Likes
Views
Likes
Replies
Views
Likes
Replies