Calculated column can impact system performance when data load happens for 1lakhs records ?
Hi Community,
I am ingesting data from external sources into AEP dataset. Each table has 4-5 calculated fields. When size of data will be huge around like 1lakh records in each file with calculated fields, will it going to impact system performance. I am creating calculated fields to convert datatime into UTC format and some fields are using split function. Do we have any best practices on using calculated fields in data
mapping.
Thanks in advance