Hello,
We're writing 4,000,000 at a time to a custom broadlog table (on prem instance, Oracle back end, 19c, AC build 9349). This table is partitioned on tscreated and is quite large (163,740,279 records, 256 GB). Stats are gathered each morning by a cron job on the database server. Old partitions are dropped as the data ages out.
As I watch the job loading the 4mm records via TOAD Session browser, the behavior is this: a few hundred thousand records are written, and then up_UpdateTableStats is called. Then a few more hundred thousand are written, and up_UpdateTableStats is again called. This process really slows down the load.
So, the question is: how can we turn off the automated up_UpdateTableStats call during insert/update?
Thanks,
Jim
Solved! Go to Solution.
Views
Replies
Total Likes
Hi @CedricRey ,
We have a solution. We did two things, so I'm not sure which (or both) was the actual fix.
1. Set the Batch Size to some value greater than the number of records to-be inserted.
2. Modify the up_UpdateTableStats procedure to ignore the pain-point table:
CREATE OR REPLACE PROCEDURE DADOBESYS.up_UpdateTableStats(sSchema varchar2, sTable varchar2, iPercent INTEGER) IS
BEGIN
IF UPPER(sTable) IN ('TMPBROADCAST', 'TMPBROADCASTPAPER') THEN
DBMS_STATS.SET_TABLE_STATS(USER, 'TMPBROADCAST', numrows=> 2000000, numblks=>40000);
ELSIF UPPER(sTable) IN ('HIDDEN') THEN
dbms_output.put_line('Skipping stats for HIDDEN table');
ELSE
DBMS_STATS.GATHER_TABLE_STATS(ownname => sSchema,
tabname => sTable,
estimate_percent => iPercent,
cascade => true,
degree =>24);
END IF;
END;
/
Thanks for your help!
Jim
I'm not sure this is linked (because you talk about hundred thousand, and default is ten thousand), but can you try to change the "Bacth size" parameter in the writer activity ?
As I sayed default value is 10k, but you can give a try to see if it's have an effect on the "up_UpdateTableStats" calls frequency ?
Cedric
Views
Replies
Total Likes
We'll give it a go. The documentation doesn't give me much hope, though.
https://experienceleague.adobe.com/docs/experience-cloud-kcs/kbarticles/KA-15164.html?lang=en
Thanks, @CedricRey
Jim
Views
Replies
Total Likes
Hi @CedricRey ,
We have a solution. We did two things, so I'm not sure which (or both) was the actual fix.
1. Set the Batch Size to some value greater than the number of records to-be inserted.
2. Modify the up_UpdateTableStats procedure to ignore the pain-point table:
CREATE OR REPLACE PROCEDURE DADOBESYS.up_UpdateTableStats(sSchema varchar2, sTable varchar2, iPercent INTEGER) IS
BEGIN
IF UPPER(sTable) IN ('TMPBROADCAST', 'TMPBROADCASTPAPER') THEN
DBMS_STATS.SET_TABLE_STATS(USER, 'TMPBROADCAST', numrows=> 2000000, numblks=>40000);
ELSIF UPPER(sTable) IN ('HIDDEN') THEN
dbms_output.put_line('Skipping stats for HIDDEN table');
ELSE
DBMS_STATS.GATHER_TABLE_STATS(ownname => sSchema,
tabname => sTable,
estimate_percent => iPercent,
cascade => true,
degree =>24);
END IF;
END;
/
Thanks for your help!
Jim