However, frequently gathering statistics on even small tables can become time consuming. Recently, I have been working on PeopleSoft Time and Labor. This makes heavy use of temporary records. In a single execution of TL_TIMEADMIN, several tables associated with temporary records are truncated, repopulated and analyzed many times. I discussed the problem with excessive use of truncate elsewhere giving rise to Local Write Wait.
Oracle also recognised this problem, and in version 9 of the database they introduced Optimizer Dynamic Sampling, where the database samples the data to generate statistics at statement parse time.
I am still testing, but on Oracle 10gR2 (version 10.2.0.3) I have obtained improvements in performance and stability of T&L AE processes by:
- Deleting optimizer statistics on tables associated with temporary record in order to force the optimizer to sample at parse time
- Locking optimizer statistics to prevents the %UpdateStats macro from putting them back on. Tables with locked statistics are also omitted by GATHER_SCHEMA_STATS and GATHER_DATABASE_STATS (unless the force option is specified) and hence also by the delivered maintenance window job to refresh stale statistics.
- Implementing alternative DDL model that uses a PL/SQL packaged function to suppress the error when attempting to collect statistics on table whose statistics are locked (see %UpdateStats() -v- Optimizer Dynamic Sampling). This also addresses the the mix-up in the DDL models
- The final piece of the puzzle has been to set OPTIMIZER_DYNAMIC_SAMPLING to 4 at instance level. I certainly have had problems with this parameter set to the default of 2.
- Level 2: Apply dynamic sampling to all unanalyzed tables.
- Level 3: As Level 2, plus all tables for which standard selectivity estimation used a guess for some predicate that is a potential dynamic sampling predicate.
- Level 4: As Level 3, plus all tables that have single-table predicates that reference 2 or more columns.
I started off looking for tables that had recently been analysed.
The following script identifies all instances of temporary tables associated with temporary records, and then deletes and locks the statistics. I started by restricting it to list of specific tables, but I think it would be perfectly reasonable to take this approach with all temporary records.
Updated 11.2.2009: I have updated my advice on the use of Optimiser Dynamic Sampling (see %UpdateStats -v- Optimizer Dynamic Sampling. I still consider this to be a useful feature, but I have found scenarios where Oracle has not chosen a better plan that it did choose with explicitly gathered statistics. Therefore, I still suggest locking statistics on temporary working storage record, but where batch programs have been coded to explicitly update statistics then dbms_stats should be called with the force option to override the lock. I have updated my
DDL model wrapper script accordingly.
The scripts in this posting can be downloaded from my website
2 comments :
This is very interesting and more straightforward approach. We have a PL/SQL package that will generate a set of triggers, sequence, and debug table to capture data in autonomous transactions. We then move the data from the debug table back to the temporary table to generate Oracle statistics. From there we can import and export statistics to other databases as needed. This is more time intensive and your idea would save quite a bit of time.
David, do you recommend the 4 steps above as the best solution for 11.1.0.7 also? We have just implemented peoplesoft financials 8.9 and have started having issues related to temp table stats (fs_bp, fs_jgen, fs_gledit) and I'm going to implement your suggestions if they apply to 11.1.0.7 also.
Post a Comment