Tuning The Untunable: When Indexes and Optimizer Don’t Help

June 8, 2014

In 2004, I gave a tuning presentation in the Boston area at NOUG

الرئيسية

If you haven’t seen it, do take a look. It actually went viral and I received emails from around the world.

The PDF can be found here:

Oracle Performance Tuning

Recently, I gave a tuning presentation at the Manitoba Oracle User’s Group
http://mboug.wordpress.com/

This MBOUG presentation is a follow up. It covered a number of very mysterious tuning scenarios that I’ve encountered since NOUG, and the techniques I used to overcome them. In these cases, using optimizer and indexes usually did not help at all.

The PDF can be found here:   Read the rest of this entry »


HD Tune – Fabulous Windows Hard Drive Testing Software!

January 1, 2013

Recently I tested a number of my older Windows machines with HD Tune. It’s fabulous performance testing software!

HD Tune

HD Tune

HD Tune tests the slowest part of any computer. The hard drive.

If you have seen my database tuning presentation, you will know that it always amazes me why so many developers don’t realize that the hard drive is the slowest part of any system, and look to remove lines of code that run in memory. Sometimes only once! Duh!

What was really interesting was the the hard drives of three of my old machines ran at an average speed of 20, 50, and 70 megabytes per second. But one machine was running at less than 10 megabytes per second! It confirmed what I knew; that computer was really slow.

HD Tune is fast, easy and intuitive. Reading manuals is not required. Installs in one motion. Only works on Windows though.

The other really good testing software I’ve used is Passmark, that I reviewed previously.  Although, by just looking at the hard drive, HDTune seems to approximate the entire performance pretty closely.

Now, should I just get some solid state drives? Or just buy some new computers? Hmm.

Do check it out!


Oracle Applications R12: More Schema Analysis

December 18, 2010

Oracle Applications R12: More Schema Analysis

Some more analysis of the schemas and objects in Oracle Applications R12 sample database, VIS.

Lots of objects, triggers, and packages.

And over 57 million lines of source code in the APPS schema alone!

Read the rest of this entry »


Oracle Applications R12 Schema Analysis

December 17, 2010

Oracle Apps R12 Schema Analysis

Continuing on my series on analysis, here is some of the initial analysis I did after installing Oracle applications R12.  Much of the analysis involves queries to the Oracle data dictionary under the covers.  This analysis is on the sample “database”, VIS, which is actually composed of many schemas.

I also did a fair bit of analysis from a DBA standpoint: tablespaces, logs, processes, DBMS_JOBS, etc.  And other analysis on the unix and middle tier side.  Tuning.  I’ll try to keep it all organized in different posts.

When I’ve started a new position, these are the types of queries I might run on the schemas to get a feel for the systems.  Just how was it made? Does it use PK, triggers, or external code to enforce data integrity?  What are the most important tables?  What are potential performance issues? Any issues?  And so on.

Many of these queries you won’t find the GUIs like SQL Developer, or Toad.  You have to write them yourself.  It pays big dividends to know the data dictionary!

(Sorry for the odd font. WordPress has not made changing the font for select sets of text easy or obvious. If anyone knows the secret, on how to change fonts in WordPress A LA Wordpad, please let me know. )

------------------------------------------------
------------------------------------------------

How many tables are there?

Read the rest of this entry »


Some Really Good PC Performance Testing Software!

November 24, 2010

After my rant the other week about video software that didn’t work, I have been very pleasantly surprised tonight by a number of benchmarking programs, so I felt some praise was in order.

I’ve noticed that one of my desktops, is considerably faster than the other one, even though it has much less RAM.  So, I wanted to stress test the machines, and get some real numbers.
Read the rest of this entry »


System Performance: Cobol Versus Database

October 7, 2010

More Architecture Mistakes To Avoid, Database Design Mistakes To Avoid, Tuning

When I started working with Oracle, I helped to replace a Cobol system. For a while, the company systems ran half on Cobol, and half on Oracle.  The Oracle 7 system was pretty slow then.  After the cold backups had run overnight, it was not unusual in the morning to see a query against a 700K row table take about 30 seconds.

I was asked to write a C program that only interfaced with the Cobol flat files, and read Cobol transaction records.  Each transaction was stored in multiple rows, and given repeating groups of records, could vary in size from a few rows, to dozens of rows.  The files contained about 700K to 800K rows altogether.  The program grabbed information out of the multiple lines, tallied the numbers up, and output one line of summary. Lots of IF statements, and loops.  About 800 lines of code.

I expected this program to take at least 30 seconds, given that database queries to similar size tables took that long, and that there was so much logic and so many lines of code.

In fact, the program ran in about 5 seconds!  I was really surprised.

But when you think of it, it should be obvious that Read the rest of this entry »


MUCK – Massively Unified Code-Key, Generic Three Table Data Model

September 21, 2010

Another Database Design Mistake To Avoid.   Actually, one of the worst DB design mistakes you can make.

One of the worst things you can do to a database system is to make a generic data model of three tables for the whole system. A variation is to use three tables to combine multiple lookup tables into one.

Instead of standard tables and columns, three tables are used for everything.

One table each for:
entity/table
attribute/column/field
values of the data
Read the rest of this entry »


Combining Identifying Data, Transaction Data, and LOBs in a Single Table

September 19, 2010

Another Database Design Mistake to Avoid

In another post, I mentioned that as a rule of thumb, if data has a 1:1 relationship, all the fields should be in the same table.

Of course, with many rules, you can go overboard.  Blindly following rules is the mark of a beginner.  Search for:
The Dreyfus Model of Skills Acquisition

Here is a caveat.
Read the rest of this entry »


Referential Integrity and Performance

September 18, 2010

There are three ways to implement Referential Integrity in a database system.  And each method results in different database performance.

1)
Primary and Foreign Keys:
Fastest.  Most structured.  Declarative.  No coding required.

2)
Triggers:
in between.

3)
Application code:
Slowest.  Least Structured.  Procedural.  The most coding required.   If the code is found in applications outside of Oracle, network traffic gets involved and is even slower.
Read the rest of this entry »


Stamping Out Cartesian Products

September 15, 2010

Over the last number of years, I kept finding Cartesian Products (CP) in queries and other objects like views.

These have caused a number of serious performance issues.  A single CP in a query can make the whole server go slow.  The other problem is the result sets, which come back many multiples too big.

No gizmo tool will help you with a CP.  No query analyzer will tell you that you have a problem, or how to fix it.  Sorry mouse lovers.

Cartesian products are one problem area that only observation, knowledge, and good SQL skills will overcome.

When I have pointed out the cartesian products to the rest of the crew, many were in denial.  The really odd thing was that the more “experienced” the database person was, say, over 10 or 15 years of experience, the less obvious it was to them.  And the more they insisted on playing with statistics or indexes.  Which of course did nothing to deal with the real issues.
Read the rest of this entry »