Mainframe Blog

SQL Performance Revisited

Phil Grainger
3 minute read
Phil Grainger

Earlier this year, we presented a SQL performance webinar which discussed management of sub-optimal SQL, access path regressions, new applications, new CPU peaks, and more. We have since received these questions:

  • “How do I measure how my performance has improved?”
  • “How do I know what that improvement has saved my business?”

Both questions are relatively simple to answer, if you have made some preparations; otherwise the answers are almost impossible to find.

How do I measure improved performance?

The first step in measuring performance improvements is to have proof/documentation about performance before you started improving it. Everyone should have a performance database of some sort. It doesn’t need to be the most complex database on the planet, but it does need to allow you to compare performance at two distinct points in time.

Db2 can make it tricky to compare SQL statements directly. Although program names typically stay the same, every change to application code can change the statement number that Db2 uses to track each individual statement. If this statement number changes, it can be difficult to compare the performance of the new statement to an earlier version. The long-term solution is to tag every SQL statement with a static number using the “QUERYNO” clause. If the clause is omitted, the number associated with the SQL statement is the statement number assigned during precompilation. This cannot be implemented overnight, but once done, you can track any SQL statement for performance changes regardless of what application changes have been made – it’s QUERYNO will be consistent. Now it’s possible to track the performance impacts of major changes (such as a new application version or even a new release of Db2) as well as minor changes (tuning individual SQL statements). One customer told me that after they made static identification of SQL statements part of their standards, they were easily able to analyze performance changes even down to the SQL statement level.

Without this level of granularity, it is still possible to track the performance improvements at the application/program level. So, if you don’t have static statement numbers assigned to all SQL you can simply compare the performance of named programs between two points in time

Also, if you suspect you are seeing a performance degradation after moving to a new version of Db2, one of the first things IBM will ask you for is documentation that shows the prior performance as well as your reported poor performance statistics. If you have a complete performance database, then you have those answers readily available. But, if you can’t prove what the performance was before you migrated, IBM is unlikely to be able to help you.

How much did my business save by the performance improvement?

For this second question you may need to involve your colleagues that handle the license payments to IBM (if you are paying MLC charges based on the rolling four-hour average). This may be your capacity planners, your systems programmers, or some other group. These are the people who know which components are making up the CPU usage, when during the month your peak CPU consumption is happening and the value of saving certain percentages at certain points in time. They should be able to advise on what sort of financial saving could be expected from your savings in CPU consumption.

Unfortunately, saving CPU at random points during the month may not actually translate into any financial savings at all, so it is important to understand where any tuning activity could make a difference. Simply improving performance can be addressed by tuning ANY SQL statement, but tuning to save license charges needs to be focused on the SQL that is consuming the most chargeable CPU. This will be SQL that is executing in your 4-hour average peaks.

To learn how BMC can help you improve SQL performance, please visit our website and be sure to look out for future Db2 webinars. Pencil in December 6 on your calendars for “Intelligent Automation is the Key to Your Digital Transformation” – more information to follow.

Access 2020 Mainframe Survey Results

BMC’s 15th annual mainframe survey reveals strategies to adapt, automate, and secure the mainframe for continued success.

These postings are my own and do not necessarily represent BMC's position, strategies, or opinion.

See an error or have a suggestion? Please let us know by emailing

BMC Bring the A-Game

From core to cloud to edge, BMC delivers the software and services that enable nearly 10,000 global customers, including 84% of the Forbes Global 100, to thrive in their ongoing evolution to an Autonomous Digital Enterprise.
Learn more about BMC ›

About the author

Phil Grainger

Phil Grainger

Phil has 30 years experience of Db2, starting work long ago in 1987 with Db2 Version 1.2. Since then he has worked with all versions, including Db2 12.

From his beginnings as a Db2 DBA for one of the largest users of Db2 at that time in the UK, through his time at PLATINUM technology, his almost 10 years as Senior Principal Product Manager at CA and through to his current position with BMC Software, Phil has always been a keen supporter of user groups and is a regular speaker at both vendor sponsored and independent events. His work with IDUG includes being a past member of the European IDUG Planning Committee, an inductee into the IDUG Volunteer Hall of Fame and now Board Liaison for BMC Software

Phil has been honoured by IBM as an Analytics Champion from 2009 to 2017

Phil is now Lead Product Manager at BMC Software working in support of their Db2 tools portfolio

In addition, Phil is a regular contributor to the IDUG sponsored
Db2-L discussion list