Improving query performance

Sometimes a database will start to run slowly, and you will need to trace the problem. There are many reasons for poor database performance, particularly over a network. They include:

  • Opening a form based on a large table or query, without filtering it first
  • Linking to multiple images in a network directory
  • Using a single copy of a database front end on a network drive, and opening your database via that networked front end
  • In short, anything that causes you to pull large amounts of data to your desktop.

If you haven’t made these design errors the time has come to look at your queries; you may have one or more queries that form the basis for a group of forms and reports, and if they are running slowly you can make a big difference to the database’s responsiveness by redesigning the queries.

Checking your queries

When this happened to me recently I built a function that opened all of the queries in the database, timing how long they took to run, and writing the results to the Immediate code window. I ran the function 4 times, copied the output to Excel, built a pivot table to average the results by query, and sorted the times in descending order. It quickly became apparent that one group of queries was slower than the rest, which gave me a couple of targets to concentrate on. This code requires a reference to the Microsoft DAO 3.6 Object Library (2003 or earlier) or the Microsoft Office x.0 Database Access Engine Object Library (x is 12 for office 2007, 14 for Office 2010).

Function TimeQueries()
    Dim qdf As DAO.QueryDef
    Dim StartTime As Single, _
        EndTime As Single

    For Each qdf In CurrentDb.QueryDefs
        DoCmd.SetWarnings False
        If InStr(1, qdf.Name, "~") = 0 Then
            StartTime = Timer
            DoCmd.OpenQuery qdf.Name
            EndTime = Timer
        End If
        Debug.Print qdf.Name & "; " & (EndTime - StartTime)
        DoCmd.SetWarnings True
    Next qdf
End Function

Optimisations

I use subqueries fairly extensively and discovered that two queries in particular had subqueries in the criteria. In both cases those subqueries contained groups and / or domain functions like Max and First.

First fix: write the subquery values out to a table

The first query returned all records from an import table that matched specific criteria, and took around 3 seconds to open. The import is a monthly process and takes a couple of minutes to complete, so I decided to make a table using the query data, as part of the process. When the ‘slow’ queries were based on this table instead of the base query, execution time dropped to 0.15 seconds: a 95% saving. This particular query was the main culprit; after the redesign, the database performance returned to its original speed before the slowdown.

Second fix: join to a second query

The slowest query fortunately doesn’t get used very often; it took around 150 seconds to run. Removing the criteria caused the query to open almost immediately so I turned the criteria into a second query, and joined it to the first on the primary key. Re-timing the query showed that it now opened in 0.15 seconds. This was a thousand-fold improvement over the starting query.

The bottom line

There are many ways to build queries but you may find yourself using one or two favourite techniques most of the time. If things slow down, you can get a significant performance boost by redesigning a couple of key queries.

About sydneygeek

I was a consultant in Excel and Access for about 14 years before taking a full time role as an analyst with an energy company. Tech interests include Excel, Access, VBA, HTML / CSS and Oracle. I am based in Sydney, Australia and am a moderator on MrExcel, where I go by the name SydneyGeek.
Bookmark the permalink.