Navigation and service panel


Content

Database performance analysis - Part I

By Ernst Joss on 13. April 2015, No comments

Usually one might not bother about databases for Sitcore websites, especially not when the "normal" SQL guidelines are taken into account during setup of your SQL server. But there might be situations where you think that maybe your database could be the limiting factor for your website. In this cases it can be helpful to know a little bit more about the state of your database used.

Things to consider first

On a first view it is always helpful to know whether your database setup follows the installation guidelines of Sitecore. Other easy to check areas are the memory consumation and availability as well as the expanding and shrinking strategy of particular databases. This should be set in a manner not leading to endless small amounts of expanding’s or even worse, to alternate expands and shrinking’s. For memory, always configure your SQL-Server to leave some memory free for the operating system and (if available) all the other programs running on this server.

More advanced analysis

What if everything about that looks fine, but you still think you need to improve something?

In this case it’s time to have a closer view on what is happening within your databases. Therefore different ways exist, which all can be found somewhere out there in the wide area of internet blogs and best practices. Let’s talk about some approaches we used in the past. All of them have the main goal to identify some slow parts and to find a better way to address the same outcome in terms of performance.

Identify "high costs" queries

In order to have a possibility to change something, first you need to have an idea on where to start. Saying to have a query or procedure to start with and dig deeper afterwards. But what is a high "cost query"? Is it the one taking a quite long amount of time (e.g 2 seconds) or the one taking just 200 milliseconds but executed much more often? A good description about that I could found on sqlmag.com where the different sort of "costs" as well as some more explanations about what “cost” numbers means are explained.

On the same place one possible solution to ask SQL-server itself for "costly" queries can be found. The following statement identifies all queries and calculates some sort of costs. As additional benefit, for each query you get its execution plan to be analyzed in more detail if wished.

SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED ;

WITH XMLNAMESPACES (DEFAULT 'http://schemas.microsoft.com/sqlserver/2004/07/showplan'),
core AS (
    SELECT
        eqp.query_plan AS [QueryPlan],
        ecp.plan_handle [PlanHandle],
        q.[Text] AS [Statement],
        n.value('(@StatementOptmLevel)[1]', 'VARCHAR(25)') AS OptimizationLevel ,
        ISNULL(CAST(n.value('(@StatementSubTreeCost)[1]', 'VARCHAR(128)') as float),0) AS SubTreeCost ,
        ecp.usecounts [UseCounts],
        ecp.size_in_bytes [SizeInBytes]
    FROM
        sys.dm_exec_cached_plans AS ecp
        CROSS APPLY sys.dm_exec_query_plan(ecp.plan_handle) AS eqp
        CROSS APPLY sys.dm_exec_sql_text(ecp.plan_handle) AS q
        CROSS APPLY query_plan.nodes ('/ShowPlanXML/BatchSequence/Batch/Statements/StmtSimple') AS qn ( n )
)

SELECT TOP 100
    QueryPlan,
    PlanHandle,
    [Statement],
    OptimizationLevel,
    SubTreeCost,
    UseCounts,
    SubTreeCost * UseCounts [GrossCost],
    SizeInBytes
FROM
    core
ORDER BY
    GrossCost DESC
    --SubTreeCost DESC

Executing this query underlies some limitations, which all can be read on sqlmag.com.

Another option to extract queries executed by your Sitecore site is to refer to SQL Server Profiler in order to make a capture of the current queries executed. As this in most cases only gives you a short time of inspection, it is quite hard to identify the most critical statements executed. For a first analysis of your SQL servers answer time it indeed might be good enough.

A last possibility I like to talk about is the analysis of the whole application logic happening on your Webserver, including all the queries and answers send to and received by your SQL server. One very good Tool for that is NewRelic which has to be installed on your Webserver in order to start measure all the work happening in your application. As NewRelic can analyze your application over quite a big timeframe you will be able to generate statistics about how often some sort of SQL queries are executed. Digging deep enough into the data gathered, you’ll be able to identify a bunch of queries worth a deeper analysis.

Categories  Tools  Troubleshooting Tags  SQL  diagnostics

No comments

Add your comment

Your email address will not be published. Required fields are marked *

*