Goal Review Q1

Categories: News, Professional
Comments: 1 Comment
Published on: March 31, 2010

As we close out the First Quarter of 2010 it is time to review progress made on the Goals I set at the beginning of the year.  So far, I think I am progressing pretty well.  There has been measurable progress.  Thus I will break down each of the goals that were established in that blog post back in January.

Start a Blog

I think this one is pretty self evident at this point.  The blog is started and is active.  I have been able to produce good articles throughout these three months.  The blog is syndicated at SQLServerCentral.com and people at least look at the articles from time to time.  Some even leave comments.

Implement a BI Solution

The solution I planned for this has made progress.  I need to spend more time on the project.  I have been able to reduce the number of errors that I encountered throughout the processing of the package.  Also, as an end result the package performs better and provides more accurate results now.  Some of the errors were logic errors and some were actual errors.  I have a few more phases to implement including an archival piece, reporting piece, and more data retrieval.

1 Technical Blog Post per Week

As far as averages go, I am well ahead of this goal.  I have been able to output at least one technical article per week if going strictly by the calendar without rollforward or rollback of excess articles.  This works well with the first goal since the main premise of my blog is to write from a professional perspective on SQL Server.

Attend 2010 Pass Summit

I have not yet attended the summit, and can’t do that until November.  However, I have registered for the Summit and have every intention of attending this year.

Present Once a Quarter (or 4 times)

In the first quarter I presented twice at the local PASS Users Group.  The first was in January and the second was in March.

Write 2 Articles

I am half-way home with this goal.  The first article was published by SQLServerCentral in March.  It was titled Log Growing Pains.  I am working on topics for the next article.  I should have at least one more article submitted and published by the end of the year and hope to have more than that.

Run Two Marathons

I am sucking really bad at this one so far.  I still have plenty of time to correct and get on track for the year.  I will get back on track.

As for the less measurable goals, I think I am doing fine there as well.  I was assigned as VP to our PASS Users Group, I am on a subcommittee for Summit 2010, and I am actively involved with the community at SQLServerCentral.com.

Index Info

Categories: News, Professional, Scripts
Tags: , ,
Comments: 2 Comments
Published on: March 31, 2010

I recently came across an interesting script to help show index info and usage.  As I began to dissect the script, I noticed that the script was intended for SQL 2005 and beyond but was using SQL 2000 compatibility views.  I decided I would take it upon myself to update the script to use the 2005 objects and not the compatibility objects.

The design of the script is to give a basic view of the usage of the index and then try to evaluate the cost of that index.  If the index updates far outweigh the queries that use it, then the index may not be a useful index.  Furthermore, the index cost probably outweighs its use.

The original query employed the use of dbo.sysobjects and dbo.sysindexes.  The reason for the use of dbo.sysindexes was merely to calculate the size of the index.  The use of dbo.sysobjects seemed like an oversight.  The area that I ran into a sticky point was with calculating the size of the index.  The problem was in figuring out a method to execute the entire query and have it perform as well as the original query.  The first attempt yielded a query that was absolutely abysmal.

[codesyntax lang=”tsql”]


The method I am using to calculate the size whilst using the SQL 2005 objects was to use the function sys.dm_db_index_physical_stats.  I employed this directly as a subquery in the above posted query.  This version takes in excess of 7 minutes to return results.  I moved on from it to find a better method.

The next attempt was to move the subquery into a CTE.

[codesyntax lang=”tsql”]


This version returns in about thirty seconds.  Still not acceptable but substantially better.  The original query returned in roughly three seconds.  It is also important to note that the accuracy of both queries is equal.  Since both return the same result set, the only focus is on performance.  The delay in this query completely revolves around the use of the function.  Armed with the knowledge that the real delay is in the use of that function, there needs to be alternative method to pull back that information.  I decided to use a temp table to stage the data.  Using this temp table I could test directly the efficiency of the query.

[codesyntax lang=”tsql”]


The population of the temp table consumes 97% of the query cost.  The prior version of the query also showed a cost of 97% associated directly with the function.  If I populate the temp table prior to the execution of the remainder of the script, then I can compare performance of this script versus the original.  When comparing, this version runs in about 1 second.

The next steps from here would be to employ a method that populates a table in an “Admin” database.  By using a pre-populated table I can maintain good performance to gain the information concerning Index Usage.

page 1 of 1

March 2010
« Feb   Apr »

Welcome , today is Thursday, January 23, 2020