Goal Review Q1

Categories: News, Professional
Comments: 1 Comment
Published on: March 31, 2010

As we close out the First Quarter of 2010 it is time to review progress made on the Goals I set at the beginning of the year.  So far, I think I am progressing pretty well.  There has been measurable progress.  Thus I will break down each of the goals that were established in that blog post back in January.

Start a Blog

I think this one is pretty self evident at this point.  The blog is started and is active.  I have been able to produce good articles throughout these three months.  The blog is syndicated at SQLServerCentral.com and people at least look at the articles from time to time.  Some even leave comments.

Implement a BI Solution

The solution I planned for this has made progress.  I need to spend more time on the project.  I have been able to reduce the number of errors that I encountered throughout the processing of the package.  Also, as an end result the package performs better and provides more accurate results now.  Some of the errors were logic errors and some were actual errors.  I have a few more phases to implement including an archival piece, reporting piece, and more data retrieval.

1 Technical Blog Post per Week

As far as averages go, I am well ahead of this goal.  I have been able to output at least one technical article per week if going strictly by the calendar without rollforward or rollback of excess articles.  This works well with the first goal since the main premise of my blog is to write from a professional perspective on SQL Server.

Attend 2010 Pass Summit

I have not yet attended the summit, and can’t do that until November.  However, I have registered for the Summit and have every intention of attending this year.

Present Once a Quarter (or 4 times)

In the first quarter I presented twice at the local PASS Users Group.  The first was in January and the second was in March.

Write 2 Articles

I am half-way home with this goal.  The first article was published by SQLServerCentral in March.  It was titled Log Growing Pains.  I am working on topics for the next article.  I should have at least one more article submitted and published by the end of the year and hope to have more than that.

Run Two Marathons

I am sucking really bad at this one so far.  I still have plenty of time to correct and get on track for the year.  I will get back on track.

As for the less measurable goals, I think I am doing fine there as well.  I was assigned as VP to our PASS Users Group, I am on a subcommittee for Summit 2010, and I am actively involved with the community at SQLServerCentral.com.

Index Info

Categories: News, Professional, Scripts
Tags: , ,
Comments: 2 Comments
Published on: March 31, 2010

I recently came across an interesting script to help show index info and usage.  As I began to dissect the script, I noticed that the script was intended for SQL 2005 and beyond but was using SQL 2000 compatibility views.  I decided I would take it upon myself to update the script to use the 2005 objects and not the compatibility objects.

The design of the script is to give a basic view of the usage of the index and then try to evaluate the cost of that index.  If the index updates far outweigh the queries that use it, then the index may not be a useful index.  Furthermore, the index cost probably outweighs its use.

The original query employed the use of dbo.sysobjects and dbo.sysindexes.  The reason for the use of dbo.sysindexes was merely to calculate the size of the index.  The use of dbo.sysobjects seemed like an oversight.  The area that I ran into a sticky point was with calculating the size of the index.  The problem was in figuring out a method to execute the entire query and have it perform as well as the original query.  The first attempt yielded a query that was absolutely abysmal.

[codesyntax lang=”tsql”]


The method I am using to calculate the size whilst using the SQL 2005 objects was to use the function sys.dm_db_index_physical_stats.  I employed this directly as a subquery in the above posted query.  This version takes in excess of 7 minutes to return results.  I moved on from it to find a better method.

The next attempt was to move the subquery into a CTE.

[codesyntax lang=”tsql”]


This version returns in about thirty seconds.  Still not acceptable but substantially better.  The original query returned in roughly three seconds.  It is also important to note that the accuracy of both queries is equal.  Since both return the same result set, the only focus is on performance.  The delay in this query completely revolves around the use of the function.  Armed with the knowledge that the real delay is in the use of that function, there needs to be alternative method to pull back that information.  I decided to use a temp table to stage the data.  Using this temp table I could test directly the efficiency of the query.

[codesyntax lang=”tsql”]


The population of the temp table consumes 97% of the query cost.  The prior version of the query also showed a cost of 97% associated directly with the function.  If I populate the temp table prior to the execution of the remainder of the script, then I can compare performance of this script versus the original.  When comparing, this version runs in about 1 second.

The next steps from here would be to employ a method that populates a table in an “Admin” database.  By using a pre-populated table I can maintain good performance to gain the information concerning Index Usage.

Stepping Stone Cert

Comments: 6 Comments
Published on: March 30, 2010

A few of us have been deliberating and discussing the need for a bridge certification between the MCITP and MCM certifications from Microsoft.  This discussion was brought about due to an editorial by Steve Jones at SQLServerCentral.  As the discussion has progressed I believe there are some good ideas coming about from the discussions on how to create such a bridge.  I want to share some of that info here as well as some background on the topic.

Why is there a need?

Microsoft has two levels of certification that are of interest to many Database Professionals.  The first is the MCITP and the second is the MCM.  The MCITP is a series of exams taken from computer that are designed to test general skills related to one of the database roles and the exam topic in particular.  This certification is relatively accessible and has minimal relative cost (you pay for each exam and study materials).

The MCM is a higher achievement with considerably more cost.  The MCM is conducted via a “boot-camp” training session on the Microsoft campus for three weeks straight.  The cost is $18,500 for the three weeks alone.  Beyond those costs are all associated travel fees (lodging, food, transportation).  To be accepted into the program there is an application process and a non-refundable fee of $125.  There are exams and labs that must be passed in order to receive the certification.  There is also the time requirement of five years in the profession and be an MCITP.  Now, you do get to receive training at the hands of some MVPs such as Kimberly Tripp and Paul Randall – which could be quite an experience alone.  As you can see, this certification is designed for Senior level Database Professionals and has some high barriers to entry.

These two levels of certification create a chasm between themselves.  There is a lot of middle ground that is not covered.  Many can’t afford the MCM, and many won’t do the MCITP due to perception.  If there is something between the two that can bridge the gap, there will be change in perception from the community as well as business.  The certification in the middle needs to be able to build on the MCITP and serve as a building block for the MCM if one should desire to pursue it.  However, it should also be recognized as being able to stand on its own (which the MCITP does not) as a quality certification.

Barriers to Entry

Barriers to entry are requirements that must be met by an individual prior to being accepted.  This term is used in business and economics and denotes the difficulties a company may have when starting up in an industry.  Barriers to entry are good.  They help create competition and try to help create a perception of quality.

I believe the MCM has some high barriers for a few good reasons.  These reasons are:

  1. Prevent people from cheating the system.  Certifications have received a black eye in general due to people cheating the exams and through the use of widely available brain-dumps.
  2. Be a credible certification.  Since MS is controlling the environment of this certification, it would be difficult to cheat your way to being an MCM.  By being so difficult as well, it could garner more respect for the person having achieved it and thus instill a little more confidence in the certification system.
  3. Make the MCM prestigious.  With so much cost and so few people attaining the certification, it could become a desirable certification.
  4. In addition to those reasons for the barriers, there is the revenue that this program will generate for Microsoft as well.

Bridging the Gap

How does a company that sponsors a certification bridge a gap such as exists for Microsoft with the MCM and MCITP?  That is the dilemma that has several people talking and particularly in the discussion forum from the editorial mentioned already.  There is a foundation of ideas that are forming into a consensus on how to handle this middle ground.  This does not however mean that all of the questions have been answered.  Nor does it mean that these ideas are to be adopted by Microsoft or even PASS.  Here is a review of the foundation being proposed.

  1. MCITP Certified
  2. Lab / Project
  3. Review Board
  4. Presentation
  5. Exams
  6. Time in Service

This bridge would be akin to the Journeyman in the trades.  The idea being that the MCITP is more like an apprenticeship.  You have hopefully learned a little bit from the exams and studying performed to gain that certification.  The next step is to journey out as an apprentice and establish yourself in the industry.  Somebody signs off that you are able to perform the duties that to which you ascribe.  A journeyman can do the work without supervision and has the endorsement of his peers.  This is a formalized approval.  Does one need to be certified to be able to do the job?  NO.  This is a path for those that desire to be certified.  Will this path lend more credence to the person doing the job?  Potentially.  Since this certification is not in a bubble there will be some gaming of the system.  The amount of gaming and cheating will be considerably less.

MCITP Certified

Being certified at the base level should be a pre-requisite.  This requirement holds for the MCM and shows some level of competence and intent toward certification and the DBMS.

Lab / Project

The conducting of a hands on exercise is essential to testing the ability of the candidate.  Being able to perform a given task under pressure is what we do as professionals.  This will help to divulge the mettle of the candidate as well as demonstrate the skill set.  The administration of this exercise is one yet to be flushed out.  There needs to be a large pool of exercises for this to help prevent any brain-dumping of content or cheating of the system.


Senior level professionals in the community need to be able to communicate effectively.  This presentation can be a topic of choosing by the candidate.  The presentation could be scored by the attendees to provide additional feedback for the Review Board.  A benefit of requiring this is to help build a Presenter pool for the various events in the community.  This also helps to provide additional material for review on how the candidate handled the task being presented and knowledge base.

Review Board

A panel of peers needs to review each and every applicant.  This will serve multiple purposes.  The primary purpose is to reduce the likelihood that somebody has gamed the system.  The board themselves should be Certified and well respected individuals in the SQL Community.  A nice example would be to have MVPs sign up for this.  A secondary benefit of this is that this review could serve as the application for the MCM.  The review will entail a review of a presentation (at minimum) the candidate has done, the lab / project, Exam Scores and potentially be an interview directly with the candidate.

Time in Service

This is the most controversial requirement for the certification.  If one is pursuing the MCM, then one must wait five years anyway.  If not then the candidate may or may not be intrigued by the need to wait for 2 years.  I feel that time in service gives more credence to the certification.  From Boy Scouts up through employment there are several time in service stipulations that one must meet in order to attain the goal.  I don’t think two years as a Database professional is a hard requirement (especially for those that have been working in the arena for several years already).  This time in service can serve as a fleshing out period.  I think there can be exceptions built in for this requirement.


This is pretty straight forward.  There are exams required for the MCITP, there need to be exams for this middle cert.  The exams need to entail greater difficulty and possibly be more specific skill-set.  This is an area for greater discussion.


These requirements are designed in an effort to build the community and bring a higher level of credence to the profession.  In no way should these be considered as an effort to esteem ourselves as the elite.  There are plenty of details to be discussed.  This is a worthwhile cause to champion and try to get PASS and MS on board with it.  Currently we call this certification the MCJ (JourneyMan).  I think that a renaming of the MCITP is also in order to help delineate the path from beginner to master.  MCITA would suffice for me.  Maybe the MCJ should also be more along the lines of MCITJ.  However, the MCJ works in coordination with the MCM as far as naming convention goes.

FizzBuzz Revisited

Categories: News, Professional, Scripts
Tags: ,
Comments: No Comments
Published on: March 30, 2010

I had bigger plans for this series than what has transpired. I think that is a testament as to how things can rapidly change or other events can change your plans. The first thing that changed my plans for this series was the advent of a competition on ask.sqlservercentral.com centered around the Fizzbuzz question.  Then came along the TSQL Tuesday concerning IO.  My submission for that meme involved some Fizzbuzz examples to help demonstrate the point.  Now the series I had planned was left in a nebulous.  Well this will be an attempt to try and resurrect it and terminate it all with this article.

The IO article helps to fill a gap left from the first article.  In that article I discussed the creation of a numbers table but did not explore the cost of creating that numbers table in order to solve the FizzBuzz question.  If you already have the numbers table then the cost is minimized and spread out across all solutions involved in the use of that numbers table.  However, if you do not have that table then there is significant IO cost in the creation of that table.  That cost increases as the size of the numbers table increases.  As demonstrated by the IO article, the cost for a large numbers table was the eventual filling of my hard drive.  That is certainly not something you would want to be doing on a production system.

So what is an alternative method to this problem while not creating such an IO cost?  My final solution was the result of several trials and tweaking.  Some of them were just for fun, and some were to test performance gains.  I had thought of evaluating the evolution of those trials.  I decided against that and will just go straight to the final two trials.

Version 1

[codesyntax lang=”tsql”]


This solution was labeled as one of my “for fun” tweaks.  However, there was a good reason for this version.  What if I only wanted to know if a specific value was a multiple of 3 or 5 or both?  This gave me the @WhatNum idea to compare and check on that value.  If no number is supplied then I can query the entire range.  In other versions of this same query, I use an abs() function in the where clause on the @Limit variable.  This is another optional tweak, but I feel it should be present (though not demonstrated in the above script).  If I run this script for a specific value the query performs faster than the entire range (as expected).  If I want the entire range returned, then this query performs quite rapidly and is on par with the solutions provided in the editorial comments and the ask answers (ask link provided, editorial link is referenced in the first article in the series).

Version II

[codesyntax lang=”tsql”]


The major speaking points on this version is the final CTE being the Select and FizzBuzz assignment.  In many solutions this is outside of the CTE.  This method performs slightly faster in some cases, and for the most part is on par with the other solutions.


The biggest selling point of both of these solutions is the speed and IO cost.  Neither of them use DISK IO.  That helps speed both of the queries up substantially.

The better thing about this exercise has been the ingenuity that the community has shown.  There are some amazing responses.  Some responses have used pivot tables, others the cascading CTE method, and yet another used a View (which actually may be the fastest).  I learned a great many things from some of the solutions.  Go check out those solutions and discussions – they are worth it.


Comments: 3 Comments
Published on: March 25, 2010

There seems to be quite a flurry of talk these days about certification. There is evidence of it in the forums and editorials at SSC, as well as in several personal blogs of various SQL Professionals throughout the community. I even see it happening with the quarterly goals at my employer (yes we have quarterly goals).

I have, myself, taken several certification exams. I found them worthwhile for my personal and career growth. I haven’t taken an exam in probably ten years though. I have always found that studying by myself with the materials I could find online or via a bookstore was the best way to go. Studying in this fashion, I could drive the pace and spend as little or as much money as necessary to pass the exam. Through this practice, I have earned quite a few letters after my name from Cisco to Novell to Microsoft to Citrix to A+ and N+ (it all started with the A+ certification due to employment requirements). Each of the certs was job related and useful when working in a one man shop type of setting.

Well, it is time again to commit to getting certified. I doubt I will pursue the MCM any time soon. However, I will be pursuing the MCITP certification as soon as possible. I will probably pursue some other vendor specific certifications due to work related goals and prior personal goals I never fulfilled. As I progress, I will likely blog about my successes and experiences. However, don’t expect to find any information specific to my exams – other than general feelings upon completion.

Security Audit

Comments: 8 Comments
Published on: March 19, 2010

Of late I have seen a lot of questions on how to audit the logins and users on each SQL Server.  I had the same questions for myself when I went through the same exercise some time ago.  My first step was to peruse the internet and see what I could find to get me started.  I found that to be quite helpful.  I found a lot of different scripts that were beneficial.  I, like most, did find one though that I preferred above the rest.  That script can be found here.

Why do I like this script?  I like the format.  It also generates a nice output that can be passed along to auditors.  The output is saved into an html format and seems more presentable to me.  Besides those facets, it meets the base requirements – I can find what roles and users have what permissions in each database on a SQL Server Instance.

The script didn’t quite suit all of my needs.  I think that is frequently the case.  The trick is being able to take the script and make necessary adjustments to suit whatever needs you may encounter.  The changes that I made to this script were in favor of progressing toward an automated inventory solution that I could run from a central location.  The script as it stood required manual intervention.  Granted, I have not yet completed my inventory solution, I have modified the script to work well with 2000 and 2005 and output the results to a properly consumable html file.  Since 2000 and 2005 behave differently in certain regards, I had to add some logic for the script to also behave differently if depending on the version of SQL Server it was run against.  This was necessary since I have SQL 2000 – SQl 2008 in my environment.

Scripts of Change

So, starting from the top.  I decided to use several more variables and create a bunch of temp tables.  The variables will help in the decision making, and the temp tables will help in Data storage for processing as the script runs.  Thus we have this block of code at the top in place of the old Variable block from the original script.

That is the prep setup so we can now begin the true work of the script.  As, I said there was some decision logic added to the script.  I needed to find a way to determine SQL Server version and based on version execute a different script.  And now we have the decision block.

Basically, I am checking the version and determining if I should use the SQL 2000 objects or if I can use the SQL 2005 objects since the 2000 objects are scheduled for deprecation.  Also, since xp_cmdshell is disabled by default in SQL 2005, I am prepping to enable that just for the final piece of this script.  Due to the nature of xp_cmdshell, it is advisable that you understand the security risk involved and revert it back to disabled – if you enabled it to run this script.  There are other methods for doing this, I am sure, but I chose this since I got consistent results and have not had time to revisit it.

After that decision tree, I have changed the main body of the script to also use a decision tree in building the dynamic sql.  That tree is built like the following snippet.

I think you can see at this point some of the differences and why I chose to do it this way.  The final section of code change comes at the end of the script.  This is where the html file is finally built, and then saved out to the file-system.

In this section, I am enabling xp_cmdshell if necessary.  I am also performing one more necessary trick.  I am using xp_cmdshell to flush bad dns records and ping a remote host.  I will be saving the file off to a central repository and found some bad dns records on my servers while doing this process.  By adding this step, I saved myself quite a bit of frustration in the long-haul.  After that, I use xp_cmdshell to bcp the results out to file.

This took some work to get the ” ‘ ” all lined up correctly and working properly with BCP.  It was somewhat satisfying when it finally came together.

Now, remember I said you should reset xp_cmdshell back to disabled once completed?  Well, I built that into the script as a part of the cleanup.  I perform this action right before dropping all of those tables that I created.


I effectively took a well working script and made it suit my needs / wants just a little better. The initial code was just over 300 lines and I nearly doubled that with this script.  Is it worth the extra effort?  Yes!  Though it took some time and effort to make these modifications, I was able to finish auditing the servers well ahead of pace of doing it by hand.

Furthermore, I can still use this script and continue to reap the benefits of having taken the time to modify it.  Can the script be improved?  Sure it can.  I have a few things in line for it currently.  The biggest piece of it will be modifying it to be run from the inventory package I am still trying to finish in my spare time.

You can download the script in its entirety here. You may also read more about security and permissions from these articles here.

Edit: Fixed some WP formatting issues.


Categories: Book Reviews
Comments: No Comments
Published on: March 12, 2010

I am now 3/4 done with the pure torture of finishing this series.  Could it get any more sappy, whiny, and boring?

I am trying really hard to find anything of real good to pull from these books.  Teenage (yawn) romance.  I hope the last book gets better.  I have even been skipping big chunks and still am not missing a beat with the story.  There is no real climax in this book.  You might call that battle in the forest the climax, but it was so poorly described.  Many pages wasted on describing and re-describing non-essential items in the book.  Why do we need to repeat, repeat, repeat, and repeat some more the same descriptions, the same plot – blah blah blah.

The book could have been immensely better if not for the gaping holes in the story (unrelated to skipping the descriptive crap that was repetitive).  I also thought it was the wrong move to allow the youngling to be killed by Jane.

Oh well, just another book that was not worth the time it took to read it.

Rating: D

Another Presentation Down

Categories: News, Professional
Comments: 2 Comments
Published on: March 11, 2010

This evening I had the opportunity to once again give a presentation at our local SQL Uses Group (SSSOLV).  I went into the presentation with the hope of trying to encourage participation and group discussion.

I think the presentation started off a little shaky for a couple of reasons.  I had sent out an email requesting additional info to help prep for the presentation – however, I sent it late which impacted the feedback.  I also didn’t review the PASS Monthly slide deck and sort of shot from the hip going through that stuff.

Despite that, the presentation went well.  I had several people give good feedback.  They liked the presentation!!  Participation really went well and it felt more like a conversation than a presentation – that is a good thing.

Probably the best thing to come from the presentation was the resounding support in favor of trying to get a SQLSaturday down here.  With that info, it looks like I will be starting the process to host a SQLSaturday and organize it from our side.

Another benefit to be derived from this presentation is a growing warmup that has been happening with the group.  Each presentation I have given has improved in group participation.  That would be coming from both sides – I am sure.  Going back to some of my early blogs about participating in the Local User’s Groups, I want to reiterate how good it feels to participate.  I am glad I have had the opportunities to present.

Log Growing Pains Article

Categories: News, Professional
Comments: No Comments
Published on: March 10, 2010

I Just had an article published at SQLServercentral on troubleshooting the cause of log growths.  At the conclusion of that article I had a reference to my blog with the expectation of having had time to create a post about using Event Notifications to track the Log Growth and determine the cause.  I have not completed that yet.

Sincerest Apologies.  Please check back from time to time for that article.

Also, as a sidebar, since writing the article – I thought of a few things that would have probably made the scripts a little better.  I am sure that some of those items will be hashed out in the forum associated with the article.

The article is here.

New Moon Review

Categories: Book Reviews
Comments: No Comments
Published on: March 10, 2010

I am doing this only as an effort at a more than fair shake at the series of books.

I have been critical of this teenage romance series, and with good reason.  The series is authored by Stephanie Meyer and has created quite a following in the teenage audience and in the adult female audiences.

The book is quite sappy in its romance with the confused Bella trying to figure out her teenage love.  She swings from Edward to Jacob and back again.  She is ditzy, would be an understatement.

I have found though that the book can be enjoyable under the circumstances that my mother-in-law used to read the book (and she liked the book too).  Speed read through it.  The way to do it in this case is to skip the repetitive paragraphs, and pages.  You can easily pick up on the story later without having missed a beat.  Thus, becoming an extremely casual reader of this book is beneficial to the likability of the book.  Another benefit is that the book finishes much more quickly.  The story becomes bearable and there is actually a climax in the book (though predictable as it may be).

Now, it is off to finish the rest of the books.

Overall grade = C-

«page 1 of 2

March 2010
« Feb   Apr »

Welcome , today is Monday, January 27, 2020