Always Waiting, Waiting Waiting

Categories: News, Professional
Comments: 3 Comments
Published on: December 23, 2013

What’s all the Wait about…

I have been meaning to publish this post for a long long time.  I have no idea what I have been waiting on.  As a DBA, that isn’t necessarily a good thing.  As a DBA, we would generally like to know what is causing the delay or what the wait is being caused by etc etc etc.

It’s even a bit of coincidence because the topic today would have also worked very well for the TSQL Tuesday topic this month.  Robert Davis invited all to participate by writing about waits in SQL Server in some fashion or another.  You can read a bit about that from his roundup, with all of the necessary links, here.

Today, I only hope to be able to do a minor justice to the topic.


Since DBAs really do not like to be caught off-guard, it is very common practice to monitor the waits on the server(s) under his/her domain.  If the waits are not monitored, then the DBA at least should know how to check the waits and determine what may be helping cause the delays and/or procrastination in SQL Server.

I want to share a tool that I have been impressed with for several years.  The tool should be pretty popular by now.  Not only do I want to share a bit about that tool, but I will show how to become a bit more efficient through the use of the tool and trying to have the tool help you before you have to turn to the tool to start hunting.

Let’s just say this is a small gift from me to you for this Holiday Season.

What is it?

I was introduced to this tool 6+ years ago.  I was happy with it then and started to use it where I could at my employer.  After moving on, I have made a consistent recommendation with regards to it.  That said, I like the tool for the very precise design of monitoring and inspecting waits on the server.  That tool of course is – Ignite for SQL Server by Confio.  I will be writing about Ignite 8 and not as much about Ignite Central.

Before getting too far into, I want to say that like many worthwhile tools, Ignite gets better with each release.  For me, that speaks to the company and their willingness to listen to their constituents.  Take the feedback – make the tool better.  Know what you do, what you do well and continue to make it better.  I think Confio does a fine job at that.


What you see now is a quick screenshot with a stacked bar chart showing some information that Ignite might present to you.  In this case, I have a monthly trend report for a specific server showing the top x waits and how each of those waits stacks up in the grand scheme of things.

Now, at a glance, this is great information.  It is enough to get you started.  You can see a trend, or maybe the absence of a trend.  You can identify at a glance which waits are reportedly problematic in your server.  From here you can even drill in and get more information.  You would do that by clicking a section of one of the stacked bars to determine what might be related to that wait type on the day related to the stacked bar you clicked.

That is all great.  It’s even better when in the middle of troubleshooting (you just have to remember to use the tool).

But what if you are off-site and can’t get to the server housing the reports?  What if you are a Consultant and don’t necessarily need/want to login to the client server each day just to check this information?  The simple solution is to have the report emailed, right?

Well, with Ignite, that is a possibility too.  Confio has created several canned reports that are (rare species here) useful out of the box.  To help make it easier for all of us, a link has been created in the application on the Home Page.  It is real easy to get to the reporting module and to see all of the possible reports that can be viewed.

With that, we are starting to get somewhere.  If you click the Reports link on the home page, you will be presented with two list boxes from which you can pick some reports.


I can run any of those reports from that prompt.  That’s good news.  But that is not quite yet our final destination.  We want to have these reports run auto-magically and be emailed to us.  If you look around a bit more on the Reports screen, you will find a “Report Schedules” button.  Once the new page loads, you will find there is a Create Schedule Button.  By clicking this button, you will permit yourself the opportunity to create a schedule to email a report or group of reports automatically to a group of people or to just yourself – your choice.  Following the prompts is very straight forward and worth the five minutes or so to create the schedule.

Here’s a bit of a caveat.  You must execute and save the report before you can add it to a schedule.  Once you have done these few simple steps, you can have access to the reports from your favorite tablet or mobile device.  Better yet, should you see something out of line, you could take an action on it (call somebody and have them fix it, or remote in and fix it for those taking vacation 😉 ).

This was a bit of a short and sweet introduction into just one feature of a really good tool.  As a DBA, I like to automate what I can.  I also like to monitor what I can.  Then there is an aspect of automation and monitoring called reporting and free time.  If I can automate and implement a solution with minimal time that provides information that I need – I am usually in favor of that.  DBAs need reports on how SQL Server is performing.  Without those, you are just waiting to fight fires rather than be proactive.  So I hope this simple gift of automated reporting from a great SQL tool can give you more time in the future to be a better DBA.

A Trio of Tools

Comments: 3 Comments
Published on: November 15, 2011

I have talked about tools for SQL server a few times in the past.  You can read some of what I wrote here and here.

Since writing those last articles, I have come across more tools here and there.  Over the past few weeks, I came across three that stood out and I wanted to give them a quick shout out.

SSMS Tools Pack:  I have already written about this tool.  It was recently updated and the functionality has been improved since I last wrote about it.  Not only did that functionality improve, but the feature set is better now too!.  Go give it a try.

SSIS Reporting Pack:  This tool is available on Codeplex.  This is one of those things that could be queried from TSQL, but this gives an interface (SSRS reports) for you to browse the information.  The same kind of reporting pack would be very useful for SSRS.  I know people ask from time to time for this kind of information in both products.  Hence the usefulness of these tools would be pretty high.

SQL Treeo: Straight up, this tool was created to add customization to the tree view that you get with the default SSMS.  Some (many) find that the inability to customize this tree is inadequate.  This tool allows you the ability to create custom folders for the various objects.  This lends itself to being able to sort the tree in SSMS a bit differently.  Also, it can lead to being a bit more productive for many database professionals.

Check the tools out.  Test them and see if you like them.  Let the creator of each of these tools know what you think about their product.  Other than writing about the tools, I personally have no affiliation with any of the tools.  But I do think they would be good tools and are certainly worth the effort of testing for yourself.

SSRS Export En Masse

Tags: , , ,
Comments: 5 Comments
Published on: August 8, 2011

Have you ever found yourself in need of exporting all of the reports from a Report Server?  I have found myself in need of extracting all RDLs from a Report Server on more than one occasion.  Reporting Services 2008 would rather have you do that in a rather monotonous and repetitive style.  That doesn’t work very well for me – and especially not if you have a ton of reports to export.

I combed the internet to see if there was a quick to implement way to do this – and found some recommendations.  Most of these  required the installation of another utility or the purchase of other software.  I found one site however that was extremely useful for what I wanted.  Brett Stateham used some queries to display the XML for these reports that is stored in the Catalog for Report Server.  You can read about that here.

Reading his blog post on the topic, I got the distinct idea that I could use a query to extract the xml and save it to an RDL file.  To verify this, I copied the XML (as returned by the queries on Bretts page) to a notepad file and save the file with an RDL extension.  Then I added that file back into a solution in Visual Studio and ran the report – sweet.  It worked.

Now that I know it can be done as simply as he displayed, it was time to take it a step further.  I now need to create something that will help me export numerous reports to their own individual files.  Hmmm, what could I use?  Having done something similar in the past for a single xml file in SSIS, I decided I would just use SSIS.  Thinking it through, I figured this should be pretty simple in appearance (a file destination object, an execute sql task, a data source, a file source, a foreach loop – you get the idea).

As I started to work through the solution, I found that I was over thinking it a bit and the solution could possibly be easier.  That made me wonder if a script task would be helpful here.  I decided to research and see if there was a quick way to write the xml to a file via a script task.  Guess what, there is.  Using VB .Net, there are simple methods inside a script task to write to the filesystem.

So, without further ado, let’s take a look at the package.

First, let’s create a few variables for use throughout the package.

  • objReport – This is of type object and will be used to store the results of our TSQL statement from an Execute SQL Task.
  • ReportExportPath – Type is String and will be the destination file path.  The last character for this variable should be a “\”
  • ReportName – Type is String and is for use by the ForEach Loop.  This variable will receive the report name from each iteration through the objReport and then be used by the Script Object to later give a name to the output RDL file.
  • ReportSourcePath – Type is String.  This variable is to help extract only the reports from a specific location.  In many cases, reports are stored in different folder paths and you may not need all folders’ contents.
  • ReportXML – Type is String.  Similar in function to ReportName
  • DBName – Type is String.  This value will be used to override the Initial Catalog of the Connection String
  • ServerName – Type is String.  This value will be used to override the Server Name of the Connection String.

Next, we need to create a connection object to create a data source for this package.  I created an ADO.Net connection object and named it ReportSourceDB.  Once created, use expressions from the properties tab to overwrite the Initial Catalog and Server Name values with the already mentioned variables.

Now that we have that stuff out of the way, we can begin work on creating the data flow.

Above is displayed the entire data flow for this package.  I adapted, albeit ever so slightly, the script we discussed at the beginning to the following for use in this package.

This script is in the ExecuteSQL task.  The task is configured to retrieve the full result set and store it in objReport.  A parameter is specified and passed to the above query in the first CTE.  The following images should help clarify the configuration a bit.

General Tab:

Parameter Mapping Tab:

Result Set Tab:

So far so good.  This is now getting all of the data that we need from the ReportServer database.  We can now pass this information to the ForEach Loop container and write the XML out to RDL files.  The configuration of the ForEach Loop is not too complicated and looks like this:

Collection Tab:

Variable Mappings Tab:

Inside of this ForEach Loop container we have that script task that was displayed.  This was probably the trickiest (yet extremely simple) part of the whole exercise for myself.  I have used script tasks in the past for various tasks and was worried this might be a bit more complicated.  So let’s start with a couple of pictures.

There is no need to use a ReadWrite variable in this task.  This task simply reads the variable and then writes the contents of that variable out to a file.  Note that the script language is set to VB 2008.  The default (for me) was C# – and I changed that.  Once the variables are specified for ReadOnlyVariables, click the Edit Script button at the bottom of the screen.

In the new window that opens, highlight everything and overwrite it all with the following.

The line that is important here is the line containing My.Computer.FileSystem inside Public Sub Main().  Note how I am using the variables we have created to this point to create the file name (first parameter series inside WriteAllText()) and also to write the contents of the file based on the ReportXML variable (that is the second parameter inside WriteAllText()).

From here, it is simply a matter of testing the package.  Run the package and check the directory you specified in the ReportExportPath variable.  This little project will save me an enormous amount of time in just exporting the reports to files from the database.  Yay, no more repetitive download task done through report manager.

Enjoy, and put it to good use.

Tools note

Categories: News, Professional, SSC
Comments: No Comments
Published on: March 24, 2011

I was asked a few months ago to allow some advertising on my blog.  I hadn’t given the idea a whole lot of thought but was reminded again today about the request.  For now, I think I will just give the requester a mention and add the tool he wants advertised to the growing list of tools that I have been compiling.

The tool looks like it has some merit and seems like it would be worth the trial.  The tool is called DBDefence.  You can find more about it here.  The purpose of this tool is to help encrypt your databases – outside of the databases and prevent people from prying too far into what they shouldn’t.  I haven’t yet tested the tool – but it seems quite interesting.

And since I have a captive audience already, I will add one more tool to the mix.  Like DBDefence, this other tool is security focused.  In SQL Server there is a bug that allows the savvy to view the unencrypted password of the SQL accounts that have logged in since last service start.  One of the passwords that falls into this category is that of the sa account.  Sentrigo has a tool that will clear that password from memory so it can no longer be viewed in clear text.  The name of the tool is Passwordizer.  You can check it out from here.

Database Tools Follow-up

Categories: News, Professional, SSC
Comments: No Comments
Published on: March 18, 2011

After posting a database tools list yesterday, I found that there were things that I had forgotten and a couple of things that i hadn’t even considered.  In thinking about this, I found myself trying to find a script that I knew I had seen and that I wanted to put on the list.

Trying to find the script got me a bit out of sorts.  I knew I could probably find it on my share at work with my scripts (which reminds me that I really need to get a dropbox or put them all on a thumbdrive to keep with me).  But, I wasn’t at work and wanted to locate this cool script that I knew was recommended by another prominent person (more about that later).  So I pulled out my googlefu.

I tried searching the web for this blog that had the script and tried several combinations of the key words I knew would help me find it.  In doing this search, I came across a few blogs that had created a list of tools that database professionals should check out.  Here is a short list of the blogs I found.

All of these lists are great compilations.  I also found one more list, but found that it was just a copy of Aaron’s list.  This was extremely disappointing.  Just reference the list or come up with a list of your own.  A SQL Server MVP should never have to copy content from another MVP.  It’s not even that the list is the same – it was a copy of the entire post by Aaron.  I personally hold MVPs to a higher standard and find it disappointing and aggravating when one is copying content and is esteemed as a person who has contributed much to the community.

None of these lists had what I the item for which I was looking.  The item that I wanted was a script that Brent Ozar (blog) had given high compliments to a few months back.  It was a script that, he said “a script that’s like my Blitz server takeover script, but even more advanced.  Love it!”  Too bad I couldn’t remember the person, script or wording he used when I was trying to find the script.  Better yet, the page with the reference popped up several times in my searches.  Well, needless to say, I pinged the community and Brent on twitter and Brent came through.  Here are a few more items to add to the list of tools you should have or try.

  1. 60 Minute Blitz Script (Brent Ozar) – check it out.  It will be worth your time.  If you looked at the other lists you will also see that this script is there and highly recommended.
  2. Diagnostic Information Queries (Glenn Berry) – this was the script I was trying to find.  This is the script that Brent has given high praise – give it a look.
  3. sp_whoisactive (Adam Mechanic) – Newest version as of this post is 10.76 and is listed as a beta version.
  4. SSMS ToolsPack (Mladen Prajdic)

So there you have it.  Between my two lists and these other great lists – you should have a great toolbox.

Database Tools

Categories: News, Professional, SSC
Comments: 5 Comments
Published on: March 17, 2011

Recently I have posted a couple of articles about various database or SQL tools either for maintenance or for benchmarking.  Those posts work well in conjunction with a post several months back that you can read here.  These posts made me think a bit harder about the tools that I use to do the job.

In thinking about those tools, I realized that I did not have a list of what I have used or do currently use.  Thus, I went through an exercise compiling my list of tools.  Some of these tools are currently in use, some are items that I have used in the past and was impressed by the tool.  And someday in the future I will finish compiling the list of all of the tools that I have encountered over the past year or two that I have intended to test.

Without further adieu:

  1. SSMS Addins – This is available on codeplex.  One of the features that intrigues me is the ability to script out the data from the table.
  2. OpenDBDiff – This one seems to have promise.  Similar in function to Visual Studio 2010 or RedGate SQL compare tools, this one is free and compares the schema and objects.  If you are on a tight budget and need to be able to compare two databases, this just might be the tool to try.
  3. SQL Monitor – A tool to monitor various things in SQL Server like jobs and executing queries.  Kind of low-level, but I figured I would test this app out and see if it had some merit.
  4. SQL nexus – This is a tool to help evaluate performance issues with SQL Server.  You can evaluate wait stats along with PSSDiag files.
  5. SQL Powershell Extensions – I recently learned of this tool on Codeplex.  This is a high priority item for me to download and test.  This tool helps to create “intuitive functions around the SMO objects.”
  6. PowerShellPack – Download from Microsoft to enhance the powershell experience.
  7. Data Dictionary – This software is on my list to evaluate.  It is mostly out of curiosity because I have something in place to create data dictionaries already.  This tool allows you to update the extended properties from a GUI.
  8. US Census Data – I think this one is intriguing as a sample data set.
  9. SQL Schema Source Control – This is an SVN plugin
  10. ScriptDB4SVn – Another SVN Plugin to get your database projects into source control.
  11. SQL Source Control (RedGate) – Do you detect a theme going on now?  This is a commercial product to integrate into SVN or TFS.  It integrates into SSMS and has received many great reviews.  I have seen it in use and it is a good product.
  12. SQL Diagnostic Manager (Idera) – I used this tool a lot a few years back.  The tool has gotten better since.  I need to get another license for it and try it again.
  13. Confio Ignite – I was a part of a focus group testing this tool.  I was highly impressed by the tool.  Ignite allows you to gather waitstats and other diagnostic information to monitor the health of the server.  I would highly recommend this tool.
  14. TOAD (Quest Software) – I used this tool a few years ago and liked it.  This tool is useful for determining quickly the alternatives to writing your query in a few different ways and to view the performance impact of those changes.
  15. DBA Bundle and Developer Bundle (RedGate) – Alternatively, you could look for the Toolbelt by RedGate.  The Bundles are chock full of high value great tools to do the job.
  16. SQL Scripts Manager – This is a collection of Scripts from various contributors that has been made available for free by our friends at RedGate.
  17. Dr. DMV – Glenn Alan Berry has some awesome scripts for use on your 2005 and 2008 servers.  These scripts utilize greatly the DMVs in SQL Server.
  18. DBA Dashboard – This is a set of reports put together to help you identify resource usage and the source of that resource consumption.
  19. SQLPing3 – Security type tool to help you discover SQL Servers on the network.
  20. Discovery Wizard for SQL Server (Quest Software) – A tool to help discover SQL Instances on the network.
  21. SQLCentric – By Robert Pearl, this tool is a web based monitoring and alerting tool for your SQL Servers.
  22. Power Architect – I used this tool largely for helping to document some data models.  This is a reasonably priced tool and it works quite well.
  23. SQLIO – This one is from our friends at Microsoft and I think the name explains it.
  24. SQLIOSim – Another tool from Microsoft that I think the name explains it.
  25. IOMeter – Another IO tool
  26. GeekBench – This tool will quickly measure processor and memory and provide some benchmarks.
  27. Plan Explorer (SQLSentry) – I find this tool extremely useful.  The execution plans are much easier to read in this tool than in SSMS.  I use both to compare and contrast and am able to more quickly ascertain the pain points of a query.  The readability of Plan Explorer is great and the additional features really help augment your abilities to query tune based on Execution Plans.

There you have it!  That is quite the list.  I don’t have all of these installed but would recommend trying some of the items out and getting them in your toolbox.  A good recommendation would be to install them into a virtual machine while you are testing the software.  One of my goals for the year is to actually get all of my preferred tools installed into a VM that I can port with me.  By doing so, then I would have them readily available with little to no downtime while trying to install them.

If you have any tools that you think are great and should be in the DB Professional toolbox, please leave a comment or drop me a note.

page 1 of 1

September 2017
« Jun    


Welcome , today is Saturday, September 23, 2017