PASS Subcommittee

Categories: News, Professional, SSC
Comments: No Comments
Published on: March 31, 2011

In February I posted this post about the need for Volunteers for SUMMIT 2011.  I submitted my application to volunteer.  This past week I got word back concerning that application.  I am pretty happy that I have the opportunity once again to serve on a subcommittee for the upcoming SUMMIT.

What does this mean?  Well, I am really looking forward to participating and helping with the event in any way I can.  It also means that I will not be submitting any abstracts this year.  I have a goal in the back of my mind to some day submit an abstract for Summit as well as simply speak at more and more events where possible / feasible.

The restriction on submitting an abstract comes from the subcommittee on which I will be serving.  I foresee this as being an exciting assignment as well as an opportunity to get to know many people a little better.

Here’s looking forward to an Exceptional Summit in 2011.


Categories: News, Professional, Scripts, SSC
Tags: ,
Comments: 3 Comments
Published on: March 30, 2011

Do you use NULLIF?  For me, this command has been seldom used.  Because of that, I have been dabbling with it a bit lately to make sure I have a better understanding of it.

What is it?

This is a function that compares two values.  If the two values are the same, then the result of the operation is a Null matching the datatype of the first expression.  If the two expressions are different, then the result is the value of the first expression.

Seems pretty simple.

In Action

If you look online at MSDN, you can see a couple of examples and a more complete description of what this function is and does.  The MSDN article can be found here.  I wanted something a little different and decided to visualize it differently for my learning.  So here is what I did.

[codesyntax lang=”tsql”]


This will give me a nice random sampling of values to compare.  I simply compare the firstval column to the secondval column.  Both columns are populated with random numbers.  As a part of the result set, I am labeling the comparison field to something descriptive of the field.  I am returning all of the columns so I can see what the values are, and the result of the comparison.  This visualization can help to understand what is happening with the code.  Now I know that if I see a null value, then the two columns are equal.

Do You Solution?

Comments: No Comments
Published on: March 29, 2011

Do you ever find yourself working on different scripts at once?  Maybe you are working on something that tests certain functionality in SQL Server, you have some presentation scripts you are working on, and possibly three or four scripts related to a project at work.  I find myself in a situation like this from time to time – multiple scripts open checking this or that.

How do you organize your scripts?  Do you leave them all open in SSMS?  Do you create a folder structure on your disk somewhere and set yourself a reminder task?  I have left them all open in SSMS on occasion because I wanted to come back to them and work on them and it can help remind me where I am in the thought process and what needs to be done next.

Did you know that there is a feature in SSMS to help you with organizing yourself?  It’s not there just to help you get organized – but it can also help with efficiency and recoverability of files.  I’ll explain the recoverability a bit later.

SSMS has a feature called “Solution Explorer.”  Much like Visual Studio or BIDs (which is Visual Studio), you have the ability to create a solution file to manage your projects.  This solution can then be stored on disk and updated as you progress through the solution or projects.

If you want to display this feature, click the View Menu from the toolbar and then select “Solution Explorer.”  This will open a new pane in your management studio from which you can add new or existing projects as well as rename the solution.  To the right is an example of one of my solutions.  In this image, you can see that I have two projects displayed associated with this particular solution.  In these projects, I have loaded numerous script files for each of the projects as I work on them.  Some of these scripts are testing the results of stored procs, while others may be stored procs, and yet others may be tables.  What the script performs immaterial to Solution Explorer.

Should I decide that I need multiple files open from the project, I can highlight them all and open them all at once.  Nothing earth shattering, but now I have an easy way to see the files associated with each task or project as I work on the task/project.

The project files are stored in XML format and are saved off to disk in a directory of your liking.  An unfortunate problem with this kind of storage is the sorting of your scripts in the project.  Despite this problem there is a way to fix it.  I found the fix for that at this blog.  It is nice to have scripts sorted properly to your liking.  I recommend giving that blog a read and I would recommend the use of this feature in SSMS – if you aren’t doing something already (for instance in Visual Studio, SVN or something like that).

I like having the ability to see all of the scripts quickly that I may need on the same screen I am working in.  I can bounce quickly between the scripts as needed and I don’t necessarily have to keep them open indefinitely while working out the task at hand.

I mentioned earlier that it also helps with the recoverability of files.  Let me explain a bit.  Have you ever been in the middle of working on a script and then the dreaded forced shutdown occurs due to updates?  Just in case you did not get your files saved, SSMS has an autorecover feature as well as an autosave feature.  Files that are Autorecover and Autosave (recovered) are saved commonly in Solution1.  They are stored in your user profile directory and can be brought back so you can continue working on them.

Should you need to find these files, here is a common path you can use to retrieve them.

%userprofile%\Documents\SQL Server Management Studio\Backup Files\Solution1

Do you want to learn more about this feature?  Check out the Microsoft documentation on the topic.  You can view that here.

Something that I really like about this feature is the ability to create a script either directly in the project, add an existing script, or take a script that you started working on outside of the project and move it into the project from within SSMS.  It took me a bit of looking to find how to add a script from SSMS directly to the project (when not initiated from within the project).  It is really easy to do once you find it.  Here is how to do it.

With the focus in SSMS active on the script you wish to move, click on the File menu.  From the File menu you will have an option to Move the script that you are currently working on in SSMS.  From there Select the appropriate project and it is that simple.

*Note: In the example of the image being displayed, you can see that the query was SQLQuery1.sql that I wanted to move.

ShowPlan XML

Categories: News, Professional, SSC
Comments: No Comments
Published on: March 28, 2011

I just ran into something that I hadn’t noticed in SQL Server.  It struck me as somewhat interesting but is really not too big of a deal.

While demonstrating that the XML plan can be generated when you dump data into a temp table, I happened onto this little ditty.  First, let’s take a look at the simple demo script that I created for the temp table dilemma.

[codesyntax lang=”tsql”]


It was thought, by the questioner, that inserting into a temp table just like I am demonstrating would cause an error to be thrown when trying to show the xml plan.  I can execute this query and it runs just fine.  Now, if I add the showplan_xml to it, like the next script, it will still run fine.

[codesyntax lang=”tsql”]


Pretty straight forward and note that I have batched the showplan separately from the code I am testing.  This is required by the showplan command.  The curious part comes when looking at the results.  Well, not entirely the results – but more accurately the name of the results.

I looked at the column name for this output and thought: “wait, I know this is a SQL 2008 instance.”  I decided to verify the version and results.  Thus, I reran the query below to test.

[codesyntax lang=”tsql”]


This query produces the following results.

This seems pretty conclusive to me that I am running SQL 2008 but the showplan still outputs as 2005 XML Showplan.  I also checked this on 2008 R2 and get the same results.  Microsoft has decided to continue to use the 2005 Showplan schema and did not update that name.  Would it be nice to have an updated label?  Certainly it would.  The flip-side is that the XML Schema does not seem to have changed, so it is merely aesthetic in nature.

If you would like, you can take a look at the schema here.  I only checked the last updated date on the current, SQL 2008, and SQL 2005 sp2 schemas and saw that the date was the same for all of them.

Tools note

Categories: News, Professional, SSC
Comments: No Comments
Published on: March 24, 2011

I was asked a few months ago to allow some advertising on my blog.  I hadn’t given the idea a whole lot of thought but was reminded again today about the request.  For now, I think I will just give the requester a mention and add the tool he wants advertised to the growing list of tools that I have been compiling.

The tool looks like it has some merit and seems like it would be worth the trial.  The tool is called DBDefence.  You can find more about it here.  The purpose of this tool is to help encrypt your databases – outside of the databases and prevent people from prying too far into what they shouldn’t.  I haven’t yet tested the tool – but it seems quite interesting.

And since I have a captive audience already, I will add one more tool to the mix.  Like DBDefence, this other tool is security focused.  In SQL Server there is a bug that allows the savvy to view the unencrypted password of the SQL accounts that have logged in since last service start.  One of the passwords that falls into this category is that of the sa account.  Sentrigo has a tool that will clear that password from memory so it can no longer be viewed in clear text.  The name of the tool is Passwordizer.  You can check it out from here.

SQL Server Startup

From time to time I see a question asking how to determine the last time a SQL Server was either restarted or when the last time was that the database was used.  For the first question there is a pretty common answer that works for SQL 2000 and up.  In answering both questions though, I am going to show you three quick similar scripts that can be used to find this information.  The catch is that these queries will not work on SQL 2000.  On the flipside, it should give you an idea of what can be done to find out the same info for SQL 2000.


[codesyntax lang=”tsql”]


A common answer to the question of when was SQL Server last started is to check the Create_Date of the tempdb database.  Since tempdb is recreated every time SQL Server is restarted, it is a pretty accurate timestamp of the server start.  This query was written to work for SQL 2005 and above, but you can still find the create_date of the tempdb database in SQL 2000 by using a different query.


[codesyntax lang=”tsql”]


In this query we access one of those Dynamic Management views available in SQL 2005 and up.  In this case, we look at the first process id similar to what one may have done with sysprocesses in sql 2000.  Now is probably a good time to also note something that is happening inside the CTE.  Notice the COALESCE that is being used?  I am taking the Max from those fields that are associated with read type of activities in a database.  This information is being pulled from a DMV called sys.dm_index_usage_stats.  This query will get me the most recent activity for reads and writes in each database (notice the group by).

Default Trace

[codesyntax lang=”tsql”]


The last one is to make use of the default trace in SQL 2005, 2008 and 2008 R2.  The default trace records the time of certain events that occur in the database and that includes when the database is starting up.  To find the server startup time in this case, I am checking the EventSequence and IsSystem fields for a value of 1.  As you can see, I also dumped the filepath for the default trace file into a variable and use that in the function to get the data.

When comparing performance of these three options, they pretty much came out even.  Each took its turn performing faster – with regards to time.  However, the default trace method did return a more expensive execution plan every single time.  With how fast these queries run, I’m not real sure that that is very conclusive nor that it would be a heavy concern.  These queries are designed more for the occasional run by the DBA rather than to be executed millions of times a day.  You decide which will be best for you.  I think with the use of the function to get the default trace info, I was mildly surprised that the query performed that well.

Another thing to note is that each of these methods will return a slightly different timestamp.  For instance, the tempdb timestamp gives me 3:21:28 and the sys.dm_exec_requests produces a timestamp of 3:21:36 and the tracefile shows a timestamp of 3:21:24.  I don’t have any hard foundation for why that is – just speculation that seems to make sense.  I will just leave it at that though.

In addition to these methods there is also the option of checking the system event logs to determine the SQL Service startup time.  This information should help you when investigating your databases and server and would be something handy to keep in your toolbox.

SQL Resource

Comments: No Comments
Published on: March 22, 2011

By now, I am sure that you are well aware of the Resource Database.  Starting with SQL Server 2005, a new system database was made available – the Resource Database.  In SQL 2005 you could move the Resource Database – but in 2008 and beyond you cannot.  If you plan your install accordingly, your databases should install where you would like them (i.e. the program files directory that is the default installation directory can be changed during setup).

That is all well and good, but what is in this database and how can we take a look into it?  I thought about this as I was reading a blog by Thomas LaRock where he discusses system tables and system views.  I started thinking about the underpinnings of these views and if we could see them in the resource databse.  Taking a peek into the Resource Database is rather trivial.  I will show you how to copy that database and attach it as an alternate database.

The first step in taking a peek is to locate the directory where the data and log file for the hidden database are stored.  You can find the path by referring back to the SQL 2005 and 2008 links earlier in this post.  In my case, the path (I am using SQL 2008 SP2) for me is displayed in the next image.

Take the two highlighted files, copy them and place them into a new directory.  I copied and renamed the files – placing the files into my “Data” directory as follows.

As you can see, the database is rather small – and you could probably do this procedure multiple times ;).

Once you have the files copied to the new locations, you are now ready to attach the database.  I am just using the GUI to attach the database – for illustrative purposes.  While attaching, I think there is an interesting thing that should stand out.

Now look closely at the image just above – pay attention to the current file path.  This is the screen you will see after you browse to your new directory location and choose to add the mdf file for the Resource database.  You will see that this file path does not match either of the file paths that I have already shown.  Take a closer look:

You can see that the file path references a path for e:\sql10_katmai_t\sql…  I find that pretty interesting.  I don’t know the reason for that path persisting like that.  The file paths shown need to be changed to the appropriate file path where we placed the copied files – as follows.

There is one more thing that I think you should change.  This really is a matter of preference and is good housekeeping.  Change the name of the database in the Attach As field.  I appended _test to the end of the database name.  This should make it obvious that this database is NOT the system database.

With these few steps, you now have the resource database attached as a User Database.  With that, you now can take a look at the inner makings of some of the system views.  Take the following queries for example:

[codesyntax lang=”tsql”]


Both of these queries return exactly the same thing.  Both of these queries are only provided as a means of backwards compatibility.  You can actually see master.sys.sysdatabases (just like mssqlsystemresource_test.sys.sysdatabases).  One thing you can do though is script out that view and see what the source of the data is.  Well…with the resource database attached as described, you can now script it out.  When you script out the view, make sure you try to script the correct version of it.  Just like any user database, you will get a category of system tables and one of system views.  The objects under those categories cannot be scripted.  The objects, however, are also represented in this database as “user” objects – which can be scripted.

After you script out the object, sys.sysdatabases you will see that the source of the data is the following:

[codesyntax lang=”tsql”]


There you have it – this view is predicated on the newer view called sys.databases.  As has been said over and over again, it only exists as a backwards compatibility view until Microsoft decides to remove it.  Also note that the other part of the foundation for the sysdatabases view is the sys.master_files$ view.  You can now see this in the system views collection inside the Resource database that we attached.  Unfortunately you can’t script it, but at least you can see the columns that it would return.  You also can’t query directly to that view – you have to reach it through the system views.

Dedupe Data CTE

Tags: ,
Comments: 2 Comments
Published on: March 21, 2011

As a DBA working with SQL Server, some data issues arise that need to be fixed.  One such data issue is the duplication of data.  Duplicate data can happen for a number of reasons.  One such reason is the absence of constraints such as Primary Keys.

Luckily, there are ways for the Data Professional to clean up such a quagmire.  First, let’s set the stage with a really simple example.

Here, we are simply creating a table and populating the table with two columns of randomized data.  This should give us a significant amount of duplicate data and a good enough test bed to demonstrate how to quickly fix this issue.  Due to the random nature of these inserts, we will get a significant range of duplicates for some values and maybe no duplicates at all for other values.  (Did that make sense?)

Let’s check our work and see how well we did with the duplicate creation.

Each time I have run this, I have come up with a good test bed.  It should work relatively well for you as well.  Now, we can work on eliminating those duplicates.  To better illustrate that we have dupes and the distribution, let’s have a look at this query.

This query will yield results like what we see in the image to the right.

Similarly, I can run a query that would be contained within a CTE and take advantage of the ROW_Number() function to help find where I may have dups.

Now that we know for certain that we have a large amount of dupes, lets finally work on eliminating them.  We can use a query just like the last one and alter it a bit to make it work so we can delete the dups.  Under normal circumstances, I would make absolutely certain that the tables to be affected by these deletes were backed up.  You can accomplish that by either copying the data in the table into a new table or you can run a database backup.

Really easy script.  This will delete all source table records that are dupes.  You can validate that by rerunning either of the first two queries used to demonstrate that duplicate data existed.  This is illustrated in the following image.

Since SQL 2005, removing bad duplicate data has become substantially easier.  This example should help to demonstrate that as well as provide a method to create a repeatable test while learning the technique.

Database Tools Follow-up

Categories: News, Professional, SSC
Comments: No Comments
Published on: March 18, 2011

After posting a database tools list yesterday, I found that there were things that I had forgotten and a couple of things that i hadn’t even considered.  In thinking about this, I found myself trying to find a script that I knew I had seen and that I wanted to put on the list.

Trying to find the script got me a bit out of sorts.  I knew I could probably find it on my share at work with my scripts (which reminds me that I really need to get a dropbox or put them all on a thumbdrive to keep with me).  But, I wasn’t at work and wanted to locate this cool script that I knew was recommended by another prominent person (more about that later).  So I pulled out my googlefu.

I tried searching the web for this blog that had the script and tried several combinations of the key words I knew would help me find it.  In doing this search, I came across a few blogs that had created a list of tools that database professionals should check out.  Here is a short list of the blogs I found.

All of these lists are great compilations.  I also found one more list, but found that it was just a copy of Aaron’s list.  This was extremely disappointing.  Just reference the list or come up with a list of your own.  A SQL Server MVP should never have to copy content from another MVP.  It’s not even that the list is the same – it was a copy of the entire post by Aaron.  I personally hold MVPs to a higher standard and find it disappointing and aggravating when one is copying content and is esteemed as a person who has contributed much to the community.

None of these lists had what I the item for which I was looking.  The item that I wanted was a script that Brent Ozar (blog) had given high compliments to a few months back.  It was a script that, he said “a script that’s like my Blitz server takeover script, but even more advanced.  Love it!”  Too bad I couldn’t remember the person, script or wording he used when I was trying to find the script.  Better yet, the page with the reference popped up several times in my searches.  Well, needless to say, I pinged the community and Brent on twitter and Brent came through.  Here are a few more items to add to the list of tools you should have or try.

  1. 60 Minute Blitz Script (Brent Ozar) – check it out.  It will be worth your time.  If you looked at the other lists you will also see that this script is there and highly recommended.
  2. Diagnostic Information Queries (Glenn Berry) – this was the script I was trying to find.  This is the script that Brent has given high praise – give it a look.
  3. sp_whoisactive (Adam Mechanic) – Newest version as of this post is 10.76 and is listed as a beta version.
  4. SSMS ToolsPack (Mladen Prajdic)

So there you have it.  Between my two lists and these other great lists – you should have a great toolbox.

Database Tools

Categories: News, Professional, SSC
Comments: 5 Comments
Published on: March 17, 2011

Recently I have posted a couple of articles about various database or SQL tools either for maintenance or for benchmarking.  Those posts work well in conjunction with a post several months back that you can read here.  These posts made me think a bit harder about the tools that I use to do the job.

In thinking about those tools, I realized that I did not have a list of what I have used or do currently use.  Thus, I went through an exercise compiling my list of tools.  Some of these tools are currently in use, some are items that I have used in the past and was impressed by the tool.  And someday in the future I will finish compiling the list of all of the tools that I have encountered over the past year or two that I have intended to test.

Without further adieu:

  1. SSMS Addins – This is available on codeplex.  One of the features that intrigues me is the ability to script out the data from the table.
  2. OpenDBDiff – This one seems to have promise.  Similar in function to Visual Studio 2010 or RedGate SQL compare tools, this one is free and compares the schema and objects.  If you are on a tight budget and need to be able to compare two databases, this just might be the tool to try.
  3. SQL Monitor – A tool to monitor various things in SQL Server like jobs and executing queries.  Kind of low-level, but I figured I would test this app out and see if it had some merit.
  4. SQL nexus – This is a tool to help evaluate performance issues with SQL Server.  You can evaluate wait stats along with PSSDiag files.
  5. SQL Powershell Extensions – I recently learned of this tool on Codeplex.  This is a high priority item for me to download and test.  This tool helps to create “intuitive functions around the SMO objects.”
  6. PowerShellPack – Download from Microsoft to enhance the powershell experience.
  7. Data Dictionary – This software is on my list to evaluate.  It is mostly out of curiosity because I have something in place to create data dictionaries already.  This tool allows you to update the extended properties from a GUI.
  8. US Census Data – I think this one is intriguing as a sample data set.
  9. SQL Schema Source Control – This is an SVN plugin
  10. ScriptDB4SVn – Another SVN Plugin to get your database projects into source control.
  11. SQL Source Control (RedGate) – Do you detect a theme going on now?  This is a commercial product to integrate into SVN or TFS.  It integrates into SSMS and has received many great reviews.  I have seen it in use and it is a good product.
  12. SQL Diagnostic Manager (Idera) – I used this tool a lot a few years back.  The tool has gotten better since.  I need to get another license for it and try it again.
  13. Confio Ignite – I was a part of a focus group testing this tool.  I was highly impressed by the tool.  Ignite allows you to gather waitstats and other diagnostic information to monitor the health of the server.  I would highly recommend this tool.
  14. TOAD (Quest Software) – I used this tool a few years ago and liked it.  This tool is useful for determining quickly the alternatives to writing your query in a few different ways and to view the performance impact of those changes.
  15. DBA Bundle and Developer Bundle (RedGate) – Alternatively, you could look for the Toolbelt by RedGate.  The Bundles are chock full of high value great tools to do the job.
  16. SQL Scripts Manager – This is a collection of Scripts from various contributors that has been made available for free by our friends at RedGate.
  17. Dr. DMV – Glenn Alan Berry has some awesome scripts for use on your 2005 and 2008 servers.  These scripts utilize greatly the DMVs in SQL Server.
  18. DBA Dashboard – This is a set of reports put together to help you identify resource usage and the source of that resource consumption.
  19. SQLPing3 – Security type tool to help you discover SQL Servers on the network.
  20. Discovery Wizard for SQL Server (Quest Software) – A tool to help discover SQL Instances on the network.
  21. SQLCentric – By Robert Pearl, this tool is a web based monitoring and alerting tool for your SQL Servers.
  22. Power Architect – I used this tool largely for helping to document some data models.  This is a reasonably priced tool and it works quite well.
  23. SQLIO – This one is from our friends at Microsoft and I think the name explains it.
  24. SQLIOSim – Another tool from Microsoft that I think the name explains it.
  25. IOMeter – Another IO tool
  26. GeekBench – This tool will quickly measure processor and memory and provide some benchmarks.
  27. Plan Explorer (SQLSentry) – I find this tool extremely useful.  The execution plans are much easier to read in this tool than in SSMS.  I use both to compare and contrast and am able to more quickly ascertain the pain points of a query.  The readability of Plan Explorer is great and the additional features really help augment your abilities to query tune based on Execution Plans.

There you have it!  That is quite the list.  I don’t have all of these installed but would recommend trying some of the items out and getting them in your toolbox.  A good recommendation would be to install them into a virtual machine while you are testing the software.  One of my goals for the year is to actually get all of my preferred tools installed into a VM that I can port with me.  By doing so, then I would have them readily available with little to no downtime while trying to install them.

If you have any tools that you think are great and should be in the DB Professional toolbox, please leave a comment or drop me a note.

«page 1 of 3

March 2011
« Feb   Apr »


Welcome , today is Tuesday, March 28, 2017