Data Head

Categories: News, Professional, SSC
Tags: ,
Comments: No Comments
Published on: September 27, 2011

Well, it’s official.  I am a Data Head.  It doesn’t change too much my level of geekiness.  But I may have to rethink one of the answers given during that interview.

The Data Head profiles are being done by a training company called Data Education.  The company is another of Adam Machanics creations and is based out of Boston.

The people at Data Education contacted me a month or so ago and asked me a few questions.  I thought it was a cool idea and answered the questions and provided them with a profile picture.

There are two other Data Head profiles to this point.

Both of these people are MVPs.  The only Microsoft title I could associate to my name is via my certifications (MCITP and MCDBA for instance).  That brings up one small update to that profile – the fact that I have the MCITP too and not just MCDBA (which as you know is expired).

I put more thought into the question on geekiest thing ever done.  If I could do it, I would totally do this.  Well, at least come up with a way (maybe a remote built into the gloves) to make it appear like I had something beyond the TSQL Force.

Swing by Data Education and check things out – at the very least check out the Data Heads.

In The SQL News

Categories: News, Professional, SSC
Tags: , ,
Comments: No Comments
Published on: September 23, 2011

There is a lot of stuff going on out there these days.  I am looking forward to a few things.  And since I am looking forward to them, I thought there are others out there that are probably interested too!


SQL Server MVP Deep Dives 2:  The first book was awesome.  They have ramped it up again and are bringing us more excellent content.  I think this is a must buy out there.  Go check it out from Manning Publications (the link).  Digital copies are already available for MEAP participants.

Jonathan Kehayias (blog|twitter): I don’t have a lot of info on this.  I know that the book is coming out soon.  I know that Jonathan does some really good stuff.  I know the person that is tech reviewing the book.  And I know that it is being published by Simple-Talk.  Check SQLServerCentral and Simple-Talk to find the book in the next few weeks to months.


SQL Inspire NYC:  Bobby Tables (aka Robert Pearl) is one of the event organizers and a friend.  The event is November 12, 2011 and registration is coming to a close real soon.  Currently there is even a push to get more people registered.  Check out Bobby Tables’ blog on the topic.

SQLSat 103: Another Friend is organizing this SQLSat event down in paradise.  Roy Ernest is extremely excited for this event and wants to put on a good show.  I am working on getting my passport renewed.  This event is February 25, 2012.

SQLSat LV: Another person I know really really well is working with a bunch of friends to try and put this thing together.  We have the date reserved as March 10,2012.  Now, I need to pull the rest of the info together.


Performance Stories: This contest is currently on-going and about to end.  I submitted an article.  Swing by and vote for my entry.

There are other things happening like Summit 2011, Regional Mentors being named, and a few other books being written (by people like Stacia Misner, Denny Cherry, Jason Strate and Andy Leonard).  I figure you probably already know about the new mentors and Summit by this time.  As for the books, I know even less about those books than the ones that I plugged.  But I can say that these books look to be very interesting as well.

September Performance Contest

Categories: News, Professional, SSC
Comments: No Comments
Published on: September 21, 2011

This month SafePeak is sponsoring a contest centered around improving performance in SQL Server.

The host of the contest is my friend Robert Pearl.  You can read the announcement from him about the contest here.  In that link you can also read his submission.  After reading that link by Robert, you can then go to the SafePeak website and read all of the entries there.

When you go there, make sure you vote for my article.  Yeah yeah – shameless plug I know.  I have submitted an article for the contest.  You can read that article from the contest or you can read it from my blog directly.

I am a little biased, but I think my entry is top notch!!  There is not a lot of time remaining on the contest – so go vote please.

Stored Procedures – Common Security Practice

Tags: , ,
Comments: No Comments
Published on: September 20, 2011

In SQL Server a good practice is to access the data via calls through stored procedure.  Have a look at the document available in that link.

To further this practice, one may create a database role, then add users to that role.  Permissions to execute the stored procedures would then be granted to the role.  A role is simple enough to create.  You can do that with the following code.

[codesyntax lang=”tsql”]


After creating this role, simply add users to that role.  The next part of the process is to ensure that you have granted the appropriate permissions to this role.  There are two methods to do that: 1) blanket execute to all procedures, and 2) pick and choose the procs to which you wish to grant permission.

Before we get to adding permissions, let’s create a little test proc for testing purposes.  I will reuse something from a past article to simplify.  In that article, I already did the setup for the table – you can get it from here.  The stored procedure is as follows.

[codesyntax lang=”tsql”]


Let’s also make sure that the user has been added to the role.

[codesyntax lang=”tsql”]


As for the exercise in how to create that database user and the associated login, I will leave that for you to do.

In order to test, we need to connect to the SQL Server as that user.  Once connected, run the following to verify that your session is connected as desired.

[codesyntax lang=”tsql”]


On my connection, running that query will show that the LoggedInUser is testu.  Now, having confirmed that I am connected as the appropriate user, I will try to execute that test proc we created.

[codesyntax lang=”tsql”]


At this point, the expected results should be similar to this error message.

[codesyntax lang=”tsql”]


Now, I will switch over to the previous connection where I have administrative permissions.  I will now proceed to grant execute permissions following the first method – blanket grant.

[codesyntax lang=”tsql”]


And now, flip back to the user connection to test our permissions.  Try running that proc again, and your results should be similar to these.

ColorID ColorPlate ColorType
1 Red 1
2 Blue 2

That is good, but what else can this user now do?  The user can execute all user created stored procedures.  Do you necessarily want this?  What if your business requirements specify that certain user groups be able to execute only certain procs?

In that case, we now need to grant execute permissions on a more granular level.  A big problem with this method pops up right from the beginning.  What if there are thousands of stored procedures?  What if you need to grant execute permissions to hundreds of stored procedures for each role?

We have two avenues for these types of situations.  One avenue is to separate the various stored procedures via schema and then grant execute to the schema.  The other is via naming convention.

In the event you have a suitable naming convention to help mass assign permissions, here is a little script to help.

[codesyntax lang=”tsql”]


It is very simplistic, I know.  I also left an example of such a naming scheme.  In this example, the naming convention may imply that the procedure is a Reporting Services stored procedure.  I can query for all of the Reporting Services procs in the database, and then assign permissions to all of them much faster.

This method does not immediately grant permissions to the entire result set.  It does allow for you to review the results.

Are there more elaborate examples out there?  Certainly.  Find a method that suits you.  I would be very careful about using the first method though – it just might be too much granted to the user.

Performance Tuning – Over the Top Biz Logic

Comments: 1 Comment
Published on: September 19, 2011

This month, Robert Pearl is helping to host a Performance Story contest.  The contest is being sponsored by Safe Peak, and you can see more about it – here.

The timing on this contest actually fits well with something I have already been working on.  The only delay has been related to getting everything scrubbed and obfuscated.  There is a lot that could be included with this particular experience, I plan on touching on two of the key areas.


I have been working a lot lately with building reports in SSRS.  These reports had been working great from IE hitting directly to the SSRS Server in one environment.  When we migrated the reports to the production environment – we started seeing some serious timeout issues.

There was one report in particular that was giving more grief than others.  I had already gone through everything I could find with regards to SSRS in trying to help speed these reports up to acceptable times.

Having gone through what could be done in SSRS, I next turned my attention to the main stored procedure creating the data-set for the report.

Stored Procedure

Although the report was not even making it to this particular stored procedure – I wanted to take a look at it and make any improvements there that I could.  It made sense to eliminate it as a possible source of the problem.

I opened up the stored procedure and found a fair amount if Business Logic.  I came to a section of the proc that contained code that at first sight, made my head spin a bit.  I thought for sure I was seeing double because of all of the logic.

Once I finally got my head wrapped around it, I noticed a lot of similarities and wanted to start from that particular section with my tuning.  Here is an obfuscated version of what I saw.

[codesyntax lang=”tsql”]


I want you to look closely at both sides of that OR condition in the where clause before proceeding to look at the image of the execution plan that this query generates.

Again, my head started spinning when I first looked at the execution plan.  It was the size of the overall plan that was causing it this time.  Looking through it though, I quickly saw what I thought was the main point of slowness with this query (yes it was slow).  If you open the image of that execution plan and compare the red section to the yellow section, you will see that it is duplicated.  These sections also correlate to each side of the OR condition that was already mentioned.

When browsing this execution  plan in SSMS, I was also lured into these sections due to a bad estimate on the query cost (which you don’t see in this image).  All of the Index Spools and Index scans/seeks had a cost of 89% attached to them (within the red and yellow regions).  Seeing that both sections were identical – I was certain that the OR condition could be optimized.


Since the queries were soooo similar, I knew there had to be a way to combine both sides and make it less costly.  After working on it a bit, I came up with the following query.

[codesyntax lang=”tsql”]


Now take a close look at the first part of the where clause.

[codesyntax lang=”tsql”]


That little change represents the only difference between the two sides of the Or condition in the prior query.  The result set is exactly the same.  Now we get a new execution plan that looks like this.

In this execution plan, you can see that there is now only one section matching the red and yellow sections from the previous query.  This is a very good thing.  We can see that the plan looks simpler than the last one.  What about if we execute the two queries in a single batch for comparison?


The improvements we see are substantial.  Look at the difference in cost comparison between the two!!  The old query consumes 97% of the cost.  To further illustrate the improvements by making this little change in logic, let’s take a look at the time.










Looking at the time it takes to run these queries should also be a strong indicator of the performance gains made by simplifying that where clause.  Let’s also take a look at another couple of screen-shots that helps to show some of the different improvements made by simplifying that “OR” condition.

Notice the old plan has an index scan, sort operation and 61 parallel operations.  The new plan has a key lookup.  Now look at the glaring difference between the two in estimated rows.  That is significant.

That is not all that was done with the query itself to improve performance.  Other things like index tuning and more work on the rest of the query was also done with some improvements in overall execution of the query.  But none were as significant as this.


The other place where significant improvement was made in the performance of this query came from something outside of SQL Server.  A key part of this exercise has not yet been mentioned.  The reports were all taking 30-60 seconds to even load without any parameters having been selected.  Then they would take another 30 seconds or so after that to bring up the next parameter.

We ran a trace between the two servers to see what we could find.  After some more troubleshooting, we decided to try disabling the TCP Chimney.  This change had a significant impact in loading the reports from SSRS.  From within IE, these reports were now immediately loading and the prompt flow became very responsive.


Taking a little bit of extra time when writing out the logic for your queries can have a profound impact on the responsiveness of your queries.  In this case, simplifying the code but retaining the logic had a tremendous effect on overall performance.

In addition to what can be done in SQL Server, sometimes one needs to look outside of SQL to the network or disk layer for further tuning.  Working with the appropriate teams to accomplish those goals can go miles to having a positive effect on co-workers and application experience by the business users.

Column Level Permissions

Categories: News, Professional, Scripts, SSC
Comments: 2 Comments
Published on: September 19, 2011

Did you know that you can grant permissions down to the column level in SQL Server?   Well, if you didn’t know that – you do now.

It is actually rather simple to grant permissions at the column level.  This can be demonstrated by the following script.

[codesyntax lang=”tsql”]


If you want to check out more on that syntax, read here.

And then…

Why is it important to know that you can do this?  Well, it is quite possible you have some of these permissions already in place.  It is possible you may have inherited something like this.  Just maybe there is a business requirement requiring that certain users or groups only have access to certain data within certain columns.

That brings up a new problem then.  How do you find out what columns have specific permissions applied to certain users?  Well, that is actually pretty straight forward.  We can query the system views and determine column level permissions.

[codesyntax lang=”tsql”]


The previous query is a really simple version of how to find this information.  As you can see, I am simply returning the UserName, TableName and ColumnName along with the permission in effect on that column.

You should also be able to see that the mapping between these system views is pretty straight forward as well.  Major_id maps to object_id and column_id maps to minor_id.


This query can be of good use to determine permissions in place for columns in tables within your database.  Furthermore, you can even use this query to simply test your curiosity as you check to determine what has been put into effect in the databases you manage.

There are more complex methods to determine these permissions.  With there being more complex methods, I am sure there are also some easier methods.  Let me know what you do to query these permissions.


Categories: Book Reviews, News, SSC
Comments: No Comments
Published on: September 16, 2011

It has been a looooong time since I wrote a little book review.  It’s nice to take time out with a little fantasy to ease and please the brain.

Just because I haven’t written a review in a long time doesn’t mean that I stopped reading books.  I actually have a queue of books finished that could have a review written.  For now, I’ll just write a review on the latest book completed and probably never get around to the others – though they would be worthwhile.

Beyonders is the name of the book I recently completed.  This is a fantasy novel that is the first in a three book series.  For now, I am somewhat stuck waiting for book two.  Knowing that the book is part 1 of 3 can help you assume that this book ends in a cliff hanger.  That is somewhat frustrating – I was into the book and really wanted to reach the finale much sooner than the anticipated two years for the remaining tomes.

In Beyonders, two earth children get “warped” to an alternate dimension.  The are transported to a place that is void of heroes.  In this world, magic reigns supreme.  These two children must solve an enigma and become heroes in order to be able to return to earth.

These kids meet interesting people along the way, make several friends, and make even more enemies.  There is plenty of peril along the way.  The kids must solve the puzzle by gaining pieces to a word along their quest.  This word can defeat the wizard and restore peace to the world.  By the end of the first book, the kids have retrieved all the syllables to this word.  Shortly after completing the word the book ends and you must wait until the next book to continue the quest.

The plot had plenty of twists and turns to keep you interested.  It also portrayed these youth quite well as longing to return to their families and friends  Neither of them knew each other prior to being sucked into this new world.  Both show a lot of concern for the other as the book progresses and their friendship forms.

I would recommend reading this – especially if you have teenagers into these types of stories.

What Happens in Vegas…

Categories: News, Professional, SSC
Comments: No Comments
Published on: September 14, 2011

…Need not Stay in Vegas

Coming to Las Vegas in 2012 is an event many have been waiting for.  Those that have been waiting for this event are the SQL professional type.  And believe me, it will be good.

A few friends and I have begun work on putting together a SQL Saturday Event in Las Vegas.  We have a date in mind and a venue.


We have arrangements with the fine people at MStaff Solutions to use their facilities at LakeSide Center.  Here (to the left) is a great shot of the area that is provided by MStaff (which can also be found on their site).






We will be able to use a few classrooms as well as one of their dining rooms.   To the right is an image of one such banquet room that they sent to me.

The people at MStaff are excited to be helping put this event together with us.  They really want for this to be a top notch event – as do we.



Be ready to mark your calendars.  We have put in a date reservation to SQLSaturday for March 10, 2012.  This date is still subject to change.  But I feel confident that it will not be changing at this point.

There seems to be a fair amount of interest in having a SQL Saturday in Las Vegas.  We now hope that this interest turns into a good amount of attendance.  We will be marketing this event for the next few months to try and drive up interest and registrants.

Interested in Presenting: Start prepping your presentations!

Interested in Attending:  Be prepared to register and come to “get your learn on!!”

Oh, and please do not leave what you learn, at this event, in Las Vegas.  Some things are not meant to stay in Vegas.

T-SQL Tuesday #22 – Data Presentation

Tags: ,
Comments: No Comments
Published on: September 13, 2011

Tuesday is upon us.  It also happens to be the second Tuesday of the month and you know what that means.  It is time for TSQL Tuesday.  This month, Robert Pearl, a friend, is hosting.  Read his invitation here.

As the invitation suggests, this topic is wide open.  There are many ways to present data.  Some of those methods may be better than others.  There are a few things of note with Data Presentation that will impact how much better your Data Presentation will be.

The items I will discuss that will help your data be better presented are: Performance, Accuracy, Display, and Business Requirements.  I will use a few scripts to progress through each of these topics.  Let’s start with Performance.


Why is performance included in this discussion?  How does performance relate to data presentation?  Well, have you ever had an end-user complain that a report was utterly useless because it was too slow?  If the report is too slow, it won’t matter if the data is accurate in the eyes of some.  They needed to have the data yesterday and it simply took too long.

Here is an example of a query that could be optimized a bit more (admittedly this query does not perform soooo slowly that a user would give up on it – by the end you will see that it could perform better).

The requirements for this script are simple.  Provide the number of days in a month in a given date range.  If the starting date provided is not the first of the month, then we must only provide the number of days from that date to the end of the month.

As you can see, this script utilizes a looping mechanism to provide this information.  The loop inserts into a table variable one record at a time.  When examining the execution plan and the execution time on this query, one would see that the insert is the most expensive part of the query.  One would also find that this query does take a fair amount of time to run – despite its’ simplicity.  On my machine, it takes about 1.3 seconds to execute.

Certainly, if this were a more complicated query, one would see that this type of query could cause some delays in data rendering and subsequently cause grief for the end-user and you.

There is one more issue with the provided query in that it doesn’t meet all requirements.  I neglected to mention that the date format needs to accept date/month/year format.  Yes it is nitpicking, but it was a requirement and an invalid date is far too easy to submit with this query.


There should be no question on how accuracy affects data presentation.  No matter how pretty the rest of the data/report may look, wrong data will render the report useless in the eyes of many business users.

Again, this example is not extreme – but it does create sufficient concern that the query should not be used – unless fixed.

Though this query works faster than the first query, it is not entirely accurate.  This query only supports a max of 12 months.  Also, this query is overly complex.  There is value in keeping this particular query more simple.


This topic could be deemed to be largely a matter of preference.  That said, there is great value in how you display the data to the end user.  Displaying the data involves such things as useful column headings, and meaningful data.  For instance, naming a column “Month” but displaying a year value in it – is not very useful.  This could also overlap with the prior topic of accurate data.

Personally, when I am displaying the month, I like to see month names.  In my experience, displaying the name of the month is more easily recognizable than the month number for many users.

In this case, the Column Heading matches the data and works.  However, a little change could make this query and output more useful to an end user who is quickly scanning over the results.

To this point, each query has progressively increased in performance.  This last query (both versions) also matches the need to accurately display the data.

The display of data is closely coupled with the next topic – Business Requirements.

Business Requirements

When looking at the results of the last two queries, which one is more accurate?  Well, that entirely depends on what the business has specified as acceptable output for this query.  In this case, both could be correct or both could be wrong.

It is essential to have the requirements written down and understood by all parties so as to avoid any misconception or misunderstanding of what the query should display.  Matching your output to the specified business requirements will help to provide an overall impression of accuracy and usefulness.  It will also help to achieve faster sign-off by the business.


Data Presentation is not just about the look of a report.  Data Presentation involves performance, accuracy, display and the business requirements.  By giving proper attention to each of these facets, the Data presented will be both remarkable and acceptable to the end-user.

Hustle and Bustle to Kick off September

Tags: , ,
Comments: No Comments
Published on: September 7, 2011

September is kicking off with a boom for me.  In one week, it seems I will be here, there and everywhere.  Between SQL, PASS, Family, and User Group – this is shaping up to be quite the week.

Here’s a quick run down of this weeks agenda.

Cross Country

My oldest son is running varsity cross country.  This week there are two meets.  One is in-state and the other is in Idaho.  We are really looking forward to these meets.  He is enjoying the sport, team, atmosphere and success he is having.  The first meet is Wednesday and the second is this weekend.


We also have a son starting soccer practice this week.  His first match is this weekend.  He has played before and really enjoys playing soccer.  Mostly, I think he likes kicking the ball hard.


I sponsor a co-ed softball team.  We just wrapped up the summer season finishing second in the season ending tournament and first in the regular season.  Fall season starts Wednesday and this should be a lot of fun.


My wife captains a co-ed volleyball team.  The league was created as a fundraiser for the high school volleyball team.  She has three matches on Wednesday.  This keeps her pretty busy while I am busy with the kids.  The league is about to wrap it up – so the schedule gets a little less congested after this.


24 Hours of PASS starts Wednesday.  I am scheduled to moderate a session for Allan Hirt that begins at 19H00 GMT.  I invite you to attend this session.  It should be a good session and well worth your time.  Now where is that record button?  Here is more info on that particular session.

You can find more about the rest of 24HOP here.  The schedule line up is pretty good.


Thursday evening (yes 24HOP is on Thursday too) we will be having our regularly scheduled monthly meeting for the Las Vegas SQL Users Group.  You can read more about that here.  This meeting is open to all wishing to get some more learning out of the day than what is available through the means of 24 Hours of PASS.

SQLSaturday 94

SQL Saturday 94 is being held this Saturday, September 10.  You can read more about what I have already blogged about it – here.  As you can see from the above entries, I already have conflicts on this day.  I am hoping that the timing works out to be available to all three things.


«page 1 of 2

September 2011
« Aug   Oct »

Welcome , today is Thursday, June 4, 2020