December 2011 LV UG Meeting

Categories: News, Professional, SSC
Tags: ,
Comments: No Comments
Published on: November 30, 2011

The Holidays are upon us.  It is time to take a break from the hustle and bustle and come out to participate in the local User Group for SQL Server.

This month, Jason Brimhall will be presenting a new topic:


As a DBA in the modern era, you may be required from time to time to do something outside of your comfort zone.  One of these things may be to become quickly acquainted with SSRS.  Even better is that you may be required to do things you have not considered in a standard report.  In this session, you will learn how to implement a framework to help provide a common ground for your reports.  This session will delve into fun topics such as dynamic grouping and dynamic sorting.  We are not talking about the interactive sorting that your accountant may use.  Attendees will also be introduced to a few quick methods of exporting Reports from the report server – this is from a DBA perspective after-all!

Jason’s Bio:

Jason Brimhall has 10+ yrs experience and has worked with SQL Server from 6.5 through SQL 2008 R2. He has experience in performance tuning, high transaction environments, as well as large environments. He is currently a DB Architect and an MCDBA. He is he VP of the Las Vegas User Group (SSSOLV).

LiveMeeting Information:

Attendee URL:
Meeting ID:  FHKZS7

New Meeting Segment

We are going to be trying a new idea with the group at the meetings.  Each meeting, bring some of your ugly code.  We can look it over and help each other make better code.


The meeting location has changed.  We will no longer be meeting at The Learning Center.  New meeting location is M Staff Solutions & Training / 2620 Regatta Drive Suite 102 Las Vegas, NV 89128.



SSRS Export part 2 (Export Data Source)

Categories: News, Professional, SSC
Tags: , ,
Comments: No Comments
Published on: November 29, 2011

Back in August, I published a post on exporting SSRS report files en masse.  That article (which can be read here) detailed an SSIS package I created to export the files.  As it is published, it is only good for exporting the actual report files and not the data sources.

I knew of this short coming for some time and updated my SSIS package shortly after with the expectation of writing an update to that article.  Well, time went by a little too quickly.  Now it has been almost four months and I am finally getting to that update.  I am doing this all while also working out a TSQL only solution to do the same thing.  I hope to have that worked out soon with how to do it being published shortly after.

So, in keeping with the same format as the previous article, let’s start by detailing out the variables that have been added.

FileExportExtension – As the variable name should imply, this is the extension of the xml file that is to be created.  RDL would be for the report file, and RDS would be the data source (as examples).

Then inside the script task we will find the next change to be made.  The new variable we created will need to be added to the readonly variable list as shown.

So far so good.  The changes are simple and straight forward.

The next change is to the script.  Let’s click the edit script button and we can change the Main with the following.

Looking at this code, you will see once again that variable that we added popping up.

One key to this working effectively is the use of the ReportSourcePath variable.  An appropriate path must be specified that contains Data Sources in the Catalog table.  An example would be /Data Sources/.  Some environments may have a subfolder after the data sources.  Just make sure that the path specified leads to the data sources you want to export.

I had also considered altering the “Execute SQL Task” that starts the flow to this package.  The script there could be altered such that another variable may be added to designate report part type.

The change would add another variable into this query in the where clause.  Change the type from in to an equality.  Add a variable that would designate the different types listed in the case statement – and it becomes pretty straight forward.  This change would allow more flexibility.  I will update at a later time with the changes I have made to the package to permit that.  But for now, I felt it more of a bonus addition and didn’t need it quite yet.  (Necessity drives functionality, right?)

If you make these suggested changes, you will have more flexibility in being able to export the various files related to reporting.  If you have played with Report Manager, you will know that there is no way to export an RDS file.  Now, you have a means to export the xml into a usable file that can be imported to another server – if you need it.

Check back in the future for that update to do this using TSQL as well as for the update to provide more flexibility to the package.



SQL Deep Dives 2 on the Kindle

Categories: News, Professional, SSC
Tags: ,
Comments: No Comments
Published on: November 23, 2011

Since Deep Dives 2 came out, I had been putting off getting my copy of the e-book until I wanted the book for travel purposes.  I decided I really needed to have it loaded on the kindle and quickly started running into a few roadblocks.

First roadblock was easy to overcome.  That required an email to Manning to get the beta link for the ebooks.  If you purchased through Manning, I’d recommend checking the beta site for any e-book purchases.

The second roadblock was determining which file to use on the kindle.  E-book formats are .mobi, .pdf and another that escapes me right now.  I didn’t see one for kindle.  In my journeys though I learned that mobi is essentially the same format as the azw format used by Amazon for the Kindle.  That is very good to know.

The next roadblock was how to get the file onto the kindle.  Getting it into my PC Kindle was pretty easy.  Find the \Documents\My Kindle Content folder in your user profile directory.  Then copy the mobi file to that directory.  But despite that, syncing did not put that book onto the rest of my kindle apps.  I soon found two ways of getting that done.

The first method for getting those mobi files onto the kindle was to plug my kindle into the usb port and copy the mobi file onto it.  Still, it didn’t sync to the rest of my devices.  Flip side is that it was pretty fast.

The second method for getting those mobi files onto the kindles is to email your kindle email account.  Amazon will eventually make those files available for you.  This will make it so you can sync all of your kindle devices with the same files.  The drawback is that it is considerably slower.

The same process can be done for any of those SQL books you have that you want to port around with you.  Now, the book is updated on several devices for me (laptop, phones, kindle) and I can reference it much faster than lugging the book around everywhere I go.

Try it and enjoy!

Throne of Fire

Categories: Book Reviews, News, SSC
Comments: No Comments
Published on: November 22, 2011

Finally, I have completed another book.  I took the opportunity while traveling to catch up on some reading.  Better yet, I was able to do this while testing out my Kindle.

The book I just finished is “Throne of Fire” by Rick Riordan.  This is the second book in the Kane Chronicles series and is very similar to the popular series about Percy Jackson (by the same author).

Sadly, some of my dislikes about the Percy Jackson books are present in this series as well.  I can get past some of that because the story is good (grammar and spelling mistakes throughout).

The two Kanes (Carter and Sadie) embark in this book to awaken the sleeping crazy Zeus.   The meet new friends and new challenges.  There are bumps and twists throughout the book.  And the one thing that kinda bugs me is that the climax is at the end of the book – creating a cliffhanger.  Now I am stuck waiting for the next book to be published – arghhh.

I liked the story.  I thought it was entertaining.  I would certainly let me children read the book.  It is a nice adventure and a good escape from the daily stresses.

Check it out sometime.

SQL Family – an Update

Categories: News, Professional, SSC
Tags: ,
Comments: 3 Comments
Published on: November 21, 2011

At the beginning of the month, we had a Meme Monday on the topic of SQL Family.  I had a few things to say about the SQL community back then.  And now, I want to give a bit of an update on the topic that supports what I have already said.

The Story

Late last Wednesday I learned from my wife (I was at the time on the other side of the continent) that our two year old daughter needed to have “emergency” surgery on her nose.  My wife was understandably concerned.  I was a bit more freaked out than she was – and yes we were both really worried.

My daughter was having problems breathing and her nose and cheek were swollen.  My wife took her to the doctor suspecting that it may have been broken by a head-on collision with her older brother.  The doctors at the clinic referred her to specialists saying they felt surgery was necessary.  There was a white sliver poking through skin internally in the nostril.

By the next day when my wife had gone to the specialist (this visit was Wednesday and was the one that got us a bit more concerned) that sliver had gotten larger.  Add that my daughter was getting frequent nosebleeds and you may just have the picture now.  The specialist told my wife that they needed to operate Thursday morning and fix it.  The would have to slice this protrusion off and sew the nose.

The doctor tried to pull the white sliver from the nose and nothing moved.  This was kind of weird to me since they had called the sliver “cartilage.”  It also made the panic go up a bit more.

Thursday morning, I commented on twitter that my daughter was having the surgery.  Many thanks again to all of you that replied both publicly and privately.  This is what I mean about community.  I was trying to work but also trying be with my daughter in spirit.


After the surgery I got a text message from my sister in law about almonds.  I was confused by the text and decided to call back.  It happens that she was with my wife at the hospital and the almonds reference was in regards to the nose surgery.  It turns out that my daughter had sneezed while eating some almonds.  Some pieces (large and small) had traversed through that opening between nose and throat at the back of the mouth.  Those pieces became lodged in her nose.  One was too large to completely pass.  Some of the almond skin and cut into her skin and did have to be surgically removed.  In short, without surgery, none of it could have been removed.

That is quite the relief!  It is also something we can look back on and laugh about now – embarrassing as it may seem.

Again, thanks to all who expressed interest and concern.  It is very much appreciated.

Table Space – CS Part Deux

Categories: News, Professional, Scripts, SSC
Comments: No Comments
Published on: November 21, 2011

Another script, another day.  And as promised, I am providing an update to the Table Space script that followed the sp_MStableSpace script.  Not a lot more to be said about this one, since much was said already.

I updated the script for those CS collations.  I also provided the update to show the % of DB column to be driven based on the data file usage information.

[codesyntax lang=”tsql”]


Phew, I finally took care of some of those somedays that have been nagging me.  Sure, there has been a someday that has evolved due to that – but that is a good thing.

It helps that I also need these scripts to be CS.  Add to that, that I need to use them more frequently and it was a perfect opportunity to do a little housecleaning.

Table Space revised Again

Categories: News, Professional, Scripts, SSC
Comments: 1 Comment
Published on: November 17, 2011

Since I am in the Collation Sensitive mood, I am finally getting around to updating this script.  This is the Table Space script that I have had out there for a while.  In the last release of this script, a request (by Remi) was made to update it so it will work with CS.  In addition to that, a request was made to add a few columns.  I have done both.

The CS request was not too big of a deal – just took a minute to actually sit down and do it.  Then it was a matter of setting a test database to CS and confirming that the script continued to work.  A friend did the same legwork (thx Remi) and posted his update in a thread I had been planning on getting back to with the update.  Now it will just get a link to this, and then there can be a circular reference.

The second part of the request was for a change in calculations and possibly additional columns.  I just added columns and someday hope to get back to this script and parameterize the whole thing so that a variable set of columns can be returned – based on user input.  Oh the glory of those someday goals.

So, here is the updated Table Size script.

[codesyntax lang=”tsql”]


If you recall, I did two versions of the table size script.  One followed the path of sp_spaceused and the other followed sp_MStablespace.  This script is the one that follows the sp_spaceused version.  I will post an update for the sp_MStablespace version shortly.

Table Hierarchy goes CS

Categories: News, Professional, Scripts, SSC
Comments: No Comments
Published on: November 16, 2011

At the urging of a friend, this script is being updated for those that are dealing with Case Sensitivity.  The first few rounds, I neglected Case Sensitivity and never tested for that.  It makes sense to have this script updated for that if anybody out there is using it.

The updates are simple enough, it is just frustrating if you run into an error caused by CS and then you waste time troubleshooting it.  Believe me, it has happened to me recently – and I don’t much like it.

Without further ado, here is the udpated script:

[codesyntax lang=”tsql”]


Still on the todo list is to make this bad boy run faster in the event of circular references.  If you find something else with it that you think could use adjusting, let me know.

A Trio of Tools

Comments: 3 Comments
Published on: November 15, 2011

I have talked about tools for SQL server a few times in the past.  You can read some of what I wrote here and here.

Since writing those last articles, I have come across more tools here and there.  Over the past few weeks, I came across three that stood out and I wanted to give them a quick shout out.

SSMS Tools Pack:  I have already written about this tool.  It was recently updated and the functionality has been improved since I last wrote about it.  Not only did that functionality improve, but the feature set is better now too!.  Go give it a try.

SSIS Reporting Pack:  This tool is available on Codeplex.  This is one of those things that could be queried from TSQL, but this gives an interface (SSRS reports) for you to browse the information.  The same kind of reporting pack would be very useful for SSRS.  I know people ask from time to time for this kind of information in both products.  Hence the usefulness of these tools would be pretty high.

SQL Treeo: Straight up, this tool was created to add customization to the tree view that you get with the default SSMS.  Some (many) find that the inability to customize this tree is inadequate.  This tool allows you the ability to create custom folders for the various objects.  This lends itself to being able to sort the tree in SSMS a bit differently.  Also, it can lead to being a bit more productive for many database professionals.

Check the tools out.  Test them and see if you like them.  Let the creator of each of these tools know what you think about their product.  Other than writing about the tools, I personally have no affiliation with any of the tools.  But I do think they would be good tools and are certainly worth the effort of testing for yourself.

TSQL Challenge 63 – Update

Categories: News, Professional, Scripts, SSC
Comments: No Comments
Published on: November 14, 2011

If you recall, I like Sudoku.  I even posted a script for solving it via TSQL.  I went so far as to enter my script into a TSQL Challenge.  That all started way back in August.  Today, I have an update!!

I was notified this morning from that I have earned a new badge.  Cool, what’s the badge?  I clicked the link and it took me to this badge.
Huh?  I’m a winner of the SQL Sudoku Challenge?  Awesome!

Looking it over, I am winner #3.  This means I could have done better with my solution.   And looking at the other solution stats, it appears I will need to find time to see what the others did to make their solutions go sooooo fast.  I have some learning to do – woohoo.

So, now that means I need to post my solution.

[codesyntax lang=”tsql”]



Sadly, that is not the most recent version of the script that I had.  I had intended on submitting this version, which is still slightly faster.

[codesyntax lang=”tsql”]


Still, I am certain that (without having looked at the other winning solutions) this is not on par with the best solutions.  And I have a lot to learn.

«page 1 of 2

November 2011
« Oct   Dec »

Welcome , today is Monday, January 27, 2020