A Trio of Functions

Categories: News, Professional, Scripts, SSC
Comments: 2 Comments
Published on: January 17, 2012

I found myself perusing an execution plan the other day.  I know, big surprise there.  This execution plan showed me some interesting things I had never really paid much attention to in the past.  When I started paying attention to these things, I found myself jumping down a rabbit hole.

It all started with a bit of curiosity to see if I could make an “admin” script perform a bit better.  The execution plans started showing some table valued functions that I knew I hadn’t included in the query.  Subsequently, I found myself wondering – what is that?

The items that made me curious were all table valued functions.  There were three of them (different) in this particular plan.  I started looking hither and thither to find these functions.  It didn’t take long to figure out that I could find them in the mssqlsystemresource database.  So I proceeded to making a copy of the database and attaching a copy of it for further learning opportunities.

The three functions are:

SYSSESSIONS

FNGETSQL

SYSCONNECTIONS

Knowing the query and based on these names, I began looking in the appropriate DMOs to see what I could find.  Here are the scripts for each of those DMO’s.

[codesyntax lang=”tsql”]

[/codesyntax]

Cool.  I can now see the internals of each of the DMOs – sort of.  You see, there is an OPENROWSET call in each of these objects.  Each call uses an undocumented feature called TABLE.  This is an internal command used by the engine and you won’t find much on it (mostly people asking what it is and Microsoft saying they won’t tell).

Here is the fun part.  If you try to run that code outside of querying the DMO, you will receive error messages.  If you try to create a new view utilizing the Openrowset, it will fail.  It is reserved for internal usage.  With that said, just continue to use the DMO and you will be fine.  Personally, I was curious to find out how it worked so I tried a bit to find it.

So there you have it.  If you are curious what is the internal makings of these DMOs, you can script them from the resource database.  Alternatively, you could also run sp_helptext.  I like to check these things from the resource database.  It feels more like an adventure.  Have fun with it and see what you will learn.

Missing Indexes

Comments: 11 Comments
Published on: January 12, 2012

SQL Server has means built into it to track possible missing indexes.  This used to be found through the use of the Index Tuning Wizard.  The process has improved over time (you can sort of see that from my April Fools post).

As luck would have it, I was recently asked to help fix a query that was somewhat related to the whole process.  You see, since SQL Server 2005, there are DMOs that help to track metadata related to column and index usage.  And if there a query is repeated enough that doesn’t have a good matching index, then the engine may think that a new index is needed.  This potential index information is recorded and becomes visible via the DMOs.

The query that I was asked to help fix was a dynamic query within a cursor that read information from the DMOs in order to generate some missing index information.  That particular query was failing for a couple of reasons, but on the same token it gave me an idea to modify and adapt the query to something more in line with what I might use.  After all, the queries that I used were in need of updating and this gets me started in that direction.

First, a little on why the query was failing.  A common problem with dynamic queries is the placement of quotes and having enough quotes in all required locations.  When you start nesting more levels into a dynamic query, the more confusing the quotes can get.  When running into something like this, I like to print the statement that I am trying to build dynamically.  If it doesn’t look right, then adjust the quotes until it looks right.

The second reason it was failing was a simple oversight.  Whether building dynamic queries or just using variables in your code, make sure you use properly sized variables.  In this case, the dynamic query variable was substantially inadequate.  The use of a print statement also helps to catch these types of errors rather quickly.

There were also a few things that would cause me to not use the original query in any environment.  The first problem is that the script contains a column which is the create statement for each proposed index.  In this create statement, all indexes were given the same name.  That would be a bit of a problem.

The next issue is my concern with the creation of indexes without ensuring that the index is going to provide greater benefit than cost.  Better stated is that the creation of these indexes just because the script spewed them out is no better than to create all of the indexes proposed by the Database Engine Tuning Advisor.  For this, I added a cautionary statement next to every index create statement.

So with these tweaks, as well as other less significant tweaks, here is the query.

[codesyntax lang=”tsql”]

[/codesyntax]

As I post this message, as I tend to do, I am looking for ways to improve upon the query and make it better.  This script should only be used with caution.  It is to provide an insight into potential missing indexes in each database.  A score is assigned to each potential index.  It is with the highest score indexes, that I typically begin analysis to improve performance.  I typically start from a query and execution plan to performance tune.  There are times when an alternative starting point is necessary.  This script is a tool for those times.  Please head the warning that these should be created with extreme caution.

Database Data and Log Size Info

Categories: News, Professional, Scripts, SSC
Tags: ,
Comments: 6 Comments
Published on: December 5, 2011

Have you ever wanted or needed to get the data and log file size for all of the databases in your server?

I have seen scripts that would get this information in the past.  All of the scripts seemed to use either a cursor or the proc from Microsoft called sp_MSforeachdb.  Having seen some of these scripts, I wondered if there might be a different way of doing it (that was also different than you see in my TableSpace Scripts).

After putting some thought into it, I decided on a plan of attack and went to work on building a query that would do the trick.  But before I continue, I will advise that running a script like this on large databases may take a while to execute.

Keeping these things in mind (potential slowness when run on large db and wanting to try something different), I came up with this method (tested on Instances with small databases as well as Instances with 2TB Databases).

[codesyntax lang=”tsql”]

[/codesyntax]

You will see that I did not entirely eliminate the looping mechanism.  Truth be told, it is so much faster on servers with Large Databases.

Also take note of the DMV that is in use in this query.  I am taking advantage of the performance monitor stats that are exposed through the DMV sys.dm_os_performance_counters.  One caveat to this method, is that this DMV shows us the size of the resource database as well.  I think it is fine to report that information back – but it won’t change much over time.  It is for that purpose that I use the Left Join in the query.

The other trick that I utilized is to Pivot that performance monitor data.  I think this works better than to write a bunch of sub-queries to generate the same sort of data layout.

You have probably also noted that I have chosen 250GB as the tipping point in this query.  There is no particular reason for that size – just a large enough database size to make the point.  For some, the tipping point may be a much smaller database size.  Feel free to change this value to suit your needs.

Some other caveats.  The perfmon cntr_value data is presented as an Int.  For more accurate math operations, I chose to cast many of these to a Float.

I also ran into a case where the perfmon data was reporting 0 as the Data File Size for one of my databases.  I had to throw a case statement in for the Data File Used Percent in order to avoid a Divide by 0 error.

I also wrapped the DataUsedMB in an ISNULL.  This was due to the model and mssqlsystemresource databases not having data in the dm_db_physical_stats function.  I could have left those as NULL, but wanted to show something for them both.

Check it out and Enjoy!

Table Hierarchy goes CS

Categories: News, Professional, Scripts, SSC
Comments: 1 Comment
Published on: November 16, 2011

At the urging of a friend, this script is being updated for those that are dealing with Case Sensitivity.  The first few rounds, I neglected Case Sensitivity and never tested for that.  It makes sense to have this script updated for that if anybody out there is using it.

The updates are simple enough, it is just frustrating if you run into an error caused by CS and then you waste time troubleshooting it.  Believe me, it has happened to me recently – and I don’t much like it.

Without further ado, here is the udpated script:

[codesyntax lang=”tsql”]

[/codesyntax]

Still on the todo list is to make this bad boy run faster in the event of circular references.  If you find something else with it that you think could use adjusting, let me know.

TSQL Challenge 63 – Update

Categories: News, Professional, Scripts, SSC
Comments: No Comments
Published on: November 14, 2011

If you recall, I like Sudoku.  I even posted a script for solving it via TSQL.  I went so far as to enter my script into a TSQL Challenge.  That all started way back in August.  Today, I have an update!!

I was notified this morning from BeyondRelational.com that I have earned a new badge.  Cool, what’s the badge?  I clicked the link and it took me to this badge.
Huh?  I’m a winner of the SQL Sudoku Challenge?  Awesome!

Looking it over, I am winner #3.  This means I could have done better with my solution.   And looking at the other solution stats, it appears I will need to find time to see what the others did to make their solutions go sooooo fast.  I have some learning to do – woohoo.

So, now that means I need to post my solution.

[codesyntax lang=”tsql”]

[/codesyntax]

 

Sadly, that is not the most recent version of the script that I had.  I had intended on submitting this version, which is still slightly faster.

[codesyntax lang=”tsql”]

[/codesyntax]

Still, I am certain that (without having looked at the other winning solutions) this is not on par with the best solutions.  And I have a lot to learn.

Column Level Permissions

Categories: News, Professional, Scripts, SSC
Comments: No Comments
Published on: September 19, 2011

Did you know that you can grant permissions down to the column level in SQL Server?   Well, if you didn’t know that – you do now.

It is actually rather simple to grant permissions at the column level.  This can be demonstrated by the following script.

[codesyntax lang=”tsql”]

[/codesyntax]

If you want to check out more on that syntax, read here.

And then…

Why is it important to know that you can do this?  Well, it is quite possible you have some of these permissions already in place.  It is possible you may have inherited something like this.  Just maybe there is a business requirement requiring that certain users or groups only have access to certain data within certain columns.

That brings up a new problem then.  How do you find out what columns have specific permissions applied to certain users?  Well, that is actually pretty straight forward.  We can query the system views and determine column level permissions.

[codesyntax lang=”tsql”]

[/codesyntax]

The previous query is a really simple version of how to find this information.  As you can see, I am simply returning the UserName, TableName and ColumnName along with the permission in effect on that column.

You should also be able to see that the mapping between these system views is pretty straight forward as well.  Major_id maps to object_id and column_id maps to minor_id.

Conclusion

This query can be of good use to determine permissions in place for columns in tables within your database.  Furthermore, you can even use this query to simply test your curiosity as you check to determine what has been put into effect in the databases you manage.

There are more complex methods to determine these permissions.  With there being more complex methods, I am sure there are also some easier methods.  Let me know what you do to query these permissions.

Send DBMail

Categories: News, Professional, SSC
Comments: No Comments
Published on: August 15, 2011

With SQL Server 2005, Microsoft improved the methods available for DBAs to send email from SQL Server.  The new method is called Database Mail.  If you want to send emails programmatically, you can now use sp_send_dbmail.  You can read all about that stored procedure here.

What I am really looking to share is more about one of the variables that has been introduced with sp_send_dbmail.  This parameter is @query.  As the online documentation states, you can put a query between single quotes and set the @query parameter equal to that query.  That is very useful.

Why am I bringing this up?  Doing something like this can be very useful for DBAs looking to create cost-effective monitoring solutions that require emailing result sets to themselves.  I ran across one scenario recently where a DBA was looking for help doing this very thing.  In this case, the query was quite simple.  He just wanted to get a list of databases with the size of those databases to be emailed.

Here is a quick and dirty of one method to do such a thing.

[codesyntax lang=”tsql”]

[/codesyntax]

As I said, this is a real quick and dirty example of how to send an email with query results.  The results of the query in the @query parameter (in this case) will be in the body of the email.  A slightly modified version of that first solution is as follows.

[codesyntax lang=”tsql”]

[/codesyntax]

This is only really slightly modified because I took the guts of sp_databases and dumped that into this query.  The modification being that the remark column was removed.  Why do this?  Well, to demonstrate two different methods to get the same data from the @query parameter.  We can either pass a stored procedure to the parameter, or we can build an entire SQL statement and pass that to the parameter.

This is just a simple little tool that can be used by DBAs.  Enjoy!

TSQL Tuesday 21 – FAIL FAIL FAIL

Comments: No Comments
Published on: August 10, 2011

It is TSQL Tuesda… err Wedn… err Tuesday for August 2011.  This month the party is a day later and bringing us oodles of failure.

Adam Machanic is hosting this month.  He thought it would be a good idea to poke and prod us while getting us to torture ourselves.  It seems to be a month long theme (as he alluded to in his announcement of the topic).

See, the topic this month requires us to share our failures.  More specifically we are to share our ugly code that we know better than to do.  Crap Code is a failure.  At least we have come to recognize that it was crap code and that it was indeed a failure.  The question now is this:  What have we done about what we learned?

I put a fair amount of thought into this topic.  I could come up with several possible experiences for this topic.  I kept running into a wall though.  That wall was recalling detailed facts and examples of those experiences.  Many of them were documented and left behind with former employers.  I no longer have the “dirt” on those dirty little secrets.

Then it hit me!  I had posted some blog entries about some of my experiences and was certain that something would be applicable.  You know what?  It’s a good thing I have this blog going.  Not only does it help me to learn many things and write better, it serves as a repository for introspection and recollection.  So, I will rely on a couple of posts from the early days of this blog to help write my entry for TSQL Tuesday this month.  And, I hope that what I share and what I learned from my experience will prove helpful to somebody else.

The FAILURE / Crap Code

If you want, you can read the entire first part here then skip to the next heading.  Or you can continue reading as I recount that article.

I had been asked to provide a report that would provide data according to pay periods.  I had no calendar table yet I needed to be able to compare dates passed to the report and match them to this pay period calendar.  Despite requests to the business (the pay periods for this report did not line up with the normal business pay periods) to get a list of those pay periods in order to create a Calendar table, I got nothing.  I did however receive generic instruction as to the general time-lines for these pay periods.  Based on this, I decided to get tricky and try to meet some basic guidelines that I created for myself.

  1. The payperiods need to be accessible to the query
  2. No Physical Lookup Table
  3. Do not hard-code the data in the code.

Based on that, I came up with a pretty neat little solution.

[codesyntax lang=”tsql”]

[/codesyntax]

This worked well – for awhile.  Performance degraded over time, then it didn’t seem so cool.

Egg On Face

The update to that first article can be read in full here.  In short, that method really did not work out as well as I had hoped.  In the end, I had to create a table that would need to be maintained – at some point.  I populated the data in the table with 5 years worth of pay periods.  I hope somebody reads the documentation and takes care of that!

With the implementation of the table, I updated the CTE and the code being run by the report.  Performance of this updated version outperformed the first version – even when it was running fast.  I probably could have gotten away with doing an Itzik style numbers CTE in lieu of that recursive CTE – as I look back on it now.  Again, that is another level of learning and could potentially be an improvement on that first round of crap code I offered up for that report.

Conclusion

I learned from that experience.  First, it was a pretty neat little trick.  Secondly, I really should have tested it more.  Thirdly, I can likely still learn from it and improve on it because I am getting better at writing faster code and testing my proposed solutions (and I keep learning new tips/tricks).  Obviously something like this has not deterred me.  Quite the opposite really.  Because of a little failure like this, I work harder to get better at what I do.

FK Hierarchy v 2.1

Categories: News, Professional, SSC
Comments: 1 Comment
Published on: August 9, 2011

Last month I published an update to my Foreign Key Hierarchy script.  Today, I am providing a new update for that script.  A friend (Rémi Grégoire) helped out with some mods to this script.  The change for this month is nothing too intrusive.  The script is now updated for those databases that are Case Sensitive.

[codesyntax lang=”tsql”]

[/codesyntax]

This update should make it more usable for any that may be using it or is interested in using it.  Thanks for Rémi for taking the time to propose this update.

Activity Monitor and Profiler

Tags: ,
Comments: 2 Comments
Published on: July 28, 2011

Today I came across a link to a neat little script for SQL 2005 / 2008 to help derive the head blocker in a blocking chain.  That script can be found here.  I was looking at the script and thought it looked pretty cool and also wondered why it might look a little familiar.  Not too big of a concern since many admin type scripts can look familiar.

Then I noticed that the title contained the words Activity Monitor.  I wondered to myself, was this taken from activity monitor?  So, I decided to play with Profiler for a bit to see if I could generate the same query.  The process really isn’t too difficult.  Here is what I did.

  1. Open Activity Monitor.  Simple enough to do.  Right click the instance (in SSMS) you wish to monitor, select Activity Monitor from the context menu.
  2. Open Profiler.  Again from within SSMS, click the tools menu and then select SQL Server Profiler.
  3. Start a New trace.  When starting the trace ensure that you are connecting to the same instance as you started Activity Monitor.  Select the TSQL-Replay template (other tsql will probably work, this is the one that I used).  Goto the events selection tab and click on the Column Filters button.  Select “LoginName” and then expand “Like” on the right hand side.  In the new text box, type the login you are using to authenticate to the SQL Instance in question (domain\user or SQLLogin).
  4. Click run and browse through the results for a bit.  It shouldn’t take too long for that query to pop up.
  5. Once you find the query, make sure you stop the trace.
What is the point in this?  Well, as explained in that other blog, sometimes it is just better to run a query than to run Activity Monitor.  Furthermore, you can take this method and trap many of the queries that Microsoft built into that tool.  You can either use them for learning or troubleshooting.  This is just one more tool to have in your arsenal in order to become better at your job.
«page 2 of 7»






Calendar
August 2015
M T W T F S S
« Jul    
 12
3456789
10111213141516
17181920212223
24252627282930
31  
Content
SQLHelp

SQLHelp


Welcome , today is Saturday, August 29, 2015