Get the Source of PREEMPTIVE_OS_PIPEOPS Waits

A client reached out to me for help about a wait_type they were seeing. That’s not terribly uncommon, but the specific wait_type was something that was fairly obscure.

Every day, they could see the delta on this wait_type continue to climb higher and higher. The problem was not in their ability to capture the deltas for the biggest wait_types. The problem as it were, was in figuring out what exactly was the root cause of their biggest wait_type.

What was the wait_type? Well, the obscure wait_type was called PREEMPTIVE_OS_PIPEOPS. What causes this wait? As it turns out, this is a generic wait that is caused by SQL pipe related activities such as xp_cmdshell.

Knowing this much information however does not get us to the root cause of this particular problem for this client. How do we get there? This is another case for Extended Events (XEvents).

Tracking Pipe Ops

Knowing which wait_type needs attention is most of the battle in figuring out what precisely is causing the specific wait. The second piece of the puzzle is to understand whether the wait is an internal or external. This is essential because a different event is triggered depending on whether the wait is internal or external. There will be more on that in a future article.

Since I happen to know already that the PREEMPTIVE_OS_PIPEOPS wait is an external wait, then I also know that I will want to use the sqlos.wait_info_external. The downside to this method is that conventional wisdom teaches us that we must use the integer value for the wait in order to monitor for it. To find that integer value, one can query the sys.dm_xe_map_values dmv. Since the values of many of the waits change from CU to CU, then one should validate this value on every server instead of using the same value on all servers without regard.

Depending on which server I run that code on, I may receive different results.

As you can see, I have a small sample showing five different values for this specific wait. Using the wrong value on the wrong server could result in a bit of hair tugging since nothing may be trapped.

Due to this potential for change, writing a script to monitor for the session gets a little bit trickier and a lot more dynamic – for now.

After executing that script, the session will be created with the appropriate wait_type as well as the session will be started. To verify it is working, then I can execute a query that will use xp_cmdshell.

With that information in hand, we now have a tool that will help us identify the root cause of this wait climbing to the top of the waits list. Once I am able to identify the root cause, then I can start applying an appropriate fix (or do nothing if the situation deems that necessary).

But what about this map_key that keeps on changing with updates to SQL Server? Surely there is a better method than figuring out the integer value. As luck would have it, conventional wisdom is wrong in this case. There is a much easier method of creating this event session. So much easier in fact that I don’t need to worry about the map_key value. All I need to know is the wait_type I wish to monitor. So let’s look at a better version of the script.

This will yield the same results as the first script, but without the dynamic sql and with a more friendly script for the DBA to understand. This method is actually employed in the system_health system default session if you ever decide to peruse that session.

Wrapping it Up

Extended Events is a powerful tool with plenty of ease of use and flexibility. This flexibility allows the DBA to capably monitor the server for any issue be it small or large. This article demonstrated how to use Extended Events to monitor for a specific wait_type and the same principles can be applied to any of the waits you may need to investigate. If you are in need of a little tune-up for your XE skiils, I recommend reading a bit on Extended Events to get up to date. For other basics related articles, feel free to read here.

 

Note to Self

TSQL2sDay150x150TSQL Tuesday

The second Tuesday of the month is upon us once again. That means it is time for another group blog party called TSQLTuesday. This party that was started by Adam Machanic has now been going for long enough that changes have happened (such as Steve Jones (b | t) managing it now). For a nice long read, you can find a nice roundup of all TSQLTuesdays over here.

Sands of Time

This month, MOHAMMAD DARAB (b | t) invites us to do the 40 yr. old self reflection. You know that age that is almost as dreadful to reach as the righteous age of 50? From there age only gets better – or at least I have been told that. I have done a couple of articles similar to this in past TSQLTuesdays. Similar yet still different just enough. Check out my article on passion,  or this on one influence.

The problem with this particular challenge is the request to write a note to your 20 yr. old self. Well, maybe not an extreme problem other than the time travel part. As luck would have it, I recently discussed the topic with my grandmother. There was a lot of wisdom in what she taught me as we discussed the topic and I must admit, I was in agreement prior to even broaching the topic with her.

Passing Notes

Here is basically what my note to myself would say: “Self, you are doing good. Keep it up!”

That is even if I chose to send myself a note. You see, I would not want to change how things have happened. Life experience is good – both the good and the bad. All of it is necessary to grow and develop. Just like Mater’s dents, each experience is valuable and shouldn’t be changed.

Now, what if I were to create a note of advice for other more junior level professionals that I might be mentoring? That is a different story. My note to them would be simple – “Listen to me.” Maybe not so literally “me” and maybe more precisely as advice to listen to anybody who is giving their time to present, mentor, blog, or write (articles, books, etc). They are imparting of their experience and time and you should respect that and take advantage of it.

Not only should you pay respect to those giving of their time, but you should spend a little time to also give back to the community. Why? Well, I covered that pretty well in this article here. I will say it again, the more you give of yourself to help others learn your trade, the more you will personally learn and grow in your professional and personal self. It is an investment well worth taking.

 

Wrapping it Up

I don’t believe in trying to alter who I have become with any sort of visit back to my past self. The changes could cause me to change my path and not end up where I have reached due to personal growth and experience. Trial and error is good for growth. Everybody needs to learn that for themselves and cannot be forced into it. Just learn to grow and be patient with anybody you may mentor.

Easy Open Event Log Files

One of the beauties of using a tool such as Extended Events (XEvents) is the versatility and ease of use. XEvents can be used for so many things to help an aspiring DBA or Developer do their job better.

There has long been a sticking point about allowing various people access to production servers. Part of the sticking point is the developer who believes that access to read and evaluate trace files is mandatory on prod (yes I have heard many times where this has been used effectively by developers to gain prod access). But is that prod access truly necessary? Without an adequate method to provide the developer access to the trace data, the DBA loses confidence (from management) and standing ground for their side of the argument. How can a middle ground be met?

XEvents to the Rescue

With the use of XEvent tracing, a DBA is given a new realm of possibilities. This particular realm of possibilities becomes available when the XEvent trace (session) is created using a file target. Through the use of a file target, and the flexibility of XEvents, I have shown the beginnings of how a developer might be able to access the trace data easily.

That is merely the beginning though. There are a few more layers to making life easier for both the developer and DBA in regards to fetching event trace data. This article will show a few methods that will help make life just a bit easier.

Let’s start with the basics. After an XEvent session is created, it is important to know where the event file is being stored. If you don’t happen to know where it is, that is not a problem. The file source is easy enough to find and I have detailed it in this article.

Method 1

As shown in the previous article, the first quick method to open an XEvent trace file is from within SSMS as shown here.

After clicking on “Merge Extended Event Files…”, a new window titled “Merge Extended Event Files” will open. From the new window, follow the following steps.

Method 2

Now, that is an extremely simple method to open and view a trace file for XEvents. That said, would you believe there are other equally easy methods? Let’s look at the next method that is very simple as well.

As was previously mentioned, you will need to know the file location first and then navigate to that location. So let’s do that. My common location is C:\Database\XE. Once navigated to the folder, I need to choose my file to be opened and follow the prompts as shown here.

A new window will pop up. By default, SSMS will be showing. If not, you can select SSMS and enable the option to always use SSMS for this file type by clicking the check box before clicking OK.

Method 3

If the prior two methods feel unreasonably easy, then there is this next method that will spice things up a bit. It does require a bit of coordination. Once you have identified the file to be opened, you can elect to perform a drag and drop operation of the file onto SSMS. Once done properly, the file will open in SSMS and you can start evaluating the data.

Method 4

This is as equally simplistic as the previous method but requires just a scoch less coordination. This method involves a double-click method. A double-click on the file will open the file up in SSMS.

Method 5

If you paid close attention to the screenshots in method 2, you will have noticed that there was a menu option called “Open” that I did not discuss. If you select open from the context menu at that point, then the xel file will open in SSMS just like any of the other methods mentioned thus far.

Wrapping it Up

Extended Events is a powerful tool with plenty of ease of use and flexibility. This flexibility allows the DBA to better service the needs of the developers when the developers need access to the trace data. This article demonstrated another use for Extended Events. If you are in need of a little tune-up for your XE skiils, I recommend reading a bit on Extended Events to get up to date. For other basics related articles, feel free to read here.

 

Implicit Conversion Insights with XE

Implicit

In a recent article, I covered a little bit about implicit conversions and an error that may crop up from time to time. It is one thing to understand what an error means, it is an entirely different problem to understand what is causing the error in the first place.

Seldom does a DBA have the opportunity to get out in front of infrequent or random errors such as implicit conversions. More often than not, it is the privilege of the DBA to find out about the problem after the fact from a developer or, worse yet, an end-user.

What if there was an easy method to track when these errors occurred? Even better, it would be ideal to understand what query caused the error as well as which types of queries are merely causing implicit conversion warnings. Even better, what if you had a way to understand which type of conversion problem is at hand.

Implicit Playground

 

To start this playground, let’s begin with a simple query based on the AdventureWorks database.

This query will yield a plan similar to the following.

If I hover the select operator, I will see a little window like the following.

I have a few items highlighted in this window. In the larger red box, I am outlining much of the type conversion warning. Inside of that, there is a blue box highlighting the “convert_implicit” warning. If I look further into the error message, there is text about query plan choice and “CardinalityEstimate” or “SeekPlan” for the different convert_implicit warnings. Another way of looking at it is as follows.

This seems a lot more friendly than that little context window. Here we have an easy to follow layout of what is happening. As I drill down into the warnings, I can see there are three convert_implicit warnings and they all trigger a “planaffectingconvert” type of warning. As we look closer at this, I can see there are two distinct types of convert issues.

There is one for the conversion from nvarchar to int in the HumanResources.Employee table for the NationalIDNumber field (green).

Two warnings that state that an index seek has been disabled due to the conversions ( orange and purple).

Curious to see what all of the possible “ConvertIssue” values could be for these implicit conversions? That is easy enough with the following query.

Which shows us the following results.

That is a grand total of three. Having covered the cause of two of these already, all that is left is the type called “unknown”. I think we can safely presume that the cause of this one is something else that is undocumented and hence “unknown”.

So far, so good. We have seen how these warnings can crop up in execution plans. We have also seen how to query the for the “ConvertIssue” types or “plan_affecting_convert_type” from the Extended Events (XE) maps system DMV.

Right about now, the light bulb should be popping up for you. If we can query the XE metadata for the different convert issue types, then does that mean there is a way we can use XE to track these things and review them at our own convenience? Yes there is!

XE

Without further adieu, let’s look at an XE session that can be useful in discovering more information in regards to implicit conversions in the database environment.

This session is pretty comprehensive with the ability to be very much like a fire-hose session if the commented out events are uncommented prior to session creation. Running this on a busy server ( less than 1 minute) for a client with a third party vendor application produces results that look like the following.

As we can see here, the session is trapping when I have any of those pesky implicit conversion errors (red box outline) as mentioned in the previous article, as well as the convert_implicit warnings (purple box outline) that were demonstrated in the execution plans in this article.

How am I trapping the implicit conversion error? That is done via the following piece of the event session shown above.

Error number 257 is the message id for the implicit conversion error. When filtering on that ID for this event, then I will receive only the implicit errors that match that number.

As for the implicit conversion warnings, we have the following that accomplishes that requirement.

The event, plan_affecting_convert should seem eerily familiar at this point. We have seen terminology like that in the execution plans as well as in the query for the map values.

The combination of these events will provide great insight into the issues revolving around any implicit conversions you may be having in your database environment.

Wrapping it Up

Implicit conversions are a fail-safe for when bad design or code (or both) crops up in your environment. When this fail-safe is being used in your environment, it is helpful to gain further insights via Extended Events. This article demonstrated another use for Extended Events. If you are in need of a little tune-up for your XE skiils, I recommend reading a bit on Extended Events to get up to date. For other basics related articles, feel free to read here.

 

Implicit Conversion Fail

Implicit

Every now and again, you may run into an error about implicit conversions in your queries. Implicit conversions are not uncommon in the computing world and can be viewed as kind of a fail-safe for when we don’t quite follow decent practices when designing the database or when writing queries or both.

Despite this little fail-safe, there are times when a nasty little error will pop up and cause a bit of consternation.

Implicit conversion from data type %ls to %ls is not allowed. Use the CONVERT function to run this query.

What Went Wrong

 

Unlike many other errors in SQL Server, this error message makes some sense. The major components of what is wrong are present and you are given a decent idea of what the conversion attempt is that failed. When this particular error happens, you can bet that there are issues with some TSQL code somewhere for sure. In addition, you can bet there is likely a problem with the database design as well. Yay! More work for your back burner.

First, this error comes with an error id of 257 and we can see the message text via the following query. This id is important for when we want to monitor for this problem in the future.

Let’s see how we can recreate this problem.

Which will produce the following.

This is a prime example of a bad query producing an error. Obviously, I am using the wrong data type to try and query the temp table. The ImplicitID column is an integer and I am trying to query it using a date. The quick fix, would be to query the appropriate date column if I must use a date in my query, or i can use an integer to query the ImplicitID column.

After this minor tweak, now the query will work and I start to see results. Given the random nature of the data in this query, the results will vary from batch to batch.

Wrapping it Up

Implicit conversions are a fail-safe for when bad design or code (or both) crops up in your environment. Sometimes, this fail-safe is inadequate and needs further assistance. Sometimes, that may be an explicit conversion and sometimes that means an appropriate rewrite of the query to use the appropriate columns and data types in the queries. This was an introductory article into the world of implicit conversions. There will be a follow-up or two about implicit conversions and monitoring for them. In preparation for the upcoming articles, I recommend reading a bit on Extended Events to get up to date. For other basics related articles, feel free to read here.

 

Puzzles and Daily Trivia

TSQL2sDay150x150TSQL Tuesday

The second Tuesday of the month comes to us a little early this month. That means it is time again for another group blog party called TSQLTuesday. This party that was started by Adam Machanic has now been going for long enough that changes have happened (such as Steve Jones (b | t) managing it now). For a nice long read, you can find a nice roundup of all TSQLTuesdays over here.

The Why?

This month, Matthew Mcgiffen (b | t) invites us to come to a little puzzle party for our TSQL Tuesday party. Bring your favorite brain teaser, puzzle, questions, or interesting and complex TSQL problems/solutions.

This ties pretty nicely into the topic from last month (well at least for me). If you recall, last month Todd asked all of us to share some of our uses for databases in our personal lives. I submitted my article here, but forgot about one of my favorite uses for a database in my personal life – a daily trivia set about SQL Server.

So, why not elaborate on that database a bit today. Almost like a two for one. However, there is one little quick departure I want to make. My first puzzle solved with TSQL was written about many moons ago and can be found here.

TSQL Challenges

A long time ago, Jacob Sebastian ran regular challenges involving TSQL to get you to think about ways to solve problems using TSQL. The site is no longer available, but challenge #97 was about solving Sudoku puzzles. Here is my solution to that particular challenge. A little TSQL and a bit of the black arts and there is a nifty little solution. Even though I have this little trick in the bag, I still solve the Sudoku puzzles the hard way.

What a nice little stroll down memory lane there. That solution alone could satisfy the request for this TSQL Tuesday. Alas, we won’t stop there!

QOTD

Several years ago, I put together a little database to help with daily trivia questions. The database is a simple design and had a primary function to help teach SQL Server facts and internals to those with varying levels of DBA experience as well as helpdesk members. That said, the topic of questions doesn’t have to be SQL specific – it was my primary use.

With just a few tables and a few procs, I have a database that I can use to create questions, track responses from participants, and email questions and answers to participants on a daily basis – automatically. If I run out of questions, I just add more to the table. Nothing super complex there.

What this offers me is a mechanism to mentor multiple people without the burnout and while also gauging their true interest level in improving their SQL skill set.

Wrapping it Up

One of my favorite database automations is to send trivia style questions daily. This helps me to mentor and assess other dba talent within an organization without being too aggressive. Not only can a database be used for automation but it can also be used for numerous other automations. Beyond being highly useful for automation and training, there is also the possibility of using TSQL to solve puzzles like Sudoku puzzles.

 

Making Databases Personal

TSQL2sDay150x150TSQL Tuesday

The second Tuesday of the month comes to us a little early this month. That means it is time again for another group blog party called TSQLTuesday. This party that was started by Adam Machanic has now been going for long enough that changes have happened (such as Steve Jones (b | t) managing it now). For a nice long read, you can find a nice roundup of all TSQLTuesdays over here.

The Why?

This month, Todd Kleinhans (b | t) invites us to share something personal about ourselves. Well sort of. The invitation to each of us is to share a little about databases that we use in our personal lives or that we have created in our personal lives to help us in some fashion or another.

I think this is a really cool idea. Data and databases are very integrated into my life. The integration is enough that it would be difficult to un-blur the line where personal life and professional life begins when talking about data and databases.

Take a moment and reflect on the importance of data and databases within your life. At a minimum, you have taken the approach that data is your profession of choice and hopefully it is a career for you and not just a job.

If you have taken it down the career path, then you probably study your craft in your personal time to try and make yourself better at your trade-craft. Often times, if you are like me, these studies lead to trial and error, experiments, and possibly rabbit holes. If data is just a J-O-B, it is quite likely none of this will apply to you. Those are the types that fall into a category Steve Jones wrote about recently, and I would urge you to change that sooner rather than later. (Side note, I read that article as if it applied to me directly because I really do need to learn and practice more about my craft too!)

Personal Databases

Over the years, I have had sooo many personal databases that I have created for one reason or another. Some still need to be finished. Here are some examples of what I use a database for in my personal life.

Finances: A simple little database to track (sort of like a checkbook yes) credits and debits. More importantly, it tracked bills and due dates with dollar amounts for each bill. This is useful both for reminders for bills but also to see if there are any noteworthy trends with spending or increased costs. Now, many banking (and credit card) sites offer the same types of services – so long as the spending is done with their card.

Exercise: Back in the day, I tracked all of my activities in a database. I would track the duration, activity type, effort level and various biometric measures after each activity. Once again, this is something that is now tracked by so many different apps and databases that it seems counter productive to maintain my own system in favor of something that works more easily on a wearable device.

I have also previously done databases to inventory books, movies, track personal studies and link thoughts about studies from day to day, or even track routine chores (calendar reminders don’t offer much for tracking details). The short of it is, a database can be created to manage so many things as a part of everyday personal life.

 

Wrapping it Up

If you can use a database to create automation routines to help sustain your work life, you can certainly do the same to sustain your personal life. Databases are an essential piece of both my personal and professional life. Not only do I use them to create automation (such as this or this) for things, but I stand databases up to test numerous things from corruption to recovery to performance tuning to help improve my personal abilities and craft in the realm of data.

 

Passion, Challenges, and SQL

Comments: 1 Comment
Published on: February 12, 2019

TSQL Tuesday

The second Tuesday of the month comes to us a little early this month. That means it is time again for another group blog party called TSQLTuesday. This party that was started by Adam Machanic has now been going for long enough that changes have happened (such as Steve Jones (b | t) managing it now). For a nice long read, you can find a nice roundup of all TSQLTuesdays over here.

The Why?

Long time friend Andy Leonard (b | t) invites us this month to do a little checkup on ourselves and talk about the “why” around what we do. This could be a very easy topic for some. Equally, this could be a very difficult topic for those same people at different times in their lives. Thus the problem, the topic is simple in nature but sure requires a firm reflection on self and what you have been doing.

The problem for me is less about the “why” behind what I do, and more about how to stretch it out into something more than a few sentences. Think! Think! Think!

Challenges

One of my biggest reasons why I do what I do, boils down to the challenges that I frequently get to encounter. There is a wild satisfaction to working on a very difficult and challenging task, product, tool, profession, skill, etc. This satisfaction often involves reward and a sense of accomplishment.

The challenge can be anything from how to effectively communicate with a difficult person, a tough to find internals problem in SQL Server that could be causing a performance issue, or taking over a project and turning it back from the edge of failure and onto a track of success. Sometimes, the challenge may be as simple as converting a pathetic cursor into a set based approach and gaining an improvement of 100x in performance.

I really do enjoy some of the puzzles (challenges) that I get to work on routinely. This gives me an opportunity to improve my skillset as well as continue to learn. Being able to continually improve is a great motivation for me. The frequent challenges and continual opportunity to learn presents a great opportunity to evolve ones self and career. In a constantly changing world, being able to naturally and easily evolve your personal career is a bonus!

Passion

“Do what you love and you will never work a day in your life.” This is a common saying in the United States. Agree or disagree – there is some truth to it. Being able to do something one loves makes the really hard days a lot easier. Knowing, I may be able to solve a complex problem makes it easier to face the day.

I really enjoy the opportunity to face difficult challenges and resolve those challenges. The passion to solve these puzzles with data doesn’t end there. I also really do enjoy the opportunity to learn which brings up two other challenges that help me learn: speaking and writing.

By putting myself out there regularly to speak and write, I am becoming a better technical person. I am becoming better equipped to solve many of the puzzles I face. Those are great benefits. That said, I don’t feel I could get out there and talk about something about which I wasn’t passionate. I have learned to become passionate about writing and speaking – though I still have plenty of room for improvement (just as I do in my quest to become a good DBA).

TSQL2sDay150x150Wrapping it Up

I really do enjoy the challenges I get to face on a frequent basis in the world of data. This is the big “WHY” for me to continue my progress in this career.

Find something you are passionate about and strive to envelop your career with as many opportunities to do that thing. If that means accepting some less wanted tasks in order to do more of the thing you love, it could very well be worth it!

Automating like an Enterprise DBA

Comments: 1 Comment
Published on: January 8, 2019

TSQL Tuesday

The second Tuesday of the month comes to us a little early this month. That means it is time again for another group blog party called TSQLTuesday. This party that was started by Adam Machanic has now been going for long enough that changes have happened (such as Steve Jones (b | t) managing it now). For a nice long read, you can find a nice roundup of all TSQLTuesdays over here.

Automation

The theme as chosen by Garry Bargsley (b | t) is about automation. Specifically, Garry has provided two requirements about automation for this month. As is always, there is leeway in a post that participates in TSQL Tuesday.

One of the things that should seem very commonplace to a data professional is the effort to become a lazy DBA. A lazy DBA is not a bad thing. It just means the DBA works hard to automate the repetitive mundane tasks that may be tedious and/or time consuming. Time can always be better spent somewhere else, right?

If you are lacking in any ideas for what can be automated, here are a few TSQL Tuesday roundups from when we have talked about automation previously (yes it is a hot topic – ALWAYS!).

  1. August 2010 – Beach Time – what do you automate to earn beach time?
  2. February 2011 – Automation in SQL Server – Give your best tricks for making your life easier through automation.
  3. January 2014 – Automation – How much of it is the same?
  4. September 2015 – The Enterprise – How does one manage an enterprise of databases?
  5. September 2017 – PowerShell Automation – Find something and automate it.

In the past, I have written about automation a few times. Some of my favorites are automated restores, automation in the cloud, and my poor mans automated audit.

I automate many processes and have automated loads of tasks over the years. You see, automation means I can spend more time doing other tasks that require more time, more thought/concentration, more effort, and frankly more interest. So what have I automated recently that may be different from what I have previously written? This time, I have something that may seem utterly trivial but in the end it is rather tedious and time consuming to manually check over and over and over.

PowerShell

When I automate a task, I generally will try to use the tool that seems the most appropriate for the task: windows scheduler, SQL Agent, TSQL, SSIS, VB, C#  and now I am trying to add PoSh to that list. I don’t believe there is a one size fits all automation tool. Sometimes, one has to be flexible enough to adapt other technologies into the tool-belt.

I have been working with a client to check their servers for SQL Server version, SSMS version, PoSH version and so on. All of this to try and get the appropriate updates installed on the server. Believe it or not, many of their servers were still running PoSH v2 and didn’t have any Service Packs installed for their database servers. OUCH!

Touching every single server (even if it is only 10 servers) is far too tedious and error prone. So, I spent a little time klooging together with my neanderthal level PoSH skills and found a way to retrieve various pieces of information from the servers and then store those data points in a database so I could report on the entire environment easily with TSQL. In addition, I could show change history and find approximately (at worst) when an update was installed.

Of all of the things I scripted to start tracking, the one I want to share this time can also be used to audit security on each of the database servers. I use the following script to audit the localadmins on each of the database servers in the enterprise. In order to trap each local admin on the server, I also recurse through domain groups to find all users of a group to find everybody that may have access. Here is a version of the script that is similar to what I use now.

Could I improve on the efficiency of this script? Most definitely I believe there is room for improvement. Remember, I am very novice at my PoSH skills. Scripting issues aside, it works and basically fetches a list of servers from a database, then iterates through each of those servers to fetch the complete list of local admins on each of the servers. Then the script writes out the complete list of admins for each server back to my database so I can generate a history of changes to the admins or report on who has admin access on the server.

For anybody that has admin access to a database server, the permission path (nested group path) is recorded in hierarchical form separated by the carrot character (^). Using this script, I have been able to provide a report to domain admins to clean out various unwanted individuals from access that was not intended or necessary.

Wrapping it Up

TSQL2sDay150x150Automation is an essential tool for every data professional. Wait, no, that’s not accurate. Automation is an essential tool in all facets of IT. Automation is a definitive method to work more efficiently and offload some of the mundane repetitive tasks that consume too much time.

Even if the task is not trivial but needs to be repeated and done so without error, the best tool is automation. Performing tasks over and over naturally leads to higher risk of error. The way to minimize that risk is to perform the task via some automation script or routine.

T-SQL Tuesday Participation Over the Years

Comments: No Comments
Published on: December 19, 2018

Loads of TSQL Tuesdays

TSQL2sDay150x150

Years ago the TSQL Tuesday party was started by Adam Machanic (b|t). The premise of the monthly event is to get people a topic each month to try and ease some of the difficulty with writing (figuring out what to write) and to get more involved in the community. The party started in December 2009 and is now embarking on the 10th year. The party is running strong and will likely continue for many years to come.

I have personally participated in quite a large number of these events. I am far from a perfect participation record like Rob Farley (b | t) but I do participate when I can. Every now and again I think about what I might have participated in or what I have written in the past for a TSQL Tuesday. This post will serve as a table of contents for all of those articles. This is to help make it easier to find for me (as well as give me a chance to review and fix problems from really old posts). This post will provide a quick synopsis of my article (if one exists) along with topic and links to original invite and roundups for the monthly party.

T-SQL Tuesday #001: Date/Time Tricks (Dec. 2009)

Invite and roundup

My Article: N/A

T-SQL Tuesday #002: A Puzzling Situation (January 2010)

Invite and roundup

My Article: TSQL Tuesday – But I was late

This was the first time I participated in TSQL Tuesday. I was very new to the blogging concept – and it shows. The puzzling problem I encountered in the article was a fun little job that kept giving different results than running the same code from SSMS. Check it out!

T-SQL Tuesday #003: RELATIONSHIPS (February 2010)

Invitation and summary

My Article: Relationships

I covered all sorts of different relationships that affect SQL Server from work, to marital, to table relationships in a database. It all ties together in requiring an individual to constantly tune the different types of relationships.

T-SQL Tuesday #004: IO (March 2010)

Invitation and roundup.

My Article: IO IO IO

What if we could create a numbers table without IO?  What if we could perform several different kinds of queries without IO?  Itzik Ben Gan proposed a cascading CTE solution that does this sort of thing. This article shows that method in use to help improve certain IO conditions.

T-SQL Tuesday #005: Reporting (April 2010)

Invitation and roundup.

My ArticleIP and Default Trace…T-SQL Tuesday #005

Having been introduced to a requirement to report on IP addresses of connections, I dive into a solution that will help show hostname, ip address and a few more tidbits.

When it is necessary to provide reports on activity occurring on the server, it pays to do a little prep work.  Be Prepared.  It is not an easy task to be able to go back in time and report on data that isn’t captured.  The little prep work that one may need to do is well worth the effort in the end.

My 2nd ArticleBLOB Report T-SQL Tuesday #005- Reporting

Learning how to decrypt the individual object sizes broken down into type and filegroup. Then report on the data that was retrieved from the DMOs.

T-SQL Tuesday #006: “What About BLOB?” (May 2010)

Invitation and roundup.

My ArticleT-SQL Tuesday #006: A Blobbing We Will Go

In the DMVs for SQL 2005 and SQL 2008 there is more than one place you can find the information about the size of your LOB data?  And it goes without saying that there is more than one way to find information about LOBs in your database.

T-SQL Tuesday #007 – Summertime in the SQL (June 2010)

Invitation and roundup.

My Article: Data Compression

I see database compression, as offered with SQL 2008, to be more like these file compression utilities than DriveSpace.  Data compression in SQL 2008 is not an all or none implementation.  You get to pick and choose what gets compressed.  That is a big time bonus for me.

T-SQL Tuesday #008: Gettin’ Schooled

Invitation and roundup.

My ArticleGettin’ Skewled

I am learning that learning is not just formalized education in a classroom or in specific settings.  There are things to be learned from all aspects of life.  This can be learned if only a little observation is used.

T-SQL Tuesday #009: Beach Time (August 2010)

Invitation and roundup.

I hosted this particular event.

My ArticleR & R

I find it completely useless to go on vacation if I am going to be checking email or project statuses every 10 minutes.  There is no rest or relaxation in doing those things while I am supposed to be doing something else.  Vacation should be fun and enjoyable.  Thus, if I am to enjoy vacation, I need to do a few extra things in the office prior to leaving.

T-SQL Tuesday #010 – Indexes (September 2010)

Invitation and roundup.

My ArticleTSQL Tuesday Indexes and Blobs

How does one find what columns were LOB columns in the database.  I knew I had some past blog posts about various aspects of LOBs, but I had never broken it down to find the columns in the LOB.  Even better was that I wanted to know what columns were in what index that were also a BLOB.

T-SQL Tuesday #011 – Misconceptions in SQL Server (October 2010)

Invitation and roundup.

My ArticleA Haunting TSQL Tuesday Tale

I chose the myth that truncate is unrecoverable. Check it out and see how a truncate is most certainly capable of a rollback or being recoverable.

T-SQL Tuesday #012 – Why are DBA skills necessary? (November 2010)

Invitation and roundup.

My ArticleT-SQL Tuesday #012 – Skills

As a DBA, we occasionally have the opportunity of attracting a new client or a new job or a new database.  It seems that more often than not, some of the skills requisite (at least they should be) of owning a database are missing. (Nunchuku skills could come in handy from time to time too!)

T-SQL Tuesday #13 – What the Business Says Is Not What the Business Wants (December 2010)

Invitation and roundup.

My ArticleT-SQL Tuesday #13 – Business Requirements

I think a common area that is easily overlooked when it comes to requirements and interpretation of requirements is report creation.  A common problem is that there are no defined or written requirements for the creation of a report.

T-SQL Tuesday #014 – RESOLUTIONS (January 2011)

Invitation and roundup.

My ArticleTSQL Tuesday 14: Committed

This month was all about resolutions and goals. My list of goals were pretty good and entirely profession based.

T-SQL Tuesday #015 – Automation in SQL Server (February 2011)

Invitation and roundup.

My ArticleT-SQL Tuesday #15 DBA Automaton

I shared a script that will load a sample of data from every table in every database for every column and give you the length of the pertinent columns (I have excluded obvious columns such as numeric types and certain LOB types).

T-SQL Tuesday #016 – Aggregate Functions (March 2011)

Invitation and roundup.

My ArticleT-SQL Tuesday #016: Aggregates and Statistics

Super geeky article on stats, quartiles and the like. For my data analysis and trending, I wanted to find a simple distribution across quartiles.

T-SQL Tuesday #017 – APPLY Knowledge (April 2011)

Invitation and roundup – unavailable on origin site.

My ArticleT-SQL Tuesday #17 – APPLY Knowledge

In this article I take an unexpected twist in my application of the APPLY operator. In that twist, I show a quick and simple script to grab role members.

T-SQL Tuesday #018 – CTEs (May 2011)

Invitation and roundup.

My ArticleT-SQL Tuesday #18 – CTEs

I shared a rather complex series of CTEs that are used in the same way a hierarchy and ancestry tree would be generated – except for Foreign Key relationships. That script was a load of fun.

T-SQL Tuesday #019 – Disasters and Recovery (June 2011)

Invitation and roundup.

My ArticleT-SQL Tuesday #19 – Disasters & Recovery

Who needs a single disaster when you can enjoy multiple disasters in a single sitting? I share three different types of disasters. I am sure many have had experiences with all three and probably many more types of disasters.

T-SQL Tuesday #020 – T-SQL Best Practices (July 2011)

Invitation and roundup.

My Article: N/A

I missed the announcement or something like that.

T-SQL Tuesday #021 – A Day Late and Totally Full Of It. (Aug. 2011)

Invitation and roundup.

My ArticleTSQL Tuesday 21 – FAIL FAIL FAIL

I don’t hide much as I share my feelings about crap code and then show my own crap code followed by a little bit of explanation on an improved version of the same code.

T-SQL Tuesday #022 – Data Presentation (September 2011)

Invitation and roundup

My ArticleT-SQL Tuesday #22 – Data Presentation

I chose to touch on several aspects of data presentation – Performance, Accuracy, Display, and Business Requirements.

T-SQL Tuesday #023 – Joins (October 2011)

Invitation and roundup.

My Article: N/A

D’oh – missed another one in 2011.

T-SQL Tuesday #024 – Prox ‘n’ Funx (November 2011)

Invitation and roundup.

My ArticleT-SQL Tuesday #024: Prox ‘n’ Funx

Big takeaway from this month was the value of a very good string splitter.

T-SQL Tuesday #025 – Invitation to Share Your Tricks (Dec. 2011)

Invitation and roundup.

My ArticleT-SQL Tuesday #025 – Holiday Gifts

In the spirit of the Holidays, I wanted to share some tricks and tips.  They can be my gifts to you during the holidays.  And maybe they can help you give to somebody else.

T-SQL Tuesday #026 – Second Chances (January 2012)

Invitation and roundup. (Original site toasted).

My ArticleTSQL Tuesday #26 or #23 – Identity Crisis

Having missing TSQLTuesday 23 and the topic being about Second Chances, I chose to write about JOINs and sort of fulfilling the requirements for both TSQL Tuesdays in one shot.

T-SQL Tuesday #027 – Invitation to The Big Data Valentine’s Edition (February 2012)

Invitation and roundup.

My Article: N/A

After that second chance I blew it the very next month. Luckily I have a good reason – Valentines Day!

T-SQL Tuesday #028 – Jack of All Trades or Master of None (March 2012)

Invitation and no roundup.

My ArticleT-SQL Tuesday #028 – Jack of All Trades, Master of None?

This one brings up bad memories of when as the DBA it was also required to be the janitor.

 

More updates coming soon!

What is T-SQL Tuesday?

T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements below. Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Steve Jones via the tsqltuesday website – here.

How to Participate

  • Your post must be published between 00:00 GMT Tuesday and 00:00 GMT Wednesday.
  • Your post must contain the T-SQL Tuesday logo from above and the image should link back to this blog post.
  • Trackbacks should work. But, please do add a link to your post in the comments section below so everyone can see your work.
  • Tweet about your post using the hash tag #TSQL2sDay.
«page 1 of 10

Calendar
June 2019
M T W T F S S
« May    
 12
3456789
10111213141516
17181920212223
24252627282930

Welcome , today is Monday, June 24, 2019