Collation Conflict with Extended Events

Have you ever run into an error like this?

Cannot resolve the collation conflict between “pick a collation” and “pick another collation” in the equal to operation.

This kind of error seems pretty straight forward and it is safe to say that it generally happens in a query. When you know what the query is and you get this error, it is pretty easy to spot and fix the problem. At least you can band-aid it well enough to get past the error with a little use of the collate clause in your query.

But what if the error you are seeing is popping up when you are trying to use Management Studio (SSMS)? The error is less than helpful and can look a little something like this.

And for a little context in how that error message popped up, here is a little better image.

As you can see here, the error message popped up just trying to get into the Extended Events Sessions folder in SSMS. Just trying to expand the folder to see a list of the sessions on the server throws this error. So what is really happening?

Background

First let’s cover a little background on this problem. This server was poorly configured. There was considerable set it and forget it action with this server – meaning all defaults and and a lot of bad choices. In an effort to update the instance to meet corporate standards, the client attempted to change the server collation to “Latin1_General_100_CI_AS_SC”.

Changing collations on a server is a rare occurrence in the grand scheme of things. If you do it at the right time, all will go well. Do it on a server that is poorly configured then there is a little more risk. In this case, the collation change had failed for the server. In order to figure out if the Server collation change had failed, we can run a few tests. First though, let’s see what that query looks like that was causing the error from within SSMS.

The query here is one time that profiler can actually come in handy (not that I would try to rely on it too much) to help retrieve given that there is a collation issue with Extended Events (XE). We can now take this a step further and start to show that the Server collation change failed. Let’s check the collations on each of those objects and then validate the server collation.

As we can see from the image, the collation for the master database indeed was changed (it is done before the user databases) but then the change for the server collation failed. As it turns out, due to the sequence of events, if there is a failure in changing the collation in a user database, then the collation change for the server fails but the master database will indeed change. We can further confirm this from the output of the attempted collation change. Here is a snippet from a failed change (sadly does not show the user database change failure due to extent of output).

So, how do we go about fixing it?

The Fix

Well, since we know where the failure occurs, the fix becomes pretty apparent. It may be a lot of work – but it isn’t too bad. The fix is to detach all user databases, attempt the collation change again, and then re-attach all databases. To help with user database detach and re-attach, I would recommend using a script that can generate the necessary detach and attach statements for all databases. It will save a bit of time.

Once, reattached, I can try to look at the XE Sessions from SSMS with much different results (shown here).

You see? It is as easy as that.

Final Thoughts

The default collation for SQL Server is a pretty bad idea. Sure, it works but so does SQL Server 7. When you have the opportunity to update to more current technologies, it is a good idea. Sometimes though, that upgrade can come with some pain. This article shows how to alleviate one such pain point by fixing problems related to collation conflicts and XE.

This is yet another tool in the ever popular and constantly growing library of Extended Events. Are you still stuck on Profiler? Try one of these articles to help remedy that problem (here and here)

The Extended Events library has just about something for everybody. Please take the time to explore it and become more familiar with this fabulous tool!

Window Shopping the Query Store with XEvents

Comments: 1 Comment
Published on: March 11, 2020

TSQL Tuesday

The weather is starting to warm and Spring is upon us. It feels like a great time to take a stroll outside in the warming sun and maybe do a little window shopping on our way to the March (2020) blog party called TSQLTuesday.

This party, that was started by Adam Machanic, has now been going for long enough that changes have happened (such as Steve Jones (b | t) managing it now). For a nice long read, you can find a nice roundup of all TSQLTuesdays over here.

This month, the blog party is all about couponing and super shopping at the Query Store. Tracy Boggiano (b | t) insists that everybody could save a grundle of money by couponing and shopping at this ultra sleek new store. Ok, maybe you can’t actually shop there but you can pick and choose your execution plans from this store just like you could pick and choose your favorite brand of spaghetti noodles from a store shelf. You can find the invite from Tracy – here.

I will not be delving into how to use the Query Store (QDS). Nor will I be exploring whether or not QDS is the right tool for you. What I am here to sell you on though is two methods to help you become better acquainted with the inner workings of QDS – via Extended Events (XE).

Some Basics

One of my favorite questions to ask during some of presentations on XE is “What was the first version of SQL Server to have Query Store?” You can imagine the wide array of answers but what is interesting is how often  the correct answer is always missed. I hear lots of answers for 2012, some answers for 2017 and somewhere in between for 2016. But never does the correct answer pop up.

Right now, I hope you are scratching your head at that last statement. You see, the question is somewhat of a trick question. The first version of SQL Server that has QDS is SQL Server 2014. However, the first version where you can actually use it is SQL Server 2016. This fun fact is visible when we start exploring SQL Server from the realm of XE. Let’s take a look.

The following is a screenshot showing some data that I have been tracking on XE across all editions of SQL Server ever since it was incorporated into SQL Server. Don’t trust my word for it, there is a query later that can help you see this for yourself on a SQL Server 2014 instance

The result of the query is far too large for this article, so I just grabbed a snippet of the result and marked each version of SQL Server where various events can be seen. Some events were deprecated after SQL Server 2014 (in blue) as noted in the image, and some that existed in SQL Server 2014 still exist in SQL Server 2019 (in green).

Here is a query (as promised) that can be executed on SQL Server to capture the events related to QDS.

If you are curious to see other sources showing that this feature was planned to be released in 2014, you can read it from Brent Ozar – here.

Looking Deeper

Referring back to the previous query, you may have noticed that I am filtering the events down to those that are provided by the “qds” package. The “qds” package is the provider for the Query Store (short name) or Query Data Store (long name). It’s not just events that are provided by that package, there are maps as well. Interestingly enough, there is more to this little package than just the QDS. Take a look at these results from a SQL Server 2019 instance (similar results can be found when querying SQL Server 2017 as well).

See those events in the blue box? Those are all related to the Automatic Tuning feature. Yes they are related to QDS and that is why these events are provided by the same package as QDS.

Sandbox

Let’s go ahead and enable QDS so we can take this to the next level.

Once we have QDS enabled (you are not restricted to the settings I chose to use – try something different if you want), let’s go ahead and also gather some additional information that will be helpful for creating some XE sessions.

That query will provide a list of tables that are related to QDS. It will also provide the proof of concept (basically) for creating an XE session. Knowing that the query will return just those objects for which I am interested, I am ready to go ahead and apply the same query pattern to my XE session.

But what is this XE session? Well, I am somebody who is curious and I want to know precisely what is being done when I do something like this.

That statement has the same effect as clicking the button from the Database properties page for the Query Store.

Since I really want to know what is happening when the QDS is cleared, I have this little XE Session to show me exactly that.

Never-mind the fact that I am creating a folder on disk despite the fact that I am using the ring_buffer in this session. I do that with most all of my sessions just in case I want to add an event_file target at some future point. A couple of key points for this session: 1) I have scoped the results to be pertinent just to the WhatsYourDBFlavorofTheDay database; and 2) the results are further filtered to just be for any objects that match “plan_persist” with some wildcards. If I have that session running, and I clear the QDS cache, I will see results similar to the following.

When the cache is cleared for QDS, it is a very quick process. Looking at the results of this session, we can see why that is – the underlying tables are all TRUNCATED. This may be a bit of a heavy handed approach given the truncate does reseed the tables. Just be aware of that fact. Personally, I am ok with the TRUNCATE and it is nice to know what is happening behind the scenes.

That is a bit of a fun look at a somewhat trivial piece of the QDS. Something more technical comes from trying to monitor the QDS for plan removal, cleanup processing, and similar tasks. How and when does some of that work/occur? Let’s get a more comprehensive XE session to peer into those kinds of internals.

That is a big session script! Here is basically what triggers all of those events in this session to fire.

And with that session running and after waiting a period of time, here is a sample of what some of that session data may look like.

Your results could vary quite substantially. That said, this session could help you to better understand how cleanup operations are occurring within QDS.

Wrapping it Up

This article has just shared multiple tools to help you become more acquainted with the Query Store! This acquaintance is coming via an extremely powerful tool called Extended Events. Through the use of these two sessions and two additional scripts, this article demonstrates how to become more familiar with the internals for QDS.

Feel free to explore some of the other TSQL Tuesday posts I have written.

If you are in need of a little tune-up for your XE skills, I recommend reading a bit on Extended Events to get up to date. For some “back to basics” related articles, feel free to read here.

Top 3 Database Life Hacks

TSQL Tuesday

Woohoo, it is an illuminating time of the month this second Tuesday of February 2020 and that also means it is time for another group blog party called TSQLTuesday. This party that was started by Adam Machanic has now been going for long enough that changes have happened (such as Steve Jones (b | t) managing it now). For a nice long read, you can find a nice roundup of all TSQLTuesdays over here.

This month, the blog party is all about hacking your database – err hacking your database life. Jess Pomfret (b | t) wants us to each share some of the best tips we have that can help save us oodles of time or can simplify our job significantly. These has been a growing phenomenon of late commonly called life hacks. You can find the invite from Jes – here.

I have a plethora of hacks of all sorts. But since we are supposed to focus on “life” hacks, my plethora is reduced to a smaller plethora. Whittling down that smaller plethora, I have a few that I really must recommend. A couple of these have appeared at other times in some of my articles. Check out my top 3!

Solutions

This is one that I have been using for most of my career. I started using Solutions in SSMS to help organize my scripts and keep them handy for quick use. The amount of time saved there is beyond count at this point. I adopted this method later in my career to also use it when giving presentations.

By adding all of my scripts for a presentation to a solution, it is a quick and efficient way to organize my scripts for use through the presentation. In addition, I have the solution saved so I can quickly reload it at any time when preparing to give the presentation again. Open the solution – BAM! I am ready to go.

As you can see in the associated image, I don’t just do a single presentation per solution. I create one solution to house all of my regular presentations. An added benefit to this is that I will have it already loaded just in case there is a pertinent question that can be best explained through the use of a demo from a different presentation.

I explain in more detail how to use solutions from my previous post on the topic – here.

 

Mapping it Out

This feature is one of my favorite features still – almost four years later. Even better is that MANY data professionals are still surprised to learn that it exists. I wrote about the scrollbar map in this article and here is a quick synopsis of the feature.

In the following image, I have highlighted the scrollbar as it is configured in SSMS for all instances where I use SSMS. Notice how it looks somewhat different than a typical scrollbar! To help point it out, I have circled the scrollbar in red.

How do we enable that map? That is easy, right click the scrollbar itself (as shown in the following image) and then select “Scroll Bar Options…”

After clicking that menu option, the following window will open.

Circled in red is the set of configurations that will control the look and behavior of the scrollbar. Personally, I prefer map mode over bar mode. Map mode also comes with the option of a “source preview” that will show the code as I hover over the map (as shown in the first image). Again, using this tool, I can navigate the code so much faster and save soooo much time, it is a must have.

Clickable TSQL

One of my favorite time saving hacks is the ability to quickly find code that references a specific object. In addition to being able to quickly find the desired object references in code, I like to be able to display the referencing code quickly without having to go search for it in source control or in SSMS by name. This is where a handy dandy script that formats the code into clickable format is helpful. Not only does it give me the ability to display the code – but it displays it in the format that it was committed to the database.

All of the credit for this code nugget goes to Adam Machanic (b | t). He took the time to develop this method and to map out the “?”-able characters appropriately.

When running this code, I get something like the following.

When clicking the link illustrated in the preceding image (big red arrow pointing to it), I get to see something like the following.

Note here that whatever was committed to the database, that is what is being displayed. Sure, I put a sarcastic comment in the green box about the developers notes (entirely because it is all XML and is poorly written for the SQL side imo), but there is also the important illustration in red showing the Create Procedure statement where the stored proc is actually being created. Furthermore, this is the exact format in which the proc was committed to the database. Now, I have have just saved myself some time every time (potentially dozens of times a day) that I have to inquire of the database for a specific referenced object and all of the code that references that object.

Wrapping it Up

There you have it – three very powerful but simple hacks that can give you back dozens of minutes per day. These hacks can help you better navigate code and respond to some of those interrupt requests requiring either some referencing code or maybe to troubleshoot an issue that just cropped up. Check them out and try them. I am sure you will find they will save you oodles of time too!

Feel free to explore some of the other TSQL Tuesday posts I have written.

If you are in need of a little tune-up for your XE skills, I recommend reading a bit on Extended Events to get up to date. For some “back to basics” related articles, feel free to read here.

Trace Query in Extended Events

Tracing a query is a common task for a DBA. The methods employed are pretty varied. I seem to encounter a new method here and there throughout my journeys and projects.

One method is surprisingly easy and I don’t recall ever paying it any attention until recently. Upon discovering this method, I was appalled that there is still no equivalent method within Extended Events (XE). In this article, I am going to share that method and a viable equivalent for XE until an appropriate GUI integration is created for SSMS.

Query Tracing

First things first, how do we find this supremely easy tracing method? When do we want to use it? Let’s answer that second question first. This is a method we will want to use whenever we have a query that we have just been handed and we want/need to trace the query to figure out things such as resource utilization. To get to this method, we simply right click in the query pane and select “Trace Query in SQL Server Profiler” from the context menu. The following image illustrates that menu option.

After selecting that menu option to trace the query, the next step is amazingly simple too – just execute the query. That is fabulous – if you want to use a tool as inefficient and outdated as Profiler. How do we do this in XEvents? First, we need to capture a bit more detail from this Profiler style trace.

Let’s delve into the properties for that trace session we just started (from the previous image).

Once we have the properties open, the next step is to click the “Column Filters…” button as shown in the following image.

After clicking the “Column Filters…” button, a new window will open, revealing any filters that were defined for that query we wanted to trace. In this case, the SPID for the query window is transferred to the query trace. Thus, in theory, this Profiler trace will only capture data related to the SPID in question. With the filter in hand, and also noting the events being trapped from the properties window, we have adequate information to create an XEvent session to perform the same functionality.

We can easily setup a session in XE through the GUI using the Standard template shown here:

And then modify it to include the missing events as shown here:

Or, one could use the easy button and take advantage of a script. The script option provides a much more robust option while also being far less repetitive than the GUI.

This session is ready to roll (without a GUI access point obviously) simply by entering the SPID #, into the @SessionId variable, for the query window in question. Once the spid value is entered, and the script is executed, we can easily watch the live data for the spid in question (if we wish).

So, what happens when I need to query a different spid? That’s easy! I just change the value of the @SessionId variable to the spid in question and then run the script again. The script will drop and recreate the session with all of appropriate filters in place and pointing to the correct SPID.

Final Thoughts

The ability to quickly and easily trace a query is important to database professionals. This script provides one useful alternative to trace a specific spid similar to the method of using the context menu to create the trace within SSMS and Profiler.

This is yet another tool in the ever popular and constantly growing library of Extended Events. Are you still stuck on Profiler? Try one of these articles to help remedy that problem (here and here)

The Extended Events library has just about something for everybody. Please take the time to explore it and become more familiar with this fabulous tool!

CRM Data Source Connection Error

Working through some security modernization recently with a client I ran into a fun little nugget. What we were trying to accomplish was to make the service accounts being used more secure through the use of Managed Service Accounts.
Making the change was fairly easy and all of our services seemed to start up swimmingly. SQL Server was running perfectly fine. Even when we got to SSRS and applied the change to the Service account via SSRS Configuration Manager, things went smooth.

After ensuring the services all started came the task of validation. From a sysadmin perspective we often figure the validation process means looking through the tools we know and sometimes we don’t know what to do via the application to validate functionality. For this we more or less rely on the end-users. While waiting for the end-users to perform their validations, we resort to the tools known best to us – SSMS etc.

In the case of testing reports, one may attempt to use the Report Manager url (or report server url). When opening reports, you may see that everything is working correctly through that tool. So, you continue to wait for the end-user to validate. Eventually they say it all looks good only to report back a week later that nothing is working and now you are stuck trying to figure out what to do to resolve the problem.

SSRS Service Account Needs PrivUserGroup

This is particularly interesting given everything worked just fine before the service account change and given that the only thing that changed was the service account (via Reporting Services Configuration Manager). Nothing changed with the execution account (an execution account was not actually employed in this configuration at any point).

We also determined that nothing was necessary to be changed with the CRM’s “SRS Data Connector”. What is the next best thing? Well, time to take a look into the logs and try to find some clues there. Perusing the logs, I was able to find a couple of key indicators. Here are the two enlightening errors.

First error:

“Cannot create a connection to data source ‘CRM’.”

Better, more verbose exception that broke the case wide open:

“Immediate caller <SQL Server Reporting Services Service Account> has insufficient privilege to run report as user <SID>”

Googling for these errors revealed a lot of information about the errors in the event an Execution Account for SSRS was being used. That scenario didn’t apply but it did give a hint at what to look at next. Comparing the previous service accounts permissions in Active Directory to the new service accounts permissions we found the “PrivUserGroup” permission was missing. This permission was the exact permission mentioned in all of the other articles. Adding that permission immediately solved the issue for the end-users and their ability to run reports via the CRM application.

The Wrap

This article takes us to the edge with a couple of CRM related errors after changing the service account to a more secure Managed Service Account. Despite the CRM reports working properly within Report Manager (via SSRS), the reports would fail in CRM.

Comparing the permissions for the old service account versus the new service account shed a bright light on the problem indicating a missing permission. PrivUserGroup is essential to proper report functionality and should be added to either your execution account or your SSRS service account.

This has been another post in the back to basics series. Other topics in the series include (but are not limited to): Backups, backup history and user logins.

Coping with Self Doubt

TSQL Tuesday

Behold, the second Tuesday of January 2020 and that means it is time for another group blog party called TSQLTuesday. This party that was started by Adam Machanic has now been going for long enough that changes have happened (such as Steve Jones (b | t) managing it now). For a nice long read, you can find a nice roundup of all TSQLTuesdays over here.

This month, the blog party is almost like a masquerade ball. John Shaulis (b | t) has implored us to take hold of our inner impostor. Okay okay, he hasn’t asked us to be impostors, but rather he has requested we talk about Impostor Syndrome.

At first glance, and seeing the word “impostor”, my thoughts turned outward. I thought John wanted us to out all of those Sr. DBA candidates out there who are soooo confident but don’t know the difference between a PK and FK or even the difference between a delete and truncate (as examples).

In reality, John is asking about something entirely different. Looking up “Impostor Syndrome” I see several definitions similar to the one that John posted. Here are a couple of examples.

Impostor syndrome can be defined as a collection of feelings of inadequacy that persist despite evident success. ‘Impostors’ suffer from chronic self-doubt and a sense of intellectual fraudulence that override any feelings of success or external proof of their competence. – Harvard Business Review

And…

Impostor syndrome is a psychological pattern in which one doubts one’s accomplishments and has a persistent internalized fear of being exposed as a “fraud”. – Wikipedia

These definitions reveal something very pertinent to the topic. The syndrome consists of the presence of persistent, nagging, chronic self-doubt. Even in the face of success, the feelings of inadequacy can’t be escaped. Well, this certainly adds light to the topic and made me ponder even further. You see, I think the disease is a much larger problem than having the occasional feeling of being inadequate. Persistent chronic feelings of inadequacy and Impostor Syndrome (as defined) would make me concerned about the presence of depression as well. This can be a very serious issue and shouldn’t be taken lightly. Seek help!

Well, What if…

What about the fact that so many of us do have feelings of inadequacy? The occasional feeling of not being good enough is not entirely a bad thing. (Again, the trick is figuring out where that line between chronic/persistent/nagging feelings ends and the occasional feelings is ok.)

There have been many times when I have personally had the feeling of inadequacy as a Data Professional. These thoughts often creep up whenever doing something new or different or even something uncommon (but practiced). I have even had these thoughts occasionally right before speaking. I have a close friend who has these thoughts just about every time before he speaks.

When these thoughts occasionally crop up, what is done next is the important thing. How does one cope? Does the coping mechanism work? Do you practice your speaking in front of a mirror for hundreds of hours? Maybe you have a few people proof read an article you are writing. Maybe, you build a proof of concept and then ask a couple friends to test it and try to break it to make sure it works.

Coping mechanisms are great methods to assist in the removal of self-doubt. Another good coping mechanism is to get some physical activity. Physical activity can help with the chronic self doubt and is also something therapists may recommend for depression (important because the symptoms for both Impostor Syndrome and Depression are so close).

Another component of Impostor Syndrome is the fear of being discovered as a fraud. Living in fear isn’t healthy. If there is the persistent unshakable feeling of fear about being discovered as a fraud, then it is time to find help and also start troubleshooting your personal health. Here is a good starting point to help troubleshoot your health in regards to this syndrome.

One of the best coping mechanisms for Impostor Syndrome is the use of learned behaviors. Some of the learned behaviors that are great for everybody are as follows:

  1. Learn to take your mistakes in stride. Mistakes are natural. Own the mistake. Mistakes can make us better at we do. The sooner we own a mistake, the sooner we can learn how to not make that same mistake again.
  2. Learn to accept internal validation. Not everybody needs a participation trophy. Internal validation is just as good as external validation. Also be accepting of constructive criticism.
  3. Set smaller attainable goals directed at specific skills or behaviors you wish to improve upon or learn.
  4. Learn to rely on others. Be able to reach out and ask for help.
  5. Instead of being an information hoarder, learn to disseminate knowledge. In addition, it is perfectly acceptable to admit you don’t know something and trust that you can learn it quickly or find somebody that already knows it.

Wrapping it Up

It is not abnormal to have the occasional feelings of inadequacy. In IT, and particularly among those who are successful, thoughts of being a fraud can occur. The important thing to do is recognize the thoughts and frequency. When the thoughts start to creep in, learn to replace it with a coping mechanism or an alternate behavior. This article provided some coping mechanisms, and some behaviors to try and learn.

In the case of experiencing chronic inescapable self-doubt and living in constant fear of being found out as a fraud, that is an entirely serious issue and professional help may be required.

Feel free to explore some of the other TSQL Tuesday posts I have written.

If you are in need of a little tune-up for your XE skills, I recommend reading a bit on Extended Events to get up to date. For some “back to basics” related articles, feel free to read here.

PowerShell to Backup XE Session

Quite some time ago, I shared a few articles that peeled back the top layer of how to use PowerShell (PoSh) with Extended Events (XEvents). Among those articles, I showed how to retrieve the metadata, predicates and actions, and targets (to mention a few). Those are prime examples of articles showing some of the basics, which means there is plenty of room for some deeper dive articles involving both PoSh and XEvents. One topic that can help us bridge to the deeper end of the XEvents pool is how to generate scripts for our XEvent Sessions.

In this article, I will venture to show how to generate good backup scripts of our sessions using PoSh. That said, there are some caveats to using PoSh to generate these scripts and I will share those as well.

Wading to the Deeper End!

PoSh provides many methods and classes that allow us to more easily do various things. One of my favorite uses for PoSh is to automate tasks across the enterprise database server footprint. Nothing like creating a script that can effectively hit all of the servers one manages in one fell swoop to help minimize effort and time spent touching all of the servers. Afterall, a prime directive of the effective DBA is to be lazy.

So, when I look at creating a script in PoSh, I also look at how can I make this script function for a single server just the same as it would work for hundreds of servers. That will be one of the “nice to have” add-ons that you will see in this script. But, how did we get there in the first place?

A critical component of using PoSh to generate the scripts for each of the XEvent Sessions on the server is the GetScript(). GetScript() can be called for many different objects within PoSh. Another critical component is the XEvent SMO classes that were added as of SQL Server 2012. Take a mental note of that because it will come into play in a bit.

In addition to these critical components, I used some of the stuff that I mentioned in the previous articles as well as my article on XEvent management with PoSh. As I have said many times, these articles are building blocks and do rely upon many fundamentals divulged in previous articles.

Getting into the Script

Given this is a script that interacts with SQL Server, we need to ensure the SqlServer module is installed and loaded. It if is already installed, that is great. During my testing, I found that one of my servers, I ran this locally on, did not have it installed. With that discovery, I figured it is just easier to implement a check. If the module is not installed, then install it.

The next major component is how I fetch my list of servers. I prefer to have all of my enterprise SQL Servers listed within SQL Server. I keep a table with all of the servers and designate various attributes for the Servers (e.g. prod or dev, business unit, decommissioned etc).

Once, I know which servers need to be queried, I need to make sure I set my PoSh location properly.

With this script, I also have the ability to script out a specific XEvent session or to script every XEvent session deployed on the server. When scripting all sessions on the server, I opted to put all of the sessions into a single output script file. Since, this script can be run against multiple servers, I name each output according to the Server Name and the Instance on that server followed by the XEvent Session name (or all_sessions in the event all sessions are being returned).

Since I prefer to use a custom event_file path, I also wanted to ensure my script would produce a means to ensure the custom file path is created. I needed to validate that this only gets added to those sessions that had an event_file added to it.

There are a few more nuggets throughout that help bring this whole thing together. This is what the completed script looks like.

If you follow along on that script, you will see that I concatenate the sessions together and then force overwrite the output file. This ensures that I am not appending to an existing file and also ensures that I have the most current version of the XEvent session script. I configure this to output to the local server from where the script is executed.

Caveats

Using PoSh to generate scripts is extremely handy especially when working across numerous servers. That said, there are certain problems for which one must be aware. First is that the SMO classes for PoSh were introduced as of SQL Server 2012. These objects are not backwards compatible.

The second issue I have may be less of a concern to you than me, but it goes well with concerns I mentioned in the article I wrote about how to do this same task in TSQL. The script generated via PoSh adds the events in alphabetical order. I have also seen on some cases where the predicate is not in the exact order that was originally created. Predicate order is essential! Event order may not be an essential thing for anything other than Source Control or for the demanding OCD DBA. Your mileage may vary.

Final Thoughts

Bringing PoSh to the world of XEvents can open up your eyes to a world of better automation and DBA efficiency as you manage your enterprise. PoSh also can help us dive deeper into the world of XEvents as we will see in some upcoming articles.

Working with Extended Events will help you become a better DBA. Working with PoSh can also help you in many various tasks to become a better DBA. Combine the two and you just might have a super weapon.

Interested in exploring the vast world of Extended Events? Check these out! The library of articles is pretty large and continues to grow.

Unicode, Emojis and Databases Oh My!

Categories: News, Professional, Scripts, SSC
Comments: No Comments
Published on: January 5, 2020

Over the past several (or pervious if you have been following along) articles, I have explored the use of special characters in certain database scenarios. Using these special characters brings certain fun and flare to your hum drum naming schemes. In addition to the fun, there comes a little bit of extra peculiarities to ensure everything works as expected.

While figuring out some examples, I found myself searching for good reliable sources constantly. I also found myself trying to find various emojis that would work well to represent one idea or another. Sometimes, the effort paid off, and sometimes it seemed frivolous.

If only there was an easy comprehensive source to search through all available unicode characters for SQL Server. Then a thought occurred to me. Why not create my own database with these characters?

Laying the Groundwork

After determining that a database would work best locally to make it easier to research these characters, I quickly determined that I would need to figure out how to write a powershell script so I could scrape a list of characters from a website. That should be easy, right?

With a sample such as that, I just need to figure out how to pull the image for each character and the unicode value of each. Next task to figure out is where can i find some decent powershell examples on the web that would do a similar task.

In addition to figuring out how to do the powershell end of this, there is the database side of things. What should the table design be for this kind of data? Which values and attributes should be stored? Am I overthinking this? Is this too much work?

Well, this is indeed far too much work. Especially given this awesome gem I found while trying to find the exact reference link I was looking for in one of the pervious articles. You see, as it turns out Solomon Rutzky had already done all (and I really do mean all) of the heavy lifting for this when he wrote a function that will do exactly what I was looking to do – here. The function that Solomon created will return all 188,657 code points that are usable in SQL Server.

Now, instead of building out a script to import the data from somewhere else, I just need to employ the use of this fabulous script and move on with all the oodles of time I was spared.

Put a bow on it

Playing around with emojis in a database is a fun endeavor. Not only is it fun to play with for personal growth, but it does have some business advantages. We live in an era were these emojis are popping up everywhere and are even being stored long term in many databases. I view it as a great opportunity to improve your skill set and better your career for the future.

Interested in learning about some deep technical information instead? Check these out!

Want to learn more about your indexes? Try this index maintenance article or this index size article.

This is the twelfth article in the 2019 “12 Days of Christmas” series. For the full list of articles, please visit this page.

*Previous has been intentionally mis-spelled throughout this article as “pervious” as an ode to a fat finger mistake in a recent article.

Creative XE Sessions – Funky Data

In the previous article, I explored the opportunity of using special characters (or emojis) within Extended Event Sessions. Creating a session with these characters in the name is not terribly complicated. However, it is a bit more of a pain on the data review side (as mentioned in the previous article).

Just because it gets to be a little more complicated is seldom a deterrent for somebody such as myself. In fact, many times it is more of a challenge. These challenges are fun. With this particular challenge, and as I promised, we are going to see how we can solve the problems posed by having emojis within the object names (which causes pseudo funky data to be trapped in the session data).

Get the Funk Out!

As was demonstrated in the last article, when we have special characters in the database names and that data is captured in an XE Session, we end up with some smelly funky cheese type data that is fairly unpleasant to our desires. We have to do something quickly with that funky data in order to make it useful.

Unlike funky cheese that may have turned, we are not going to discard this data as being unwanted or un-useful. We still have a use for it. We just need to transform it a little bit before it becomes a little more palatable.

Let’s go back to a query we used to parse the session data from that last article. We need to modify this query to make it more capable of handling this new data experience.

Knowing that we have certain patterns that need to be recognized and altered, we can take advantage of the replace function. The first two patterns to find are: “&#” and then “;“. The next step is a little less known. What do we replace those known patterns with? Well, as it turns out, and as is shown in the above query results window, we know that we are looking for two surrogate code points (High and Low) that represent the character in question. When we do a little TSQL trickery and use NCHAR with those surrogate values, we can render the character to display in the desired format. Let’s see that in query form (it will help that explanation quite a bit).

The results of this query will look like the following.

Easy peasy right? Well, sort of easy. We can easily render that funky data we are seeing in the session details to something more human friendly. However, in order to display that data in human friendly format, we have to execute it in a sort. This means we have to do something a little more with the query – we have to convert some part of the query to a dynamic SQL statement. Here is how we can do that!

In this new version of the query, I have a few things going on. First, I am pattern matching and doing a replacement for those aforementioned terms. Next, I am dumping the results into a temp table and removing some extra data (unnecessary stuff due to the pattern matching and replacement). From the temp table, I then build a dynamic query to pull it all together and use that NCHAR trick to put the two surrogate values together to visually represent the character we expect. In the end, I have results that look like this.

Now, from a TSQL query, I have a visual representation of the Database Name that matches what I should be seeing within the SSMS object tree. This makes more logical sense to the person reviewing the data and is far easier to correlate to the correct database.

What about the histogram target that was demonstrated in that other article? Let’s look at the changes for that one as well.

The results are now similar to those we fetched for the event_file target (similar in display format). Again, this makes it easier to digest for us humans.

A little more effort on the preparation end with the TSQL queries, makes for an easier time going forward on the data review end of these XEvent Sessions.

Put a bow on it

Playing around with emojis in a database is a fun endeavor. Not only is it fun to play with for personal growth, but it does have some business advantages. We live in an era were these emojis are popping up everywhere and are even being stored long term in many databases. I view it as a great opportunity to improve your skill set and better your career for the future.

Interested in learning about some deep technical information instead? Check these out!

Want to learn more about your indexes? Try this index maintenance article or this index size article.

This is the eleventh article in the 2019 “12 Days of Christmas” series. For the full list of articles, please visit this page.

Creative Extended Event Sessions

In the previous article, I showed how to take advantage of collations to add more unique and interesting names to databases. In that article, I alluded to the better collation to use but failed to be very explicit about it.

As Solomon Rutzky pleads in many articles, the best default collation to use is not any collation that starts with SQL, but rather the collation one should use is Latin1_General_100_CI_AS_SC. I agree. Let’s not use the old default collations anymore and start using more modern collations – even if Microsoft does not change the default, you should change your defaults in your builds to represent a better more modern collation!

Fun with Sessions

Let’s suppose I strongly suspect I am having an issue with the ghost cleanup process for a specific database called ‘👻’. I want to prove whether the suspicions are accurate or not so I resort to an old tried and true XE session to validate. Here is that session.

Now, let’s try to validate my suspicions.

The best part is yet to come. With the event session running, I need to execute the preceding query. The query may take several minutes to complete – don’t worry yet. After the query completes, it is time to check the event session data to try and confirm the suspicions.

This is where things start to get a little interesting! If I try to open the session data to view it from the GUI, I will see something like the following.

Well, that is not very useful! Let’s try it from a TSQL query instead!

Well, that seems to be a little better. At least I get some data back at this point. Sadly, the database name is poorly displayed. That said, the database name that is displayed is accurate – just not very useful. As it turns out, XE is lagging a bit behind in the ability to display the characters that require certain code points (seemingly any that require multiple code points or above a certain range). I have not yet tested the full limitation, just understand that there is a limitation.

What if I tried a different target such as an event_file target?

The event session will start successfully. In addition, I can confirm that the event file is created on disk.

Then I will rerun the experiment to test the ghost_cleanup process against the 💩 database. If I query the event session with TSQL, I will see something like this.

If I look at the event session from within the GUI, I will see something slightly more usable.

Peeking at the file data (instead of displaying it in tabular format) from tsql commands, I will see something like this.

Stay tuned to see how we can resolve this issue where the 💩 and 👻 are not being translated properly in these TSQL queries for these sessions.

Put a bow on it

A picture is worth a thousand words, right? Sometimes, a picture for an event session just may be able to say it better than 50-60 letters used to try and describe the session. Maybe you just want to try it out for fun. Either way, the use of these characters in an event session can be a little difficult when looking to review the data.

Interested in learning about some deep technical information instead? Check these out!

Want to learn more about your indexes? Try this index maintenance article or this index size article.

This is the tenth article in the 2019 “12 Days of Christmas” series. For the full list of articles, please visit this page.

«page 1 of 70

Calendar
April 2020
M T W T F S S
« Mar    
 12345
6789101112
13141516171819
20212223242526
27282930  

Welcome , today is Tuesday, April 7, 2020