T-SQL Tuesday #108: New Horizons Beyond SQL Server

Comments: 1 Comment
Published on: November 13, 2018

There comes a point in one’s career when a change is requisite. Big or small there always seems to be a tipping point that mandates some sort of change. Maybe the change is an entirely new career field. Maybe the change is adapting to the ever improving features of a specific software or product. Maybe, that change means learning a tangential technology.

This is precisely the goal Malathi Mahadevan (b | t) seems to have envisioned for the 108th installment of TSQL Tuesday.

If you are interested in reading the original invite, you can find that here.

So the challenge for this T-SQL Tuesday is – pick one thing you want to learn that is not SQL Server. Write down ways and means to learn it and add it as another skill to your resume. If you are already learning it or know it – explain how you got there and how it has helped you. Your experience may help many others looking for guidance on this.”

Personally, I am not one to settle, so learning and improving are important. New technologies, changes in technologies, new features, tangential technologies – they are ways to continue to learn and improve – most of the time. Sometimes, a new technology offers a good change of pace and offers an exit from something that is becoming too standard, while providing an entrance to something different, difficult, exciting and expanding.

Through the year (2018), I created a few goals for myself around some of these new or different technologies:

  1. Become proficient at MySQL (maybe even certify who knows)
  2. Become proficient at PowerShell
  3. Work towards the TCM (I got to busy with 1 and 2 to even accord any time towards this one)

Proficient is sort of a vague term because it can have a different meaning to different people. For me, I will describe what I have been doing to become proficient in both PoSh and MySQL.

PowerShell

A shout out is absolutely necessary for Adam Machanic (twitter) for picking the right blog meme that has been able to survive so long in the SQLFamily. This party has helped many people figure out fresh topics as well as enabled them to continue to learn.

I have dabbled in PowerShell over the years – just enough to “be dangerous” as some may say. I wouldn’t call it proficient and probably not even dangerous really. I was able to muddle my way through being able to create some basic scripts to perform certain tasks.

This rudimentary ability just isn’t enough to be able to put the skill on a resume (imho). It certainly wasn’t enough skill to be able to manage a large group of servers and perform various enterprise type tasks. My objective was to be able to create a set of tools for myself that I could use and have repeatable success at great ease.

I sat down with my first set of various tasks I wanted to be able to perform and worked on them as I set time aside to learn PoSh better – every day. It took several weeks and by no means am I on the same level as Rob Sewell (b | t) or Chrissy LeMaire (b | t) or Sean McCown (b | t). That said, I do feel I am far more capable in PoSh now than I was last year.

This is a skill that I plan to continue to hone. I am learning every time I pick it up and try to do something different. It is good that I am able to keep learning. Next year, I plan on being able to say I am more proficient than I am now. I am also hopeful to be able to be good enough to properly contribute to the dbaChecks project. I do also hope to share some of the scripts I have created on my blog as well.

Here are some of my first dabbles with powershell that I have integrated into other blog posts. Yes, they are very rudimentary.

MySQL

Learning MySQL is probably not too big of a stretch to be honest. I certainly think it is far more similar to SQL Server in many regards than PoSh. After all, it is just another little DBMS and does happen to be rather popular.

I wanted to pick up MySQL so I could support clients that have it installed here there and everywhere in their environments. Many clients have more than one DBMS platform and it is good to understand and be able to administer multiple platforms with a high level of competence. Unfortunately, MySQL comes with a fair amount of gotchas. There are serious limitations depending on version and flavor. Some clients may be stuck on a rather old version of MariaDB (akin to SQL 2000). This means developing a broad set of scripts and skills quickly and on the fly.

I have a ways to go in my learning due to the varied flavors of MySQL but I am getting there. I do feel pretty comfortable hopping in and troubleshooting performance issues and doing a quick health assessment at this point. I would call that proficient. Similar to what I said about PoSh, I plan on being able to say next year that I am more proficient. More opportunity with this platform is what lends itself to better proficiency.

TSQL2sDay150x150The Wrap

I believe in continuous integration / improvement when it comes to personal growth and development. It is necessary to keep your personal skills sharp as well as keep yourself marketable.

Oh, and if you are interested in some of my community contributions (which according to Jens Vestargaard is an awesome contribution), read this series I have published.

T-SQL Tuesday #104: Just Can’t Cut That Cord

We all have our favorite scripts, tools or utilities. Those are the things that help make our jobs easier. Some of us may have an unhealthy relationship with some of those scripts (similar in nature to the relationship many have with their phone). Whether or not the need to cut that proverbial cord exists, today we are not discussing the health of that dependence. Suffice it to say, sometimes we simply need to upgrade our scripts. How else can we get better scripts or make our scripts better – by sharing them.

This is precisely the goal Bert Wagner (b | t) seems to have envisioned for the 104th installment of TSQL Tuesday.

If you are interested in reading the original invite, you can find that here.

For this month’s T-SQL Tuesday, I want you to write about code you’ve written that you would hate to live without.

Maybe you built a maintenance script to free up disk space, wrote a query to gather system stats for monitoring, or coded some PowerShell to clean up string data.  Your work doesn’t need to be completely original either – maybe you’ve improved the code in some open source project to better solve the problem for your particular situation.”

There is a high probability that through the sharing of your script, somebody out there can benefit from that script. In addition, it is very likely that somebody will make a suggestion to help make your script better. Worst case (emphasis on worst case here), you have the script stored somewhere with half decent instructions on what it does and making it easily accessible for you to use again and again. Just in case you forget you have it out there – you can google for it again and find it on your own blog ;).

Personally, I have been able to find and re-use some of my older scripts. Not only do I get to re-discover them, but I also get to re-imagine a new use or improvement for the script.

Brief Intermission

A shout out is absolutely necessary for Adam Machanic (twitter) for picking the right blog meme that has been able to survive so long in the SQLFamily. This party has helped many people figure out fresh topics as well as enabled them to continue to learn.

Easy Access

While pondering the topic for today, I had the thought occur about how frequently I post a script on my blog already anyway. An easy out for this topic would have been to re-share one of those old scripts. For instance, I could easily redo a recent article about server access that has a couple scripts demonstrated in it. Or I could go back a few years to my articles about foreign keys (here or here) and space use (here or here). Even more intriguing could be to re-envision some of my articles on Extended Events. But where would the fun in that be?

Rather than take the easy road and rehash something, I have something different. This one goes hand in hand with the numerous articles and scripts I have previously provided on auditing – yet it is different.

Not every shop can afford third party software or even Enterprise edition and so they have to come up with a different way to audit their database instances. One of the problems with a home grown solution is to ensure the data is not stored local to the server (lots of good reasons for that). Here is an example of what I did for one client that happened to have a developer that found a back door that was giving him SA access to the SQL Server Instance and was changing things and trying to cover his tracks – even after being warned.

First the query

This query will be run from a job on a different server that is restricted in access to just a select few people. I do rely on the use of the default trace in this query. I am also reliant upon a little bit of sneaky behavior. If I run this from a separate server, prying eyes are usually unlikely to find that it is running and thus makes it easier to catch them red-handed. In addition, if they discover via some sort of trace and by a lot of luck that it is running, then they have no access to the remote server to alter anything that was captured.

The query does go out to the default trace and pull back any changes to permissions or principals on the server in question. The captured data is then stored in a database that is also restricted to a select few people. Lastly, the captured data can be routinely queried, or automated reports can be created to send email notifications of changes encountered.

The second part of the trickery here is that I am using a linked server to perform the queries (a slight change and I could also do this via powershell which will be shown in a future article). The linked server query uses the openquery format and sends the default trace query to the remote server. Since I am running this from a job on an administrative server that pulls a limited data set, I am not overly concerned with the linked server setup here.

Storing It

Once I query the data, I need to put it somewhere on my administrative server. The table setup for that is very straight forward.

After creating this table, I am ready to store the data. All I need to do is throw the audit query into an agent job and schedule it to run on a regular schedule. For my purposes, I usually only run it once a day.

TSQL2sDay150x150The Wrap

This has been my diatribe about service and giving back to the community. When done properly, there is a natural born effect of enhancing one’s personal life equal in some way to the amount of effort given towards the community.

Oh, and if you are interested in some of my community contributions (which according to Jens Vestargaard is an awesome contribution), read this series I have published.

Single User Mode – Back to Basics

In a recent article, I took you on a trip through how to hack (ethically) a SQL Server to regain sysadmin access. In that article, I made quick mention of restarting SQL Server into single-user mode. It only makes sense to show quickly how to get into single-user mode.

Before getting into that, I do hope that there is something you will be able to learn from this basics article. If you are curious, there are more basics articles on my blog – here.

Single-User

So, what exactly is this single-user mode thing? Single-user mode is basically the official back-door into SQL Server for various reasons such as:

  • Somebody deleted all of the logins that were in the sysadmin role.
  • The sa account is disabled or the password has been forgotten.
  • Somebody deleted any Windows groups that were members of the sysadmin role.
  • All members of the sysadmin role are no longer with the company.
  • You need to restore the master database
  • You want to keep SQL Server all to yourself because you are greedy!

These are some pretty solid reasons to need to be able to use the back door. But how exactly do we get to the back door?

Two Paths

As luck would have it, there are two ways to enable single-user mode. You can either get there by making some changes for the SQL Server service in Configuration Manager, or you can utilize a command prompt. I won’t cover the gui path beyond the gentle reminder that you must remember to undo your change when using that method.

My preferred method is through the command line. Using my SQL Server 2017 as the experiment, I would navigate to the Binn directory for that instance. In this case, as shown in the next image.

Before getting too far ahead of myself, I am going to stop my SQL Server.

Notice, I also queried to find all of my services related to SQL before stopping the MSSQLServer service via the net stop mssqlserver command. We will come back to some net start and net stop commands later.

With the service successfully stopped, I can now restart the service in single-user mode.

And then the validation that we are indeed starting in single-user mode…

But wait, did you notice that bit of trickery on the startup command?

This is a pro-tip for when you must use single-user mode. Inevitably, somebody will steal the single-user connection and you will be locked out of the session. By using an app name after the single-user switch, you are telling SQL Server to only accept connections for that specific application. Since most apps will not be using sqlcmd, you will have far less contention to gain that connection and you will be able to complete your task much easier.

You could also pass something like this instead…

In this case, I would be limiting the connections to a query from SSMS (and not object explorer).

Now that I have a single-user connection, I can add a sysadmin or restore the master database or just sit on it and play devious. It all depends on what your objective for the single-user session happens to be.

More Command Line

Remember that reference to the NET commands? Well, it turns out we can also start SQL Server in single-user via net start. Let’s check it out.

The command is pretty simple:

The effect here is the same as navigating to the Binn directory and starting SQL Server with the sqlservr.exe executable. The big difference is considerably less typing and less verbose output of the service startup.

When using the net start method, you do need to know the service name of the SQL Server instance. To get that, I do recommend the following powershell script.

This will produce results similar to the following.

From the results, I can pick the SQL Server service and then pass that to the net start command fairly easily.

The Wrap

Starting SQL Server in single-user mode should be a tool every data professional holds in the bag. This is an essential tool that can be used in multiple scenarios and ensure you are able to fully maintain and control your server. I have shown how to get to single-user mode via two command line methods and mentioned a GUI method. The nice thing about the command line methods is that you don’t have to remember to undo the startup switch like you do with the GUI method.

If you feel the need to read more about single-user mode, here is an article and another on the topic.

This has been another post in the back to basics series. Other topics in the series include (but are not limited to): Backups, backup history and user logins.

Syspolicy Phantom Health Records

SQL Server comes with a default SQL agent job installed (for most installations) to help manage the collection of system health data. I would dare say this job is ignored by most people and few probably even know it exists.

This topic is not new to me. You may recall a previous article I wrote entailing one type of failure and how to resolve that failure. That article can be found here.

I recently ran into a variant of the problem outline in that previous article that requires just a bit of a different approach. The errors turn out to be similar on the surface but really are different upon closer inspection.

Phantom Health Records

If you are unfamiliar with the topic, I recommend reading the previous article. Then after reading that article, maybe brush up a little bit on the SQL Agent. The failures we will be looking at are within the SQL Agent and come from the job called: syspolicy_purge_history.

For this latest bout of failure, I will start basically where I left off in the the last article. The job fails on a server and I have access to other servers of the same SQL version where the job works.

Before diving any further into this problem, let’s look at what the error is:

A job step received an error at line 1 in a PowerShell script.
The corresponding line is ‘set-executionpolicy RemoteSigned -scope process -Force’.
Correct the script and reschedule the job. The error information returned by PowerShell is:
‘Security error. ‘. Process Exit Code -1. The step failed.

Having the error in hand, and knowing that the job works elsewhere, my next logical step (again based on experience from the last article with this job) is to script the job from another server and use it to replace the job on the server where it fails. In principle this is an AWESOME idea.

 

Sadly, that idea was met with initial failure. As it turns out, the error remained exactly the same. This is good and unfortunate at the same time. Good in that I was able to confirm that the job was correctly configured with the following script in the job:

Since the step fails from SQL Server let’s see what else we can do to make it run. Let’s take that code and try it from a powershell ise. So, for giggles, let’s cram that script into powershell and see what blows up!

Now isn’t that a charming result! This result should be somewhat expected since the code I just threw into the ISE is not entirely powershell. If you look closer at the code, you will notice that it is using sqlcmd like conventions to execute a parameterized powershell script. Now, that makes perfect sense, right? So let’s clean it up to look like a standard PoSH script. We need to replace some parameters and then try again.

This will result in the following (resume reading after you scratch your head for a moment):

The key in this failure happens to be in the sqlserver. PoSH thinks we are trying to pass a drive letter when we are just trying to access the SQLServer stuff. Depending on your version of server, SQL Server, and PoSH you may need to do one of a couple different things. For this particular client/issue, this is what I had to try to get the script to work.

If you read the previous article, you may notice this command looks very much like the command that was causing the problems detailed in that previous article. Yes, we have just concluded our 180 return to where we started a few years back. Suffice it to say, this is expected to be a temporary fix until we are able to update the system to PoSH 5 and are able to install the updated sqlserver module.

As is, this script is not quite enough to make the job succeed now. To finish that off, I created a ps1 file to house this script. Then from a new step (defined as a sqlcmd step type) in the syspolicy purge job, I execute that powershell script as follows:

Tada, nuisance job failure alert is resolved and the system is functioning again.

Conclusion

I dare say the quickest resolution to this job is to probably just disable it. I have seen numerous servers with this job disabled entirely for the simple reason that it fails frequently and just creates noise alerts when it fails. Too many fixes abound for this particular job and too few resolve the failures permanently.

I would generally err on the side of fixing the job. Worst case, you learn 1000 ways of what not to do to fix it. 😉

Given this job is tightly related to the system_health black box sessions (sp_server_diagnostics and system_health xe session), I recommend fixing the job. In addition, I also recommend reading the following series about XE and some of those black box recorder sessions – here.

Profiler for Extended Events: Quick Settings

Not long ago, I wrote a rather long article about a new-ish feature within SQL Server Management Studio (SSMS) that impacted Extended Events. You can read that book – here! The XEvents Profiler feature is one of those things that you may or may not use. If you consider using the feature, I do believe it is important that you research it a bit and try to learn the pros and cons first.

With that there is a little more about the feature that the aforementioned book did not cover. In fact, this information has pretty much gone ignored and mostly stays hidden under the covers.

Settings

As of SSMS 17.4 we have been given the ability to control XEvents Profiler just a tiny bit more. For what it is worth, we as Database Professionals love to be able to control our database environment. So this teeny tiny bit of new control ability is potentially a huge win, right?

If you are the controlling type, or maybe just the curious type, you will be pleased to know that under “Options” from the Tools menu in SSMS, Microsoft has tucked some new control options to help you configure XEvents Profiler – to a degree. If you open options, you will see this new node.

If you expand the “XEvent Profiler” node (circled in red), you will discover the “options” node. If you click on this “options” node and do a quick comparison (in SSMS 17.4 and SSMS 17.5) you will also find that you don’t need t expand the “XEvent Profiler” node at all because the options are listed in the right hand pane for both nodes and they are exactly the same. So, choose one or the other and you will end up at the same place.

The options that you currently have are:

  • Stop Session on Viewer Closed
  • Toolbar commands stop and restart

You can either set these options to True or False. I recommend you play with them a bit to discover which you really prefer. That said, I do prefer to have the “Stop Session on Viewer Closed” set to true. There is “profiler” in the name of the feature afterall. And if you have read the “book” I wrote about this feature, you would know that the filtering offered by the default sessions of this feature basically turn on the fire hose effect and can have a negative impact on your server. Are you sure you want a profiler style fire hose running on your production server?

Conclusion

There surely will continue to be more development around this idea of an XE style profiler. More development generally means that the product will mature and get better over time. This article shows how there is more being added to the feature to try and give you better control over the tool. We love control so the addition of these options is actually a good thing. Is it enough to sway me away from using the already established, more mature, and high performing tools that have been there for several generations? Nope! I will continue to use TSQL and the GUI tools available for XE that predated the XEvent Profiler.

Some say this is a way of bridging the gap. In my opinion, that gap was already bridged with the GUI that has been available for several years. Some say that maybe this tool needs to integrate a way to shred XML faster. To that, I say there are methods already available for that such as Powershell, the live data viewer, the Target Data viewer, or even my tools I have provided in the 60 day series.

I would challenge those that are still unfamiliar with the XE GUI (out for nearly 6 years now) to go and read some of my articles or articles by Jonathan Kehayias about the power that is in XE as well as some of the power in the GUI.

An Identity Crisis: Is it Profiler or is it Extended Events?

I have been working on this article for far longer than one might think is necessary for something of this nature. Truth be told, I was reluctant because there are so many people that quickly jumped on the wagon to write about this feature as soon as it was released. With some coercing, here the article is finally!

What is this newfangled feature? Well, that is kind of the problem. The name has already changed in the second release – be it ever so slightly. The feature is XE Profiler or XEvents Profiler and in theory it is supposed to give us something new in management studio.

Fact or Fiction

This Extended Events Profiler is a brand spanking new feature (ok, so the bouncing baby is a few months old now and no longer a newborn). Despite it being new, it seems there are a handful of “facts” already published about it that may or may not be accurate. Due to that, I want to play a little game of fact or fiction with it in this article.

Here is the short list from which we will work:

  1. Sessions (XE Profiler Sessions) are Customizable
  2. The default standard session displays all Extended Events
  3. It works with instances of SQL 2012 or greater only
  4. XE Profiler Adds ability to quickly start/stop sessions
  5. XE Profiler provides a new live view of the Event Session
  6. XE Profiler provides a GUI for Extended Events
  7. Templates were not available in XE until the release of XE Profiler
  8. Similarly, XE Profiler also brought the Profiler templates with it as part of the feature release
  9. A DBA can now do something that was not previously possible to do (quickly start a session)

That is a rather wealthy list of “facts” to be checked for validity if you ask me. Let’s go ahead and start diving into each of these. I will not be following the provided list order. Rather, I will be looking at items that seem appropriate to validate in my own special order. Rest assured tho, we will get through the list.

GUI

I don’t know how this is still a misconception about Extended Events, but it is certainly still out there. Suddenly XE Profiler comes along and people suddenly think that this tool has provided the GUI that everybody has been clamoring about since Extended Events was released in 2008 (yes almost 10 yrs ago).

I am not going to waste much time on this because it is a complete and total piece of fiction. The GUI has been readily and easily available since SQL Server 2012. In addition to that, there was a GUI that could have been installed for the older versions that was created by Jonathan Kehayias.

From SSMS 2014, here is a sample screenshot of the GUI wizard followed by a standard GUI screen.

As you can see, there is clearly a GUI prior to the release of SSMS 17.3.

Result: Fiction

Templates

This is probably the next easiest notion to prove or disprove. I have heard from more than a single source that templates are brand new with SSMS 17.3 (which is when XE Profiler was released).

I don’t really need to go very far or work very hard to prove this notion as a complete work of fiction. I have SSMS 2014, 2016, 17.0, 17.3, and 17.4 readily available to me. I am going to start with SSMS 2014.

If I  once again open the GUI for XEvents in SSMS 2014, I can easily see the available templates in that version.

It looks pretty obvious there that templates were introduced well before SSMS 17.3.

I can also see all of the templates available by browsing out to the file system.

And for good measure, here is one from a 2012 instance.

The same templates (minus possible changes within the template definitions) are available in SSMS 2012, 2014 and 2016. We will see those same templates available in SSMS 17 and above as well – in addition to a few that have been added.

I want to also add here that these are just the default templates. Just like in Profiler, you can create and save your own templates for future use.

Result: Fiction

Profiler Templates

We just saw how there are templates available prior to the release of SSMS 17.3. We can also see that the profiler templates are not available in those prior versions of SSMS. So does that also mean that the profiler templates were only released as a part of SSMS 17.3? We best be careful with that slippery slope of assumption. Let’s take a closer look at SSMS 17.0.

And just so we can confirm that this is indeed a version of SSMS that pre-dates 17.3…

As you can see there, this is RC3 of SSMS 17.0 so it is definitely a version that precedes 17.3

And similarly from a peek inside the folder on disk:

Seems to me we have ample evidence that the profiler templates were made available to the XE GUI well in advance of SSMS 17.3

Result: Fiction

Event Session Live View

One of the most widely used feature of profiler is the ability to watch the data as it streams through the session. According to many, this ability is missing in Extended Events and the XE Profiler feature has brought this ability to the forefront with XE now.

Let’s start by taking a look at an XE Session in SSMS 2014 connected to a 2014 database instance to check the validity of this concern.

All I need to do is browse the nodes in SSMS down to the session I want to view. Right click the session and then select the option to “Watch Live Data”.

After clicking watch live data, the event session live stream will appear on the right and start streaming the data to your SSMS workspace as shown in the following image.

For this session demo, I selected a session that should be readily available to everybody – the system_health default session. That session comes installed and running for you already.

Don’t like the layout of the screen? Well, you can customize the view to your liking and it can be customized differently for each and every different running session on your server.

As it turns out, the ability to watch livestream data has been available to you for Extended Events since 2012 when the GUI was made available. If you are curious about learning more about the GUI and customizing the view, you can read various articles I have written in my 60 day series you can find here.

Result: Fiction

That brings us to the halfway point. To recap we are now sitting at a score of Fact 0 and Fiction 4. Let’s see if we can turn the tide in these last 5.

Quick Start a Session

To be quite frank here, this has been a long time argument against using Extended Events. It just isn’t possible to start a session quickly and easily. I have two methods that I have shown in my training sessions concerning that concern. Let’s take a look at the first. I will demonstrate this from SSMS 2014.

The first thing that needs to be done is browse the SSMS tree until a session is found that you want to start. Here is an ancient cheat code – I leave many sessions on the server in the stopped state until I need to use them. Pro-tip: create your XE sessions on all of your servers and leave them stopped until you think you need to use it.

For this example, I have selected my AG_LeaseTimeout sessions which is in the stopped state (denoted by the little red “stop” icon next to the session name).

Then right click the session name and select “Start Session” from the context menu. To help clarify, I have circled the session name in green and the menu item in blue. This an extremely easy method to start a session. Once started, and if you really feel you must watch data stream onto the screen, then you can also follow the steps shown for watching the live stream data in the previous section.

The second method to quick start a session is only a touch more difficult. The steps are just the same as shown in the “template” section. You select a template that matches your desires and give the session a name. Here are some broad strokes shown in SSMS 17.0 (same steps apply for SSMS 2014 minus the ability to select a profiler template):

Right click the “Session” node and selection “New Session” from the context menu.

Select the “Standard” template from the dropdown box. This is a profiler equivalent (and also happens to be one of the two default sessions in “XE Profiler” so makes sense for a good comparison. Note that the window has a warning that the Session name cannot be empty, so let’s give the session a name.

With that warning cleared, we can now move on to the next configuration. Notice the two red arrows in the preceding image. Enabling these two options will start the session immediately and also give you that profiler warm and fuzzy by allowing you to immediately start watching the session without needing to lift another finger.

From here, click the OK button ( I know I normally teach people that they should script it, but in this case we are going for quick and easy). After clicking “ok”, be prepared for something like the following.

After just a few seconds on a busy server, one should expect thousands of events to fire in a very profiler type manner. I want to reiterate here, that this was the same session template that feeds one of the two available sessions in XE Profiler.

Once you have determined that you have enough data, you can easily stop the session by right clicking the session name and then selecting the “Stop Session” menu item.

And that is all. It is very easy to start a session in two different manners without XE Profiler. One of the options even allows you to create the exact same session as XE profiler but without that feature even being made available on the version of SSMS being used. So if you really want that profiler type of session, you can use a template to quickly create it even if your SSMS version pre-dates 17.3 and even if you don’t have the profiler templates by default (remember you can simply add the profiler templates should you choose).

In the end, I have to call this one a piece of fiction as well. The XE Profiler does not add any new ability to quickly start a session that wasn’t already a viable option. That said, the XE Profiler does add a third alternative to start a session – so long as it is one of the two default sessions.

Result: Fiction

SQL 2012 Required

This is one of those areas that I wish were fiction. However, this has actually been a consistent theme ever since the GUI was released for XE. The GUI components only work on 2012 instances and later. Let’s take a look at a connection to a 2014 Server and a 2008R2 Server in the same SSMS window from SSMS 17.0.

We can see here that the standard Extended Events GUI is just not available for the 2008R2 instance. The 2014 instance is denoted with a Green box and arrow to help distinguish between that and the red marks for the 2008R2 instance.

Now, let’s take a look at it with SSMS 17.3. With a similar setup and similar notation, we see the following in SSMS 17.3 when connected to both a 2014 instance and a 2008R2 instance.

Once again, I will admit this is something I really wish were not the case. It would definitely be helpful if the XE GUI were to be made available to SQL Server Instances that are 2008 and 2008R2. Especially given that the tool is an “SSMS feature”. That said, it really behooves everybody to get their SQL Servers updated to current technologies (2008 is nearly 10 years old afterall).

Result: Fact

The default “Standard” session displays all Extended Events

So far, we have been easing ourselves into this XE Profiler very gently. It is time to finally get into the deeper end and actually look at some very specific XE Profiler things. When I first heard this one, I was admittedly flabbergasted. One session displays all extended events? Really? Let’s actually establish some baselines about the default sessions for XE Profiler first.

  1. There are only two default sessions for XE Profiler
  2. The session names are “Standard” and “TSQL”
  3. The default sessions are based on templates (discussed previously)
  4. The template for the “Standard” session is the xe_Profiler_Standard xml template file
  5. The template for the “TSQL” session is xe_Profiler_TSQL

If I peruse the template file located in the directory on disk (C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Templates\sql\xevent) for the standard session, I will be able to see that it contains the following events:

Now, that is not the entire template file. I shorted it to only include the events and none of the other details like session configuration or the actions attached to each event. As you can see, there is a grand total of seven events in this particular session.

If I run the following statement against a SQL Server 2014 instance, I will get a few more events than just seven.

And the results:

And just for the sake of clarity, the results on a 2017 instance:

Obviously the math does not add up. There are multiple things wrong with the notion that a session from XE Profiler will contain all events. First, there is no sense at all in adding every event to a single event session. The second problem is that the session is supposed to mimic a profiler default template session. If it is supposed to mimic a profiler style session, you can only use 180 events max anyway. Profiler only had 180 events and XE is obviously way beyond that with so much more feature support than Profiler could even muster back in its glory days.

In case, you were wondering, here is how you access the XE Profiler. It is not under the Extended Events Node, it is a brand new node.

As you can see, there really are only the two options for XE Profiler and if you recall, both of those options are derived from the templates that were made available in the traditional XE GUI and are very easy to create a quick session via the same templates as these two options.

Whether using the XE Profiler or the previous method to create these sessions, there is just no way that these sessions could contain every event in Extended Events. It’s just a bunch of hot air with no foundation to the claim.

Result: Fiction

XE Profiler adds ability to start/stop session

Let’s start back with the last image from the previous section and then go from there. We need to access what is available to us via the XE Profiler node so we can test this theory. I will start by right clicking one of the sesisons (right clicking on the “XE Profiler object itself will not yield anything very promising fwiw).

As you can see here, we are given some pretty basic options to either “Stop Session” or to “Launch Session”. It also seems very obvious what they want you to do here because the option is bolded – “Launch Session”.

This menu is the same regardless of the session state. What I mean by that is even if you do not have the session created, this menu is the same as if you have the session created and running. None of the options dim and all are clickable on this menu. This holds true for SSMS 17.4 as well. Let’s take a look at a server where none of these “profiler” sessions are running.

 

I want to make it abundantly clear. This is the way it appears after a completely fresh install of SSMS 17.3. None of these sessions have ever run on this particular instance of SQL Server 2017 and this is even a completely fresh install of SQL Server 2017. That said, I have done nothing yet to break it – buahaha.

Knowing that no sessions have ever been “launched” for XE Profiler, it is clear that it is time to go ahead and “launch” a session. In this new feature “launch” is the new equivalent of “start”. Once you launch a session you will get the same net effect as what we saw earlier when I created a session from the profiler template and checked the box to watch live data and the box to start the session. When I click launch, I will see something like the following:

A new XE session was created under the Extended Events node and it is now called “QuickSessionStandard”. Once again, this is created from that profiler template I have mentioned a few times.

And then after a bit, I may want to try and stop the session.

You can see here that I was able to capture some quick data into the live stream view (which was a feature that was already there prior to 17.3). And after clicking the “stop session” menu item, I will see the following.

Note that the “QuickSessionStandard” event session now has a “Stop” icon instead of a “play” icon. So yay – I was able to start and stop a session quickly from XE Profiler. In addition, XE profiler launched a live stream view just like we saw earlier when creating the same session from the template in the XE GUI. Now for a pro-tip – start and stop an XE Session from the old GUI – “right click any event session and select start/stop from the context menu.”

Yes it is that easy. In addition, only the available option for that session will be enabled (e.g. you cannot stop a session that is already stopped – or not running). In fact you can even start/stop from TSQL. If you try to stop a session that is not running in TSQL, you will get a warning message that it is already stopped. Far more intuitive than XE Profiler. You will get no indication that the session is unable to be stopped if it is already stopped. Let’s take a look at the old way.

And after the session is started, I can also quickly stop it.

You can also see from each of those images, that I have the option to watch live data for the session. Another easy access feature from the old XE GUI.

While XE Profiler has added an additional means to be able to start/stop sessions quickly, the fact of the matter is that it is not a new feature brought to the table by XE Profiler. All it did was add a different way of doing it. I will also add that this new method can actually be somewhat confusing as well. You can only determine (visually) if the session is running or stopped by expanding the “Extended Events” node. Stopping a session from the XE Profiler node does not close the livestream viewer so you could easily presume it is still running. The XE Profiler node gives you no insight into the actual state of the session.

And for giggles, this is how easy it is to start or stop a session via TSQL.

If I run the statement to stop the session when it is already stopped, I will get the following:

Msg 25704, Level 16, State 1, Line 9
The event session has already been stopped.

I have previously covered this specific topic (start/stop sessions) in my 60 Day series which can be found here.

Result: Fiction

Customizable XE Profiler Sessions

Let’s take a gander at the previous section where I showed the context menu available in XE Profiler and not the extent of that menu. The profiler iteself does not offer any sort of customization. You have only two default sessions. You cannot change either session within that particular feature.

Since I cannot customize anything for these sessions via the XE Profiler, let’s use the XE GUI that is tried and true. I will effectively perform the following via the XE Session Properties window:

And after I am done, that session will look like the following:

And visually, from the GUI, it will look like this:

You can see that the aforementioned events are definitely gone from this session. From here, I can absolutely start that session from the XE GUI and it will run just fine. I can hit “watch live data” from the context menu and it will open up the previous view of the live data that was used (so the exact same configuration of that window because the XE GUI has always done that). If I double check my session after it is started, I will be able to confirm that the altered session is indeed still tact just the same as I customized it.

Let’s go ahead and stop that and then “launch” it from the XE Profiler now. After the “launch” I will re-script the session to verify that it is either the same or altered from what I had configured. In this case, since the session was already in place, I will find that the XE Profiler did not alter my session in any way and merely started the session. That is a win for XE Profiler in that it was able to determine that the session did indeed exist. However, I have to use the XE GUI and not XE Profiler in order to get a decent config on those default sessions.

Result: Fiction

Bonus time

XE Profiler only has SQL Profiler Events

This is actually an interesting statement. While the XE Profiler relies on templates based on SQL Profiler and the events in those templates are events that are available in SQL Profiler, they are not exclusive to SQL Profiler. These events can be used in Extended Events and have been a part of the XE Engine since inception. So, I would call this one fiction as well. I think an obvious litmus test for that is that the events are actually running through an XE Session so they must obviously be XE events as well.

Result: Fiction

Up to this point I have not addressed any changes in the Extended Events Profiler that were made for SSMS 17.4. Here is where it seems we compound a bit of confusion for the masses. Not only do we have this Extended Events tool that we have decided to call “Profiler” which has already had many people asking me about it because the name confused them (is it profiler or is it XE??), but it seems that the identity crisis for this feature is exacerbated slightly with a sudden name change.

Using yet another very clean install of SSMS on a different instance of SQL Server to ensure that none of the XE Profiler components had been used or any of the default sessions created and launched, I have the following:

Yes, the name change is very innocuous but it can lead to some confusion for various parties involved. If a set of step by step instructions says to do something and the learner cannot find that explicit node, they will become confused.

For anybody that has been working with XE for some time, they will know that XE is synonymous with XEvent and will quickly figure it out.

Conclusion

So, in wrapping up this very long article, I want to recap the score of the Fact v. Fiction items. The final tally is Fact 1 and Fiction 9 (9 original list items plus 1 bonus entry). This means that there is a whole lot of stuff going on around out there about this new feature that is just not accurate. While some may be able to derive some small use from the XE Profiler, it really does not add anything that you could not already do with either TSQL or the old XE GUI.

Some say this is a way of bridging the gap. In my opinion, that gap was already bridged with the GUI that has been available for several years. Some say that maybe this tool needs to integrate a way to shred XML faster. To that, I say there are methods already available for that such as Powershell, the live data viewer, the Target Data viewer, or even my tools I have provided in the 60 day series.

I would challenge those that are still unfamiliar with the XE GUI (out for nearly 6 years now) to go and read some of my articles or articles by Jonathan Kehayias about the power that is in XE as well as some of the power in the GUI.

Simplified Session Backups

Comments: No Comments
Published on: October 12, 2015

posh_DBPowerShell exposes scads of information and provides the professional with a formidable tool for the tool-belt. Over the past several articles, I have demonstrated how this tool can be used to uncover useful information and metadata.

The groundwork has been laid to be able to mine information about each of the core components for Extended Events Sessions. These core components are Targets, Events, Actions, and Predicates. Being able to query this metadata and understand what it represents is an essential skill to more effective Extended Event Session management.

Thus far, I have demonstrated how to perform metadata inquiries from both PowerShell and TSQL. I even showed how to map this metadata between script and the GUI screens. As I concluded the metadata discovery via TSQL segment, I demonstrated a very useful script, to generate backup scripts of deployed sessions, using all of the concepts discussed to that point. That script can be found here. In this article, I will share how to do the same thing via PowerShell.

Backstory

Before I go into depth on the script within PowerShell that can be used to backup a deployed session, I want to share a bit of a story.

Many years ago while still in High School, I had a really tough teacher for Calculus. He had a reputation going back to long before I took any of his classes of being a very hard teacher. I enjoyed this teacher quite a bit. My group of friends and I even enjoyed giving him a hard time because it seemed like he could take it as well as dish it out a little.

This teacher had a few philosophies that I didn’t really much care for at the time, but I can sure appreciate now. Truth be told, I gained an appreciation for his teaching philosophies within the first year out of High School. One of his philosophies was to make the High School classes at least on par with what could be expected in College. This made college significantly easier.

Another of his philosophies was to teach core concepts first then to build on those concepts. While teaching the core concepts, he always showed the hard way of doing things first and sometimes an equally hard method as the secondary approach. But there was always an easy way of doing things that would be eventually taught. Sometimes it is essential to a solid foundation to know how to solve a problem the hard way before learning how to do it the easy way.

Correlating that to SQL Server, there are many times that the first approach to do something may be a more difficult approach. As the skills are learned, then maybe an easier solution may present itself, but there is (hopefully there is) a solid foundation there first to help support and enable the learning of an easier method.

Simplified Session Backups

In the aforementioned article, I shared a script that would recreate a deployed session via TSQL. That script was lengthy and required a few tricks to get things to work out just right (e.g. concatenations). Now I will show how to do the same thing in a sickeningly simpler way.

The example that I am about to share requires that SQLPS be loaded and the “demosession” Session be loaded into an object. If a recap of how to do that is needed, I recommend reading this article prior to proceeding.

With the “demosession” Session loaded into an object, I also want to load the name of the session into a parameter to simplify things just a bit.

I can confirm the value of the $sessionname parameter if I choose. In this case, the value does return as “demosession” and is the correct value needed. With that, now I can run the following script to create a backup of my deployed session:

That is the entire script! Just one line! This script could have been even smaller, but I felt it necessary to dump the script out to a SQL file in order to have the script available for future needs – should it be needed. One caveat here is that the directory path must exist or an error will be thrown. To create the directory path, it is as simple as the old DOS commands of old to create that directory.

While the script produced is perfectly viable to recreate a deployed session, it is important to note that the Events listed in this script will be in the order of Package then Event sorted alphabetically. Recall from the prior article on Session backups that this is the default script order used by SSMS as well and is not necessarily the same order that the original Session was created.

How did I arrive at this method to create this script? This goes back to repeating some of the processes I have used throughout these articles on PoSH and Extended Events. Starting with a quick exploration into the $sessionobject object.

This would lead me to seeing the following results:

session_script

Seeing ScriptCreate as an option is a great indicator that I can do something with this Session to quickly recreate the session script. If I run that with the session obj, then I get the next clue.

session_scriptnocommands

I highlighted the interesting clue to this. Even though it says ScriptCreate, it apparently needs to have an additional command in order for it to do anything. With that in mind, a quick Get-Methods on ScriptCreate() will produce insight that there are commands that can be utilized. One of those commands happens to be GetScript(). GetScript seems pretty self-explanatory, so I would settle on that and voila the script can be reproduced.

session_scriptgetscript

There it is. I have shown two methods to create backups of deployed sessions. The first being the more complex solution and then this extremely simple solution. Both are valid means to produce the backup scripts and it now just boils down to a choice as to which to use.

 

This is the latest article in the 60 days of XE series. If you have missed any of the articles, there is a recap for the series. To recap or reread the articles in this series, please visit the table of contents.

Deployed Target Metadata with PoSH

Comments: No Comments
Published on: October 9, 2015

posh_DBHaving just covered the methods to discover the metadata for a deployed session including the details about the session settings, events, actions, and predicates, I am nearly finished with the session metadata. All that remains is to dive into the metadata for Targets.

Through the articles thus far, I have shown some of the nuances in dealing with PowerShell to retrieve this metadata and how it differs a bit from performing the same inquiries via TSQL. PowerShell exposes Extended Events differently than what one would expect due to familiarity with Extended Events through the catalog views and the DMVs I have previously discussed.

Along with the nuances associated with accessing this metadata via PoSH, I have also showed how some of this metadata is different. The dive into the metadata about Targets will continue to reinforce and embrace these differences.

Regrouping

Picking up from where I left off, and following the same pattern as has already been target_metadataestablished, it is time to dive into the metadata for the Predicates and Actions that are tied to an Event deployed with a session using PowerShell. This means that I will need to ensure I have a session established that has the SQLPS module loaded. Then I need to ensure I browse back to the sessions folder.

Just in case, here is the working directory for the sessions out of the default instance.

Also, from this point, I will go ahead and load the “demosession” session into an object again for demonstration purposes throughout the remainder of this article.

With that loaded, I just need to remember to reference the Events object (as was introduced in the prior article).

Targets

While the Actions, Predicates, and Events are all tied tightly together (Actions and Predicates have to be attached to an Event), Targets are somewhat independent. The Target is directly attached to the Session in the same way that an Event would be tied directly to the session. Looking at the Metadata for the Session attached to an object, it will show the Targets object being similar to the Events object. Let’s see that in action:

That script will produce the following:

session_psmetadata

Reverting back to the patterns used so far through this series of posts, I will explore what is available to me for the Targets that may be deployed to the session of interest. The following query would be the first step:

This query will result in the following:

ps_targets

Following the established patterns and results, I can see some very common ground here. The results of this query are strikingly similar to the results seen when querying Actions and Events where I would also see the “name”, package and a description. This is a pretty basic demonstration into viewing the targets tied to a session. To get a better view of the Targets attached to a session, I need to dig a little bit deeper.

This will produce the following useful information:

ps_targetsmethods

From those results, I can see that I would immediately be interested in the TargetFields (recall the similar property from the Events) and the Properties property. The first of these that I want to explore is the Properties property.

Note the trick I am using again to dump the results into a quasi-table format. And the results will look like this:

ps_targeteav

Once again, the results should seem somewhat familiar. These properties underscore one of the nuances being the EAV model with how PoSH accesses the Extended Event properties. With this kind of query, I can see what kind of targets have been deployed to the Session and get a little bit of information as to the type(s) of target(s) they are.

This is all really cool. The real meat of the metadata for Targets is in the “Set” options that have been configured. The “Set” options are the configurations that are exposed via the TargetFields property previously seen. With that said, it is time to dive into these configuration options.

This query returns this rather useful data:

ps_targetsettings

And to make this slightly more useful and more valuable, I need to know which target has which setting. For a more usable query, I can resort back to the table output. Instead of just using the data directly accessible from the previous query, I will use the Parent property to get the name of the Target.

And now I will get something like this:

ps_targetsettingswname

This is good and useful information. Being able to tie the target type to the specific setting is crucial to better understanding how the metadata is interrelated. This is also something pretty easy to do via TSQL as I demonstrated in this article.

In this article I have covered the core concept of Targets. I showed how to access the Target metadata which includes the “Set” operations for each of the Targets. This is the last of the core concepts to discuss the means to access the metadata via PowerShell. I do have one more item I wish to discuss via the means of PowerShell and I will be discussing that in the next article.

Stay tuned as I continue to work through another trick that can be done via PowerShell in the next article. To recap the series, please visit the table of contents.

Deployed Action and Predicate Metadata with PoSH

Comments: No Comments
Published on: October 8, 2015

posh_DBI have recently shown that using PowerShell can be extremely powerful in obtaining insight into how to investigate deployed Extended Event Sessions. Throughout the demos I have used, I hope that it has also shown that PowerShell can be very easy to use.

Querying this metadata through TSQL is easy, but using PowerShell does seem to simplify this just a step more. The simplicity of PowerShell is a different kind of easy over TSQL. For me, the simplicity of PowerShell comes in the quantity of code. That said, I want to reiterate what I said previously – there is a bit of a learning curve with PowerShell. PowerShell exposes Extended Events differently than what one would expect due to familiarity with Extended Events through the catalog views and the DMVs I have previously discussed.

I have demonstrated how to access session information as well as how to access Event metadata. That said, I left the discussion of Actions and Predicates for this article. As I expose this information, the simplicity of PowerShell will continue to be reinforced – along with the patterns that I hope have started to become apparent in the previous articles.

Regrouping

Picking up from where I left off, and following the same pattern as has already been 3d_toolestablished, it is time to dive into the metadata for the Predicates and Actions that are tied to an Event deployed with a session using PowerShell. This means that I will need to ensure I have a session established that has the SQLPS module loaded. Then I need to ensure I browse back to the sessions folder.

Just in case, here is the working directory for the sessions out of the default instance.

Also, from this point, I will go ahead and load the “demosession” session into an object again for demonstration purposes throughout the remainder of this article.

With that loaded, I just need to remember to reference the Events object (as was introduced in the prior article).

Predicates

Looking back at some of the metadata that was revealed for the Events object in the previous article, I have the following:

ps_eventgm

If I query the Events directly, I should be able to see that there is a Predicate returned. Unfortunately, the formatting of the output is not very friendly and much of the information is truncated. Seeing the Predicate listed as another item that can be queried direct is somewhat useful. In addition there is a PredicateExpression property that could be useful.

In this following image, I show both the queries used for each (Predicate and PredicateExpression) as well as the corresponding output.

ps_predicates

Notice how the PredicateExpression is a bit more familiar in the output? With the wonky formatting of the output for the predicate, one could also presume that this output would be the XML type of formatting for a pred_compare object. With a little extra effort, this can be confirmed.

And with better formatting, here are those results:

ps_predcompare

These results would reinforce the prior presumption. And it also helps to reinforce the differences between how the Predicate is represented in PowerShell over the sys.server_event_session_events view where the predicate column is closer in value to the PredicateExpression in PowerShell. Additionally predicate_xml in the the view more closely fits the Predicate property within PowerShell. As long as one can keep these differences in mind, it will make working with PowerShell and Extended Events a tad easier.

Actions

Accessing the metadata for Actions via PowerShell is a little bit more straightforward in that there aren’t multiple objects like with Predicates. Unfortunately, with Actions, it is right back to the EAV data model with the Actions object which can cause a different level of complexity and possibly confusion.

Starting with a very basic query to the Actions:

And the results of that query:

ps_basicActions

In addition to a problem noted with the Events (package name being listed twice), there is an additional problem with the initial output here. The problem with the default output is that the Event name is missing. Granted, this is also a problem with the Predicates examples too. This is one of those areas where extra effort has to be made when using PowerShell to explore Extended Events. Luckily, the data is there. All that is needed is to know how to get to the data.

So how does one figure out the Event for which the Action (or Predicate) was added? I have that in this next example:

This query is simply requesting the Parent (Event) of the Action, and then the Action name to be returned in a table format. Here is what the output should look like:

ps_actiontable

Having the Event name is a critical piece, especially in a more complex Session such as the system_health default session. After-all, an Action (yes it must be repeated because it is a core concept) is only attached to an Event and has to be configured for each individual Event.

Circling back around to the EAV concept. One of the areas within Actions where this is more evident is when trying to look at the properties of the Actions module. Starting with a sample query such as this:

Again, I am instructing the results to be returned in a table format. If I don’t do that, then everything is returned in a single column and it is far more difficult to pick out the needed bits of information. Forcing the results to a table format, I get the following:

ps_actioneav

Looking at these results, I see Name as a value under the Name column and the associated value for it under the Value Column. Then there is a lot of repeating. This means that a little extra work is going to need to be done to parse things a bit further.

In this article I have covered two of the core concepts related to deployed sessions in Extended Events – Actions and Predicates. To be able to effectively use PowerShell, I have also covered a few of the nuances necessary in figuring out where essential metadata is exposed via this tool.

Stay tuned as I continue to work through some of the Target metadata in the next article. To recap the series, please visit the table of contents.

Exposing Deployed Event Metadata with PoSH

Comments: No Comments
Published on: October 7, 2015

posh_DBIn the last article I introduced a power tool that can be used to help manage Extended Events. That tool is PowerShell. In that article, I focused primarily on introducing PowerShell as a power tool to help in discovering the Extended Event Sessions deployed to the server as well as some of the settings that are associated with a session. This is the same as thing as calling these settings the Session metadata introduced here.

Now that some of the basics concerning how to access Extended Events via PowerShell have been covered, it is appropriate to start digging in a little deeper. This deeper dive will continue in the direction of metadata discovery and familiarity with this powerful tool.

More Power

Picking up from where I left off, and following the same pattern as has already been established, I want to now 3d_toolstart diving into the metadata for the events that are tied to a deployed session using PowerShell. This means that I will need to ensure I have a session established that has the SQLPS module loaded. Then I need to ensure I browse back to the sessions folder.

Just in case, here is the working directory for the sessions out of the default instance.

Also, from this point, I will go ahead and load the “demosession” session into an object again for demonstration purposes throughout the remainder of this article.

In the previous article I noted that within the metadata for the session, there is a listing of some objects that can be further perused, which are related to the session itself. Those objects are Targets and Events (shown in the next image). I want to immediately jump into the Events object and see what needs to be done within PowerShell to access the metadata for the deployed Events.

session_psmetadata

Luckily, as with how things work from TSQL queries into the metadata, there are patterns that can be established within PowerShell as well. In this case, to get to the Events within the Deployed Sessions, I would need to “query” that object by adding it to the “query” that I use to list the contents of the “Sessions” object already declared (the $sessionobj instantiated previously). This is a lot easier to explain via the code.

See the pattern? This is the same sort of pattern that was used when querying the various deployed Session settings in the previous article. With the Session loaded in an object, I can query properties or sub-objects by appending that “object” or “property” at the end of the $sessionobj object I created. Running that last snippet will produce the following output.

ps_eventlist

Immediately visible are a couple of interesting tidbits. Beyond the query returning the list of all of the events in the session, I can see the package names, the predicates and the descriptions of the events. This is similar to the sys.server_event_session_events catalog view excluding the descriptions. Unfortunately, the formatting is a bit awkward so extra measures would be needed in order to get that description to be useful – just the same as would be needed when querying via TSQL.

Additionally, there is a bit of redundancy with this object. There is a listing for the package name for each of the events. Now, take a look at the event names in these results. I cheated a little bit and highlighted the interesting parts. Each event will have the package name as part of the event name within the PowerShell results. Again, just a note because it is one of those things that could cause a bit of grief later when trying to build more complex queries via PowerShell.

Looking a little bit further, if I continue to follow patterns established thus far, I can check the Metadata for  this “object” and learn a little bit more.

And the results:

ps_eventmetadata

This should come as no surprise. Actions are tied to specific events so it makes sense that it is an object accessible via the events object. In addition, I can see how the “SET” operations are tied into the Event. The difference here being that the object is called “EventFields”. This should make sense since this metadata is exposed in TSQL via the sys.server_event_session_fields catalog view.

And if I invoke a Get-Methods on the Events Object as follows:

I will get a clearer picture of the metadata and what I can do with the Events.

ps_eventgm

In addition to the Actions and EventFields objects, I can see the Name and Predicate properties just the same as I would if I were to query the session events through the catalog view – sys.server_event_session_events.

With the connection to the event fields being so much more obvious through PowerShell and thanks to the prior article on the event fields / “SET Operations”, I want to explore this metadata at this point.

The results of this will be:

ps_eventfieldmeta

From these results, I can see that the “customizable” data point “collect_database_name” for this particular event has been enabled.

In this article I have shown how to begin the exploration into the metadata for deployed Session Events. Additionally, I demonstrated how to quickly get to the set operations tied to those events. I have not yet covered the core concepts of Actions and Predicates which were exposed through some of the queries demonstrated in this article. Tune in for the next article where I will cover both of those topics in greater detail.

With all of the information that has been revealed through this series, it is easy to have either missed some  of the information or to have sensed a bit of information overload. If a recap is needed for one of these or any other reason, feel free to catch up here.

«page 2 of 4»

Calendar
January 2020
M T W T F S S
« Dec    
 12345
6789101112
13141516171819
20212223242526
2728293031  

Welcome , today is Monday, January 27, 2020