Hidden GUI Gems

frostedsnowflakeThe GUI for Extended Events offers some power to help you get started with trying to work with Extended Events. As I have shown over the past few articles, some of this power comes with a bit of cost and may in fact not be as powerful as just using a script. You can flip back through the previous articles via this link.

Today, I have a few gems that are available to you in the GUI. These gems should help solidify your understanding of the tools available through the GUI to help work with Extended Events. So far I haven’t hid my preference for using a script over the GUI. That said, the GUI can be useful for a thing or two. I will explain in better detail further into the article.

Playing in the Mud

To show these gems, a little setup is required first. So I am going to start with the following sample session (started from the GUI):

debugwhat

The setup thus far is rather simple, I have selected two optional settings – “start the event session immediately after session creation” and “track how events are related to one another.” The next thing to do within the GUI is to make my way through the events and select the events I need. The events I want to select need to help me track information related to the SQL process stopping (exiting) and any info related to when a memory dump is created.

To try and find the appropriate events, I will go to the Events tab, click the drop down shown in the green box and select “Event names and descriptions” from the menu.

filtering_eventselect

After making that selection, then I can type a keyword within the text box under “Event library”. Typing in the term “dump” will produce no results. This means either I have no events that will meet my requirements or I need to try a different term, right?

Well, just to verify the results I desire to confirm what I have seen by using a script. Executing the following script:

This confirms my suspicion. The script returns several results with the term “dump” when querying both the descriptions and event names just as I did with the GUI. There is an interesting development however. All of the results show as being in the debug channel.

filtering_eventselect_script

So why do these results not display from the GUI? Well that is one of the hidden gems. The debug channel is not shown in the GUI results by default. This will prevent any events from that channel from being displayed. This can be changed by checking the box displayed in the following image:

channelfilter

Consider here the name of the Channel for a moment. These events may be obfuscated from search results by default for a good reason. Some of these events can have a significant performance impact on the instance. That said, there is the occasional good reason to need to use the events from this channel in order to troubleshoot specific issues on the server. You just have to dig harder at it.

Having resolved that issue, I have resolved on using the stack_trace and sql_exit_invoked events. So I select both of the events and then move on to the Data Storage tab so I can configure the target. The target I have chosen to use is the event_file target. Now that I have selected the events and configured a target, I can slick the script button. I recommend always using the script button rather than committing the session direct from the GUI. Clicking the script button here will yield the next gem.

Xe_debugwhat

I have scripted the session I configured in the GUI. Overlaying the general page of the session with the produced script shows the gem. While the track_causality setting does get scripted, the option to start the session after creating the session does not properly script. This is an interesting problem. While this produces a small negative impact, it is one to bear in mind. If you need to have the session started, then make sure to manually start it or script the start of the session to confirm it has been done.

A third gem is one that I have not shown here but one to play with in your free time. If you change the name of the session and then script it, see what happens.

These are just a few more gems that I have shown in the GUI tools for Extended Events. I recommend using a script where possible (which is just about everywhere), and I have not hid that fact. While useful to a small degree, I do not recommend using the GUI for most tasks (again with a recommendation to use a script instead). Using a script has numerous benefits above and beyond the use of a GUI.

If for some reason you are not using Extended Events simply because you wanted to use a graphical interface, I recommend reconsidering. The series of posts I have on Extended Events has plenty of useful scripts that will provide the functionality and ease of use for Extended Events to mitigate that internal lust for the GUI.

This has been another article in the 60 Days of XE series. If you have missed any of the articles, or just want a refresher, check out the TOC.

 

Filtering Logged Data

grnsnowmnOver the course of the past couple of articles, I have shown some interesting if not useful functionality in the user interface for working with Extended Event log files. Those articles on merging files and customizing the view can be read here and here.

Today, I am going to explore how to further customize the view of the GUI for working with Extended Event Log files or XEL files. This further customization does include working with filters. Prior to proceeding, it would be beneficial to review the aforementioned articles – they do outline some building blocks for this article.

Sifting through it all

Even with a well tuned XE session, sometimes there is just a deluge of data. A deluge of data can make it more difficult from time to time to find exactly what is the most likely candidate to be the cause of the problem currently at hand (whatever that may be).

When dealing with a large quantity of data, it becomes necessary to filter that data down. Filtering it down will make it significantly easier to handle – when an appropriate filter is utilized. The GUI permits the implementation of filters in a couple of different ways to help with these dives into the XE logged data.

When convinced to use the GUI to peruse the log data, there are a few possibilities in how to create a filter (and yes it would be better to do it with a script). The first of these methods is actually quite simple. Let’s start with a grid display of the same session used in the previous couple of articles:

starter_grid

From here, if I right click one of the cells as shown below, I will be prompted with a context menu:

grid_filter

By right clicking a value in the grid, an option to “Filter by this value” will be given in the context menu. Selecting that option will open a new window allowing for further configuration of the filter or to just accept the filter as-is.

filter_window2

From the “Filters” window, you can see there is the option to set a time based filter, to create a filter on values for the fields within the trace log or to combine the two. This is pretty straight forward here. That said, recall that filters configured here can be saved in the viewsetting file discussed in the prior article (mentioned at the beginning of this article).

Further Analysis

After filtering down the results, what if you wish to now perform more complex analysis? Maybe the analysis is to be done for trending purposes, or maybe something else. Regardless of the purpose, the need is there and you wish to know how to do it (but for some reason have a serious allergen against doing the work via a script). Luckily there just happens to be a magic button within the GUI that helps to perform this task.

Within the GUI, we are given the opportunity to group and aggregate data. The aggregations cannot be performed without first performing a grouping (very similar to the requirements via script). The grouping can contain multiple fields or just a single field. Let’s take a quick peek.

groups_xe

Clicking on the Grouping button, the following window opens, permitting the configuration of the groups.

group_wind

This looks pretty familiar and standard. It is rather simple in design and function. Move columns from left to right to add to the grouping. To remove from the grouping, move from right to left.

aggregs_xe

With the group established, then comes the fun part to help with analysis. This is where aggregations comes into play. Within an aggregation; we can perform counts, max, min, avg and sum. Depending on the field, the aggregation that can be performed may be restricted. Clicking the aggregation button, one will see the following window:

agg_wind

Given the session being evaluated and the columns that have been chosen previously, I am restricted to the columns shown in the preceding image. Aggregations can only be performed on columns selected in the “Column Chooser”. The rest of the configuration of the aggregation is merely an exercise in plug and play to determine which fields or aggregation will serve your needs the best.

For this exercise and the chosen grouping, I am going to go with Average on the duration field and then sort the aggregation by that same field in a descending order. For your purposes, feel free to choose something more meaningful or appropriate to your needs.

Cautionary Tale

If there happens to be a rather large number of events within the session file(s) being evaluated, don’t expect this to work without a hitch or three. Trying to load a 1GB trace file (read that as a default size on an XE trace file) with over 600,000 events, SSMS regularly crashed on me while trying to apply a filter, a group, and even the aggregation. None of this has been an issue while trying to perform similar types of groupings and aggregations with large trace files via script.

If you are going to attempt to manipulate aggregations, groupings, filters etc. via the GUI with large trace files, understand that you may run into errors and crashes when using the graphical interface rather than a script. The choice is yours!

This has been another article in the 60 Days of XE series. If you have missed any of the articles, or just want a refresher, check out the TOC.

 

Customize the XEvent Log Display

lego_giftIn the previous article I showed a couple of methods to merge Extended Event Log files. In that article I mentioned the ability to customize the display in the GUI for the merged log files.

Today, I am going to explore how to configure the GUI with filters and different column sets or orders. To proceed, it will require a refresher course via the prior article. If you need a primer in how to merge the XEL files, or to open the XEL files from within the GUI, I recommend taking the time now to read that prior article.

Getting Sticky Again

This should be a rather quick foray into the world of the GUI and how to customize. Just bear in mind that it is a small building block.

Taking up from the prior article, I will continue to use the same sample session and sample data.

Let’s start with a screenshot where we left off the last time:

merge_gui_display

By itself, this is not the prettiest of layouts. Sure there is good information, but the layout is rather simple and leaves one wanting more. The “more” comes easily enough after some familiarity with the GUI. Suppose you desire to change the order of the columns presented in this view, or to even select different columns, there are two methods to achieve that goal. The first option would be to do as shown here:

method1_xe_gui_options

By right clicking an empty space in the title bar, a context menu will be made available. There are several options here that can be a bit fun to peruse. I will just focus on the “Choose Columns” option for now.

The alternative for this method is to look in the toolbar. There will be an XE specific toolbar available when looking at an XE log file. In this toolbar there is a “Choose Columns” button. This method will get us to the same end result as the first.

method2_xe_gui_options

Selecting the “Choose Columns” from either method just presented will open a new window such as the following:

choosecolumns

This kind of screen should be fairly familiar and easily discernible. One the left is a set of available columns that can be added in the display. On the right is the list of columns currently being displayed in tabular format. In between there is a set of buttons to add, remove, add all, or remove all columns. In the top right is a means of re-ordering the columns as presented in the display table. Pick a few and sort the columns to a suitable display mode and you are all set.

Recall from the image showing method one that there is an option to remove the column. This same function can be performed from the context menu or from this “Choose Columns” window.

For the order of the columns, there is another fancy means to do that as well other than through the “Choose Columns” window. Columns can be dragged and dropped from the initial screen to change the presentation order easily enough. Sometimes it is far easier to use the “Choose Columns” window to do that. Pick your poison here and run with it.

Super Stickey

This is fantastic if the manipulation of the display settings is a one-off adventure. What if you need to do this multiple times? Maybe, just maybe, this needs to be done for 100 servers. The process I have just shown can become exceedingly tedious. In addition, every time an XEL file is opened, it defaults back to the same settings I have already shown. What do you do in cases like that?

Beyond using a script (the recommended method by the way), there is an alternative.

displaysett_xe_gui_options

The alternative is found in the toolbar discussed earlier in this article. The option is “Display Settings.” After picking the columns and the order and getting the display dialed in just right, the view can be saved. The view settings will be saved as an XML file with the extension of viewsetting. This view can then be used to immediately apply to a freshly opened log file. Sure it is still an extra step or two to open that view and can still become a bit tedious, but it is far better than resetting the view every time. And a bonus is that the saved viewsetting file will also save merged columns, groupings, aggregations, column order, and filters defined in the view.

Then again, all of this is easily achieved through a tsql script which is far more scalable when dealing with multiple servers or the need to review the data more than once. The choice is yours when using XE.

Though I mentioned the creation of a filter for the displayed log data, I will not be delving into that topic today. It would be a very suitable exercise for the reader to figure out how to apply a filter through this tool. Or wait until the next time when that topic will be covered along with aggregations and groups. In the meantime, I recommend trying to figure it out with a little picking and plucking through the GUI-ness.

This has been another article in the 60 Days of XE series. If you have missed any of the articles, or just want a refresher, check out the TOC.

 

Log Files from Different Sources

Well, it has been quite a pause between articles in the series. It is well past time to continue the series. With such a delay in the series, it is prudent to remind all that there is a series on Extended Events I had done over the course of two months. The complete table of contents for this series can be found here. yule logTruth be told, I had intended to start this series again as my usual 12 days of Christmas series. Sadly, I am a bit behind on that series. Maybe some of there will be multiple posts a day to help catch up.

As was the case throughout the course of the series, I intend over the next few articles to continue to produce building blocks that may be helpful for later articles, or concepts. With that in mind, the chosen topic today is very basic in nature. While basic, it is a rather important topic.

While working with Extended Events, or with any sort of logging or tracing utility, it becomes critical to consume the data from disparate sources or even to consume the data from a source server on a different server. The question becomes: How to consume that data?

Getting Sticky

When it comes to working with log files from different sources, or most tasks within SQL Server for that matter, there are two general methods. The first method is of the type frequently frowned upon by the more senior of database professionals, sysadmins and so forth. This first method is to use the graphical user interface or GUI. This is more of a point and click method. This is also a method that I will generally shy away from for most tasks. That said, this method is easily performed through the GUI and is reasonably safe.

The first step is illustrated with the following image:

XE_merge

From within Management Studio, navigate the file menu as shown in the preceding image. After clicking on “Merge Extended Event Files”, a dialog will open as shown (in the background) of the next image:

XE_merge_opendialog

Clicking add from the “Merge Extended Event Files” window will open a dialog that permits you to navigate to the location of the XEL files that need to be opened and merged. Once located, a single file or multiple files can be selected. One should proceed carefully here to ensure against opening too many files or files that may be too large. Opening too many files or files that are too large can cause various unwanted effects.

Once the file(s) is(are) open, a grid like window will be open in SSMS that looks like the following:

merge_gui_display

From here, one can manipulate the display to something more suitable to his/her display preferences (e.g. filtering or column layout). I will not cover that here but will be saving it (the display customization tutorial) for another time. That aside, I have just shown how easy it is to look into an Extended Event Log file through the use of the GUI.

Not So Sticky

For the less sticky  (less GUI) method, I have shown a similar routine in the past that lays the groundwork for how to do this via script. You can read a little about that here. Building on that method, there is a tweak to be made – we must specifically name the path to the XEL files to be merged rather than try to build it dynamically.

Using the same session files as shown in the GUI examples, I will show how to proceed with the merge via script.

This script lays out a rather simple process. That process being to load the log data into a table (in a database of your choosing) and then once loaded, query the data to better understand what has been captured. In the preceding script I left a few notes. It is important to understand that if trying to pull in the logs from a SQL 2008 or R2 instance then the XEM file must be included (as I have done for this particular example).

To parse the data into a friendly format that I can use for analysis, I would do something like the following:

From here it is really easy to add/remove columns or re-order the columns into a more friendly format for the person reviewing the data. This is incredibly easy – especially given the appropriate recipe / script / process. I would also venture that this method will require fewer resources and lead to less chance of error.

Enjoy working with evaluating and merging these XEL files from different sources. It should be fun!

This has been another article in the 60 Days of XE series. If you have missed any of the articles, or just want a refresher, check out the TOC.

 

Tricks and Treats with XE

Halloween is a great time of year. It is unfortunate that it is just one day of the year. That said, I do like to think of the phantasmripmonth of October as Halloween Month. I have several posts over the years that geek out over the cross-over between Halloween and SQL Server.

With the proximity to Halloween, it’s not hard (knowing me) to figure out why I originally decided to terminate this series on the Eve of Halloween day. Then again, as it turns out, I decided to bring the series back to life on the other side of Halloween. In short, it was to be killed on Halloween Eve and then implicitly converted to some living dead creature. You pick whether it is a zombie (which is far better suited to SQL Server) or a vampire.

If you are interested in the previous Halloween posts, here is a list of a few of them:

XE Related

Ghosts in your Database I

Ghosts in your Database II

Ghosts – an eXtrasensory Experience

DB and Fun Related

All Halloween posts

That list is my Halloween treat this year. Now for the trick with a very strong warning. Because of this warning, I am not posting any code showing how to perform the trick.

Warning

DO NOT ATTEMPT THIS ON ANY PRODUCTION SERVER. I BEAR NO RESPONSIBILITY FOR ANY PRODUCTION SERVER CHANGES DUE TO THE READING OF THIS ARTICLE (OR ANY OTHER ARTICLE I HAVE WRITTEN). SOLE RESPONSIBILITY OF CHANGES OR ACTIONS TAKEN BELONG TO THE PERSON PERFORMING THE ACT OR CHANGE. IN OTHER WORDS, YOU ARE SOLELY RESPONSIBLE FOR BREAKING YOUR PRODUCTION ENVIRONMENT IF YOU IMPLEMENT THIS IN PRODUCTION!!!

I have debated seriously over even publishing this “trick” for Halloween because of how deliciously evil it is. I will try and paint the picture in broad strokes. I will leave it up to you to connect the dots.

Problem

A third party vendor has provided an application along with some database requirements. Among these requirements is that the application account must use the ‘sa’ login. You have a strict policy that ‘sa’ must never be used for any applications or by the end-users. This is an extremely protected account by policy. The dilemma you have been presented is that the CEO insists that this application must be used (never happens right?) and the vendor insists the application will not function properly without the use of ‘sa’ (I can hear you chortle at that).

Now you, as the DBA, are stuck between a rock and an even harder place. Being smart (and somewhat smart and sadistic), you insist that the use of the ‘sa’ login should not be performed because it will break SQL Server (in reality we know the login does not break SQL Server, but something inevitably will break due to a mis-step by the person using it a) when they shouldn’t, and b) because they lack proper training). Inside you are grinning from ear to ear because you have some devilish ideas, some of which you learned by attending a session by Rob Volk (twitter) where he shows some devilish methods to deter the use of ‘sa’.

For the sake of the scenario, let’s just say you have a policy preventing the implementation of logon triggers (as suggested by Rob) due to a couple of mishaps a few months back. Somebody implemented a logon trigger that wasn’t properly configured and it happened to prevent all users from accessing the server (including the dba’s). Ruh roh!

And then…

Later in the week, after reading about the power of Extended Events, you decide to play around and do a little discovery. You come across this blog post that shows you how to find all of the possible events within the XEvent Engine. So you run the script that you found and shown here:

And while looking through the events and descriptions you discover that there is an event to log all of the successful logins. Not thinking anything of the third party vendor (because it just makes your blood boil) you begin to dream of the potential for auditing all successful logins (established connections) for all users and documenting who may be doing what and when.

After taking a few notes about the potential for the login event and getting access and logins under better control, you continue along your journey through Extended Events by reading this article about Actions. Feeling comfortable enough from the article, you do what any good data professional, trying to learn a new topic, would do – you start exploring using the provided query:

While looking through the available actions, you see one in particular that causes you to mangledraise an eyebrow (not both, just one). There is an action called “debug_break”. You squirm and ponder for quite a bit at the name and definition of this particular object. You wonder out loud “why would anybody ever want that in there?”

Your co-worker interrupts with “Do what?”

To which you reply “Oh nothing important. It was just something I read.” After which you dismiss it, realize the time of day, pack up and head home for the evening. Hours later after the work day has long since been finished, you awaken in a cold sweat with an “Aha!” that startles your slumbering neighbors dog. Immediately you pull out your laptop, connect to your local sandbox instance and get to work with a wry smile and devious giggle.

Upon returning to work the next day, you call the vendor and work out a solution to build them a sandbox server to provide a proof of concept. You grant them the use of sa to use for their application login. Sitting down together and connected to the new sandbox, the vendor attempts to login and the following happens:

DEBUG_broke

Slack-jawed and speechless the vendor pales in complexion. Deep down inside you giggle like an elementary grade school girl – content with yourself. BATTLE WON!

After the vendor departs with a promise to rework the security requirements, you restart the service on the sandbox and go about the rest of your work week with a silly grin from ear to ear and nobody the wiser.

That concludes the trick. In the tradition of Halloween, you knocked on my virtual door of Extended Events and I, in accordance with the traditions of Halloween, gave you a trick and a treat.

I hope this helps you to understand the power of Extended Events. Thinking through a problem and reaching a solution is what XEvents can help to provide. I hope this Halloween XE article was enjoyable to you. Stay tuned as the XE 60 Day series goes full Zombie starting next week.

This has been another article in the 60 Days of XE series. If you have missed any of the articles, or just want a refresher, check out the TOC.

 

«page 2 of 102»






Calendar
February 2016
M T W T F S S
« Jan    
1234567
891011121314
15161718192021
22232425262728
29  
Content
SQLHelp

SQLHelp


Welcome , today is Thursday, February 11, 2016