Auditing Needs Reporting

Comments: No Comments
Published on: October 13, 2015



Welcome to the second Tuesday of the month. And in the database world of SQL Server and the SQL Server community, that means it is time for TSQL2SDAY. This month the host is Sebastian Meine (blog / twitter), and the topic that he wants us to write about is: “Strategies for managing an enterprise”. Specifically, Sebastian has requested that everybody contribute articles about auditing. Auditing doesn’t have to be just “another boring topic”, rather it can be interesting and there is a lot to auditing.

For me, just like I did last month, I will be just doing a real quick entry. I have been more focused on my 60 Days of Extended Events series and was looking for something that might tie into both really well that won’t necessarily be covered in the series. Since I have auditing scheduled for later in the series, I was hoping to find something that meets both the XE topic and the topic of Auditing.

audit_wordcloudNo matter the mechanism used to capture the data to fulfill the “investigation” phase of the audit, if the data is not analyzed and reports generated, then the audit did not happen. With that in mind, I settled on a quick intro in how to get the audit data in order to generate reports.


An audit can cover just about any concept, phase, action within a database. If you want to monitor and track performance and decide to store various performance metrics, that is an audit for all intents and purposes. If you are more interested in tracking the access patterns and sources of the sa login, the trapping and storing of that data would also be an audit. The data is different between the two, but the base concept boils down to the same thing. Data concerning the operations or interactions within the system is being trapped and recorded somewhere.

That said, it would be an incomplete audit if all that is done is to trap the data. If the data is never reviewed, how can one be certain the requirements are being met for that particular data trapping exercise? In other words, unless the data is analysed and some sort of report is generated from the exercise it is pretty fruitless and just a waste of resources.

There is a plenitude of means to capture data to create an audit. Some of those means were mentioned on Sebastian’s invite to the blog party. I want to focus on just two of those means because of how closely they are related – SQL Server Audits and Extended Events. And as I previously stated, I really only want to get into the how behind getting to the audit data. Once the data is able to be retrieved, then generating a report is only bound by the imagination of the intended consumer of the report.

SQL Server Audits

Audits from within SQL Server was a feature introduced at the same time as Extended Events (with SQL Server 2008). In addition to being released at the same time, some of the metadata is recorded with the XEvents metadata. Even some of the terminology is the same. When looking deep down into it, one can even find all of the targets for Audits listed within the XEvents objects.

Speaking of Targets, looking at the documentation for audits, one will see this about the Targets:

The results of an audit are sent to a target, which can be a file, the Windows Security event log, or the Windows Application event log. Logs must be reviewed and archived periodically to make sure that the target has sufficient space to write additional records.

That doesn’t look terribly different from what we have seen with XEvents thus far. Well, except for the addition of the Security and Application Event Logs. But the Target concept is well within reason and what we have become accustomed to seeing.

If the audit data is being written out to one of the event logs, it would be reasonable to expect that one knows how to find and read them. The focus today will be on the file target. I’m going to focus strictly on that with some very basic examples here.

I happen to have an Audit running on my SQL Server instance currently. I am not going to dive into how to create the audit. Suffice it to say the audit name in this case is “TSQLTuesday_Audit”. This audit is being written out to a file with rollover. In order for me to access the data in the audit file(s), I need to employ the use of a function (which is strikingly similar to the function used to read XE file targets) called fn_get_audit_file. The name is very simple and task oriented – making it pretty easy to remember.

Using the audit I mentioned and this function, I would get a query such as the following to read that data. Oh, and the audit in question is set to track the LOGIN_CHANGE_PASSWORD_GROUP event.

There are some tweaks that can be made to this, but I will defer to the 60 day XE series where I cover some of the tweaks that could/should be made to the basic form of the query when reading event files / audit files.

XE Audits

Well, truth be told, this one is a bit of trickery. Just as I mentioned in the preceding paragraph, I am going to defer to the 60 day series. In that series I cover in detail how to read the data from the XE file target. Suffice it to say, the method for reading the XE file target is very similar to the one just shown for reading an Audit file. In the case of XEvents, the function name is sys.fn_xe_file_target_read_file.

Capturing data to track performance, access patterns, policy adherence, or other processes is insufficient for an audit by itself. No audit is complete unless data analysis and reporting is attached to the audit. In this article, I introduced how to get to this data which will lead you down the path to creating fantastic reports.

Intro to Reading Event_File Target Data

xe_filetarget_boxHaving covered the metadata for a deployed session from just about every angle (yes there is more on metadata that could be discussed, has not yet been discussed, and that I am not planning on covering at this time), it is high-time to start figuring out how to query the payload that is desired to be trapped by the session that was deployed.

Early on in this series, I introduced the catalog views, DMVs and the supporting cast for Extended Events. In those articles introducing the supporting cast and views for XEvents, there is a critical player in this arena that was intentionally neglected. That player is the function that helps one to read the payload data that was captured via the session definition. That function is sys.fn_xe_file_target_read_file.

Functions and Files – an intro to Reading Event_File Target Data

The event_file target is an asynchronous consumer for Extended Events. This target stores the received payload in a proprietary binary format. Because of this, one needs to use the sys.fn_xe_file_target_read_file function. Using the function will then convert the data into a somewhat usable and more friendly format called XML.

With the session data being in a more human friendly form, a little bit more work needs to be done for that data to be really helpful to the data professional. To be able to get the results into a format more conducive to the consumption of most data professionals, one must use XQuery. This article is just going to focus on getting the data out of the event_file target and into the XML format. Working with the XML is a topic for another time.

The sys.fn_xe_file_target_read_file function takes a few parameters. The most important of these parameters are the first two parameters. And even then, that only applies to SQL Server 2008 and R2. Since SQL Server 2012, one really only needs to focus on the first parameter – path. That said, the file_offset can be an extremely useful field, especially under the circumstances where the session data is to be “warehoused” or a monitoring and alerting solution is to be built from Extended Events.

The path parameter is used to specify the on-disk path to the trace file that has been created. Not just the trace file, but all of the trace files associated to the session. A single file name (with path) can be specified here. Or a more common use would be to add a wildcard to the file name (with path) so all files could be included. If a wildcard is not used, then the initial_file_name parameter is pretty useless since only one file will be read anyway. If the wildcard is used, then the initial_file_name parameter can be used to determine the starting point for reading the session data.

The second parameter is mdpath and is only applicable to SQL Server 2008 and R2. This is to specify the path of the metadata file that would have been created along with the event file as a part of the session in those versions of SQL Server. If SQL Server 2012 is being used, then this parameter is unnecessary.

The initial_offset parameter helps instruct the function what to ignore and what to process when reading in the session data from the target. If storing all session data into a table (basically warehousing the data), this is an extremely helpful parameter. The use of this parameter would help the import process only import new data from the target. It would be a nightmare to import the same monitoring data every time the load process ran.

Using the same demosession session that I have used throughout the series, here is a basic example of how to retrieve that session data.

In this example, I am just pulling the session data straight back without any manipulation. I have CONVERTED the event_data to XML only as an exercise to make it easier to evaluate. If I did not convert the event_data, it would return an XML string in NVARCHAR format.

If I needed to be able to explore the data in a more friendly format, then I need to shred the XML such as I have done in the following example:

Using the function in each of these ways will work consistently and reliably. One major drawback is the need to know what the filepath is for the session in question. Even if the filepath was known at one time and saved in a script (such as the previous examples), it is not far-fetched to have another person to change the filepath and not notify anybody or update any of the scripts.

In the next article, I show a more robust means to access this target data from the files.

This has been another article in the 60 Days of XE series. If you have missed any of the articles, or just want a refresher, check out the TOC.


page 1 of 1

October 2015
« Sep   Dec »

Welcome , today is Thursday, January 23, 2020