We all have our favorite scripts, tools or utilities. Those are the things that help make our jobs easier. Some of us may have an unhealthy relationship with some of those scripts (similar in nature to the relationship many have with their phone). Whether or not the need to cut that proverbial cord exists, today we are not discussing the health of that dependence. Suffice it to say, sometimes we simply need to upgrade our scripts. How else can we get better scripts or make our scripts better – by sharing them.
If you are interested in reading the original invite, you can find that here.
“For this month’s T-SQL Tuesday, I want you to write about code you’ve written that you would hate to live without.
Maybe you built a maintenance script to free up disk space, wrote a query to gather system stats for monitoring, or coded some PowerShell to clean up string data. Your work doesn’t need to be completely original either – maybe you’ve improved the code in some open source project to better solve the problem for your particular situation.”
There is a high probability that through the sharing of your script, somebody out there can benefit from that script. In addition, it is very likely that somebody will make a suggestion to help make your script better. Worst case (emphasis on worst case here), you have the script stored somewhere with half decent instructions on what it does and making it easily accessible for you to use again and again. Just in case you forget you have it out there – you can google for it again and find it on your own blog ;).
Personally, I have been able to find and re-use some of my older scripts. Not only do I get to re-discover them, but I also get to re-imagine a new use or improvement for the script.
A shout out is absolutely necessary for Adam Machanic (twitter) for picking the right blog meme that has been able to survive so long in the SQLFamily. This party has helped many people figure out fresh topics as well as enabled them to continue to learn.
While pondering the topic for today, I had the thought occur about how frequently I post a script on my blog already anyway. An easy out for this topic would have been to re-share one of those old scripts. For instance, I could easily redo a recent article about server access that has a couple scripts demonstrated in it. Or I could go back a few years to my articles about foreign keys (here or here) and space use (here or here). Even more intriguing could be to re-envision some of my articles on Extended Events. But where would the fun in that be?
Rather than take the easy road and rehash something, I have something different. This one goes hand in hand with the numerous articles and scripts I have previously provided on auditing – yet it is different.
Not every shop can afford third party software or even Enterprise edition and so they have to come up with a different way to audit their database instances. One of the problems with a home grown solution is to ensure the data is not stored local to the server (lots of good reasons for that). Here is an example of what I did for one client that happened to have a developer that found a back door that was giving him SA access to the SQL Server Instance and was changing things and trying to cover his tracks – even after being warned.
First the query
This query will be run from a job on a different server that is restricted in access to just a select few people. I do rely on the use of the default trace in this query. I am also reliant upon a little bit of sneaky behavior. If I run this from a separate server, prying eyes are usually unlikely to find that it is running and thus makes it easier to catch them red-handed. In addition, if they discover via some sort of trace and by a lot of luck that it is running, then they have no access to the remote server to alter anything that was captured.
The query does go out to the default trace and pull back any changes to permissions or principals on the server in question. The captured data is then stored in a database that is also restricted to a select few people. Lastly, the captured data can be routinely queried, or automated reports can be created to send email notifications of changes encountered.
INSERT INTO DBA.[AUDIT].[DefTracePermissions]
'DECLARE @Path VARCHAR(512)
,@EndTime DATE = getdate()
/* These date ranges will need to be changed */
SET @StartTime = dateadd(dd, datediff(dd, 0, @EndTime) - 1, 0)
SELECT @Path = REVERSE(SUBSTRING(REVERSE([PATH]),
CHARINDEX(''\'', REVERSE([path])), 260)) + N''LOG.trc''
WHERE is_default = 1;
SELECT @@servername as SvrName,gt.StartTime AS EventTimeStamp, tc.name AS EventCategory,spid
,gt.DatabaseName,gt.ObjectName,gt.TargetUserName,gt.TargetLoginName,gt.ParentName AS SchemaName
,gt.RoleName,te.name AS TraceEvent
FROM ::fn_trace_gettable( @path, DEFAULT ) gt
INNER JOIN sys.trace_events te
ON gt.EventClass = te.trace_event_id
INNER JOIN sys.trace_categories tc
ON te.category_id = tc.category_id
INNER JOIN sys.trace_subclass_values tv
ON gt.EventSubClass = tv.subclass_value
AND gt.EventClass = tv.trace_event_id
WHERE 1 = 1
AND CONVERT(date,gt.StartTime) >= @StartTime
AND CONVERT(date,gt.StartTime) <= @EndTime
and tc.name = ''Security Audit''
AND gt.TargetLoginName IS NOT NULL
ORDER BY gt.StartTime;');
The second part of the trickery here is that I am using a linked server to perform the queries (a slight change and I could also do this via powershell which will be shown in a future article). The linked server query uses the openquery format and sends the default trace query to the remote server. Since I am running this from a job on an administrative server that pulls a limited data set, I am not overly concerned with the linked server setup here.
Once I query the data, I need to put it somewhere on my administrative server. The table setup for that is very straight forward.
IF SCHEMA_ID('AUDIT') IS NULL
EXECUTE ('CREATE SCHEMA [AUDIT]');
CREATE TABLE [AUDIT].[DefTracePermissions](
[DTPermID] [bigint] IDENTITY(1,1) NOT NULL,
[SvrName] [varchar](128) NOT NULL,
[EventTimeStamp] [datetime] NOT NULL,
[EventCategory] [varchar](128) NULL,
[spid] [int] NULL,
[subclass_name] [varchar](128) NULL,
[LoginName] [varchar](128) NULL,
[DBUserName] [varchar](128) NULL,
[HostName] [varchar](128) NULL,
[DatabaseName] [varchar](128) NULL,
[ObjectName] [varchar](128) NULL,
[TargetUserName] [varchar](128) NULL,
[TargetLoginName] [varchar](128) NULL,
[SchemaName] [varchar](256) NULL,
[RoleName] [varchar](64) NULL,
[TraceEvent] [varchar](128) NULL,
[ApplicationName] [varchar](256) NULL,
CONSTRAINT [PK_DefTracePermissions] PRIMARY KEY CLUSTERED
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
After creating this table, I am ready to store the data. All I need to do is throw the audit query into an agent job and schedule it to run on a regular schedule. For my purposes, I usually only run it once a day.
This has been my diatribe about service and giving back to the community. When done properly, there is a natural born effect of enhancing one’s personal life equal in some way to the amount of effort given towards the community.
Oh, and if you are interested in some of my community contributions (which according to Jens Vestargaard is an awesome contribution), read this series I have published.