T-SQL Tuesday #096: Inspiration Abounds

Comments: 2 Comments
Published on: November 14, 2017

Who Inspires You?

Today is a bit of a divergence from the usual geekery that may abound on my blog. As prompted by Ewald Cress (blog | twitter), it is time to take a step back and put forth a bit of reflection. Ewald has requested we do this as a part of the TSQLTuesday monthly event. If interested, here is the invitation to participate along with all the nitty gritty details and those things we love to ignore (rules).

While looking at the invitation, it dawned on me how long this monthly event has been running. This is the 96th installment which also means it is the 8th birthday of this event. Birthdays are a good time to do a bit of self reflection. And right now with Thanksgiving around the corner, the invite to reflect upon those people that may have served up some sort of inspiration to us is a really cool idea.

For me, this is about much more than just a story or two about people that have impacted me in my career choices. This topic is also about much more than just a handful of years of inspiration. Inspiration comes in many different flavors and should help to build you into who you are over time.

With that, I give you my 96 for 96.There is no way that just one or two people could have inspired me to this point in my life. So, I set to see if I could think of 96 people that have had some impact in shaping and inspiring me. One person for each edition of TSQLTuesday on this the 8th Anniversary of a really cool thing started by Adam Machanic (twitter).

96 for 96

My apologies up front for what may be a really long article

Personal

  • My Wife (1) – How can I have a list of those who have inspired me in my life and not include the rock and stability of my life for most of my adult life? She has been there to push me along since the beginning and working me into something more refined.
  • Hursts (5) – My adopted family that taught me a lot about compassion and hard work. They have been very instrumental in helping me to learn more about giving to others.
  • Grandparents (2) – I wrote about my grandfather when he passed. My grandmother is tough and inspiring for somebody so petite. The two of them helped to inspire me to have a good time but also how to be resolute.
  • Children (4) – Wow, my kids are crazy balls of energy all the time. The learn and grow and absorb and try and fail and try again. They have so many talents that I would have never imagined. They embrace the challenges (mostly) and inspire me to develop new talents and continue to learn.
  • Ella (1) – My wife’s grandmother was stern and rigid. She was very precise in how things needed to be done. Oh and boy was she strong willed. She influenced me with how to be stern and accepting and how to balance that nicely.
  • Football team (22) – A football team? Why would that be? This is the team that I coached most recently. These guys had it tough. Due to various things outside of my control or their control they had a mid-season coaching change. I became a part of the new coaching staff. We pushed them hard and saw them grow. They did not quit and would be a huge inspiration to many on how to persevere.
  • Pierre & Veronique (2) – Good friends from France. Pierre died many years ago but he was able to instill in me a “fire in the belly” type of mantra. Hunger for more and hunger to be better. Veronique is another very strong woman and very quiet. Between them they have helped to inspire me in how to be strong yet quiet.
  • language teachers (2) – This one is for my foreign language teachers from elementary school through high school. Between French and German, they were able to inspire a desire to learn language and culture.
  • Conway (1) – My High school Calculus teacher. He was the toughest teacher I ever had. There were no shortcuts with him. I learned how to think logically from him. I may not have liked it back then when I could just see the solutions (without having to do every stinking step) but had to do it his way. Turns out it was worth it because it helps to avoid the shortcuts and I can think through problems to hopefully reach a better solution now.
  • Mimi (1) – Mimi was my collegiate counselor. She helped me to understand the need to speak up but how to do it politically when the time warranted it.
  • Mandy Harvey (1) – What more can be said about this girl? If you haven’t seen it, you need to watch it now.
  • Kelvin Spendlove (1) – Kelvin was a family friend from Vegas. He came down with cancer and passed away earlier this year. Kelvin was an example of fighting and persevering. Despite all of his pain, he kept a pleasant demeanor and showed as much charity to others as he could. No matter how bad you have it, you can take a moment to uplift somebody else.
  • Jon Huntsman Sr (1) – Mr. Huntsman is more of somebody that is well known that does and says many things that are inspirational to me. His philanthropy is of great interest. Huntsman teaches that the more you give with the right intent, the more you will receive and thus be able to continue the cycle

Hey Sport!

  • Coaches (3) – Different coaches for different sports that I had growing up inspired many things. One of those things is the ability to work through the fatigue and to be able to be comfortable in that painful/difficult moment of any given competition. Are you able to pick up the pace in that last mile?
  • Thao Tran (1) – Thao was a great friend that got me back into endurance sports. I had found myself out of the routine and out of time. With Thao, I was able to find that time and get back to running for very long amounts of time.
  • Dan & Dave (2) – Remember these guys? The decathletes that were in all the ads all the time? These guys made Track and Field super sexy. Seeing these guys and how cool it was to do Track helped encourage me to try other events.
  • Khalid Khannouchi (1) – Elite level marathoner. Khalid and Bernard are both elite level and record holding marathoners. This was something that I aspired to accomplish. That dream may be changed now but the ability they showed to push harder and harder through the fatigue is something that I have found very helpful in my career.
  • Bernard Lagat (1) – same as Khalid.
  • Steve Prefontaine (1) – Runners everywhere know about “Pre”, right? His running and style were inspirational and he was one of the runners I looked to when I was running through High School.
  • Greg Lemond (1) – Despite the animosity between Greg and Lance I am grouping them together. Lemond was a legend in my youth. He was winning the Tour de France and looking good doing it. Lance contracted and beat cancer and then went on to smash the Tour de France several times. No matter the allegations – the feats of these two were beyond comparison and well beyond awesome. I am a better runner than biker, but the two are similar in that they encourage you to go beyond and dig deeper in order to be successful.
  • Lance Armstrong (1) – same as Greg
  • Jerry Sloan (1) – Talk about a tough nosed get it done and gritty kind of professional. We could all learn a little about this type of workmanship.
  • Stockton (1) – Part of the trinity in Utah Jazz history. The humility and workman-like attitude he brought to the sport he loved is admirable. His attitude inspires me how I can work hard, be humble and be extremely talented and capable in my profession.
  • Malone (1) – Nobody outworked the Mailman. It is a tough act to follow but something to strive to accomplish.
  • Steve Young (1) – I had the opportunity of meeting Steve Young at the Olympics where I served as a translator one year. I met Dale Murphy there as well. I was amazed out how approachable these men were. I grew up watching both of them play (football and baseball respectively) and becoming a fan of the style in which they played and the success they were able to achieve in life and on the field. This helps me to try and become more approachable with clients or at various SQLFamily community events. It is not easy! 😉
  • Dale Murphy (1) – same as Steve Young
  • Ozzie Smith (1) – Ozzie was a wizard at shortstop. I remember watching some of the things he did defensively and being floored. Doing the job, doing it well, and doing it with a little flare. Sometimes a little flare is needed in the job – just as long as the job is getting done well!
  • Walter Payton, Jim McMahon, Mike Singleterry, William Perry, Mike Ditka (5) – Da Bears. These are the guys that got me to be a Bears fan for life – bad or good or well … Teamwork and doing whatever other thing was necessary to help the team succeed is what stuck with me from these guys. Need a lineman to play halfback a few plays? Call on the Fridge. In the workspace, sometimes we will just need to do something else in favor of the success of the group as a whole.

The Geeky!

  • Bill Gates / Steve Jobs (2) – from Garage to tech Giant these guys inspire with the dreams of success.
  • Steve Jones (1) – Steve was instrumental in getting me to write. He was helpful while also being honest. I have been able to develop into a better writer thanks to his help.
  • Gail Shaw (1) – Gail is a good friend. Another strong woman I have had the pleasure of getting to know. She is a great person to have a low key conversation with about SQL or about life. Gail knows her stuff and we can all learn something from her.
  • Paul Randal (1) – I have had the opportunity to chat with Paul on a few occasions. The most memorable comment from him was that we can all learn from each other. He learns from us and we can learn from him. Paul knows a lot about SQL Server and is still able to learn more. We can all continue to learn about SQL Server.
  • Kimberly Tripp (1) – Kimberly is a genius with stats and indexes. I have learned plenty from her from her presentations and articles.
  • Kalen Delaney (1) – One of my favorite things about Kalen is her ability to tease and know when it would be effective. Kalen is a person with whom I have enjoyed some great conversations. Are you looking for somebody that know a boatload about SQL Server – Kalen should be at the top of that list.
  • Kevin Kline (1) – Kevin has unwittingly bestowed some great lessons on me. He has some great internals information on SQL Server. He also has some awesome personal development stuff that people could learn from him.
  • Brent Ozar (1) – Brent is very smart with SQL Server. Probably the greater inspiration to me is his energy. He is great with marketing and branding. He does a great job at appearing to be outgoing. That is a difficult thing for many in the IT field that would self-classify as an introvert.
  • Joe Sack (1) – Joe and Jonathan are going to be grouped together. I had the opportunity to work with both of them when working on my first book. I learned a lot about the writing process and some tips for just plane old making it better. I hope I can retain what I learned.
  • Jonathan Gennick (1) – same as Joe
  • Bob Ward (1) – between Bob, Paul and Ewald I learn so much about the internals of SQL Server. These guys are smart and love to play with the debugger. It makes me a bit jealous. I would love to have adequate time to just dive in with the debugger on a regular basis – maybe daily. There is a lot that can be learned from these three. Find their stuff and start learning. They push me to keep pushing harder to learn more about the internals.
  • Paul White (1) – only addition here is that I appreciated the late night conversations with Paul. He had the added advantage over Ewald and Bob in having a more direct impact in inspiring some of my internals dives.
  • Ewald Cress (1) – same as Bob.
  • Pat Wright (1) – Pat is a monster in the community. Pat runs user groups, organizes events and works to bring so many people together or greater learning opportunities. He does not limit his efforts to just the SQL Server community. Rather he is looking at all data related communities.
  • Ben Miller (1) – I met Ben many years ago. He introduced me to a few little tidbits for SQL Server and it sparked a greater interest in me to diver deeper and just get better at what I do.
  • Wayne Sheffield (1) – well this big teddy bear helped Steve Jones with getting me down the path to writing. Wayne did it in a little bit of a different way but was somebody that helped inspire me that writing technical papers is something that I could do.
  • Kendra Little (1) – the technical insight and ability of Kendra is top shelf. What I like about Kendra’s community presence and work is the character she brings to it. Learning can be fun and witty and personal. It doesn’t have to be technical and dry all the time like so much of the content out there.
  • Jes Borland (1) – Talk about an amazing ball of energy! Oh and Jes is an awesome talent in the SQL community too. If you want to learn, take a minute or three with Jes.
  • Jennifer Moser (1) – Jennifer is simply amazing if you ask me. She can herd cats err data professionals like it is nothing. She does so much for the community and I would dare say that much of what she does goes completely unnoticed. If you come across her, tell her thanks. We can all learn a bit about working tirelessly for the betterment of a community from Jennifer.
  • Dwaine Camps (1) – Dwaine was a SQL Super Stud in my opinion. He was a great help in solving many technical puzzles and he loved to apply himself to those types of problems. For him, those puzzles were like deep dives for me. I learned a lot from Dwaine. Rest in Peace.
  • Jeff Moden (1) – Jeff is rbar none a top shelf MVP in my opinion. He is the juggernaut of high performing tsql solutions.
  • Andy Leonard (1) – Friend, mentor, wize man with a goatee. Nuf said! Andy is an easy going person that taught me an important lesson about community. Sometimes taking a step back is a far better contribution to the community than to hold on to everything with white knuckles.
  • Andy Warren (1) – Andy has been very influential for me. Reading his articles and talking to him, I have had the opportunity to understand a little better the managerial presence. I don’t really know how to explain that very well, but there is a calming presence and an understanding of staying even keel with whatever issue pops up. He has a way of looking at various issues, thinking about them, presenting them in a seemingly un-biased fashion and just being factual. Sometimes we can benefit from the approach of studying it out and not acting too rashly.
  • Robert Davis (1) – Robert is another one of those internals studs. Robert has been influential to me with some of his articles about how different features work. Again, this is an inspiration to dive into SQL Server to better understand how things are working.
  • Aaron Bertrand (1) – When I first met Aaron, I have to admit I was surprised that he knew who I was. This impressed me quite a bit. It tells me that this well known community giant takes the time to get to know the little guy and that everybody in SQLFamily is important. Maybe I can learn from that and work that much harder at remembering who people are (I am very weak at remembering people and faces).
  • Thomas LaRock (1) – Thomas is an interesting character on my list. This is not a bad thing at all. He is an interesting person. The most influential thing I have picked up from Thomas is his ability to weave a story while presenting. He is an amazing presenter in my opinion. He has an ability to teach through story telling that is difficult for me. It is certainly something I am striving to become better at doing.
  • Midnight DBAs (2) – Jen and Sean are the Midnight DBAs. I would call them friends as well. When I think about their influence, “Don’t sweat the small stuff” comes to mind. That doesn’t mean we need not take care of the small stuff, but rather sometimes we can have zero impact on certain things. All we can do is try to make our case and hope that people will accept our input as the SME.
  • Grant Fritchey (1) – To be honest, I don’t know why I put Grant on this list. Just kidding. I enjoy chatting with Grant. The nice thing here is that we can chat about things that are not always about SQL Server. If you have a recommendation, Grant is all over listening to you and determining if he can test it out. I don’t think I have ever seen him be dismissive to anybody except that one time to me. Yes, he will probably think about that one for a bit. It is a story that could be told some day.

TSQL2sDay150x150The Wrap

Wow, what a list? That is a list of 96 influencers in my life. True some names have been partly or entirely obscured, but the people are real. You will probably notice that I did not include any links to twitter profiles or blog sites. I am leaving it to you to google the person.

I have just shared roughly 96 points of data with you about my development into a data professional. I still have a long way to go as well. Oh and because it is 96 points of data, it meets the requirement to be at least loosely tied to the requirement of being about data.

Don’t see your name on this list? I really had far too many names for the 96 and I do realize that some people that have been really influential in my life did not make it to this list. I am sure all of us could find far more than just one or two people that have influenced us in life. If you are reading this post, I challenge you to come up with your list of at least 20 people that have influenced you. I bet you will be pleased with the self reflection.

T-SQL Tuesday #089: The Cloud and Job Security

Comments: No Comments
Published on: April 11, 2017

The Cloud

Today I am doing a quick entry for my participation points in the monthly blog party called TSQL Tuesday. I have missed the past few opportunities for various reasons. Today when I saw the topic, I wanted to post a few quick thoughts. If you are interested, the host this month is Koen Verbeek (blog | twitter) and the invite can be found here.

Koen invites us to explore the cloud, whether it be a stormy cloud or a silver lined cloud. Either way, explore it and how it relates to you. Here are some of the examples Koen posted:

  • What impact has this had on your job?
  • Do you feel endangered?
  • Do you have more exciting features/toys to work with?
  • Do you embrace the change and learn new skills?
  • Or do you hide in your cubicle and fear the robot uprising?

I guess the answer for me is “it depends” – buahaha. Just kidding.

The Future Is Bright

I think the cloud is a good thing for the data professional, when done right. I do not believe there is anything to fear with it, so I definitely don’t feel endangered. That said, I do proceed cautiously to the silver lined puffs of water in the air. It’s not from fear, but more of a caution to ensure it is the correct move for the business need in question. I don’t believe the cloud is the right answer for all business needs but it is an appropriate solution for many business requirements.

I like to ensure my clients are well informed of what the choices are and the implications may be when deciding to move to the cloud. I like to make sure they understand that a move to the cloud is not a knee jerk decision – it takes planning and considerable effort in many situations. I also like to remind them that the cloud is really just another data center hosting their data. Granted, some offerings from vendors like Amazon and Microsoft do permit considerable flexibility and the opportunity to move quickly to new demands or interests.

Playground

For me, one of the biggest benefits of the cloud is the constantly evolving sandbox that I get to use to learn and grow (obviously that means I get to play a lot). I don’t have the resources at my disposal (and most clients don’t either) to be able to stand up a brand new environment from scratch for a POC quickly and efficiently. If I want to play around with Machine Learning (ML) then I can spin up an environment to help learn and evaluate my options. Should I decided I want to learn how to setup a multi-site multi-node windows cluster, I could spin up that environment very quickly and start learning with minimal hardware requirement on my part.

The cloud offers great learning potential for those interested. That said, it is obviously not free. There is cost for the cloud services and of course one must still invest personal time into the “sandbox” in order to learn the technology properly.

TSQL2sDay150x150The Wrap

Personally, I see no threat from the cloud movement. Some may worry about the cloud automating them out of a job. The truth is, data professionals are always trying to automate things. Automation is not really entirely new and it seems there is always more to be automated.

The cloud offers new avenues to grow ones career. The technology is getting more and more interesting. Is the cloud blowing past you and your career or are you riding the Jet Stream through the clouds and into your future?

T-SQL Tuesday #081: Recap

Comments: 4 Comments
Published on: August 18, 2016

Sharpen Something

sqlskillsharpener_pigIn case you missed it (many did), TSQL Tuesday was a challenging event this month. I invited people to do a put a little more into writing a post than what they may usually do. There were some very good reasons for this. If you are interested, take a look back at the invite and see if you maybe want to give it a go outside the bounds of TSQL Tuesday. You can check out the original post here.

Before I get into the nitty gritty I have a confession. This topic was a reminder for myself as much as it was a challenge to others to help me continue to drive and improve in various areas as I see fit.

There are other dirty little secrets too. Some may become apparent as you read through the recap.

Recap of the Event

One of the tricks to becoming and staying a top tier data talent or professional is a perpetual cycle to learn, adapt, change, and evolve. We must be in a continual cycle of self evaluation and self modification. Let’s call this by something else – we must be agile. There I said the five letter word. Think about it in broad strokes with your career – it is a development process with perpetual evaluation, review and tweaks.

Now think about the invite and see how that fits with what I just said or with the, cough cough, agile flow. You start (albeit very basically) with a need for enhancement, then you plan which pieces of the enhancement you can accomplish, you then do the work (whether successful or not), then after you deliver the work you conclude with a retrospective (what went well and what needs to change). Yes! I do feel rather dirty for sneaking this on everybody like this. That said, when you think about the model and apply it in broad strokes to your career path – it has merit.

Another way of viewing this is to think in terms of the following flowchart to help improve your personal mindset or maybe improve your personal mental power. The process is repetitive and follows a natural course. Once you have acted on some plan, you must review the performance and results and then gauge where your mindset needs to go from there to improve.

mindset_improvement_560

Jason Brimhall Rob Farley Steve Jones Robert Davis Kenneth Fisher Kennie Nybo Pontoppidan Mala Mahadevan Wayne Sheffield Jason Brimhall

In other Words

Did you just look at the picture or did you explore the picture? If you hover the picture, you will find there are links to this months participants. There were only eight so not a ton of exploration is necessary.

Here are my thoughts on each of the posts submitted this month:

Wayne Sheffield (blog | twitter) – You can find his link in the big arrow that restarts the cycle. I put his link here because he ran into a ton of blockers during his experiment and he is at a spot of practically restarting – again. This is not the first time he has restarted in his quest to learn more about Availability Groups. Wayne fully admits he is deficient in AG and states near the end of the post that he had to humble himself going through this exercise. That is awesome! We could all use a little humility on a more regular basis.

Mala Mahadevan (blog | twitter) – You can find her link in the “Results” circle. The reason for this choice is that Mala discusses her midlife crisis – erm career change. MTSQL2sDay150x150ala held out for quite a while looking for just the right opportunity. When it came, she snatched it up. Along with that career change, she has implemented a plan to become more active in blogging and to learn more and more through various avenues. The increase in blogging and the ability to stick to her guns resulted in a new job/career she seems to be happy with at the moment.

Robert Davis (blog | twitter) – Robert found himself placed in the performance circle thanks to his article involving a third party backup utility that should be heavy on the performance side. Robert needed something interesting to push him to reacquaint himself with this tool. Once he found that project that required just a touch of ingenuity, performance and a way to avoid the GUI, Robert found himself right at home with a great solution for his environment.

Kennie Nybo Pontoppidan (blog | twitter) – Kennie landed in the Actions node mostly because he decided to take the challenge and act on his long time desire to get better at the new temporal features. To do that, he decided to read a book by Snodgrass which seriously sounded like something from Harry Potter to me. Kennie outlines a bunch of information that he learned from the book such as tracking time based data from either a transaction or valid-time perspective.

personal_growth_brain

Kenneth Fisher (blog | twitter) – I placed this one into the behavior node. Maybe it is a bit of a stretch, but it seems to make sense since he discussed some behavioral differences between Azure DB and SQL Server. Things just do not work exactly the same between the two. You will need to understand these differences if you find yourself in a spot where you must work with both.

Steve Jones (blog | twitter) – When looking through the image, you will find that Steve landed solidly in the mindset node. When I read his contribution, I got the full impression that his mind was 100% in the right place. He set out to learn something and try to get better at it. Additionally, he blogged about a topic that is near and dear to me – Extended Events. Have I mentioned before that I have a lot of content about XE? You can read a bunch of it here. Like Wayne, Steve was humble near the end of his article. He notes that he was clumsy as he started working with XE but that he is glad he did it as well. Read his article. He gave me a great idea of another use for XE and I am sure it may sound good to you too!

Rob Farley (blog | twitter) – I planted Rob firmly on the attitude node. It seems clear to me that Rob had loads of attitude throughout his article about Operational Analytics. The attitude I perceived was that of humility and yearning. Rob feels like he has a lot to learn and his attitude is in the right place it seems to keep him going while he tries to learn more in the field of Operational Analytics.

My Contribution link can be found by clicking on any spot in the image that is not already described. I wrote about my experiences with trying to pick up a little on JSON.

That is a wrap of all eight contributions. If you did not contribute this month, I recommend that you still try to do something with the challenge issued with this months TSQL Tuesday.

Edit: Added links to the articles with each persons name in the event this page is being viewed with Firefox. There seems to be an issue with the links in the image map within Firefox.

T-SQL Tuesday #081: Getting Sharper

Comments: No Comments
Published on: August 16, 2016

Sharpen Something

sqlskillsharpener_pigThis month I am the host of the TSQL Tuesday blog party. In the invite, which can be read here, I asked people to decide on something to work on, plan out and then report the success/failure.

Not only am I the host, but I am a member this month. In my invite (and the reminder) I provided a few examples of what I was really looking for from participants this month. It became apparent that the topic may have been over thought. So, for my contribution, I decided to do something extremely simple.

There is so much about SQL Server that it would not be feasible nor should it be expected that one single person should know everything about the product. That said, within SQL Server alone, all of us have something to learn and improve upon within our skill-set. If we extend out to the professional development realm, we have even more we can explore as a skill sharpening experiment for this month.

I am going to keep it strictly within the SQL Server realm this month. I have chosen to develop my skills a little more with the topic of JSON. I should be an expert in JSON, but since it is spelled incorrectly – maybe I have something to learn. That said, I really do love being in the database now – haha.

JSON

katanaLet’s just get this out there right now – I suck at JSON. I suck at XML. The idea of querying a non-normalized document to get the data is not very endearing to me. It is for that reason that I have written utilities or scripts to help generate my XML shredding scripts – as
can be seen here.

Knowing that I have this allergy to features similar to XML, I need to build up some resistance to the allergy through a little learning and a little practice. Based on that, my plan is pretty simple:

  1. Read up on JSON
  2. Find some tutorials on JSON
  3. Practice using the feature
  4. Potentially do something destructive with JSON

With that plan set before me, it is time to sharpen some skills and then slice, dice, and maybe shred some JSON.

Sharpening

Nothing in this entire process was actually too terribly difficult. That is an important notion to understand. My plan was very lacking in detail and really just had broad strokes. This helps me to be adaptable to changing demands and time constraints. I dare say the combination of broad strokes and a very limited scope also allowed me an opportunity for easier success.

Researching JSON was pretty straight forward. This really meant a few google searches. There was a little bit of time spent reading material from other blogs, a little bit from BOL and a little bit from msdn. Nothing extravagant here. I did also have the opportunity to review some slides from a Microsoft presentation on the topic. Again, not terribly difficult or demanding in effort or time requirement. This research covers both steps one and two in the plan.

Now comes the more difficult task. It was time to put some of what had been seen and read to practice. A little experimentation was necessary. I have two easy enough looking examples that I was able to construct to start experimenting with in my learning endeavors.

Here is the first example. This is a bit more basic in construct. (Updated to use an image since the json was messing with the rss feed and causing malformed xml.)

json_xmpl

And some basic results:

basic_json

Pretty slick. Better yet is that this is many times easier than XML.

How about something a little different like the following:

Admittedly, this one is a bit more of a hack. In my defense, I am still learning how to work with this type of stuff. At any rate, I had an array of values for one  of the attributes. The kludge I used reads up to 3 values from that array and returns those values into individual attributes. I am still learning in this area so I can live with this for now.

array_json

The last part of the plan involved doing something destructive. Why? Well just for the fun of it. I was unable to get to this stage but it is still in the plans.

TSQL2sDay150x150Report on The Successes and Failures

 

I have written about some of the successes and failures along the way thus far. Overall, I would rate this a successful endeavor. The big reason for it being a success is because I do feel I learned more about json within SQL Server than I had prior to the experiment.

Taking a bite sized chunk of learning and acting on it sure makes it a lot easier to learn a new concept or to learn more about such a vast topic such as SQL Server.

*Note: This is a late publish because the post didn’t auto post. This is a tad late but I discovered it as I was prepping the roundup.

T-SQL Tuesday #081: Sharpen Something – Reminder

Comments: No Comments
Published on: August 2, 2016

Sharpen Something

sqlskillsharpener_pigLast week I sent out the invite for the August TSQL Tuesday blog party. In that invite I promised to send out a reminder seven days prior to the event. Well – it is that time.

You are cordially invited to read the invite for TSQL Tuesday 81 and plan your entry for the party.

In the invite, I shared the details for the event including examples of what I am looking for as an entry for the event.

I hope we will be seeing you next Tuesday, August 9th in attendance at this month’s party. I am sure it will prove to be an interesting experience one way or another.

Bonus Example

In the original invite I provided a list of examples of what one could do for this TSQL Tuesday. Today, I am providing one more example in a slightly different format. Recall that the invite requested that participants set out to accomplish something, make a plan and report on that “goal”, the plan, and the outcome.

So, let’s say I have discovered that I write too much in the passive voice. Based on that information, I would like to overcome the passivity in my writing voice, therefore my goal would be to learn how to write more assertively (less passively). In order to accomplish that goal, I may need to read up on the topic and learn exactly what it means to write passively. Then I would need to examine articles that I have written. And then I would need to practice writing more assertively. After all of that is done, I may have somebody (or something) analyze a brand new article or two to determine if I have achieved my desire.

After having executed on that plan, I will write about the experience including what the initial goal and plan were and also on what worked or didn’t work while trying to reach that goal. To summarize, here is an outline of that example:

What I will Accomplish

I will learn how to write more assertively (or just Write more assertively)

How Will I do that

Research what it means to write passively

Research what it means to write assertively

Evaluate “assertively” written articles

Take Notes on how to write assertively

Evaluate my articles

Practice writing assertively

Write a new article and have it reviewed to judge the voice whether it seems too passive or not

Report on The Successes and Failures

Write whether or not each step succeeded or failed.

Write if a step was unnecessary

Write about the experience and your thoughts about the experience.

Did you achieve or fail overall?

What is T-SQL Tuesday?

TSQL2sDay150x150T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements below. Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.

How to Participate

  • Your post must be published between 00:00 GMT Tuesday, Août 9e, 2016, and 00:00 GMT Wednesday Août 10e, 2016.
  • Your post must contain the T-SQL Tuesday logo from above and the image should link back to this blog post.
  • Trackbacks should work. But, please do add a link to your post in the comments section below so everyone can see your work.
  • Tweet about your post using the hash tag #TSQL2sDay.

T-SQL Tuesday #081: Sharpen Something

Comments: 11 Comments
Published on: July 27, 2016

Sharpen Something

It has now been 30 months since the last time I hosted a TSQL Tuesday, that was TSQL Tuesday 51. I recapped that event here with the original invite here. I can’t believe it has been that long since I last hosted. It only seems like yesterday.

sqlskillsharpener_pigComing into the present day, we are now at TSQL Tuesday 81. For this month, I would like to try and up the ante a bit. Usually we only get about a weeks notice prior to the event to think about the article to write for the event.

This time, I want to invite everybody just a little bit sooner and will follow-up with a reminder seven days prior to the event. The reason I want to do this is because I think this may be a touch more difficult this time.

 

This month I am asking you to not only write a post but to do a little homework – first. In other words, plan to do something, carry out that plan, and then write about the experience. There is a lot going into that last sentence. Because of that, let me try to explain through a few examples of what I might like to see. Hopefully these examples will help you understand the intent and how this month the topic relates to “Sharpening Something“.

EXAMPLES

  1. You have learned about a really cool feature called Azure DevTest Lab. Having heard about it, you wish to implement this feature to solve some need in your personal development or corporate environment. Develop a plan to implement the feature and tell us the problem it solves and about your experiences in getting it to work from start to end. An example of how I might try to use this might involve the creation of a disposable and easy setup environment for Precons, Workshops, and various other types of training.
  2. There is a really awesome book about SQL Server you heard about and you decided to buy it. Plan to sit down and read the book. Take a nugget or two from the book and tell us how you can use that nugget of information within your personal or professional environment.
  3. You know you are extremely deficient at a certain SQL Skill. Tell me what that skill is and develop a plan to get better at that skill. Report on the implementation of this skill and how you are doing at improving. Maybe that skill is about Extended Events, PoSH or availability groups.
  4. Similar to the skill deficiency, you know you do not understand a certain concept within SQL Server as well as you feel you should. Maybe that concept is indexing or statistics (for example). Create a two week plan to become more proficient at that concept. Follow that plan and report on your progress.

In recap, this is an invite to make a short term goal covering the next two weeks. Tell everybody what that goal is (in your tsql tuesday post of course) and how you went about creating a plan for that goal and how you have progressed during the two week interval.

What is T-SQL Tuesday?

TSQL2sDay150x150T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements below. Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.

How to Participate

  • Your post must be published between 00:00 GMT Tuesday, Août 9e, 2016, and 00:00 GMT Wednesday Août 10e, 2016.
  • Your post must contain the T-SQL Tuesday logo from above and the image should link back to this blog post.
  • Trackbacks should work. But, please do add a link to your post in the comments section below so everyone can see your work.
  • Tweet about your post using the hash tag #TSQL2sDay.

SQL Server Desired Enhancements

TSQL2sDayHappy Belated Birthday

The monthly Data Professionals blog party has come and gone. It happens the second Tuesday of every month – or at least is supposed to happen on that day. This month, the formidable Chris Yates (blog | twitter) has invited everybody to a birthday party – of sorts. As with many birthdays, there is always somebody that wishes you a happy belated birthday. For this party, it is my turn to offer up that belated birthday. It just so happens there was some coordination between Chris and myself for this belated birthday.

Read all about the invite from Chris’ blog. If you missed the link, here it is again – right here.

Plastic Surgery – Desired Enhancements

SQL Server 2016 has come with a ton of cool features, bells, whistles and well cool stuff (yes redundant). That aside, what are some of the really cool features that I would love to see in SQL Server? Let’s run through them (And yes, I will be a bit greedy. It is standard operating procedure when asking for gifts, right?).

Gift #1

I need some way of being able to reproduce a production database cleanly and efficiently in a different environment. Sure, I can script everything and develop an elaborate process to ensure I got an exact duplicate of the stats, stat steps, stats histogram, schema, procedures, indexes, etc etc etc. Being able to do all of that cleanly and efficiently is the key. This is a pretty big want from clients and could be extremely useful.

Microsoft has heard the pleas. Introduced with SP2 for SQL Server 2014 there is a new DBCC statement to do exactly that – DBCC CloneDatabase. Check out all the details here.

Gift #1, let’s check that off the list.

Gift #2

Instant File Initialization is fantastic and a huge time saver. Unfortunately this only applies to the data files. We need something like this implemented for the transaction log. Currently the transaction log must be “zeroed” or 0-stamped when new space is allocated. This mechanism can delay transactions and impact performance if there happens to be a required file growth or even when trying to manually grow the file or even restore the database.

Believe it or not, Microsoft has addressed this request as well. Microsoft has changed how the transaction log is stamped for a significant performance improvement. This is a part of SQL Server 2016. Bob Dorr explains it very well in his blog post on the topic. You can read his blog post here.

Wow, two for two. We can check gift #2 off the list.

Gift #3

Availability Groups seems to get bogged down under heavy load. The redo and log send seem to get backed up and can have a significant impact on production operations. We need the log transport to be faster.  No, check that. Not just faster it needs to be 2-3x faster.

SQL Server 2016 comes to the rescue again. Amped up on SQL Steroids, Availability Groups has seen a significant improvement in log transport speeds to the secondaries. Some report it as at least twice as fast. The bottleneck has been moved out of the SQL Engine and it has really amped things up from a performance perspective. Here is a supporting article by Jimmy May on the topic – though it doesn’t go deep into the specifics.

Mark another one off the gift registry. Think we can maintain this pace?

Gift #4

Statistics seem to become stale for smaller tables which dramatically affects performance of certain queries. These tables will not see 20% of the rows updated in the leading edge any time before the turn of the year but they would likely change within the six months following the turn of the new year. We need to be able to force these stats to auto-update more regularly without extra intervention.

Fair enough, we already have a trace flag that can help with that (TF 2371). Maybe the environment or management is resistant to having trace flags implemented for something such as this. You never know what the political red tape may dictate.

stackeddeckSQL Server 2016 to the rescue again!!! SQL Server 2016 has this trace flag enabled automatically. You don’t need to do anything extra special. What this means is that those stats on the smaller tables may actually get updated without intervention despite the lack of change to the rows in the table.
That is four for four. Should we take this birthday party to Vegas? Don’t assume I have stacked the deck either! ;0)

Gift #5

I am getting very frustrated with the constant clearing of usage stats every time I rebuild an index. Just because I rebuild the index, it does not mean that I no longer need the usage stats from prior to the index rebuild. I need to be able to see the usage stats for a the time spanning before and after the rebuild without creating a custom process to capture that information. Sure it may not be an insanely difficult task to perform, but it is extra process I have to build out. It’s the principle of the matter.

SQL Server 2016 to the rescue yet again. This age old bug of usage stats being cleared is finally fixed. It is frustrating to say the least to have to deal with this kind of bug. It is a huge relief to have it fixed and be able to get a consistent clear picture of the usage information since the server has been up.

For more information, you could read this article by Kendra Little – here.

Cha-ching. We are now five for five.

resourceflowGift #6

Digging a little deeper on this one. I would really love to see an enhancement to Resource Governor. Not just any enhancement will do. I need it to be enhanced so it will also affect the reporting services engine and the integration services engine in addition to the database engine. I want to be able to use RG to prevent certain reports from over consuming resources within the SSRS engine. Or for that matter, I want to make sure certain SSIS packages do not consume too much memory. If I can implement constraints on resources for these two engines it would be a huge improvement.

We will have to wait for a while on this one. It is currently not scheduled for delivery

Gift #7

This one is going to be a little tougher. It’s not in place. It would be a fantastic gift in my opinion. I would like some tool such as Extended Events to be able to monitor the workload and determine best recommended trace flags to implement.There are many trace flags that reaper_rip_tombstoneare far from well known but could be extremely helpful to production environments based on the workload and internal workflow. Not all would trace flags are built for all environments. An analysis through some automated tool for best recommended flags to implement (again solely at your discretion) would be fantastic.

Gift #8

Get Profiler out of Management Studio finally. Enough said there. There really is no good solid reason in my opinion to keep it around. It is deprecated. It is hardly helpful with 2014 or 2016 and it is just dead weight. Extended Events really is the better way to go here.

Last Request

 

Can we please fix the spelling of JSON? It really needs to be spelled correctly. That spelling is: JASON.

Awesome SQL Server Feature

TSQL2sDayThe second Tuesday of April 2016 is now upon us and you know what that means. Well, I hope you know what that means.

It is time for TSQL Tuesday. It is now the 77th edition of this monthly blog party. This month the host is Jens Vestergaard (blog | twitter) and he insists we do a little soul searching to figure out what about SQL Server really makes our hearts go pitter patter for SQL Server. Ok, so he didn’t really put it that way but you get the point, right? What is it about SQL Server that ROCKS in your opinion?

Well, I think there are a lot of really cool features in SQL Server that ROCK! It really is hard to pick just one feature because there are a lot of really good features that can make life so much easier as a database professional. Then again, there is that topic that bubbles to the top in my articles – a lot. If you haven’t followed my blog, here is a quick clue: click here.

Why is this feature so AWESOME?

Truth be told, there are a ton of reasons why I really like it. Before diving into the why, I need to share an experience.

A client using Microsoft Dynamics AX to manage the Point of Sale (POS) systems for their retail chain has been running into a problem with the POS database at each store. Approximately a year ago, this client had upgrade most of the store databases to SQL Server Standard Edition from Express due to the size restriction of the Express Edition. This SKU upgrade was necessary because the database had grown to exceed 10GB. Most of this growth was explicitly related to the INVENTDIM table consuming 3.5GB of space in the data file.

Right here, you may be asking what the big deal is. Just upgrade the SKU to Standard Edition and don’t worry about the size of the database. I mean, that is an easy fix, right? Sure, that may be perfectly acceptable in an environment with one or maybe even a handful of servers. Imagine a retail chain with more than 120 stores and a database at each store. No extrapolate standard edition licensing costs for all of those stores. Suddenly we are talking a pretty big expense to just upgrade. All of that just because one table chews up 35% of the size limitation of a data file in SQL Server Express Edition.

What if there was an alternative with SQL Express to mitigate that cost and maintain the POS functionality? Enter the SYNONYM! You may recall from a previous post a thing or two that I have said about synonyms in SQL Server. There is good and bad to be had with this feature and most of the bad comes from implementation and not the feature itself.

Using a synonym, I can extend this database beyond the 10GB limitation – or at least that is the proposed theory. To make this work properly, the plan was to create a new database, copy the INVENTDIM table from the POS database to this new database, rename the old INVENTDIM table in the POS database, create a synonym referencing the new table in the new database, and then select from the table to confirm functionality. Sounds easy right? Here is the script that basically goes with that set of steps.

This seems to make a fair amount of sense. Querying the INVENTDIM synonym produces results just as would be expected. Notice that there is one additional step in the script which I did not mention. That step removes unnecessary rows from the INVENTDIM table based on an actual inventory item or barcode for the particular dimension variant related to the item. This helps to trim the table to specific rows related to the retail store available for purchase there. In addition, it serves as a failsafe to get the data down to less than 10GB in case of failure with the synonym.

failedTesting from within SQL Server proved very optimistic. The synonym was working exactly as desired. Next up was to test the change by performing various transactions through the POS.

The solution not only failed, it failed consistently and dramatically. It didn’t even come close. How is this possible? What is Dynamics AX doing that could possibly subvert the synonym implementation? Time to start troubleshooting.

I checked through the logs. Nothing to be found. I checked and validated permissions. No Dice! I checked the ownership chaining. Still no dice! What in the world is causing this failure?

What if I switch to use a view instead of a synonym? I created a view with cross database ownership chains in tact. Test the application again and still failed. What if I use the synonym pointed to a table in the same database? Test from the application and all of a sudden we have success. Now the head-scratching gets a little more intense.

xe_superheroIt is time to get serious. What exactly is the Dynamics AX POS application doing that is leading to failure that does not happen when we query direct from within Management Studio? The means to get serious is to now implement that awesome tool I alluded to previously – Extended Events (XE or XEvents).

With no clues being available from any of the usual sources (including application error messages), XE or profiler is about the only thing left to try and capture the root cause of this failure. Since this happens to be a SQL Server 2014 implementation (yeah I omitted that fact), the only real option in my opinion was to use XE. Truth be told, even on SQL Server 2008 R2, my go to tool is XE. In this case, here is what I configured to try and catch the problem:

With the session running, I had the POS tests begin again. Bang! It failed again, but I expected it and wanted it to fail again. This time around, finding the problem turned out to be really easy. As soon as the error hit, I was able to check the trapped events and see what it was that had been missing and ultimately causing this string of failures.

xe_trappederror_ax

Using the GUI (yeah rare occasion for me with XE), I filtered the events down for display purposes only to make it easier to see what was found by running these tests that was pertinent to the problem. Here is the highlighted text a little larger and easier to see:

Snapshot isolation transaction failed accessing database ‘AxRetailDIM’ because snapshot isolation is not allowed in this database. Use ALTER DATABASE to allow snapshot isolation.

Wow! Light bulb shines bright and the clue finally clicks. The POS databases for this client are all set to allow snapshot isolation. Since this error is coming at the time when the failure occurs in the application, it stands to reason that this is the root cause. Time to test by changing the snapshot isolation setting.

That is a quick change and easy enough to test again. With the XE Session still running, and the change in effect, it’s time to test via the POS application again. To my expectations the application is working now. This is good news! Time to test again and again and again to make sure it wasn’t a fluke that it worked and that it was only going to work just the once.

Not a single failure after the change to allow snapshot isolation. One small change with such a big impact and so few clues to be found except in that super Awesome Super Hero feature of SQL Server called Extended Events!

Being able to quickly find the root cause of so much pain is why I enjoy working with the Extended Events feature. It is an efficient way to find a ton of information while causing little overhead to the server.

The bonus here is that XE allowed us to pinpoint a problem with the proposed solution to help save costs while extending a database beyond the 10GB limitation of SQL Express.

Note: I left some notes in the XE session script. These notes help to point out differences between implementing this particular session on SQL Server 2012 (or later) and SQL Server 2008 (or R2).

All about the Change

Comments: 1 Comment
Published on: January 12, 2016

TSQL2sDayThe second Tuesday of January 2016 is now upon us and you know what that means. Well, I hope you know what that means.

It is time for TSQL Tuesday. It is now the 74th edition of this monthly blog party. This month the host is Robert Davis (blog | twitter) and he has asked us to “Be the change”. Whether the inspiration for this topic is the new year and resolutions, or Ghandi (you must be the change), or CaddyShack (be the ball), we will be discussing “Change.”

Specifically, Robert requested that we discuss data changes and anything relating to data changes. Well, I am going to take that “anything” literally and stretch the definition of changing data just a bit. It will all make sense by the end (I hope).

Ch-ch-changes

Changes happen on a constant basis within a database. Data will more than likely be blackbox2changing. Yes, there are some exceptions to that, but the expectation that data is changing is not an unreal expectation.

Where that expectation becomes unwanted is when we start talking about the data that helps drive the configuration of the server. Ok, technically that is a setting or configuration option or a button, knob, whirlygig or thingamajig. Seldom do we really think about these settings as data. Think about it for a moment though. We can certainly derive some data about these changes (if these settings themselves are not actually data).

So, while you may call it settings changes, I will still be capturing data about the changes. Good? Good! Another term for this is auditing. And auditing applies to all levels including ETL processes and data changes etc. By that fortune, I just covered the topic again – tangentially.

How does one audit configuration changes? Well, there are a few different methods to do this. One could use a server side trace, SQL audit, Extended Events or (if somebody wants to) a custom solution not involving any of those using some sort of variation of tsql and error log monitoring. The point is, there are options. I have discussed a few options for the custom solution path as well as (recently published article using…) the default trace path. Today I will dive into what it looks like via SQL Audit.

When creating an audit to figure out what changes are occurring within the instance, one would need to utilize the SERVER_OPERATION_GROUP action audit group. This action group provides auditing of the following types of events:

  • Administer Bulk Operations
  • Alter Settings
  • Alter Resources
  • Authenticate
  • External Access
  • Alter Server State
  • Unsafe Assembly
  • Alter Connection
  • Alter Resource Governor
  • Use Any Workload Group
  • View Server State

From this group of events, we can guess at the types of actions that might trigger one of these events to fire for the audit. Some of the possible actions would be:

Action Example
Issue a bulk administration command BULK INSERT TestDB.dbo.Test1
FROM ‘c:\database\test1.txt’;
Issue an alter connection command KILL 66
Issue an alter resources command CREATE RESOURCE POOL PrimaryServerPool
WITH {}
Issue an alter server state command DBCC FREEPROCCACHE
Issue an alter server settings command Perform sp_configure with reconfigure
Issue a view server state command

SELECT *

FROM sys.dm_xe_session_targets

Issue an external access assembly command CREATE ASSEMBLY SQLCLRTest
FROM ‘C:\MyDBApp\SQLCLRTest.dll’
WITH PERMISSION_SET = EXTERNAL_ACCESS;
Issue an unsafe assembly command CREATE ASSEMBLY SQLCLRTest
FROM ‘C:\MyDBApp\SQLCLRTest.dll’
WITH PERMISSION_SET = UNSAFE;
Issue an alter resource governor command ALTER RESOURCE GOVERNOR DISABLE
Authenticate see view server state vsst type occurs for auth events
Use any workload group See Resource Governor

This is quite a bit of interesting information. All of these events can be audited from the same audit group. The interesting ones of this bunch are the ones that indicate some sort of change has occurred. These happen to be all but the “Authenticate”, “View Server State” and “Use any workload Group” events even though these events may be stretched to say something has changed with them as well.

With all of that in mind, I find the the “alter server settings” event to be the most problematic. While it does truly capture that something changed, it does not completely reveal to me what was changed – just that a reconfigure occurred. If a server configuration has changed, I can capture the spid and that reconfigure statement – sure. Once that is captured, I now have to do something more to figure out what configuration was “reconfigured”. This is highly frustrating.

Here’s an example from the audit I created:

audit_alterserversettings

This is only a small snippit of the results. I can see who made the configuration change, the time, the spid, the source machine etc. I just miss that nugget that tells me the exact change that was made. At least that is the case with the changes made via sp_configure. There are fixes for that – as previously mentioned.

Here is another bit of a downside. If you have the default trace still running, a lot of this information will be trapped in that trace. Furthermore, some of the events may be duplicated via the object_altered event session (e.g. the resource governor events). What does this really mean? Extra tracing and a bit of extra overhead. It is something to consider. As for the extended events related events and how to do this sort of thing via XE, I will be exploring that further in a future post.

Suffice it to say that, while not a complete solution, the use of SQL Audit can be viable to track the changes that may be occurring within your SQL Server – from a settings point of view.

Auditing Needs Reporting

Comments: No Comments
Published on: October 13, 2015

TSQL2sDay

 

Welcome to the second Tuesday of the month. And in the database world of SQL Server and the SQL Server community, that means it is time for TSQL2SDAY. This month the host is Sebastian Meine (blog / twitter), and the topic that he wants us to write about is: “Strategies for managing an enterprise”. Specifically, Sebastian has requested that everybody contribute articles about auditing. Auditing doesn’t have to be just “another boring topic”, rather it can be interesting and there is a lot to auditing.

For me, just like I did last month, I will be just doing a real quick entry. I have been more focused on my 60 Days of Extended Events series and was looking for something that might tie into both really well that won’t necessarily be covered in the series. Since I have auditing scheduled for later in the series, I was hoping to find something that meets both the XE topic and the topic of Auditing.

audit_wordcloudNo matter the mechanism used to capture the data to fulfill the “investigation” phase of the audit, if the data is not analyzed and reports generated, then the audit did not happen. With that in mind, I settled on a quick intro in how to get the audit data in order to generate reports.

Reporting

An audit can cover just about any concept, phase, action within a database. If you want to monitor and track performance and decide to store various performance metrics, that is an audit for all intents and purposes. If you are more interested in tracking the access patterns and sources of the sa login, the trapping and storing of that data would also be an audit. The data is different between the two, but the base concept boils down to the same thing. Data concerning the operations or interactions within the system is being trapped and recorded somewhere.

That said, it would be an incomplete audit if all that is done is to trap the data. If the data is never reviewed, how can one be certain the requirements are being met for that particular data trapping exercise? In other words, unless the data is analysed and some sort of report is generated from the exercise it is pretty fruitless and just a waste of resources.

There is a plenitude of means to capture data to create an audit. Some of those means were mentioned on Sebastian’s invite to the blog party. I want to focus on just two of those means because of how closely they are related – SQL Server Audits and Extended Events. And as I previously stated, I really only want to get into the how behind getting to the audit data. Once the data is able to be retrieved, then generating a report is only bound by the imagination of the intended consumer of the report.

SQL Server Audits

Audits from within SQL Server was a feature introduced at the same time as Extended Events (with SQL Server 2008). In addition to being released at the same time, some of the metadata is recorded with the XEvents metadata. Even some of the terminology is the same. When looking deep down into it, one can even find all of the targets for Audits listed within the XEvents objects.

Speaking of Targets, looking at the documentation for audits, one will see this about the Targets:

The results of an audit are sent to a target, which can be a file, the Windows Security event log, or the Windows Application event log. Logs must be reviewed and archived periodically to make sure that the target has sufficient space to write additional records.

That doesn’t look terribly different from what we have seen with XEvents thus far. Well, except for the addition of the Security and Application Event Logs. But the Target concept is well within reason and what we have become accustomed to seeing.

If the audit data is being written out to one of the event logs, it would be reasonable to expect that one knows how to find and read them. The focus today will be on the file target. I’m going to focus strictly on that with some very basic examples here.

I happen to have an Audit running on my SQL Server instance currently. I am not going to dive into how to create the audit. Suffice it to say the audit name in this case is “TSQLTuesday_Audit”. This audit is being written out to a file with rollover. In order for me to access the data in the audit file(s), I need to employ the use of a function (which is strikingly similar to the function used to read XE file targets) called fn_get_audit_file. The name is very simple and task oriented – making it pretty easy to remember.

Using the audit I mentioned and this function, I would get a query such as the following to read that data. Oh, and the audit in question is set to track the LOGIN_CHANGE_PASSWORD_GROUP event.

There are some tweaks that can be made to this, but I will defer to the 60 day XE series where I cover some of the tweaks that could/should be made to the basic form of the query when reading event files / audit files.

XE Audits

Well, truth be told, this one is a bit of trickery. Just as I mentioned in the preceding paragraph, I am going to defer to the 60 day series. In that series I cover in detail how to read the data from the XE file target. Suffice it to say, the method for reading the XE file target is very similar to the one just shown for reading an Audit file. In the case of XEvents, the function name is sys.fn_xe_file_target_read_file.

Capturing data to track performance, access patterns, policy adherence, or other processes is insufficient for an audit by itself. No audit is complete unless data analysis and reporting is attached to the audit. In this article, I introduced how to get to this data which will lead you down the path to creating fantastic reports.

«page 1 of 8








Calendar
December 2017
M T W T F S S
« Nov    
 123
45678910
11121314151617
18192021222324
25262728293031
Content
SQLHelp

SQLHelp


Welcome , today is Friday, December 15, 2017