What’s that SSIS Password

Categories: News, Professional, Scripts, SSC
Comments: No Comments
Published on: October 30, 2017

A recurring theme over the past several weeks (there are always recurring themes it seems) has been an issue that relates to SSIS. Clients will call for help with an SSIS package that has started failing and they are stuck trying to figure it out. They are all stuck at the same spot in the process – trying to get the package open. As it turns out somebody has decided to password protect the package in each case and the client is unable to open the package because they don’t know the password.

Inevitably they are all stuck at the following screen:

Granted this screenshot is for the password prompt for a project password – it really is the same issue if it is a package password. This seems like pretty good security right? If nobody knows the password then nobody can alter the package, right? At least that seems to be the prevailing course of thought.

Of course this introduces various other issues, one of which I alluded to already – a failing package. What if the requirements change? What if the package needs to be migrated? What if the package needs to be documented from stem to stern? Even better -what if this type of security is not all it is purported to be?

SSIS Security

Microsoft has provided various different “protection” levels for affecting sensitive information within a package. These levels can be set from within the GUI or from the dtutil utility. Here are some of those settings.

Setting Description Value
ServerStorage Rely on SQL Server database roles for protection. Only valid if saved to msdb and not to the file system.
DontSaveSensitive Suppresses the values of sensitive properties in the package when the package is saved. 0
EncryptSensitiveWithUserKey Uses a key that is based on the current user profile to encrypt only the values of sensitive properties in the package. 1
EncryptSensitiveWithPassword Uses DPAPI to to encrypt sensitive values in the package based on a user generated password. 2
EncryptAllWithPassword Encrypt the entire password based on a user generated password. 3
EncryptAllWithUserKey Uses a key based on current user profile to encrypt the package. Only the user that encrypted the package can open or run the package. 4

Based on these descriptions, a look at the package contents (via notepad) would look something like this:

The package will have that ProtectionLevel value added to the xml of the package with the selected value based on the descriptions from the previous table. In the case of this image, it appears that the package in question has been set to “EncryptSensitiveWithPassword” which correlates to a value of 2. This is also what is causing that password prompt to be displayed when opening the package.

So What’s that SSIS Password?

First, a bit of a birdwalk. Some will say you don’t need that password. They say you can go ahead and just change the “ProtectionLevel” value in the project/package xml and all will be rosy good. That may or may not work. Try it at your own risk. That said, if your ProtectionLevel is at a value of 2, there is an alternative method you could retrieve that password. I am going to show you just how to do that.

This method is going to need to assume that the package is being executed via a SQL Agent job and that you also have access to view job properties on the server in question.

Let’s just jump straight to a script to help us with that.

Take note here that I am only querying the msdb database. There is nothing exceedingly top secret here – yet. Most DBAs should be extremely familiar with these tables and functions that I am using here.

What does this show me though? If I have a package that is being run via Agent Job in msdb, then the sensitive information needs to be decrypted somehow. So, in order to do that decryption the password needs to be passed to the package. As it turns out, the password will be stored in the msdb database following the “DECRYPT” switch for the dtutil utility. Since I happen to have a few of these packages already available, when I run this particular query, I will see something like the following in my results.

Now, let’s go ahead and take one of those packages and test this out. I will just take the password that was listed for one of the packages, find the path of that package and then try to open the package using the password I just uncovered. Let’s keep it simple and say I am trying to open the first package called “Sports Adobe Ticket Sales Data Upload”.

After finding the package in question, I enter the password – shown here.

After I click the “OK” button, I am greeted with the following.

Tada! I now have full access to the package fully decrypted. If I need to change the ProtectionLevel, I can easily do that now as well. From here, I am set to proceed with troubleshooting the failure or any of the other reasons I noted for needing to gain access to the package.

 

Conclusion

Occasionally it becomes necessary to access an SSIS package that may be encrypted with a password. This is easily accomplished (unfortunately) if the package happens to be run via a job in SQL Server. While it is possible to gain access to the package, it would be far better to not need to circumvent the security to gain access. Instead, it would seem a much better idea to properly document the necessary passwords etc in a common vault so the appropriate people could access the package in the event of an emergency.

Drop That Schema

Categories: News, Professional, Scripts, SSC
Comments: No Comments
Published on: October 27, 2017

An often under utilized or maybe even mis-utilized feature of SQL Server is a database object schema. In the event of the latter, there is an occasional requirement for change. When this need arises, it can turn into a bit of a problem. That is of course if we are not well prepared. In this article, I am going to explore one possible change – the dropping of a schema.

What is a Schema?

A schema is a distinct namespace to facilitate the separation, management, and ownership of database objects. It removes the tight coupling of database objects and owners to improve the security administration of database objects (Source: Technet).

Much like a skeleton is a distinct system that helps support the structural integrity of certain biological entities, a schema helps to support a distinct functioning and structure within a database. This is, of course, a very simplistic description and analogy, but it works.

Another way to look at a schema is almost like a blueprint of what has been or will be created within the database. Like many blueprints, a database schema can map out all sorts of “sub-systems” at our disposal within the database. Some of these may include functions, procedures, tables, views and so forth.

What happens when you try to take out the entire blueprint without understanding the relationship to all of the subsystems? In the case of the schema it just so happens that you will be presented with an error message informing you that you are trying to do something that is either ill-advised or not supported or both.

Here is an example of such an error.

Msg 3729, Level 16, State 1, Line 2
Cannot drop schema ‘Maintenance’ because it is being referenced by object ‘AutoStatsHistory’.

This error message proves to be helpful. I obviously have some objects bound to the schema that need to be blown away before I can drop the schema. If I only have a few objects, this may not be too terribly annoying to do one by one. But what if I have 100 or more objects? Now that becomes an entirely different story.

Drop that Schema!

I have run into this very issue where there are far too many objects in the schema to be able to drop one by one. Add to the problem that I am looking to do this via script. Due to the need to drop the schema and the (albeit self imposed) requirement of doing it via script, I came up with the following that will cover most cases that I have encountered.

And a sample of the output:

As you can see here in the output, I have set the script to generate a series of drop statements for each of the dependent objects within the schema. In addition, the drop statements are ordered to remove objects that may have dependencies on other objects first. A big key here is that this script does not drop the objects for you. Rather it just creates the scripts to do the work. You must review the output and then execute the scripts separately. That means you will be entirely responsible for the results.

Once all of the objects are out of the way, the last statement in the sequence is to finally drop the schema.

You may notice that there are a few things not included in this script. The most notable may be that the Service Broker related objects are not accounted for in this script. I leave that for a later revision.

Conclusion

Occasionally it becomes necessary to remove a schema from the database for one reason or another (e.g. somebody decided to do both Dev and Prod in the same database separated only by schemas). Dropping a schema can become a little bit of a pain without the right tools. This script will help get you on your way to a scriptable solution (to help with documentation of what was changed) and a little less pain while doing it.

Seattle SQL Pro Workshop 2017 Schedule

Categories: News, Professional, SSC
Comments: No Comments
Published on: October 26, 2017

db_resuscitateSeattle SQL Pro Workshop 2017

You may be aware of an event that some friends and I are putting together during the week of PASS Summit 2017. I have created an Eventbrite page with all the gory details here.

With everybody being in a mad scramble to get things done to pull this together, the one task we left for last was to publish a schedule. While this is coming up very late in the game, rest assured we are not foregoing some semblance of order for the day. 😉 That said, there will still be plenty of disorder / fun to be had during the day.

So the entire point of this post is to publish the schedule and have a landing page for it during the event. *

Session Start Duration Presenter Topic
Registration 8:30 AM All
Intro/Welcome 9:00 AM 10 Jason Brimhall  
1 9:10 AM 60 Jason Brimhall Dolly, Footprints and a Dash of EXtra TimE
Break 10:10 AM 5    
2 10:15 AM 60 Jimmy May Intro to Monitoring I/O: The Counters That Count
Break 11:15 AM 5    
3 11:20 AM 60 Gail Shaw Parameter sniffing and other cases of the confused optimiser
Lunch 12:20 PM 60   Networking /  RG
4 1:20 PM 60 Louis Davidson Implementing a Hierarchy in SQL Server
Break 2:20 PM 5    
5 2:25 PM 60 Andy Leonard Designing an SSIS Framework
Break 3:25 PM 5    
6 3:30 PM 60 Wayne Sheffield What is this “SQL Inj/stuff/ection”, and how does it affect me?
Wrap 4:30 PM 30   Swag and Thank You
END 5:00 PM Cleanup

*This schedule is subject to change without notice.

Seattle SQL Pro Workshop 2017

Categories: News, Professional, SSC
Comments: No Comments
Published on: October 19, 2017

Seattle SQL Pro Workshop 2017

October is a great time of year for the SQL Server and Data professional. There are several conferences but the biggest happens to be in the Emerald City – Seattledb_resuscitate

Some friends and I have come together the past few years to put on an extra day of learning leading up to this massive conference. We call it the Seattle SQL Pro Workshop. I have created an Eventbrite page with all the gory details here.

That massive conference I have mentioned – you might have heard of it as well. It is called PASS Summit and you can find out a wealth of info from the website. Granted there are plenty of paid precon events sanctioned by PASS, we by no means are competing against them. We are trying to supplement the training and offer an extra avenue to any who could not attend the paid precons or who may be in town for only part of the day on Tuesday.

This year, we have a collision of sorts with this event. We are holding the event on Halloween – Oct 31, 2017. With it being Halloween, we welcome any who wish to attend the workshop in FULL costume.

So, what kinds of things will we cover at the event? I am glad you asked. Jimmy May will be there to talk about IO. Gail Shaw will be talking about the Query Optimizer (QO). Louis (Dr. SQL) will be taking us deep into Hierarchies. Andy Leonard will be exploring BIML and Wayne Sheffield will be showing us some SQL Injection attacks.

That is the 35,000 foot view of the sessions. You can read more about them from the EventBrite listing – HERE. What I do not yet have up on the is what I will be discussing.

My topic for the workshop will be hopefully something as useful and informative as the cool stuff everybody else is putting together. I will be sharing some insights about a tool from our friends over at Red-Gate that can help to change the face of the landscape in your development environments. This tool as illustrated so nicely by my Trojan Sheep, is called SQL Clone.

I will demonstrate the use of this tool to reduce the storage footprint required in Dev, Test, Stage, QA, UAT, etc etc etc. Based on client case study involving a 2TB database, we will see how this tool can help shrink that footprint to just under 2% – give or take. I will share some discoveries I met along the way and I even hope to show some internals from the SQL Server perspective when using this technology (can somebody say Extended Events to the Rescue?).

Why Attend?

Beyond getting some first rate training from some really awesome community driven types of data professionals, this is a prime opportunity to network with the same top notch individuals. These people are more than MVPs. They are truly technical giants in the data community.

This event gives you an opportunity to learn great stuff while at the same time you will have the chance to network on a more personal level with many peers and professionals. You will also have the opportunity to possibly solve some of your toughest work or career related problems. Believe me, the day spent with this group will be well worth your time and money!

Did I mention that the event is Free (with an optional paid lunch)?

Linked Servers and Stats

Linked Servers

A linked server is a fabulous feature in SQL Server to help accomplish various data tasks between local and remote servers. There is a time and a place for the use of linked servers. Unfortunately, some are of the belief that a linked server is the solution to all problems including the way to end world hunger.

You may recall that I have written about linked servers on a few occasions. Usually that is due to the relationship of a poorly performing process and the linked server. Here is a prime example here or here. On the other hand, here are some examples of using a linked server in a much better light here and here.

Using a linked server is entirely up to you. If you choose to use a linked server, you will need to understand there are risks involved – beyond just the documentation and performance nightmares it can cause. Today, I will share one of those fabulous gotchas – mostly because it has cropped up more than once with my clients.

Remote Stats

On more than one occasion I have had an emergency request because everything was broken. The everything in almost every incident is an SSIS package that is failing with error messages. The error message will typically have text similar to the following:

Could not locate statistics ‘_WA_Sys_00000015_346C780E’ in the system catalogs.

Due to the error, the package fails processing and grinds to a halt. When diving into the package it is discovered that the missing stats happen to be coming from a linked server query. This raises a big bright blaring alarm for me. Why is the SSIS package accessing the data via a linked server? This is rather counter-productive and definitely contrary to what is desired from a performance perspective.

You can certainly see why this is a sudden emergency right? Absolutely nothing is working anymore, right? Well, at least that is the way it seems for the person reporting the issue. Understanding that their realm of work has come to a halt is important in order to get to a resolution more quickly. Knowing that the point of failure is not as grande scale as claimed is critical because it can help you keep your cool while assessing the real problem.

Since the problem presents itself when accessing a specific object or even set of objects across a linked server, we have our scope of the issue scaled down quite a bit already. Now, what do we do from here? Obviously the statistic with that name is missing, right? Don’t be so hasty to assume it is actually missing. Let’s verify that the stat is indeed missing first. Even better – let’s just eliminate as many variables as possible. We can do this by querying the affected objects directly from SSMS.

Easy Troubleshooting for the DBA

 

For me, it is essential to eliminate variables when we run into a failure. In this case, I have just a few major variables that can potentially be causing this problem. Those variables are:

  1. SSIS
  2. Linked Server
  3. The query itself
  4. Security
  5. Bad Plan
  6. The stat is legitimately missing

I can easily assess the legitimacy of each variable through various quick tests. To eliminate or validate the “query” variable, I can issue a very simple query to retrieve data from the affected object. So let’s eliminate variables 1 and 3 in one fell swoop.

This query, in this scenario, results in the following:

This, while disappointing, is actually quite productive. This has eliminated two variables for me. I now know that SSIS is not related to the problem. I also know that query complexity is not related to the problem. There are still a few variables left to either validate or eliminate. Since I know the problem occurs when querying via linked server, let’s try querying the remote server direct (not via the linked server).

Well, while that does not entirely eliminate or validate any variables, it does tell me that the problem is still specifically related to the linked server. What if I try to use OPENQUERY instead of the traditional linked server query?

Wow, look at that? Ok, so this is a bit of trickery because I have told SQL Server to execute that query on the linked server as if it were a “local” query on that server. This does work without error and is definitely pushing the findings to be more conclusive that it is a linked server problem.

While the openquery version works, I do still need to eliminate some variables from the problem. One of the variables is security. Since the error mentions sp_table_statistics2_rowset, I googled about for that proc and found some mentions that maybe there are some column denies related to the stats in question that is giving the linked server some major fits. Or it could also be insufficient permissions to execute DBCC SHOW_Statistics. I highly doubt this to be an issue since the openquery version works while using the same pass through authentication of the linked server that the traditional linked server query would use.

In order to eliminate security as a potential cause, the test is simple (while it could be more complex, I went for the jugular to just eliminate the theory as quickly as possible) – I will add my account as the pass through account (which is a sysadmin on the remote server) and then query the linked server all over again. Suffice it to say, there was no change in the result – the error persisted.

This does not just yet eliminate the security variable because there could be a cached plan somewhere. So, just to be sure, I chose to flush the cache on both the linked server and the local server. Running the simple “Select *” query all over again yielded no difference in the query results. I can now safely saw that the problem is not related to a cached plan nor is it related to the security. At this point, I set the linked server security back to the way it was. I have effectively eliminated all variables but 2 and 6 (linked server and the stat is missing).

Let’s eliminate the missing stat variable right now.

Undoubtedly you have noticed that I built a drop statement into the result set from this particular query. That aside, the green highlighted row is the very stat that was producing the error. This stat is most certainly available on the source server (linked server). This entirely eliminates the sixth variable because the stat is not missing.

This brings us to the final step – see what happens when we drop the stat and try the query again. I have the drop statement ready to roll, so let’s just run that and see what happens. My risk here is very small. This is an auto-generated stat and will be recreated if needed. After dropping the stat, I run the simple “Select *” query across the linked server again and guess what? It works perfectly fine.

In my opinion, we can just skip a few of these steps if the problem is on a system generated stat and just go straight to dropping the stat and trying again.

That said, the better solution in this case would be to do one of two things.

  1. Modify the Package to not use the linked servers in this way and create a data source connection for each of the different server connections.
  2. Modify the Package to use OPENQUERY style queries.

My preference between those two options would be to create specific data sources. It seems very silly to pull data across the linked server to then be consumed via SSIS package.

I have just illustrated what I would classify as a nuisance problem related to linked servers. Simplifying the approach as I described with the two potential solutions would alleviate this nuisance while also giving an inevitable performance boost.

Your Name is Your Brand

Categories: Blogging, Professional, SSC
Comments: 3 Comments
Published on: October 13, 2017

This topic is one that has been stewing for a while and finally this week it decided to boil over.

My first thought with this post was to write a rant. Instead, I want to try and turn it on its end and try to put a different spin on the problem. Yes – I said problem.

Know your Audience

This started when I was setting up a piece of software (to demo to a client) for a vendor that I respect – A LOT. I noticed something peculiar about the software that was different than the documentation (and therefore expected result). I reached out to my contact at this company and he escalated a ticket to their support staff. After a few back and forth threads, my contact noticed something troubling – as had I. While I was going to set it aside but my contact was bothered by it too. The support personal started calling me by the wrong name.

Addressing somebody by the appropriate name or title is a pretty important topic. In this case, not only did they start addressing me by some other name, they had also lost sight of the existing relationship I had with this company as well as any previous information provided to them demonstrating level of expertise in the area of SQL Server or with the specific observation being reported. This is demonstrative of a lack of attention to detail – both in regards to the audience and the technical problem.

Your Name is Your Brand

If I throw a few names out there like Paul Randal, Steve Jones, Grant Fritchey or Brent Ozar, chances are you will know who those people are (given you are reading this blog post). These are people (like many other giants in SQL Server) that have built a brand based off their name. This is a good thing. You recognize the name and you recognize that they are very good at what they do.

Building your brand is not an overnight sensation. It does take a while and possibly a bit of luck here or there along the way. The one thing it always starts with is your name. This leads me to my next story.

This week I have been at Summit. This is not the traditional DBA Summit hosted by PASS, rather this is related to Dynamics (AX, NAV, GP, CRM, 365). Just like most conferences, we all get a nice easy to read name badge.

Hopefully you can easily read that mine clearly says my name is “Jason”. Besides the name, there is a QR code on the badge. Otherwise, the name is large enough you can generally read it from afar.

While meandering through the Exhibitor Hall, I stopped at one particular booth that was raffling off an R2D2 that particular day (they also raffled off a C3P0 and a Yoda on other days). And yes it is very rare for me to stop at one of these raffles – but it was R2D2.

As I was stopped to fill out the form for the raffle, I was approached by an eager booth dude. “Hey Larry, how’s it going?” he said. He was obviously talking to me, so I turned to him and said “Jason” while showing him the name on my badge. Not even 30 seconds later he did it again “So, Larry…”. At the sound of that, I had to cut him off and correct him again and reminded him I had just told him my correct name and even showed it to him via the badge. At this point I abandoned the R2 raffle and told him I was no longer interested.

This was coming from a vendor that was more than likely hoping to try and pitch their software. A lack of attention to the audience at hand just cost him a lead (albeit small compared to the number they gained). Now, that vendor has to worry about the word of mouth that may come from their lack of attention and focus.

Now, my name is pretty important to me. It may not be as important to you – but it is to me. Similarly, your name should be extremely important to you. You and I are all working at a similar task – building our own brand based on our names.

I am sure both of these vendors I have illustrated are also keen on building the brand around the corporate name they have established. Brand and name are recognizable. Now, put yourself in either of these two scenarios I have just described. Have you ever made a mistake with somebody else’s brand? Or, have you ever been on the receiving end of this kind of mistake? Suddenly the world is spinning a slightly different direction, right?

I won’t divulge the names of either vendor in this case (partly because I still have a good relationship with the one and I have already forgotten the name of the second), but think about this: What do you do for your brand when somebody steps on it? How we react (and granted I probably could have taken a higher road in the second case by gently reminding the vendor yet again what my name was) can do quite a bit for building a brand. It’s not always about what you know, but also about how you conduct yourself towards and around others.

page 1 of 1








Calendar
October 2017
M T W T F S S
« Sep   Nov »
 1
2345678
9101112131415
16171819202122
23242526272829
3031  
Content
SQLHelp

SQLHelp


Welcome , today is Friday, November 24, 2017