2010 Recap

Categories: News, Professional, SSC
Tags:
Comments: 2 Comments
Published on: December 28, 2010

As a part of a recap on the year that was 2010, I should start with a recap of my goals.  For that review I had to go back to my Q3 review.  As I looked it over, I found that there was nothing really different from that review than now that needs to be reported.

I have been extraordinarily busy this quarter with things other than that list of goals and things other than my blog.  That is not necessarily bad – and is actually quite good.  Here are some of the posts that talk about some of what has kept me busy.

  1. New Baby Girl born
  2. Job Change
  3. User Group
  4. Holidays and Birthdays (4 in December alone between myself and children)

In all reality, despite all of the busy-ness of the past quarter – it has been good stuff.

As I ponder over 2011, there have been plenty of good things that have happened to me and my career.  Several things have changed for me over the year.  I have blogged about many of those things as well.  Some of the professional related things are:

  1. Started Tweeting
  2. TSQL Tuesday (even hosted one)
  3. Actively Involved with the Local UG
  4. Blogger Awards
  5. SQLSat

All in all there have been some things that have helped me a lot in my career this past year.  This has been a full year with plenty to show for it.  For me, the items that are harder to quantify over this past year has been the friendships I have gained.  In addition to those friendships, I would also add that my network has grown significantly.  More friends – bigger network I guess is one way to describe that.  But there is also the community involvement that has increased exponentially (over prior years) for me.  I have put myself out there more (blog, speak, tweet, etc).  I have written a few articles that have been published.  I have more that I need to write and will write.  It is a bit weird to think of being an MVP, but some people out there have brought it up.  I know I was nominated, but I am completely ok with not being a recognized MVP.

Another aspect of my career that has improved over the past year is the improvement in my communication and writing skills.  This is a side-effect of the blogging and article writing.  I just need to become more consistent with it and maybe start pumping out some more technical as well as longer articles.

Next years goals will be coming soon.

SSIS Multiple File Export

Tags:
Comments: No Comments
Published on: December 28, 2010

Have you ever needed to export different data sets to different flat files?  Each of these data sets could be customer information for different clients – but they all require the same fields.  I have run into this requirement a few times and have found that it is not as hard as it may sound.  To evolve that requirement, I have also had the requirement to export multiple different files for each of the clients.  For the sake of this post though, I will just stick to how to do the first level.  (It is only a little more complex but not much different to add the extra requirement.)

SSIS to Save the Day

SSIS has a looping mechanism built in to help achieve this.

If you want to learn more about this particular data flow object, here is the MSDN resource on it.

When trying to create an SSIS package solution that meets our requirements, the first step (in my opinion) is to make sure you have an Object variable created for use by the Foreach Loop Container.

To use this variable in the foreach loop container, we first need to load data into it.  This is accomplished by using an Execute SQL Task.

Note:  There is a red-x in this image simply because I have not defined any properties for this object.

The Execute SQL Task needs to be given a data source and a SQL command to run.  The result set of this task should be set to full result-set on the general page.  On the result-set page, set the result-set to be stored in the object variable that we created already.  On the same page, you should set the result name to a value of “0” (zero without the quotes).

Now we can start looking more at the Foreach Loop Container.  When we open the properties of the Foreach Loop Container, we want to go to the Collection page and adjust two settings.  The first is to select the appropriate type from the dropdown.

In this case, we will select ForEach ADO Enumerator.  The next change to be made is the next drop down labeled “ADO Object source variable:”.  In this box we will select the object variable that we already created.

Now we need to configure the Variable Mappings page.  For this you can either create a new variable from this page, or use variables you have already defined.  Let’s Create some variables from here.

Once that is established we can work on configuring the Data Flow task.  For the sake of the example, we can do something simple.  It is a good idea to make sure you have your connection managers created by this point.  If they aren’t there – then let’s create some.  First, let’s create an OLEDB connection.  Second, we need to create a Flat File Connection Manager.  I won’t cover how to create those connection objects.  I will cover how to make a configuration change to the Flat File Connection Manager though.  To make sure the files will be written differently – we should use an expression on this connection manager.

In your flat file connection manager, you will need to look at the properties tab in order to create an expression.  In the Property Expressions Builder, select the ConnectionString Property from the drop down.  In the expression side build something like this:

[codesyntax lang=”sql”]

[/codesyntax]

Now back to that Data Flow task I mentioned already.  In your Data Flow Task, select the appropriate OLEDB Connection manager that you have created.  Then select SQL Command in the data access mode dropdown menu.  In the SQL Command text box, type an appropriate query to retrieve your data.

Example:

[codesyntax lang=”tsql”]

[/codesyntax]

Then click on the Parameters button.  From here, you will assign the appropriate variable to the Parameter that was created in the query.  In our example, we should use the variable [ User::TestSub ].

That pretty much covers it.  In a follow-up I will go through an actual setup that I have created.  More screenshots, and slightly different setup than this baseline.  Hope you enjoy.

Powershell, Me and the MCM part II

Comments: 1 Comment
Published on: December 20, 2010

Last week I posted the first part of this series.  I thought it would be a good idea to give a little follow-up on how my foray into Powershell as I attempted to download this videos.

Attempt 1:  Failed miserably.  I did not have enough disk space and thus only downloaded about half of the videos.  I tried to remedy this by changing the path to a different volume but that threw out some error messages.  The script works best when continuing to use the $env variable in the path for the location to dump the files.

Attempt 2:  From a different machine this was working beautifully until I noticed that every once in a while a file would error out and the download would resume with the next video in the list.  When I attempted to look up those particular videos, they were all present.  So I restarted thinking it was a local environment problem.

Attempt 3:  I figured out that attempt 2 was failing because my laptop was going to standby – duh!!!  I disabled the standby and finally got the download to work from start to finish without fail.

Now, I can place these videos on my Android and watch/listen from anywhere I have my phone.  Since I have an auxiliary port in my car, I can just plug the phone in, let it charge, and listen to the videos while I commute – through the car speakers.  It is nice to have such a high-level of training material available for a quick refresher or for that first time through.  I recommend people get on board and take advantage of what Microsoft, Paul Randal, and others have given to the community through this training.  Since I pay out of pocket for my training – this is the right price for me.

Thanks for the Christmas Gift.

Powershell, Me and the MCM

Comments: 3 Comments
Published on: December 16, 2010

If you haven’t heard yet, Microsoft has made some significant changes to the MCM program.  The changes make the certification more accessible to the masses.

You can read more about that from sources such as the following:

  1. Grant Fritchey
  2. Jason Strate
  3. Glenn Berry
  4. Microsoft

This post is not so much about the changes.  More it is about the information dissemination related to the MCM.  I saw on twitter that there were some videos available to watch concerning the MCM training.  I was curious where these videos were so I decided to go looking for the videos.  In my search for the videos, I came across the Blog post referenced above by Jason Strate.  In that post, he has a link to the videos.  I decided to check out the videos and I decided to start downloading them so I could use them as study aids as I work on the MCITP.

Downloading those videos one by one is somewhat tedious and time consuming.  Thus while some were in the process of downloading, I started working on a few other things and saw another twitter post come across about a Powershell script to download those MCM training videos.  This is great news – except I have never run a powershell script.

Well, no time like the present to learn.  First thing to do is to check out the blog article about the script – it has a few downloads.  The blog post is by Eric Humphrey, and can be found here.  After having read what he had to say about the script and downloading the necessary files that he lists, it was time to find out how to run a powershell script.  A little looking and I came across this article.

Excellent I am well under way now to run this script.  After following some of the recommendations in that article (e.g. security settings), it was time to look at the script and see what I needed to know from there, such as paths required for add-in files or paths for destination files.  Some quick adjustments to the userprofile path and to the download directory, as well as copying the HTMLAgilityPack into the userprofile path – and we are all set.

Now I have the script running and downloading the files – but it looks like it is going to be running for a long time.  This is a very useful tool for me at this moment.  This demonstrates another use for Powershell as well.  I haven’t yet decided that I will pursue the MCM, however I will use these videos to improve my knowledge base on SQL server.  I would recommend that people go check out these videos.  It was somewhat surprising to me to see that many of the pages had not been viewed very frequently at all.  This is great training material.  I recommend getting it and adding it to the reference material you may already have.

If you are just looking to browse the videos individually, you can find the site here.

Tis the Season

Categories: News, Professional, SSC
Tags: ,
Comments: 2 Comments
Published on: December 15, 2010

This month has been quite the month for change in my family.  The changes that we are experiencing did not end with the birth of our baby girl two weeks ago.  My employment is changing too.  Along with the employment change comes another change – moving back to Utah.

My wife and I feel that it is considerably better for our family to live in Utah closer to extended family.  The schools are better, we have family, and more room to grow.  The added benefit is that my car windows are less likely to be shot out while living in Utah (as has happened to me pulling into my driveway here in Las Vegas).

Since we are moving, and my employer is in Las Vegas, it stands to reason that my job would be changing too.  My employer has been gracious enough to allow me to telecommute part time from Utah (woohoo).  I also am working on filling in the work gaps with some other projects / contracts.  This is a step in the direction of becoming a consultant and running my own business.  All of that actually makes me nervously excited.

Here’s to changes and here’s to Happy Holidays.  Both of these changes are huge gifts for my family.  My children are extremely excited to be able to live closer to cousins and grandparents (not to mention aunts and uncles).

T-SQL Tuesday #13 – Business Requirements

Comments: 1 Comment
Published on: December 13, 2010

Business Requirements

We have made it yet another month and to yet another episode in the continuing saga known as TSQL Tuesday.  This month we are being hosted by Steve Jones ( Blog | @Wayoutwest ) of SqlServerCentral fame.  Steve has asked us to speak a little bit about business requirements, interpreting those requirements, and some of the pitfalls that may or may not exist in the whole communication process of getting a project done.  Now, can somebody please explain the requirements for me?

What issues have you had in interacting with the business to get your job done?

I think a common area that is easily overlooked when it comes to requirements and interpretation of requirements is report creation.  A common problem is that there are no defined or written requirements for the creation of a report.  When there are inadequate requirements, it is easy to miss the intention of the report and thus frustrate the business group that requested the report.  Another problem that can arise is the perceived inaccuracy of a report – even when all business requirements are properly met and signed off by the requesting business group.  How can that be?  Let me explain.

Perceptions

A recent problem was brought to light that revolved around the creation of two similar reports.  The reports were to be used by different business groups and each was requested by a different group.  The values in the reports no longer match due to a change requested on ReportA by GroupA.  GroupB doesn’t think this is accurate and wants both reports to match.  Small problem is that both reports should not produce the exact same results based on requirements and usage.  The report for GroupA has some extra requirements and filters placed on it to prevent the users from seeing data about former employees.  The report for GroupB is different in that it should show an overall summary for all data, even data of former employees.

Re-Alignment

Managing business requirements is almost as much about managing the perceptions of the business users as it is understanding what the business truly wants.  It is important that you are able to help them understand what it is you will be doing, what you can accomplish, when you can do it, and what impact it may have on other business parts (if they are known).  When an issue arises, it is best to approach the business and try to understand what the disconnect is.  In the case of the example of two reports outlined, I had to research and test both reports to get a better understanding of what the problem was.  Once I did that, then it was a matter of explaining the findings to the business.  From there, a discussion between business analyst and the two departments for which the reports serve needs to take place and an agreement made.

Conclusion

This is a rather short and straight-forward entry for this month for TSQL Tuesday.  This is a delicate subject and I am sure many people will have many more tales to tell.  I think it is most important to approach the business and try to help them understand what you understand from the requirements and get it hammered out before too much development has been done.

Wait Stats resources

Comments: 2 Comments
Published on: December 13, 2010

Today I was out and about looking for past roundups on TSQLTuesday.  While doing that, I came across a post from Paul Randal (BlogTwitter) about wait stats.  The thing that strikes me about this post is the great query and the excellent references to more resources.  I recommend that you check out the article he has written on the topic.  Also, make sure you read the reference material he has provided.

You can find it here.

SSIS Job Execution

Tags: , ,
Comments: No Comments
Published on: December 13, 2010

While working on a process to refresh the QA environment (data purge, reload data and reapply changes made over time as parts of release cycles), I ran into self-imposed requirement.  I was attempting to create a process that would perform the entire process consistently.  Amongst the items that I needed to include in this process to “refresh” the QA environment was the need to run a couple of SSIS packages.  SSIS was not installed on this particular server where I was creating this process.  I needed to have a way of executing two SSIS packages without installing SSIS on the local server.

With that need, I decided to test a remote execution of a job that had SSIS steps.  The method I decided to use was to employ OSQL.  Here is what I did for that.

[codesyntax lang=”tsql”]

[/codesyntax]

I added that line of code to a SQL Agent Job step.  The job step type is Operating System (CMDExec).

This method works well for me since I had already tested the package on the remote server and knew that it worked.  There are several other methods as well, this happened to be implemented more quickly for me at this time.

SQL Confessions

Comments: 1 Comment
Published on: December 13, 2010

I had a brainstorm of an idea for a group of articles on my blog a couple of weeks ago and am finally getting around to putting it together.  The idea came about while I was struggling with a problem at work and trying to meet the requirements in the code.  As the title states, this group is going to be about things done in SQL server that I would normally avoid.

The first article came about while trying to find a solution to some business requirements.  The requirements had an average to be calculated and then a revised average if certain criteria were met.  Then it was required to move things from one group to another group depending on a series of criteria.  When I first looked at these requirements, I thought I would need several cursors.  As it turns out, I had to implement just one cursor.  I could find no way of getting away from using a cursor for this one requirement.  The essence of it was that I had two variables.  One variable was the number of items to move and the second variable was the person to whom the items needed to be moved.  Not only did it require those two variables, but there was the additional requirement that it be randomized.

First, let’s take a look at how I did the randomizing.  I took care of this requirement by dumping all of my prep data into a temp table that had a default on a column explicitly to hold  a random value.  Here is the code for that table.

[codesyntax lang=”tsql”]

[/codesyntax]

Using this table, I can just run a set-based insert to populate the table and thus end up with random values in the “Randomizer” column due to the default constraint.

[codesyntax lang=”tsql”]

[/codesyntax]

That piece of the puzzle was fixed rather easily.  Now for the next part – the confession part.  I had to write a cursor.  Next part of that confession – I had to look up how to write the cursor.  Without going into a lot of details, this is the code of how I used a cursor to solve this particular problem.

[codesyntax lang=”tsql”]

[/codesyntax]

Note that this cursor is a little hybrid.  I am using this cursor to perform some set based actions.  I am fine with that.  I am happy with the random distribution and the fact that this performs well.  This was done for a one off requirement that will not be used again (or at least not planned to be used again) and it met all of the requirements.  I was blocked on this particular requirement by the required variables and couldn’t get past how to do this in a 100% set-based fashion.  If you have any input, let me know.

Do you have any such stories to share?

Bundle of Joy

Categories: News, SSC
Tags:
Comments: No Comments
Published on: December 8, 2010

After baking for 9 months, our bouncing baby girl has finally arrived.  She arrived Thursday Dec 2, 2010 at 8:28 AM (PST).  There were no complications and the delivery went smoothly.  Family is doing well.  My wife is doing well.  And baby is doing well.  Our newborn came in at 8lbs 4 oz and 20 inches.  Today she had her first doctors visit and we found out that she is only at the 50th percentile for weight and 95th percentile for height.  And here I thought she was a pretty big baby.

We are extremely pleased to have this new child in our family.

«page 1 of 2








Calendar
December 2010
M T W T F S S
« Nov   Jan »
 12345
6789101112
13141516171819
20212223242526
2728293031  
Content
SQLHelp

SQLHelp


Welcome , today is Tuesday, October 17, 2017