1.04.2012

I’m speaking at SQL Saturday #104

Technorati Tags: ,

From some bizarre reason, I have been selected to speak at SQL Saturday #104 this Saturday in Colorado Springs CO

3wisemonkiesI

So come see me give my session PowerShell for the Reluctant DBA / Developer

This will be my 6th time presenting this talk I previously have presented this at all three Colorado SQL chapters (Denver, Boulder and Colorado Springs) as well as at SQL Saturday #99 in Minneapolis earlier this year. While I won’t promise to make you a PowerShell expert overnight, I hope that everyone will leave feeling comfortable with the basics of PowerShell and how it can be used with SQL Server to automate administrative and development tasks.

Did I mention there will be Laser Tag?

mg_2438

Set Phasers for “Stunning”

12.30.2011

Time to be a Master

Today I passed the  70-453 Transition Your MCITP SQL Server 2005 DBA to MCITP SQL Server 2008 Exam. This was the last test I needed to complete the Microsoft Certified Master Prerequisite's.

I had taken the test earlier this week and failed (by one question in my estimation) I usually score in the mid 800’s; but I found this test was a little difficult since there were only 48 questions so there didn’t seem to be much room for missing many questions.

As far as the exam I felt there were only a handful of questions that either needed more context or did not represent a real world scenarios.

Hopefully I can find time to take 88-970 The SQL Server 2008 Microsoft Certified Master Knowledge Exam in the upcoming months. I would also like to take the 70-452 which would give me the MCITP Business Intelligence.

5.24.2011

PowerShell Denver Event

Today I went to an in-person TechNet event on PowerShell at the Denver Microsoft office.  Don Jones (Blog |Twitter) led the event which was a 3 hour training / demo session covering an intro to PowerShell. There were over 100 people in attendance.

As a recreational user (and sometimes abuser)  of PowerShell for several; years, I still came away with a lot of tips. Don’s presentation style was engaging and 90 % demo based. He had a very conversational style and encouraged attendee interaction.

Here were a few of my takeaways:

Dot Source is Dead

With version 2.0, there is no longer any need to “Dot Source” scripts. Modules are the preferred approach and  they don’t take much extra effort.

Put it in your pipeline and smoke it

Definitely consider make your scripts pipeline aware. it’s really not that hard and allows you to get a lot of potential reuse. Personally, a lot of my scripts did not take advantage of this. I’m going to go back over them and see if it makes sense to update them.

Remoting Rocks

PSRemoting is definitely something that can be used in a lot of different ways. I need to spend some more time playing with this, as I often rdp into a remote server just to use the PowerShell console.

Back to Basics

Compare-Object, PSObject: I need to spend more time looking into these. Using them would have simplified several of my scripts.

The Road less Traveled

There were a few topics that I would have liked to see covered. I think these would have been useful to the target audience, which I took to be primarily desktop support and system administrators.

  • PowerShell job control
  • Passing parameters to legacy windows commands robocopy for example.
  • Dealing with UAC and Credentials (Don did cover this a little bit, I was surprised at how many people had UAC turned off)

Beef it’s what’s for Dinner

My only real complaint about this event was there were a few concepts such as $_ , Script blocks and the difference between () and {} that we had to overlook due to time constraints. This wasn’t much of a hindrance to me; but I could see where someone brand new to PowerShell would have gotten confused.

I suspect most of the audience hand enough experience with PowerShell that this wasn’t that big of a deal.

I’m definitely glad I went. I hope to be able to to steal incorporate some of the things I learned into my PowerShell for the Reluctant DBA / Database Developer presentation that I give.

Note to Microsoft: please bring back the free soda machines

11.29.2010

SSIS OLEDB Command Transform Invalid Operand Clash Error Workaround

I recently needed to use an SSIS OLE DB Command Transformation to call a stored procedure (sproc) to update some values in a database table for a SSIS package I was working on.

One of the input parameters to the sproc was a datetimeoffset 

CREATE PROCEDURE [dbo].[usp_Update_DateTime]
@Datetime datetimeoffset,
AS
SET NOCOUNT ON;


-- Update code simplified
PRINT @Datetime;


GO


I kept getting the following validation error:



image



The validation error text from the OLEDB Command Transform was:



Error at Data Flow Task [OLE DB Command [19]]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.

An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80004005  Description: "Operand type clash: int is incompatible with datetimeoffset".



Error at Data Flow Task [OLE DB Command [19]]: Unable to retrieve destination column descriptions from the parameters of the SQL command.



as a side note: even though the text box doesn’t look selectable, you can click inside of it and do a  “Select All” (CTRL-A),  “Copy” (CTRL-C) to get the validation error text into your clipboard.



Workaround



My workaround was to add a data conversion task and cast the datetimeoffset to a DB_TIMESTAMP.



image 



I then had to change the sproc parameter to a datetime. I modified the sproc to convert the datetime to datetimeoffset.



CREATE PROCEDURE [dbo].[usp_Update_DateTime]
@Datetime datetime,
@TimeZone char(6) = '+00:00'
AS
SET NOCOUNT ON;

DECLARE @datetimeOffset datetimeoffset;

SET @datetimeOffset = TODATETIMEOFFSET(@datetime, @TimeZone);

-- Update code simplified
PRINT @dateTimeOffset;


GO


This only worked since all of my incoming times were coming in normalized to UTC already. If they weren’t, I would have needed to pass in  value for @timeZone offset parameter.



Conclusion



I suspect that this problem occurs with the other newer date/ time parameters as well , but did not have time to test. While the workaround seemed to work ok, it was certainly a bunch of extra hoops and it also took me about 30 minutes of screwing around with SSIS to figure out that I wasn’t doing anything wrong. (The actual sprocs and dataflow were much more complex, and I had convinced myself SSIS was caching the parameters data types somehow)



 



 



Links



Date and Time Functions (Transact-SQL)



Integration Services Data Types

10.30.2010

The 2010 FogBugz and Kiln World Tour

I recently attended The 2010 FogBugz and Kiln World Tour event in Denver. I previously attended the last FogBugz event in Boulder and this event was just as good and was not plagued by the logistical issues of the boulder event from 2007.

I have been watching DVCS(Distributed Version Control Systems) gaining traction; but have been hesitant to move any work related projects to it. My main concern was how well these systems would work in windows environment and the potential pain of switching from our existing source control provider: SVN.

Here are the key points I took away from the presentation about DVCS:

  • Makes a copy of all history locally using Compressed diffs(Change sets) instead of versions
  • Separates the act of committing changes and sharing changes
  • you tend to have many smaller repositories as opposed to a single monolithic repository
  • using DVCS makes it easier and provides more options when you need to do a three way merge

The new features in Fogbugz were awesome and it looks like their support for bulk operations against many cases is well thought out and very user friendly. Fobugz and Kiln seem to be like peanut butter and jelly they just go better together.

The main cool integration between FogBugz and Kiln was the ability to create and perform code reviews.

Joel also hinted at the possibility that future versions of Kiln may support a pluggable SCM provider allowing for support for using GIt instead of mercurial for the underlying version control provider.

Here are some links that were given out during the presentation:

http://hginit.com/

http://www.fogcreek.com/kiln/

http://www.fogcreek.com/fogbugz/

7.16.2010

PowerShell Simple RSS Reader

I subscribe to numerous RSS feeds. Some of the feeds contain useful information; but others require some filtering to be useful.

For example, I subscribe to a feed from my favorite local computer book store SoftPro Books. I have setup a a small snippet of PowerShell to access their New Arrival’s RSS feed and look for any new titles matching a specific text in this case, “asp”.

This script is fairly simple; but I could easily modify it to search a text file of keywords (perhaps author names or specific titles), or even go one step further to scrape the html from the item link to search the information about the book.

Another option would be to have this script run on a schedule or via my PowerShell profile. I could even create and publish a filtered rss feed that would just show the title’s I’m interested in.

 
#Create a WebClient to fetch the feed
$wc = new-object System.net.WebClient
$feedUrl = "http://www.softprorss.com/web-arrivals.rss"

#Download the feed as Xml
$rssContent = [xml]$wc.DownloadString($feedUrl)

# read the xml
$rssContent.rss.channel.Item | ? {$_.title -match "asp"} | ft title, link -auto

9.05.2009

Using the GCM Approach to simplify BI Analysis and Design

 

I have often seen organizations struggle trying to identify KPI's in the Analysis and design phase of Business Intelligence projects  I have worked on.
Often, the Project Champion or Executive Sponsor has lofty goals for the project; however getting down to brass tacks becomes problematic
This is usually compounded by the desire to focus on the "sexy" metrics with no regard to the basic building blocks (Descriptive Statistics) that enable tracking these metrics.

The Goal Question Metric approach or GCM can be applied to help organizations translate business needs into actionable requirements which ultimately become KPI's inside a Business Intelligence Solution.

Here is a simplified outline of how to use this approach:

Step 1: Define the Goal

The Goal should be written in plain meaningful English and have the SMART attributes

Example: A college wants to increase student Completion Rates

Step 2: Define the Questions

What Information do we need to measure our progress to meeting this goal

Example:

What is the current completion rate for this Term?
What is the Trend? (increase or decrease for this year to last year)

Step 3: Define the metric

The Metric definition should include a complete description of what is included and excluded from the measurement.
You must also assessed the feasibility of actually obtaining the measure.

Example:

Completion Rate: The ratio of students who begin a term and are still enrolled at the end of the term irrespective of grade
only those students present at both snapshot are considered.

As you can see, applying this method often leads to a more complete analysis of the business domain at hand.
It also helps discover related area's of interest.

GCM Web Resources:

GCM Entry on Wikipedia http://en.wikipedia.org/wiki/GQM