Debate Script: Invoke-BigBird

So it’s on hand and at the ready tonight for you Mitt:

function Invoke-BigBird () {
	try {
		$clip = New-Object System.Media.SoundPlayer

        #File extension is .doc because my host won't allow audio files.  🙂
		$clip.SoundLocation = "https://sqlfamilyguy.files.wordpress.com/2012/10/big-bird.doc"

		$clip.Play()
	}
	catch {
	     Write-Host $Error[0]
	}
}

SQL Saturday #145 – My First SQL Saturday

I am sitting here now basking in the afterglow of spending the day at my first SQL Saturday event.  I had planned on traveling to Nashville back in 2010 for SQL Saturday #51, but life, work, and family kept me from making that trip a reality.  Had I known then what I was missing out on then I probably would have found a way to make it happen.  This time I had no reason not to go since I took a job in Nashville, TN last spring – I’m glad that I made it and now I can see what the ‘big deal’ is about these events!

I started the day off by getting up early anticipating a slow start because I came down with a nasty cold on Thursday and figured I’d need some time get moving.  Thankfully, when I woke up this morning my cold had miraculously disappeared!  I was truly blessed as now I didn’t have to worry about contending with fatigue and feeling generally crappy while trying to focus on the materials at hand.  Things went smoothly and I showed up at the venue at 7:47 am.

People were already showing up en masse and I was excited to be a part of this community.  As a Twitter user I had been following the chatter leading up to the event and was really excited about the crowd that was gearing up to be there.  Once I got inside, I got myself registered, got my bearings – where are the bathrooms, where are the exits, where are the session rooms, etc..  Once that was all done, I started hitting the vendors – I had some specific questions for SQLSentry about their new plan explorer as well as some of their monitoring tools.  After making my rounds with the vendors, I headed off to my first session: Gather SQL Server Performance Data With PowerShell by Allen White.

I am a PowerShell aficionado and absolutely love the technology.  It is rare for me to go more than a day or two without opening up a PoSH editor for some reason or another, so I was really excited to attend this session.  Allen did a great job in his presentation – I really enjoyed his presentation style and it was a good session to start the day off with.  Although I didn’t learn anything that I hadn’t already known coming into the session, it was nice to see that his approach to parsing some of the counter process data was very similar to my own.  I had wondered for quite a while if I was just convoluting the process – surely there must be a more elegant way to parse out the process prefixes to get at the raw values.  Allen put those thoughts to rest as he had many of the same types of processes set up to determine those unknown variables as I previously had.  I think that I would really enjoy attending an advanced level session presented by Allen.

My second session was “TempDB Performance Troubleshooting and Optimizing” by Eddie Wuerch.  After hearing about the production environment that he supports on a daily basis all I can say is WOW!  His demonstration of the differences between temporary tables and temporary variables as well as best practices for getting the best performance out of your tempdb was great!  The unexpected deep dive on data file structure and page allocations was both a good refresher and revealing.  Demonstrating how these structures affect database response and definitely got me thinking harder about some of the practices that we follow in our day-to-day operations.  At this point my mind was really moving and I was geared up for my third session for the day.

My next session was “Indexing Deep Dive” by Kevin Boles.  This was another great session.  Although most of the information here was nothing new to me, there were some points brought up that I realized that I had never really thought too much about.  Specifically populating new columns with null values and trying to ensure (as much as possible) that our IAM pages have contiguous 1s in them.

After lunch I started back with a session by David Klee entitled “Virtualizing Business Critical SQL Servers”.  Once again I was “WOW’ed” – David definitely knows his stuff and understands how to effectively virtualize a SQL Server environment.  Little nuggets such as learning that overallocation of CPU cores can actually do more harm than good were among many things that I learned from his presentation.  All I can say after this is that I would LOVE to have his group set up our virtualization environment and train our administrators – I think that we could go all virtual without a hitch!

The fifth session I attended was by Jason Strate – Extended Events: Work Smarter, Not Harder.  I’ve done some work around deadlocking with extended events recently and was looking forward to learning *anything* that I could about them.  Jason delivered in his presentation and showed us how easy administration of SQL Server can be once you learn how to use the extended events.  I think that I’m going to start digging in around extended events – I believe that they can make my life a lot easier, and I want to get ahead of the curve before profiler and tracing are no longer options.

My final session for the day was by Tim Ford and covered DMOs.  I used DMOs on a daily basis, but one of the things that I always struggle with is remembering all of them.  Tim has come up with a great way to organize them so that they are easier to keep up with – a periodic table of DMOs.  Overall this was another great presentation and I look forward to getting a copy of that ‘periodic table’ to hang in my workspace.

After everything came to a close, I came home to spend some time with the family and unwind a bit.  I had a great time at this event and can’t wait to make it to another.  I have been inspired to dig deeper on several facets of SQL Server and my interest in learning more about the technology has been refreshed.

A special thanks goes to all of the people, presenters, and vendors that made this SQL Saturday event possible – I hope to be more involved with the next one that takes place locally.  Who knows, maybe I’ll even submit a session…

Getting SQL Server Database Mail Settings With PowerShell

Here’s a quick & dirty PoSH script that shows you how to get the mail settings from a SQL Server 2008 instance using PoSH v2.0 and above.

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') "MyServer\MyInstance"

$MAIL = $srv.mail

Clear-Host

Write-Host ($srv.Name + " Mail Servers")
Write-Host "-----------------"

foreach ($Account in $MAIL.Accounts) {
     Write-Host ("Account: " + $Account.Name)
     Write-Host "*****************"

     foreach ($MailServer in $Account.MailServers) {

          Write-Host ("Enable SSL: " + $MailServer.EnableSsl)
          Write-Host ("Port: " + $MailServer.Port)
          Write-Host ("Server Type: " + $MailServer.ServerType)
          Write-Host ("Use Default Credentials: " + $MailServer.UseDefaultCredentials)
          Write-Host ("User Name: " + $MailServer.UserName)
          Write-Host ("No Credential Change: " + $MailServer.NoCredentialChange)
          Write-Host ("Name: " + $MailServer.Name)
          Write-Host

     }

     Write-Host "*****************"

}

Should give you output that looks something like this:

MyServer\MyInstance Mail Servers
-----------------
Account: MailAccount
*****************
Enable SSL: False
Port: 25
Server Type: SMTP
Use Default Credentials: False
User Name:
No Credential Change: False
Name: mail.mydomain.com
*****************

I wouldn’t use these objects like this, but wanted to demonstrate the concept here.  You can get to different properties, such as the mail server name, in other ways as well, such as going into the MailServer.Parent.MailServers object, but I feel that this is the easiest method for demonstration purposes.  To view more information about these objects, check out the Microsoft.SqlServer.Management.Smo.Mail namespace.

SQL Server Replication Monitor – Hinky Behaviors….

I was working with replication today and testing out some commands when I ran across some strange behaviors with the monitor sproc.  I’m going to outline what I’ve done here in an effort to document it and hopefully receive some sort of explanation as to what is happening.

Environment:

I’m running the following versions for this test:

Microsoft SQL Server 2008 R2 (SP1) – 10.50.2500.0 (X64)   Jun 17 2011 00:54:03   Copyright (c) Microsoft Corporation  Developer Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1)

I have multiple instances located on the development machine that I am using, but here is the relevant setup information:

Instance Name:     DEVELOPMENT08
Publication Database:    AutomationTestDB_Master
Publication Name: AutomationTest_TRANS
Publication Details: Transactional, Push
Distribution Agent: Continuous, located on same instance (DEVELOPMENT08)

Instance Name: DEVELOPMENT08_01
Subscription Database: AutomationTestDB_Replicate1

Instance Name: DEVELOPMENT08_03
Subscription Database: AutomationTestDB_Replicate3

Producing the behavior:

After profiling the DEVELOPMENT08 instance while using the replication monitor, I found that it inserts a tracer token by executing the following command:

EXEC dbo.Monitor_Replication @publication = [AutomationTest_TRANS]

With replication running and all subscribers up-to-date, when I execute this command I get the following output:

This information can also be seen on the ‘Tracer Token’ tab of the Replication Monitor for this publication:

Everything is as expected – the tracer token was sent to the distributor and returned while reporting latency along each stop in the path.  The problem appeared when I started testing what happens if there is a simulated network outage.

To simulate the network outage, I stopped one of the subscriber instances (DEVELOPMENT08_01) and ran the replication monitor procedure.  I expected to get a message indicating failure here, but much to my surprise, it was reported that the token made it to the subscriber and back without any issue:

WHAT JUST HAPPENED HERE?  I inserted a token and was told by the monitoring sproc that it came back, but when looking at the commands, it shows that there is an undelivered command.  There is absolutely no other traffic on this machine as it is kept in a virtual development environment that I have full control over.  As stated before, I expected to see a failure indicating that the subscriber was unavailable, or at the very least see the procedure continue to run.

Now I was curious, so I ran the monitoring sproc again to see what would happen:

This time the sproc just continues to spin, and there are now two undistributed commands.  Why didn’t it complete this time?  I haven’t changed anything about what I am doing here – same procedure as with the first token, but this time the monitoring sproc continues to wait for the token to return (as I expected it to do with the first token).

After allowing this to run for about 7 minutes or so, I started the subscriber instance and observed the following:

The tokens went through, the sproc completed, and there were no undistributed commands left once replication had a chance to catch up.

My next step was to test if this was just a byproduct of executing the command from T-SQL rather than through the replication monitor GUI.  I stopped the subscriber (DEVELOPMENT08_01) again, and used the replication monitor GUI to insert a tracer token:

As seen here, even though replication monitor shows that there is an issue with replication (denoted by the red stop sign icon), the token was reported as having gone through to the subscriber.  Unfortunately, that token is still showing up as an undistributed command, just as in the case when executing this from the T-SQL query.

I wanted to replicate the first case again, so I inserted another token without starting the subscriber instance and observed the following:

This time the replication monitor GUI is showing a ‘Pending…’ status and there are now two undistributed commands showing.  Just as in the first example!

Following the first example, now it is time to start the subscriber again and observe the results:

Everything is caught up and the tokens have come through!

What is going on here and why does this sproc return successfully on the first attempt when the subscriber is powered down?  I cannot understand this behavior at all and am beginning to think this is a bug.

If anyone can offer any insight into this behavior then it would be greatly appreciated – I’d really like to know what is going on!

PowerShell – Things I’ve Learned (Part 1)

In my PowerShell adventuring over the last couple of months I have learned quite a few things.  In this series of blog posts I am going to outline each of those items for my reference and in the hopes that others out there might not repeat these practices.

When joining any team, we all want to add value to that team.  Existing teams tend to have their strengths and weaknesses and it is up to you, as the new member, to find out where you can best fit in.  In my new position, I have managed to set myself up as the “PowerShell Guy” – which I am fine with.  Before starting this position I realized that PowerShell was something that I wanted to work with as a DBA to help automate my life and simplify my daily tasks.  Taking a position where there was a need (and I was worried about losing my .NET skills) was just what I needed to get things started.  Enough with all of that – now on to some substance!

One of the first mistakes that I found myself making (in serial fashion) was my usage of ‘Write-Host’.  I used this everywhere – for debugging, to return variables, to write information to the screen, etc..  This may not sound too bad to some of you, but there are much better ways to achieve what I was using Write-Host for:

  1. Debugging: When I needed to debug something then I would just use Write-Host and comment/uncomment as needed.  I no longer need to do that since I started using Write-Debug.  Now I just add -Debug when calling my functions if I need that functionality.
  2. Informational: I might want to find out where I am in code, where output is coming from, or just provide feedback to the user when running a function.  I was using Write-Host and commenting/uncommenting where applicable.  This became difficult when my supervisor asked me to add a flag that would allow him to see the comments ad-hoc.  I initially wrote a lot of logic to handle this request – bad plan.  I found that I could just use Write-Verbose and add -Verbose when calling my functions to get the same functionality.
  3. Output: I was initally using either Write-Host or return $variable to get an object from one function to either the display or another function.  This was another case where I wrote custom logic to determine whether or not the user wanted to view the output or use it for something else.  I eventually found Write-Output and this solved my problems.
  4. Displaying Errors: I was using the ‘Write-Host, comment/uncomment’ strategy here too.  Life is easy now that I have employed Write-Error.
  5. Displaying Warnings: Again, this is a similar case to my usage of Write-Error.  Now I use Write-Warning.
  6. Displaying Progress: Instead of using Write-Host to display textual progress, I now use Write-Progress which provides a nice little progress bar for viewing.

PowerShell 2.0 provides so many useful write functions and it is up to you to make correct usage of them.  Unfortunately, as a newcomer to PowerShell, many of the tasks I was working on had solutions written in PowerShell 1.0 and I wasn’t aware of all of these new write functions.  Since I’ve found them they have made my code cleaner and more dynamic.  I have even standardized some of my snippets to use the Write-Verbose and Write-Error in key sections.

There will be many more posts in the “Things I’ve Learned” series of posts.  I hope that as long as I’m using PowerShell I’m going to be able to continue posting to this series.  I already have a long list after just 2.5 months in the environment.  My problem now is that I need to break away from the ISE long enough to write about what I’m learning.

New Blog Site!

After getting too busy with moving and life in general I managed to let my previous domain and Joomla! site expire, so I’m starting anew here at WordPress!  This should give me an outlet to share my ponderings on my personal experiences as well as continue to share my SQL knowledge with anyone who is interested.  I hope to stay active here and build a solid foundation for (at the very least) my own reference when I have those “how’d I do that before..” moments at work.  It looks like I have quite a bit of configuration and customization to do here now, but this is a good time for it as my family still hasn’t made the move to our new location yet and I have some time to kill waiting on them to arrive.  I hope that you come back again and that your visit provided some value to your day.