Musings of a Data professional

Stuart Moore

Month: July 2013

Powershell to move Windows Roles and Features between servers

I’ve been working a lot with building some test and development environments for systems that didn’t have them. And as those ‘unimportant’ features have been missed out, it wasn’t too suprising to find that the Build documentation for the servers was less than stellar.

So, how could I make sure that the test and dev boxes had the same Windows Roles and Features without wasting too much time. Well, PowerShell is quickly becoming my default answer to anything like this. And it turns out there’s a nice 1 liner you can put together to do it as well (well, 1 liner if you’ve already got the module imported 🙂 )

 

Import-Module ServerManager
Get-WindowsFeature | Where-Object {$_.Installed -eq $True} | Install-WindowsFeature -computername devbox

First up we import the ServerManager module, it’s loaded by default if you’re running Windows Server 2012 but it’s always good practice to make sure you’ve got it.

Get-WindowsFeature returns a list of all the available Roles and Features on the server, we filter that with Where-Object so we only have those roles and features that are installed. Then we pass the whole lot to Install-WindowsFeature and tell it to install it onto the remote box.

And the nice feature of Install-WindowsFeature is that it will only install anything that’s missing, so this snippet can be used to check for missing sub features

Speaking at Cambridge SQL Saturday

SQL Saturday Cambridge logo

I was very happy on Friday morning to open my email and find an invitation from SQL Saturday Cambridge organiser Mark Broadbent (blog|twitter) offering me a speaker slot at the event. Accepting the offer didn’t really require much thinking about, not the sort of offer you turn down.

So I know where I’ll be on the 28th September 2013; Queens College, Cambridge. I’ll be talking about using PowerShell to Automate your SQL Server backups and Restores, offering you some ideas about to offload some of the more monotonous bits of a DBAs job and how to ensure you can sleep soundly knowing your backups are good, and you’ll be able to prove it.

Looking at the ‘leaked’ news on the Friday’s PreCons on the event’s Facebook page, I may well head down for one of those as well. Looking like another SQL Saturday event to me

SQL Southwest User Group presentation on Automating SQL Sever backups, restores and verifications with PowerShell


Just a quick post to put up the slides from a presentation I did at SQL Southwest. Also here are the demo scripts to go along with the presentation – Demo scripts from powershell backup presentation July 2013

Hopefully everything’s self explanatory, but if you want to get in touch with any questions then please do,

T-SQL Tuesday #44 – The second chance

T-SQL TuesdayIt’s the 2nd Tuesday of the month, so it’s another T-SQL Tuesday. This months is being hosted by Bradly Ball (blog | twitter) who’s set us the assignment topic of “Second Chances”.

Now I’m sure most SQL Server DBAs (and all other techies for that matter) have lots of stories where something went horribly wrong ,and they got another chance to get it right, either with or without their employees best wishes. But those are mostly horror stories to scare people onto the right path!!

Instead I wanted to talk about about those ‘second chances’ you get, even when the first chance went well (And was probably ignored by the PTB becase of that, c’est la vie). They go well, and we feel happy as we leave the office. But there’s alway the niggling feeling that we could have done it better somehow. Maybe it could have been faster, or we could have taken the chance to implement more features, and that’s where our second chances come.

Even if I’ve done the same task before, I want another chance to try something different when I do it again. Perhaps I’ve read something online, and it looks good on my test rig, and now I want to try it on something large but not too important. Or I’ve been to some training (or conference) and picked up some useful information from a presenter. This is a perfect opportunity to try it out, but very carefully, unless you want to end up needing one of those less pleasant second chances.

And I take it as good practice to regularly review previously written scripts or setups. 3 months after you’ve written it’s amazing what can jump out at you as a less than brilliant solution, or a bit of fresh insight offers you a better way to set things up.  Another good reason for keeping notes on how things went when you do a large piece of work. For example, I have a folder full of notes I’ve built up building SQL Server clusters over the years. Because I can refer back I can make sure that I don’t repeat mistakes (and there have been some), but can also look at what can be improved (even non technical aspects, my recording spreadsheet for clusters has become a thing of great use rather than an after thought).

As SQL Server professionals we should be trying to make sure we’re always learning and refining our skills, so we need to make sure we take all the opportunities we get to improve and learn. Even if it is just doing something we’ve done before again, or we think we can shortcut things by just reusing the same scripts we’ve used before.

So grab hold of second chances when they come along as they can be a good thing.

Add-WindowsFeature cmdlet fails with the error code 0x800f0922 adding IIS role/feature

add-windowsfeature-error

I was working on trying install SSI (Server Side Includes) on an Windows 2012 IIS8 box, set up by someone else. Doing everything the usual way, I was using PowerShell (as I do for all server admining this days, just so much faster) like so:

import-module ServerManager
Add-WindowsFeature web-includes

Only to get the wonderful response shown above in the screenshot. Not very informative really. So I double checked the syntax, but that looked fine. Tried going through GUI to do it, same error just took longer to pop up. Spun up a Windows 2012 VM I had kicking around on my laptop and ran the same snippet, success. So what could it be. Googling showed some MS notes that didn’t offer much information that seemed relevant to this particular case, and in once case seemed to suggest a complete rebuild as the error was due to a corrupted system/iis config issue.

Not fancying a rebuild I thought I’d delve deeper into the IIS configuration to see if there was anything that could be manually fixed. Doing this I discovered that the box had once been part of a shared config setup across a number of boxes, and was now the only one left. BUT, it was still set up with a shared config folder. That was quickly removed and the config exported locally. Reran the Add-WindowsFeature, and this time everything added itself properly.

So it looks quite likely that the error is a catch all (for IIS at least) when the process can’t read or get a lock on the IIS config for one reason or another. So if it crops up checking through the config files for anything funny is probably a very good first step.

Getting Perfmon Data from SQL Server into R for analysis

In the last couple of posts we’ve looked at how to migrate perfmon data from CSV files into SQL Server (
Migrating perfmon CSV files into SQL Server for analysis) and how to use R to graph Perfmon data (
Simple plot of perfmon data in R).

So, the next obvious step is to combine the two, and use R to graph Perfmon data you’ve imported into SQL Server. It turns out that R makes this pretty simple via the RODBC Package.

Assuming you’ve created a SQL Server system DSN called perfmon_odbc, then the following code snippet would plot you a lovely graph of the counters:

  • % Processor Time
  • Avg. Disk Queue Length
  • Current Connections
  • Processor Queue Length

as recorded on Server A on the 19th June 2013:

install.packages(c("ggplot2","reshape2","RODBC"))
library("ggplot2")
library("reshape2")
library("RODBC")

perfconn<-odbcConnect("perfmon_odbc")
perf<-sqlQuery(perfconn,"select distinct a.CounterDateTime, b.CounterName, a.CounterValue from CounterData a inner join CounterDetails b on a.CounterID=b.CounterID where b.MachineName='\\\\Server A' and b.CounterName in('% Processor Time','Avg. Disk Queue Length','Current Connections','Processor Queue Length') and a.CounterDateTime>'2013-06-19' and a.CounterDateTime<'2013-06-20'")

perf$CounterDateTime <- as.POSIXct(perf$CounterDateTime)
ggplot(data=perf, aes(x=CounterDateTime,y=CounterValue, colour=CounterName)) +
geom_point(size=.5) +
stat_smooth() +
ggtitle("Server A OS Metrics - 11/06/2013")
odbcCloseAll()

Note that this time we didn’t have to melt our data like we did before, as the resultset from RODBC is already in the format we need. But also spot that we’ve had to add some extra \’s to the machine name. \ is an escape character in R, so we need the extra \ to escape the escape character.

Now you’ve all the flexibility of creating the dataset via T-SQL. This means you can start doing things like comparing 2 Servers across the same time period:

install.packages(c("ggplot2","reshape2","RODBC"))
library("ggplot2")
library("reshape2")
library("RODBC")

perfconn<-odbcConnect("perfmon_odbc")
perf<-sqlQuery(perfconn,"select distinct a.CounterDateTime, b.MachineName+' '+b.CounterName as 'CounterName', a.CounterValue from CounterData a inner join CounterDetails b on a.CounterID=b.CounterID where b.MachineName in('\\\\Server A', '\\\\Server B') and b.CounterName in('% Processor Time','Avg. Disk Queue Length','Current Connections','Processor Queue Length') and a.CounterDateTime>'2013-06-19' and a.CounterDateTime<'2013-06-20'")

perf$CounterDateTime <- as.POSIXct(perf$CounterDateTime)
ggplot(data=perf, aes(x=CounterDateTime,y=CounterValue, colour=CounterName)) +
stat_smooth() +
ggtitle("Server A &amp; B OS Metrics - 11/06/2013")
odbcCloseAll()

Note that in this SQL query we’ve specified 2 machine names, and then to make sure that R can distinguish between the counters we’ve appended the Machine Name to the Counter Name in the column list. I’ve also taken out the geom_point(size=.5) line, as with the number of counters now being plotted having the data points make the curves hard to see and compare.

You can extend this to pull the counters for a Server from 2 different days. This makes it easy to check if the spike is a normal daily occurence, or really is the source of your current issues:

install.packages(c("ggplot2","reshape2","RODBC"))
library("ggplot2")
library("reshape2")
library("RODBC")

perfconn<-odbcConnect("perfmon_odbc")
perf<-sqlQuery(perfconn,"select distinct a.CounterDateTime, 'Day 1 -'+b.CounterName as 'CounterName', a.CounterValue from CounterData a inner join CounterDetails b on a.CounterID=b.CounterID where b.MachineName='\\\\Server A' and b.CounterName in('% Processor Time','Avg. Disk Queue Length','Current Connections','Processor Queue Length') and a.CounterDateTime>'2013-06-19' and a.CounterDateTime<'2013-06-20' union select distinct a.CounterDateTime, 'Day 2 -'+b.CounterName, a.CounterValue from CounterData a inner join CounterDetails b on a.CounterID=b.CounterID where b.MachineName='\\\\Server A' and b.CounterName in('% Processor Time','Avg. Disk Queue Length','Current Connections','Processor Queue Length') and a.CounterDateTime>'2013-06-12' and a.CounterDateTime<'2013-06-13'")

perf$CounterDateTime <- as.POSIXct(paste("1900-01-01", substr(perf$CounterDateTime,12,28)))
ggplot(data=perf, aes(x=CounterDateTime,y=CounterValue, colour=CounterName)) +
stat_smooth() +
ggtitle("Server A Metrics - 13/06/2013 and 19/06/2013")
odbcCloseAll()

In this case we union the 2 results sets, and rename the counters to make sure R can identify which set are which date. We also set the date component on the time stamps to be the same day (I use 01/01/1900 as it’s nice and obvious when looking at a chart later to see that it’s been reset), this is to make sure R plots the time values against each other correctly.

Powered by WordPress & Theme by Anders Norén