Musings of a Data professional

Stuart Moore

Tag: sql server

Day -2 of 31 days of SQL Server backup and Restore using Powershell: Introduction

Welcome to day -2 of 31 days of posts talking about using PowerShell to automate SQL Backup and Recovery.

Yes, it is day minus 2. I thought before I got onto the how tos and scripts, and to save wasting some of the 31 days I’d post some background information about why I’d use PowerShell for these tasks, how to set yourself up if you’re not already using PowerShell and also explain some of the assumptions I’ll be making in my scripts throughout this series.

So before we talk about the how, let’s think the why?

SQL Server database backups are pretty simple to setup and schedule. If you’re running a simple setup, then you can use the Maintenance Plan wizard to quickly get your databases backing up on a schedule. But how about if you want to do something not so simple? How about being able to choose where a particular database is being backed up to? Or catch newly created databases for a full backup during the day? And what can you do about scheduling SQL Express backups when it doesn’t come with SQL Server Agent? The scripts and ideas presented should help you to come up with simple solutions to achieve all this

We all know we should be regularly restoring our databases to make sure that our backups are valid, and that we have a proper idea of our Recovery Time Objective. But it’s not the most riveting part of our job, and SQL Server doesn’t offer any way of easily automating database restores without having to write scripts for each individual case. Using PowerShell and it’s SQLPS module you can easily write scripts that will cope all restore eventualities. This means you can automate the testing, and only have to intervene when a problem is found.

And once you can automate backup and restores it becomes easy to create scripts to migrate environments between servers, or restore production data into a dev environment or speed up creating Mirrored or AlwaysOn High Availability setups.

Tomorrow’s post will discuss the components needed to perform these backups – Day -1 of 31 days of SQL Server backup and Restore using Powershell: SQL Server Powershell Components

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Migrating Databases From SQL Server 2000 to SQL Server 2012 using PowerShell

PowerShell IconI seem to have a spate of 3rd party applications finally moving from SQL Server 2000 to a version that’s a lot newer (SQL Server 2012). This means a lot of database migration, and as they want to keep the applications up while they test the new versions this means using backup and restore.

As Microsoft is only supporting restoring backups from the previous 3 versions (SQL Server 2005, 2008 and 2008R2) with SQL Server 2012 this the process actually has to go like this:

  1. Backup the SQL Sever 2000 database
  2. Restore the database onto an instance of SQL Sever 2005, 2008 or 2008R2
  3. “Upgrade” database by setting Compatibility Level to new Server version
  4. Backup new database
  5. Restore database onto SQL Server 2012 instance
  6. “Upgrade” database by setting Compatibility Level to new Server version

Which to me is a lot of manual handling for quite a lot of databases. And did I mention that steps 2 and 5 will probably also mean relocating data and log files (potentially multiples of each), and some full text indexes as well for good measure? All of which makes this an unappetising prospect to do manually.

So time for some PowerShell automation. First off we import our good friend the SQLPS module, and then define 2 simple functions:

Import-Module SQLPS -DisableNameChecking

set-location c:\

function DataBase-Restore
{
param([String]$SQLServerName, [String]$DataBaseName, [String]$BackupFilePath, [String]$RestorePath, [Boolean]$Upgrade=$FALSE)

    $sqlsvr_rest = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server ($SQLServerName)
    $restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
    $devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

    $restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupFilePath,$devicetype)
    $restore.Database = $DataBaseName
    $restore.replacedatabase = $false

    $restore.Devices.add($restoredevice)

    $restore.Devices.add($restoredevice)
    if ($RestorePath.Length -gt 0){
        $restorefiles = $restore.ReadFileList($sqlsvr_rest)
        foreach($file in $restorefiles){
	        $rf = new-object -typename Microsoft.SqlServer.Management.Smo.RelocateFile
	        $rf.LogicalFileName = $file.logicalname
	        $rf.PhysicalFileName = $RestorePath+(split-path $file.PhysicalName -Leaf)
	        $restore.RelocateFiles.Add($rf)
	        Remove-Variable rf
        }
    }

    $restore.sqlrestore($sqlsvr_rest)
    $restore.Devices.Remove($restoredevice)
    if ($Upgrade -eq $TRUE){
        $sqlsvr_rest.Databases.Item($DataBaseName).CompatabilityVersion = $sqlsvr_rest.Databases.Item("Model").CompatabilityVersion
    }
}

function Database-Backup
{
param([String]$SQLServerName, [String]$DataBaseName, [String]$BackupFilePath)
    $sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($SQLServerName)
    $backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.Database = $DataBaseName
    $backupname = $BackupFilePath

    $devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

    $backupdevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem -ArgumentList $backupname,$devicetype
    $backup.Devices.Add($backupdevice)
    $backup.SqlBackup($sqlsvr)
    $backup.Devices.Remove($backupdevice)

}

The eagle eyed amongst you will note Set-Location c:\ just after the import-module. That’s because the last thing SQLPS does is Set-Location sqlserver:\ to move to the SQLSERVER drive, which will make some of the later file copying much trickier, so we just make sure we’re back on a file drive.

First function Database-Restore takes the following parameters:

  • SQLServer Name – string of Server Name (and Instance Name if it’s a Named Instance)
  • DatabaseName – string of your Database Name
  • BackupFilePath – string of where the backup you want to restore is held
  • RestorePath – the path you want to move restored files to
  • Upgrade – optional boolean parameter to determine whether to upgrade the restored database to the server’s current compatibilitylevel

Nothing too unusual in the function, going through it quickly we perform the following operations:

  • Create a connection to SQL Server
  • Create a restore object, set it’s database, build a restore device and add our backup file to it. We set the option not to overwrite an existing database, just in case…..
  • If we’ve passed in a folder to move them to, then Read a list of files from the backup file, then loop through it creating a File Relocation object for every one of them, and adding them to our Restore object
  • Perform the actual restore (line 31)
  • Then if we’ve set Upgrade to TRUE we set the compatibility level equal to that of Model, and then call AlterDB() to write it back

The second function Database-Backup takes the following parameters:

  • SQLServer Name – string of Server Name (and Instance Name if it’s a Named Instance)
  • DatabaseName – string of your Database Name
  • BackupFilePath – string of where to put the backup.

So how are these going to help us migrate our databases? Simple, we can call them repeatedly and leave them running while we get on with something more interesting. If you’re lucky enough to have a shared backup drive that all your SQL Server Instances can read and write to then the upgrade is as easy as:

Database-Backup -SQLServerName "SQL2000\InstanceA" -DatabaseName "App1" -BackupFilePath "\\shared\backupdrive$\app1-2000.bak"
Database-Restore -SQLServerName "SQL2008" -DatabaseName "App1" -BackupFilePath "\\shared\backupdrive$\app1-2000.bak" -RestorePath "e:\dbs" -Upgrade $TRUE
Database-Backup -SQLServerName "SQL2008" -DatabaseName "App1" -BackupFilePath "\\shared\backupdrive$\app1-2005.bak"
Database-Restore -SQLServerName "SQL2012\InstanceB" -DatabaseName "App1" -BackupFilePath "\\shared\backupdrive$\app1-2005.bak" -RestorePath "f:\dbs" -Upgrade $TRUE

If not, then you’ll have to do some file moving yourself like so:

Database-Backup -SQLServerName "SQL2000\InstanceA" -DatabaseName "App1" -BackupFilePath "g:\backups\app1-2000.bak"
Copy-Item "\\SQL2000\g$\backups\app1-2000.bak" "\\SQL2008\f$\backups\app1-2000.bak"
Database-Restore -SQLServerName "SQL2008" -DatabaseName "App1" -BackupFilePath "f:\backups\app1-2000.bak" -RestorePath "e:\dbs" -Upgrade $TRUE
Database-Backup -SQLServerName "SQL2008" -DatabaseName "App1" -BackupFilePath "f:\backups\app1-2008.bak"
Copy-Item "\\SQL2008\f$\backups\app1-2005.bak" "\\SQL2012\f$\backups\app1-2005.bak"
Database-Restore -SQLServerName "SQL2012\InstanceB" -DatabaseName "App1" -BackupFilePath "f:\backups\app1-2005.bak" -RestorePath "f:\dbs" -Upgrade $TRUE

When placing the files remember that the script is running as the Windows account calling it, but the SQL Sever backups and restores will be running as the Database Engine Account.

Now I can happily migrate those databases to a happier place repeatedly without having to take time out from other larger projects. Win for me, and win for the people I’m working for.

Downloads for “Automating SQL Server Backups and Restores with PowerShell” presentation from Southampton SQL Sever User Group 07/08/2013

Hi, if you’ve just landed here you might be interested in the series of posts >31 days of SQL Server Backup and Restore with PowerShell where I’ll be providing more information about the concepts and scripts in this presentation

Scripts and Demos for PowerShell SQL Server Backup and Restore presentation given at Southampton SQL Server User Group 7th August 2013

Includes the presentation as a pdf, all the PowerShell scripts used during the presentation, the SQL scripts to build the demo databases, plus the backups of a couple of the databases to speed things up.

Readme.txt gives a quick overview of each script and the order to run through them in.

Any problems or Questions, please get in touch.

SQL Southwest User Group presentation on Automating SQL Sever backups, restores and verifications with PowerShell


Just a quick post to put up the slides from a presentation I did at SQL Southwest. Also here are the demo scripts to go along with the presentation – Demo scripts from powershell backup presentation July 2013

Hopefully everything’s self explanatory, but if you want to get in touch with any questions then please do,

T-SQL Tuesday #44 – The second chance

T-SQL TuesdayIt’s the 2nd Tuesday of the month, so it’s another T-SQL Tuesday. This months is being hosted by Bradly Ball (blog | twitter) who’s set us the assignment topic of “Second Chances”.

Now I’m sure most SQL Server DBAs (and all other techies for that matter) have lots of stories where something went horribly wrong ,and they got another chance to get it right, either with or without their employees best wishes. But those are mostly horror stories to scare people onto the right path!!

Instead I wanted to talk about about those ‘second chances’ you get, even when the first chance went well (And was probably ignored by the PTB becase of that, c’est la vie). They go well, and we feel happy as we leave the office. But there’s alway the niggling feeling that we could have done it better somehow. Maybe it could have been faster, or we could have taken the chance to implement more features, and that’s where our second chances come.

Even if I’ve done the same task before, I want another chance to try something different when I do it again. Perhaps I’ve read something online, and it looks good on my test rig, and now I want to try it on something large but not too important. Or I’ve been to some training (or conference) and picked up some useful information from a presenter. This is a perfect opportunity to try it out, but very carefully, unless you want to end up needing one of those less pleasant second chances.

And I take it as good practice to regularly review previously written scripts or setups. 3 months after you’ve written it’s amazing what can jump out at you as a less than brilliant solution, or a bit of fresh insight offers you a better way to set things up.  Another good reason for keeping notes on how things went when you do a large piece of work. For example, I have a folder full of notes I’ve built up building SQL Server clusters over the years. Because I can refer back I can make sure that I don’t repeat mistakes (and there have been some), but can also look at what can be improved (even non technical aspects, my recording spreadsheet for clusters has become a thing of great use rather than an after thought).

As SQL Server professionals we should be trying to make sure we’re always learning and refining our skills, so we need to make sure we take all the opportunities we get to improve and learn. Even if it is just doing something we’ve done before again, or we think we can shortcut things by just reusing the same scripts we’ve used before.

So grab hold of second chances when they come along as they can be a good thing.

Getting Perfmon Data from SQL Server into R for analysis

In the last couple of posts we’ve looked at how to migrate perfmon data from CSV files into SQL Server (
Migrating perfmon CSV files into SQL Server for analysis) and how to use R to graph Perfmon data (
Simple plot of perfmon data in R).

So, the next obvious step is to combine the two, and use R to graph Perfmon data you’ve imported into SQL Server. It turns out that R makes this pretty simple via the RODBC Package.

Assuming you’ve created a SQL Server system DSN called perfmon_odbc, then the following code snippet would plot you a lovely graph of the counters:

  • % Processor Time
  • Avg. Disk Queue Length
  • Current Connections
  • Processor Queue Length

as recorded on Server A on the 19th June 2013:

install.packages(c("ggplot2","reshape2","RODBC"))
library("ggplot2")
library("reshape2")
library("RODBC")

perfconn<-odbcConnect("perfmon_odbc")
perf<-sqlQuery(perfconn,"select distinct a.CounterDateTime, b.CounterName, a.CounterValue from CounterData a inner join CounterDetails b on a.CounterID=b.CounterID where b.MachineName='\\\\Server A' and b.CounterName in('% Processor Time','Avg. Disk Queue Length','Current Connections','Processor Queue Length') and a.CounterDateTime>'2013-06-19' and a.CounterDateTime<'2013-06-20'")

perf$CounterDateTime <- as.POSIXct(perf$CounterDateTime)
ggplot(data=perf, aes(x=CounterDateTime,y=CounterValue, colour=CounterName)) +
geom_point(size=.5) +
stat_smooth() +
ggtitle("Server A OS Metrics - 11/06/2013")
odbcCloseAll()

Note that this time we didn’t have to melt our data like we did before, as the resultset from RODBC is already in the format we need. But also spot that we’ve had to add some extra \’s to the machine name. \ is an escape character in R, so we need the extra \ to escape the escape character.

Now you’ve all the flexibility of creating the dataset via T-SQL. This means you can start doing things like comparing 2 Servers across the same time period:

install.packages(c("ggplot2","reshape2","RODBC"))
library("ggplot2")
library("reshape2")
library("RODBC")

perfconn<-odbcConnect("perfmon_odbc")
perf<-sqlQuery(perfconn,"select distinct a.CounterDateTime, b.MachineName+' '+b.CounterName as 'CounterName', a.CounterValue from CounterData a inner join CounterDetails b on a.CounterID=b.CounterID where b.MachineName in('\\\\Server A', '\\\\Server B') and b.CounterName in('% Processor Time','Avg. Disk Queue Length','Current Connections','Processor Queue Length') and a.CounterDateTime>'2013-06-19' and a.CounterDateTime<'2013-06-20'")

perf$CounterDateTime <- as.POSIXct(perf$CounterDateTime)
ggplot(data=perf, aes(x=CounterDateTime,y=CounterValue, colour=CounterName)) +
stat_smooth() +
ggtitle("Server A &amp; B OS Metrics - 11/06/2013")
odbcCloseAll()

Note that in this SQL query we’ve specified 2 machine names, and then to make sure that R can distinguish between the counters we’ve appended the Machine Name to the Counter Name in the column list. I’ve also taken out the geom_point(size=.5) line, as with the number of counters now being plotted having the data points make the curves hard to see and compare.

You can extend this to pull the counters for a Server from 2 different days. This makes it easy to check if the spike is a normal daily occurence, or really is the source of your current issues:

install.packages(c("ggplot2","reshape2","RODBC"))
library("ggplot2")
library("reshape2")
library("RODBC")

perfconn<-odbcConnect("perfmon_odbc")
perf<-sqlQuery(perfconn,"select distinct a.CounterDateTime, 'Day 1 -'+b.CounterName as 'CounterName', a.CounterValue from CounterData a inner join CounterDetails b on a.CounterID=b.CounterID where b.MachineName='\\\\Server A' and b.CounterName in('% Processor Time','Avg. Disk Queue Length','Current Connections','Processor Queue Length') and a.CounterDateTime>'2013-06-19' and a.CounterDateTime<'2013-06-20' union select distinct a.CounterDateTime, 'Day 2 -'+b.CounterName, a.CounterValue from CounterData a inner join CounterDetails b on a.CounterID=b.CounterID where b.MachineName='\\\\Server A' and b.CounterName in('% Processor Time','Avg. Disk Queue Length','Current Connections','Processor Queue Length') and a.CounterDateTime>'2013-06-12' and a.CounterDateTime<'2013-06-13'")

perf$CounterDateTime <- as.POSIXct(paste("1900-01-01", substr(perf$CounterDateTime,12,28)))
ggplot(data=perf, aes(x=CounterDateTime,y=CounterValue, colour=CounterName)) +
stat_smooth() +
ggtitle("Server A Metrics - 13/06/2013 and 19/06/2013")
odbcCloseAll()

In this case we union the 2 results sets, and rename the counters to make sure R can identify which set are which date. We also set the date component on the time stamps to be the same day (I use 01/01/1900 as it’s nice and obvious when looking at a chart later to see that it’s been reset), this is to make sure R plots the time values against each other correctly.

Calling a PowerShell function in a Start-Job script block when it’s defined in the same script

While writing some scripts some upcoming SQL Server User Group presentations I had to remind myself of how to do something in PowerShell, and thought I’d put a reminder up here for myself and for anyone else who needs it.

The problem was that I wanted to show firing off multiple SQL Server backup jobs asynchronously from within a PowerShell script. Sounds nice and simple, write some custom modules, and then fire them off with Start-Job as required. And while that’s the best way to do things (PowerShell rule1: Write functions, not scripts), in this case I wanted to keep everything in one script as it would make it easier when presenting

So the basics set up is:

function backup_db_id
{
param([int]$m, [int]$md)

import-module "SQLPS" -DisableNameChecking
...Do stuff...
}

while($i>$x){
    Start-Job -ScriptBlock <WhatDoIPutHere?>
}

Normally I’d just add the other script or module in there, but what about the function? Well, the trick is to pass the function in as a variable to InitializationScript, and then call it in ScriptBlock like so:

$func = {function backup_db_id
{
param([int]$m, [int]$md)

import-module "SQLPS" -DisableNameChecking
...Do stuff...
}
}

while($i>$x){
    Start-Job -ScriptBlock {backup_db_id} -InitializationScript $func
}

So that’s the first step. But I also want to pass through the 2 variables, so I’ll need to parametise the function pass. Which means stacking things like this:

$func = {function backup_db_id
{
param([int]$m, [int]$md)

import-module "SQLPS" -DisableNameChecking
...Do stuff...
}
}

while($i>$x){
    Start-Job -ScriptBlock {param($tm,$ti) backup_db_id $tm $ti } -InitializationScript $func -ArgumentList($x,$i)
    $i++
}

And now everything is passed through happily.

The other thing to note from the function definition here, is that I’ve had to include the Import-Module to load the SQLPS extensions as PowerShell will be starting up a new instance for every Start-Job and that instance won’t have the SQLPS extensions loaded up (unless you’ve set it as a system wide default, but I don’t always want to rely on that).

Upcoming SQL Server User Group sessions I’m speaking at

Just a quick note about some upcoming sessions I’m presenting at SQL Server User groups over the next couple of months:

18th July – SQL SouthWest (Exeter) – Using PowerShell to Automate SQL Server Backups and Restores 

7th AugustSQL Southampton – Using PowerShell to Automate SQL Server Backups and Restores

17th OctoberCardiff SQL User group – Using PowerShell to Automate SQL Server Backups and Restores

The session will be about how I’m using PowerShell to manage awkward backup scheduling (instances with hundreds of databases), performing automated restores to test those backups, and then to verify the restored database. With all of this being logged somewhere to keep the auditors happy.

Hopefully see some of you there.

Migrating perfmon CSV files into SQL Server for analysis

Up to a certain size and for quick sampling using perfmon data in a CSV file makes life easy. Easily pulled into Excel, dumped into a pivot table and then analysed and distributed.

But once they get up to a decent size, or you’re working with multiple files or you want to correlate multiple logs from different servers, then it makes sense to migrate them into SQL Server. Windows comes with a handy utility to do this, relog, which can be used to convert perfmon output to other formats. Including SQL Server.

First off you’ll need to set up an ODBC DSN on the machine you’re running the import from. Nothing strange here, but you need to make sure you use the standard SQL Server driver.

Make sure you use the standard SQL Driver when setting up an ODBC connection for relog to load perflog data into SQL Server

If you use the SQL Server Native Client you’re liable to run into this uninformative error:

0.00%Error: A SQL failure occurred. Check the application event log for any errors.

and this also unhelpful 3402 error in the Application Event log:

The informative 3402 message relog returns if you're using the wrong ODBC driver

Once you’ve got that setup it’s just the syntax of the relog command to get through. The documentation says you need to use:

-o { output file | DSN!counter_log }

What it doesn’t say is that counter_log is the name of the database you want to record the data into. So assuming you want to migrate your CSV file, c:\perfmon\log.csv, into the database perfmon_data using the DSN perfmon_obdbc, you’d want to use:

relog "c:\perfmon\log.csv" -f SQL -o SQL:perfmon_odbc!perfmon_data

Running that you should get a nice whirling % count as your records are loaded up to SQL Server.

Page 5 of 5

Powered by WordPress & Theme by Anders Norén