Musings of a Data professional

Stuart Moore

Month: September 2013 Page 3 of 4

Day 8 of 31 Days of SQL Server Backup and Restore using PowerShell: Asynchronous Backups – Part 2 – PowerShell Jobs

Yesterday we looked at using PowerShell Events to track asynchronous backups to fire off concurrent backup sessions from the same script. All of those backups were being run independently of each other, so what would you do if you want to run batches of backups concurrently (ie; All dbs on Server A and all dbs on Server B concurrently)?

That’s where PowerShell Jobs come in. Introduced in PowerShell 3.0 they let you fire off a block of script and have it run in the background.  You can then wait for the jobs to finish, or poll for them yourself.

We’ll use the example above of wanting to backup all the databases on 2 SQL Server installs concurrently:

$JobFunction={function backup_server_dbs
{
    param([String]$ServerName)

    Import-Module "SQLPS" -DisableNameChecking

    $SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

    $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

    foreach ($Db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
        $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $BackupSetDescription = "Full Back of "+$Db.Name
        $Backup.Database = $Db.Name

        $BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
        $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)
        try{
            Backup-SqlDatabase -ServerInstance $ServerName -Database $Db.Name -BackupFile $BackupName -BackupSetDescription $BackupSetDescription
        }
        Catch [System.Exception]{
            $output += "`nError with $db.name at "+(get-date)
            $output += ($error[0].ToString())
            $error =1
        }
        Finally{
            if ($error -ne 1){
                $output += "`n Finished backup of $db.name at "+(get-date)
            }
            $error = 0
        }
    }
    return $output
}
} 

$servers = @("Server1","Server2")
$jobs = @()

foreach ($servername in $servers){
    $jobs += Start-job -ScriptBlock {backup_server_dbs $args[0]} -InitializationScript $JobFunction -ArgumentList($servername)
}

$jobs | Wait-Job

$FullOuput = ""
foreach ($job in $jobs){
    Receive-Job $job | Tee-Object -Variable joutput
    $FullOuput += $joutput
    Remove-Job $job
}

Write-Output $FullOutput

This script starts off a little differently to every backup script we’ve looked at so far in this series. What we do first is to create a variable $JobFunction which holds a function definition for a function backup_server_dbs. The reason for this will be explained slightly later on in the script.

backup_server_dbs accepts one paramater, a string called $ServerName. And then it should look pretty famaliar from our previous articles. We build a SQL Server connection, loop through all of it’s databases (ignoring poor TempDB) and backing them up. The main difference it the inclusions of a Try, Catch, Finally block around the actual backup. If you’ve not come across this before, it’s a simple way of handling errors in PowerShell. At a high level we are:

  • Try: Try doing something
  • Catch: If “trying” has caused a error, Catch it
  • Finally: Whatever happened when trying, Finally we’ll do this

So here we Try to perform the backup, if there’s an error we Catch it and add the error message to our output sting, and Finally if we’ve had no errors we append the completion time to our output string.

Once we’ve attmempted to backup all of our databases we exit the function returning out $output variable.

Moving on from the function we set up 2 array variables, the first $Servers holding our server names, and the 2nd one is unpopulated but is going to hold the jobs we create to run the backups.

Using foreach we loop through each server we’ve provided, and start a new job for each one with the Start-Job cmdlet.

And this is the reason we created a variable holding a function. This is so we can pass it into Start-Job as an InitializationScript and then call the function contained within it in the ScriptBlock parameter. A slightly cleaner way of doing this is by using a secondary PowerShell script file and passing the file location in to InitializationScript, but that’s a little hard to show in a simple blog posting, and this way also keeps everything synced in one file.

Once we’ve submitted the jobs we pass the array containing them ($jobs) into Wait-Job. This pauses execution until all the jobs in $jobs have completed.

Once all the jobs have completed we ForEach through echo $job in the $jobs array, and use Receive-Job cmdlet to get the return information from the job, pass it throught Tee-Object to grab the $output into a temporary holder, and then append to a full output variable, and then we pass it to Remove-Job to clear it from the jobs list

And finally we output the Success and Error Messages back to the console.

This basic approach can be used for anytime you want batches of work to be running at the same time. By extending the function to take a Filegroup name and a Backup location you could backup each filegroup to a different set of disk therebay maximising your throughput by not just hitting a single controller, NIC or HBA:

Import-Module SQLPS -DisableNameChecking

$JobFunction={function backup_server_dbs
{param([String]$ServerName, [String]$DatabaseName, [String]$FileGroup, [String]$BackupLocation)
    
    Import-Module "SQLPS" -DisableNameChecking

    $BackupSetDescription = "Filegroup backup of Filegroup "+$FileGroup+" of "+$DatabaseName
    
    $BackupName = $BackupLocation+"\"+$DatabaseName+"_"+$FileGroup+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
      try{
        Backup-SqlDatabase -ServerInstance $ServerName -Database "fg_test" -BackupFile $BackupName -BackupSetDescription $BackupSetDescription -DatabaseFileGroup $FileGroup -BackupAction Files
    }
    Catch [System.Exception]{
        $output += "`nerror with $DatabaseName at "+(get-date)
        $output += "`n vars = $servername, $DatabaseName, $BackupName, $BackupLocation" 
        $output += ($error[0].ToString())
        $error =1
    }
    Finally{
        if ($error -ne 1){
            $output += "`n finished backup of $DatabaseName "+(get-date)
        }
        $error = 0
    }

    return $output
}
} 

$servername = "Server1"
$DatabaseName = "fg_test"
$BackupLocations = @("c:\psbackups\fg1","\\Server2\shareA","\\Server3\ShareB")
$jobs = @()
$i=0

$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$db = $SQLSvr.Databases.Item($DatabaseName)

foreach ($FileGroup in $db.FileGroups){
     $jobs += Start-job -ScriptBlock {backup_server_dbs $args[0] $args[1] $args[2] $args[3]} -InitializationScript $JobFunction -ArgumentList ($servername,$DatabaseName, $Filegroup.Name,  $BackupLocations[$i])
 
    $i++
}


$jobs | wait-job

$FullOuput = ""
foreach ($job in $jobs){
    receive-job $job | Tee-Object -Variable joutput
    $FullOuput += $joutput
    Remove-Job $job
}

Write-Output $FullOuput

If you were writing the above script for production use, you’d probably want to include a check to ensure you’ve provided as many backup locations as you have filegroups.

Tomorrow we’ll be looking at scheduling PowerShell Tasks.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 7 of 31 days of SQL Server Backup and Restore using PowerShell: Asynchronous backups – Part 1

So far all of the backup scripts we’ve looked at having been firing off backups synchronously, that is each one happens after the previous one has completed. This is the same as happens with SQL Server Maintenance Plans.

Within PowerShell we have 2 ways of submitting multiple backup jobs at the same time, each have their own advantages and drawbacks.

Method 1 – SMO’s SQLBackupAsync method and PowerShell Events

SMO’s backup object offers it’s own SQLBackupAsync method which can be simply fired off. This can also be extended with PowerShell’s Event subsystem to allow you to capture the status

Method 2 – PowerShell Jobs:

PowerShell 3 introduced Jobs into it’s bag of tricks. These allow you to fire out a block of script, allow it to run in the background while doing other work and then pick up the results at a later point. Jobs are simpler to work with, and allow you to group elements of work that have to happen together (ie; if you have 2 dbs that must backup together or need to make sure a filesystem backup happens at the same time as the SQL Server backup.

Today we’ll look at Method 1, and then move onto Method 2 tomorrow.

So without further ado, here’s the basics of of an asynchronous backup script:

import-module "SQLPS" -DisableNameChecking 
$ServerName="Server1" 
$BackupBath="c:psbackups\" 
$SQLSvr = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName)
$Db= New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Backups=@()
$i=0 
foreach($Db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
	$btmp=@{}
	$BackupConn= New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName)
	$Backup= New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
	$DeviceType= [Microsoft.SqlServer.Management.Smo.DeviceType"]::File
    $BackupName = $BackupPath+"\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $Backup.BackupSetDescription = "Full Back of "+$db.Name
    $Backup.database = $Db.name
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)
    $Backup.Devices.Add($BackupDevice)
    $btmp["backup"]= $Backup
    $btmp["connection"] = $BackupConn
    $backups += $btmp

    $backups[$i]["backup"].SqlBackupAsync($backups[$i]["connection"])
    $i++
}

This script will perform a full database backup of each databases on the specified server except for TempDB. But, it’s the PowerShell equivalent of throwing something and not being worried about where it lands or if it shatters.

Working through the script:

It starts off normally with loading the SQLPS module, setting the now usual variables, and creating the familiar Database and Server Connection objects.

Then we have this line

$Backups=@() 

For those of you not familiar with PowerShell, this is creating a new Array variable called $Backups, this because for Asynchronous backups we need to hold a connection object for EACH backup. Up till now, as we’ve been submitting the backups synchronously they are are queued up and submitted one by one, so they can use the same connection. As we’re submitting these asynchronously they need their own connection. The simplest way to keep track of which connection goes with which backup is to track them all in a multidimensional array, and we build the first dimension here.

My old friend the counter $i is also set to 0, this will be used to keep track of our array elements.

Then we drop into our normal ForEach loop through the databases on our server.

First we build a temporary hash $btmp which will hold the backup information before we add it to $Backups. If you’ve not met a PowerShell Hash before, it’s an associative array, so it holds data like:

$hash["author"]="stuart"
$hash["topic"]="PowerShell"

And since this is PowerShell, each Value can in fact be an object. Which is exactly what we’re going to use it for

We build a new connection object for our backups called $BackupConn, and then a Backup object as normal. We then add these to our $btmp hash like follows:

$btmp["backup"]= $Backup
$btmp["connection"] = $BackupConn

And then we append this to our $Backups array.

As demonstrated in the next line when we actually submit the backup:

$backups[$i]["backup"].SqlBackupAsync($backups[$i]["connection"])

We can reference our Backup and it’s specific connection by referencing the index of our $Backup array, and then referencing the contained hash.

If you’re backing up enough databases with enough data you’ll have time to switch to SQL Server Management Studio and run sp_who2 to see all the connections and backup processes running.

All the backups will now be submitted on their own connection and SQL Server will process them whichever order it wants. That last point is very important to remember. When running backups asynchronously there is no guarantee in which order the backups the will be run, so if you need to ensure a certain order you’ll either need to stick to running them synchronously or look at the PowerShell Job methods in tomorrows post which allows you to batch them up.

Now, this is all great, but as mentioned at the start of this blog we’re just throwing the backups at SQL Server, and have no way of knowing what’s completed. So we’ll know add some extra code into to the script so we can get this information back.

To do this we’re going to be using PowerShell Events. Many PowerShell methods expose events which can fire at certain points during the execution of the method. Your script can poll for this event, and then take appropriate action when they fire.

For this script we’re going to register the following 2 events:

    Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Complete" -SourceIdentifier "asyncbkup-complete-$(db.name)"  -MessageData "$($db.name)-$i" | Out-Null
    Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Information" -SourceIdentifier "asyncbkup-info-$($db.name)"  -MessageData "$($db.name)-$i" | out-null

These events fire when our Backup completes, or when it wants to return some other informational message (ie; an error). In each case we register the Event against the specific backup object using our multidimensional array $backups, specify the event we want to catch, give this particular event an identifier to distinguish it from all other events and then give it a ‘payload’ which we can get later.

We then need to poll the events till we get one of ours. There are a number of ways this loop can be run, either as a continuous while loop or from a Timer which fires a regular intervals.

In this example we’re going to add a “complete” value to our hash which we’ll initially set to 0. As each backup completes (successfully or unsuccessfully) we’ll set the value to 1. If we sum all the “complete” values in our $backups array, when it’s equal to the number of entries in the array then we know that all the backups have finished and we can exit the loop.

While in the loop we use the Wait-Event cmdlet to catch any events. We use Tee-Object to get the information from the event but still allow us to pass it to Remove-Event, we need to do that to remove the Event from the queue to prevent processing it repeatedly.

Then we’ll split the payload into an array index and the Database Name, pull the ExecutionStatus from the Backup Object firing the event and get the Error message (if anything). Then using a case statement to catch the possible event statuses we build up a $output to record the information, and we mark the appropriate records as completed.

After the loop exits we clear all Event registrations, and output the message we built up:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1"
$BackupBath = "c:\psbackups\"

$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

$Backups = @()
$i=0

foreach ($Db in $SQLSvr.Databases | Where-Object {$_.Name -neq "Tempdb"}){
    $btmp = @{}
    $BackupConn = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $BackupName = $BackupPath+"\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $Backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.database = $Db.Name
    $Backup.Devices.AddDevice($BackupName,'File')
    $btmp["backup"]= $Backup
    $btmp["connection"] = $BackupConn
    $backups += $btmp

   Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Complete" -SourceIdentifier "asyncbkup-complete-$($Db.Name)"  -MessageData "$($Db.name)-$i" | Out-Null
   Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Information" -SourceIdentifier "asyncbkup-info-$($Db.Name)"  -MessageData "$($Db.Name)-$i" | out-null

    $backups[$i]["backup"].SqlBackupAsync($backups[$i]["connection"])
    $i++
}

$output = ""

while(($backups[0..$backups.length].Complete | Measure -sum).sum -lt $backups.Length){
    Wait-Event | Tee-Object -variable event | Remove-Event

    $ErrorText = $event.SourceEventArgs.Error.Message
    $DbName,$i = $event.MessageData.Split('-')

    $status = $event.SourceArgs[0].AsyncStatus.ExecutionStatus

    if($backups[$i]["Complete"] -ne 1){
        $now = get-date
    	switch ($status){
    	    "Succeeded" {
    		$output += "`n $DbName Completed at $now"
    		$backups[$i]["complete"] = 1
                break;}
    	    "Failed" {
    		$output += "`n $DbName Failed at $now with $ErrorText"
    		$backups[$i]["complete"]=1
    		break;}
    	    "InProgress"{
                $output += "`n $DbName In Progress at $now"
    		break;}
    	    "Inactive" {
    		$output += "`n $DbName Inactive at $now"
                break;}

    	}
    }
}

Get-EventSubscriber | Unregister-Event
write-output $output

As an example of the output from this script, here’s a sample from one of my test boxes where I’d marked one DB as offline:

InternetSales Failed at 09/09/2013 21:32:08 with Database 'InternetSales' cannot be opened because it is offline.
 dbgrow In Progress at 09/09/2013 21:32:09
 dbgrow In Progress at 09/09/2013 21:32:09
 AWDataWarehouse In Progress at 09/09/2013 21:32:09
 AWDataWarehouse In Progress at 09/09/2013 21:32:09
 fg_test In Progress at 09/09/2013 21:32:10
 fg_test In Progress at 09/09/2013 21:32:10
 fg_test In Progress at 09/09/2013 21:32:10
 fg_test In Progress at 09/09/2013 21:32:10
 HumanResources In Progress at 09/09/2013 21:32:10
 dbgrow Completed at 09/09/2013 21:32:10
 fg_test In Progress at 09/09/2013 21:32:10
 HumanResources In Progress at 09/09/2013 21:32:10
 msdb In Progress at 09/09/2013 21:32:11
 model In Progress at 09/09/2013 21:32:11
 AWDataWarehouse Completed at 09/09/2013 21:32:11
 modellock2 In Progress at 09/09/2013 21:32:11
 modellock In Progress at 09/09/2013 21:32:11
 model In Progress at 09/09/2013 21:32:11
 msdb In Progress at 09/09/2013 21:32:11
 psrestore In Progress at 09/09/2013 21:32:12
 modellock In Progress at 09/09/2013 21:32:12
 modellock2 In Progress at 09/09/2013 21:32:12
 ResellerSales In Progress at 09/09/2013 21:32:12
 SomeDB In Progress at 09/09/2013 21:32:12
 SomeDB In Progress at 09/09/2013 21:32:12
 psrestore In Progress at 09/09/2013 21:32:12
 SomeDB In Progress at 09/09/2013 21:32:12
 ResellerSales In Progress at 09/09/2013 21:32:12
 master In Progress at 09/09/2013 21:32:13
 HumanResources Completed at 09/09/2013 21:32:13
 restoretime In Progress at 09/09/2013 21:32:13
 restoredt In Progress at 09/09/2013 21:32:13
 fg_test Completed at 09/09/2013 21:32:13
 master In Progress at 09/09/2013 21:32:13
 Staging In Progress at 09/09/2013 21:32:13
 restoretime In Progress at 09/09/2013 21:32:13
 Staging In Progress at 09/09/2013 21:32:13
 restoredt In Progress at 09/09/2013 21:32:13
 psrestore Completed at 09/09/2013 21:32:13
 msdb Completed at 09/09/2013 21:32:13
 model Completed at 09/09/2013 21:32:13
 ResellerSales Completed at 09/09/2013 21:32:13
 modellock Completed at 09/09/2013 21:32:13
 modellock2 Completed at 09/09/2013 21:32:13
 test1 In Progress at 09/09/2013 21:32:13
 test1 In Progress at 09/09/2013 21:32:13
 Products In Progress at 09/09/2013 21:32:13
 TSQL2012 In Progress at 09/09/2013 21:32:13
 Products In Progress at 09/09/2013 21:32:13
 TSQL2012 In Progress at 09/09/2013 21:32:13
 master Completed at 09/09/2013 21:32:13
 Staging Completed at 09/09/2013 21:32:13
 SomeDB Completed at 09/09/2013 21:32:13
 restoretime Completed at 09/09/2013 21:32:13
 restoredt Completed at 09/09/2013 21:32:13
 test1 Completed at 09/09/2013 21:32:13
 Products Completed at 09/09/2013 21:32:13
 TSQL2012 Completed at 09/09/2013 21:32:13

Showing that we have multiple backups happening at once, with the finish order mostly being determined by the size of the Database and the speed of my test rig’s hard drive.

And if you switch to SSMS while the script is running and look at connectiong/processes you’ll see all the backups working and queueing as well.

PowerShell Events are a powerful topic, and can be used in lots of other SQL Server scripting tasks. I’ve only just covered the tip of them here, a pair of good articles on them are :

Use Asynchronous Event Handling in PowerShell

Manage Event Subscriptions with PowerShell

which are well worth a read.

Tomorrow we’ll take a look at the PowerShell Job way of doing this.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is availablehere.

Day 6 of 31 days of SQL Server Backup and Restore using PowerShell: Redirecting backups

A very useful feature of performing your backups with PowerShell is that you can redirect your backups easily on pretty much any condition you want, or even run multiple backups asynchronously to different network drives to save saturating a single NIC.

At the simplest level you can push each database backup into it’s own folder. However, just like with T-SQL backups, if the folder doesn’t exist the backup will fail, so we’ll introduce a quick check:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.Database = $db.Name
    $BackupFolder = "c:\psbackups\"+$Db.name+"\"
    if ((Test-Path $BackupFolder) -eq $False){
        New-Item $BackupFolder -type Directory
    }
    $BackupName = $BackupFolder+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($SQLSvr)
    $Backup.Devices.Remove($BackupDevice)
}

Or we can monitor the space left on the backup drive, and if it falls below what we need (plus a margin for safety) the backups move to another area?:

Import-Module "SQLPS" -DisableNameChecking

$folders = @()
$folders += "c:\psbackups"
$folders += "\\server2\backups$"
$folders += "\\server3\panicspace$"

$FolderCount = 0
$CurrentDrive = New-Object -Com Scripting.FileSystemObject
$MarginForError = 200

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.Database = $db.Name

    $CurrentDrive = $folders[$FolderCount]
    if (($Db.Size-$Db.SpaceUnused)+$MarginForError -gt ($CurrentDrive.AvailableSpace/1024/1024)){
        $FolderCount++
    }
    $BackupName = $Folders[$FolderCount]+"\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($SQLSvr)
    $Backup.Devices.Remove($BackupDevice)
}

Here we build up an array of possible backup folders ($Folders), set a $MarginForError in Mb (in this case I’m setting 200MB) and creating a new Scripting object to hold our drive object. As we loop through the database we check that the AvailableSpace on the drive is greater that our requirements (AvailableSpace returns space in bytes, so we need to change that to MB), I check each time just in case something else is using the disk and don’t get caught out. If we are short of space we increase the $FolderCount variable by 1.

When building up the backup target we us $FolderCount to pick the currently used path from the $Folders array, and then continue with the backup as normal.

Using $Db.Size gives us a worst case scenario for the potential size of the backup. Though if you have a lot of free space in your DB it’ll be skewed, this could be corrected by using $Db.spaceavailable as well (correcting for the differences in return values (Size is in MB, SpaceAvilable in KB).

Tomorrow we’ll be looking at Asynchronous backups, allowing you to fire off multiple concurrently running backups from a single script

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 5 of 31 days of SQL Server Backup and Restore using PowerShell: File and Filegroup backups

So far we’ve only looked at backup types supported by the Maintenance Plans (Full, Differential and Log), now we’re going to start going past those and look at File and FileGroup backups. Normally to perform these you’d need a 3rd party tool or you own home rolled SQL scripts.

The script examples are based on using the following database, with 3 filegroups, 1 of which (tertiary) is readonly:

USE [master]
GO

CREATE DATABASE [fg_test]
 ON  PRIMARY
( NAME = N'fg_test', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test.mdf' ),
 FILEGROUP [secondary]
( NAME = N'fg_test_2', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_2.ndf' ),
( NAME = N'fg_test_2a', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_2a.ndf' ),
 FILEGROUP [tertiary]
( NAME = N'fg_test_3', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_3.ndf' )
 LOG ON
( NAME = N'fg_test_log', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_log.ldf');
GO

alter database fg_test modify filegroup tertiary readonly;
go

Using SMO, to backup up a single filegroup you use the Add method of the Backup object’s DatabaseFileGroup property, and everything else stays the same:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-C0BP65U3D4G"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("fg_test")
$FileGroupName="Secondary"

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Files
$Backup.BackupSetDescription = "Filegroup Backup of Filegroup "+$FileGroupName+" of "+$Db.Name
$Backup.Database = $db.Name
$backup.DatabaseFileGroups.add($FileGroupName)

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

The line that does the work is:

$Backup.DatabaseFileGroups.Add($FileGroupName)

If you want to backup multiple FileGroups then you can call the method multiple times:

$Backup.DatabaseFileGroups.Add("primary")
$Backup.DatabaseFileGroups.Add("secondary")

Or you can pass them in via the AddRange operator:

$Backup.DatabaseFileGroups.AddRange(("primary","secondary"))

It can also be done with a loop if you can identify which filegroups you want to backup. For example, if you wanted to backup all non readonly filegroups you could replace the line with this script snippet:

foreach ($fg in $db.FileGroups | where-object {$_.ReadOnly -eq $FALSE}){
    $Backup.DatabaseFileGroups.Add($fg.Name)
}

This is one of the cases where the Backup-SQLDatabase is slightly harder to use. To replicate the first SMO example you’d use:

Import-Module SQLPS -DisableNameChecking

Backup-SqlDatabase -ServerInstance WIN-C0BP65U3D4G -Database fg_test  -DatabaseFileGroup "secondary" -BackupFile "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"

For the second:

Import-Module SQLPS -DisableNameChecking

Backup-SqlDatabase -ServerInstance WIN-C0BP65U3D4G -Database fg_test  -DatabaseFileGroup "primary","secondary" -BackupFile "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"

Note passing in the filegroup names as a comma separated list of strings.

And for the looped version:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-C0BP65U3D4G"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("fg_test")
$Filegroup = ""
$i=0
foreach ($fg in $db.FileGroups | where-object {$_.ReadOnly -eq $FALSE}){
    if ($i -eq 0){
        $Filegroup = $FileGroup+"`"$fg.Name`""
    }else{
        $Filegroup = $FileGroup+",`"$fg.Name`""
    }
}
Backup-SqlDatabase -ServerInstance WIN-C0BP65U3D4G -Database fg_test  -DatabaseFileGroup $Filegroup -BackupFile "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"

Here we still have to build the connection to the SQL Server instance so we can get the information about the Filegroups. And then loop through it building up the correct string format to pass in as a parameter. I use a simple counter $i to keep track of how many filegroups we’ve found and to ensure the “,”s only go where they’re needed.

Database file backups work in exactly the same way, but using the $Backup.Files.Add() method or the -DatabaseFile parameter.

Tomorrow we’ll be looking at redirecting the backup files.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 4 of 31 days of SQL Server backup and Restore using Powershell: Catching new databases for a full backup

As the PowerShell backups are just calling the standard T-SQL backup commands, all the usual backup rules a SQL Server DBA is used to still apply.

The main one of these, is that before a Transaction or Differential Backup can be taken the database must have had at least one full backup. This ensures that there is a start to the backup chain.

A common setup at many ISVs is for a new database to be created for each new customer that sign’s up. If that’s an automated process kicked off by an online payment, then the new database could appear at any time, and if you’re only taking full backups once a day the database may not get a full backup for nearly 24 hours! If you’re lucky your developers or suppliers wrote in a check to do an initial backups, do ya feel lucky?

And it’s usually in that first 24 hours that the new customer will be very keen, loading up lots of data and making lots of configuration choicses; but not used to the product, potentially leading to an unexpected deletion of data. If that happens, you’l have no way to recover the data.

By using the following PowerShell script to run your transaction log backups, it will catch any database without a previous backup.

Import-Module "SQLPS" -DisableNameChecking

$Server = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($Server)
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($Db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb" -and $_.RecoveryModel -eq "Full"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    if ($Db.LastBackupDate -lt $Db.CreateDate){
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $BackupName = "c:\psbackups\"+$db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        $Backup.BackupSetDescription = "Full Back of "+$db.Name
    }else{
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
        $BackupName = "c:\psbackups\"+$db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".trn"
        $Backup.BackupSetDescription = "Log Back of "+$Db.Name
        $Backup.LogTruncation = [Microsoft.SqlServer.Management.Smo.BackupTruncateLogType]::Truncate
    }

    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $Backup.Database = $Db.Name

    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)
    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($sqlsvr)
    $Backup.Devices.Remove($BackupDevice
}

We make use of the fact that if a new database doesn’t have a backup then SQL Server returns LastBackupDate as “Monday, January 01, 0001 12:00:00 AM”. We don’t match for that value as if a database has previously existed with the same name and someone hasn’t run sp_delete_database_backuphistory to clear it’s backup history then SQL Server will return the old databases last backup date. But that will still be less than the CreateDate of the new database, so we use that instead.

Tomorrow we’ll be looking at FileGroup and File backups

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 3 of 31 days of SQL Server backup and Restore using Powershell: Transaction and Differential backups

So far we’ve only looked at performing Full Databse backups. PowerShell and SMO are perfectly happy to handle all the other types available in SQL Server:

  • Transaction Log backup
  • Differential
  • File or FileGroup
  • File or FileGroup Differential

In this post we’ll just look at the first 2 types, and then Files and Filegroups in tomorrow’s post But before we do, be aware that SQL Server and PowerShell do not care nor correct your backup file names or extensions, so it will quite happily write out your transaction log backups to filename.bak. As we’ll cover in the restore section, this isn’t a problem if you’re looking into the files, but may cause some confusion if someone just looks in the backup folder! For an SMO Transaction Log Backup we just need to change the backup action to Log:

$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log

By default under this action the backup will truncate the logs. If that’s not what you want to happen, then you can override it by using the LogTruncation property of the Backup object:

#To Not truncate the log and leave all the transactions in the log use
$Backup.LogTruncation = "NoTruncate"

#For completeness if you just want to truncate the log, and not back anything up use
$Backup.LogTruncation = "TruncateOnly"

For the Backup-SQLDatabase you use the following parameters:

Backup-SqlDatabase -BackupAction Log -LogTruncationType Truncate ...

Backup-SqlDatabase -BackupAction Log -LogTruncationType NoTruncate ...

Backup-SqlDatabase -BackupAction Log -LogTruncationType TruncateOnly ...

For a SMO Differential backup Action is left as file, but you set the backup objects’s Incremental property to $TRUE:

$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$Backup.Incremental = $TRUE

And for the Backup-SQLDatabase cmdlet you would use:

Backup-SqlDatabase -BackupAction Database -Incremental ...

So putting this together, a simple script to backup a database with the following schedule:

  • Sunday Evening at 18:00 – Full Backup
  • Every Evening (expect Sunday) at 18:00 – Differential Backup
  • Every hour – Transaction Log

would look like:

$ScriptStartTime = Get-Date
$BackupFileSuffix = ""
Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
$DataBase = "DB1"

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item($DataBase)
$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Database = $db.Name

if ($ScriptStartTime.Hour -eq 18){
    if ($ScriptStartTime.DayOfWeek -eq "Sunday"){
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $Backup.Incremental = $false
        $backup.BackupSetDescription = "Full Backup of "+$Db.Name
        $BackupFileSuffix = ".bak"
    }else{
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $Backup.Incremental = $true
        $backup.BackupSetDescription = "Differential Backup of "+$Db.Name
        $BackupFileSuffix = ".bck"
    }
}else{
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
    $Backup.Incremental = $true
    $backup.BackupSetDescription = "Log Backup of "+$Db.Name
    $BackupFileSuffix = ".trn"

}

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+$BackupFileSuffix
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

The sections doing the work are:

$ScriptStartTime = Get-Date
...
if ($ScriptStartTime.Hour -eq 18){
    if ($ScriptStartTime.DayOfWeek -eq "Sunday"){
        ....
    }else{
        ....
    }
}else{
    ....
}

As the script starts we get the time it started and use this as a reference. This prevents an issue if we were looping through a number of databases on a Sunday evening and it took more than an hour to get to the lasts one, they would end up getting just transaction backups. By referencing the start time we ensure that all the databases touched by that running of the script get the appropriate backup. We then check to see if the script started at 18:* (we match it a little fuzzily to allow for a delayed start due to other system jobs), if it has we check to see if it’s a Sunday. If it is we set for a Database backup, ensure that we aren’t doing a differential, populate the Description appropriately and set the suffix to .bak as normal. If it’s 18:00 on any other day we set for a Database backup but this time an Incremental backup, and set the suffix to .bck and populate the description correctly. Any other running of the script, and we set the parameters up for a Transaction log backup.

Tomorrow we’ll look at an example of how to use these ideas to catch newly created databases and ensure we start a new backup chain before it’s too late!

Day 2 of 31 days of SQL Server backup and Restore using Powershell: Looping through Databases

Yesterday we looked at backing up a single SQL Server database with PowerShell. There is certainly more code involved than a good old T-SQL style backup. But thanks to the wonders of PowerShell we now have a very reusable piece of code.

As a good example, if you wanted to loop through every database in a SQL Server instance we can now take the central part of the script and loop through it as many times as we want. And if we want to ignore certain DBs then that’s simple as well:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-4B40IEFH4CR\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.Database = $db.Name

    $BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($SQLSvr)
    $Backup.Devices.Remove($BackupDevice)
}

By changing one line we suddenly make the script a lot more useful:

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -neq "tempdb"}){
    code
}

We get our SQL Server to return an object containing all the database objects present on the server($SQLSvr.Databases. We use the Where-Object cmdlet to filter down to only the databases we want. In this case, we’ve asked for all Databases where their name is not equal (-ne to TempDB. You can modify this fitering to exclude anything you’d like based on any Database Property, for example:

  • {_.Name -notlike "*_test"} – returns all databases that don’t end in _test
  • {$_.IsSystemObject -eq $FALSE} – returns all non system databases

Then using Foreach we loop through all the DB objects in the Databases object executing our backup code.

This can be extended further to loop through a list of servers as well:

Import-Module "SQLPS" -DisableNameChecking

$ServerList = @("server1","server2","server3")

foreach ($ServerName in $ServerList){
    $SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

    $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

    foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
        $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $backup.BackupSetDescription = "Full Back of "+$Db.Name
        $Backup.Database = $db.Name

        $BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
        $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

        $Backup.Devices.Add($BackupDevice)
        $Backup.SqlBackup($SQLSvr)
        $Backup.Devices.Remove($BackupDevice)
    }
}

Now to look at getting the same outcome using the Backup-SQLDatabase cmdlet:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-4B40IEFH4CR\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -neq "tempdb"}){
    $BackupFile = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    Backup-SQLDatabase -InputObject $SQLSvr -Database $Db.name -BackupFile $BackupFile  -BackupAction Full -BackupSetDescription "Full Back of "+$Db.Name
}

Again the backup script is shorter, though in this case we still need to use the SMO methods to create a connection to the SQL Server instance so we can loop through it’s collection of databases. And as we have it open, we can also pass it to the Backup-SQLDatabase cmdlet, though note that we have to use the -InputObject parameter rather than the -ServerInstance we used previously

This can be extended to multiple servers in exactly the same way as the SMO version:

Import-Module "SQLPS" -DisableNameChecking

$ServerList = @("server1","server2","server3")

foreach ($ServerName in $ServerList){
    $SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
    $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

    foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -neq "tempdb"}){
        $BackupFile = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        Backup-SQLDatabase -InputObject $SQLSvr -Database $Db.name -BackupFile $BackupFile  -BackupAction Full -BackupSetDescription "Full Back of "+$Db.Name
    }
}

Now we’ve looked at performing full database backups, tomorrow we’ll move onto looking at transaction log and differential backups.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

PowerShell

Day 1 of 31 days of SQL Server backup and Restore using Powershell: Simple Database Backup

We’ll start with the simplest form of backup script, taking a full backup of a database. This demonstrate’s the basic principles that will then be expanded on to perform more complex operations later on in the series.

The script is:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("psdb1")

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Full Backup of "+$Db.Name
$Backup.Database = $db.Name

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

Now to break it down:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$DbName = "psdb1"
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item($DbName)

First we import the SQL PowerShell module, we use the -DisableNameChecking to hide this error message:

WARNING: The names of some imported commands from the module 'SQLPS' include unapproved verbs that might make them less discoverable. To find the commands with unapproved
verbs, run the Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.

which is just letting us know that Microsoft don’t always follow Microsoft’s recommended best practice

Next we build our connection to our SQL Server by putting the Server and Instance name into a string variable and then passing it into the constructor for a new SQL Server object. This examples assumes that you are using Windows Authentication to connect to your server and you are running the script under an account that has permissions to the server. You can pass the value straight to the constructor, but I find having well name variables towards the top of the script much easier to find and you can reuse them throughout, so you don’t have to search and replace each time you want to work on a different server,

Then we create a new Database object by passing the name of the database we want to backup to the constructor

The next section:

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Full Backup of "+$Db.Name
$Backup.Database = $Db.Name

This creates a new Backup object, and then we say what type of backup it is. In this case, we used the Database type. For later ease we name the backup. Then attach the database to be backup up.

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

Now we create our backup device. $BackupName for the File backup device is the path to the .bak file. Remeber that the path is relative to the SQL Server Instance you’re asking to perform the backup. ie; that c:\psbackups folder is on Server1, not the machine you’re running the PowerShell on. The type of device and path are passed into the constructor for a new backup device.

And now finally, we come to do the actual backup:

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

Add the newly created backup device to the backup object. Perform the actual backup, by calling the SqlBackup passing it the SQL server object we created before. The referenced SQL Server instance will now perform the backup before handing back to the script, so if you’ve pointed this at a 50GB database and are backing it up to slow disks this might be a good time to grab a coffee.

Once the backup has completed the script continues, and we remove the backup device from the backup object

In this case the same operation using Backup-SQL requires much less script:

Import-Module SQLPS -DisableNameChecking
$ServerName = "Server1\SQL2012"
$DbName = "psdb1"
$BackupFile ="c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
Backup-SQLDatabase -ServerInstance $ServerName -Database $DbName -BackupFile $BackupFile -BackupAction Database

As you can see, this new cmdlet removes the need for the scriptwriter to manually build the SMO objects manually as here it’s just taking in strings, and doing the work on your behalf.

Tomorrow, we’ll be looking at looping this script across all the databases in a SQL Server Instance excluding those we don’t want.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 0 of 31 days of SQL Server backup and Restore using Powershell: Why not T-SQL?

Last day of introductory material, tomorrow will feature an actual example and some code, promise.

So why do I think that it’s better to step outside of SQL Server into the world of PowerShell for doing backups?

SQL Server Maintenance plans are pretty good for nice simple environments. A few clicks and 2 maintenance plans later you’ve got some nice scheduled jobs that back your database up once a day, and your transaction logs each hour. Excellent.

But what about if you want to treat your system databases (master, model, tempdb, msdb) differently? Ok, you create another set of maintenance plans. Those databases are pretty fixed (you hope!) so it’s not too much work.

But what if you have hundreds of databases in a single instance and you want to vary when their backups run to spare your IO system? You could build up a number of maintenance plans, but they’ll still need manual maintenance. How much easier would it be to have a set of simple scripts that so you can define rules such as ‘Backup first 10% of databases at 01:00’, ‘Backup second 10% of databases at 02:00’? Much less of your time will be spent on keeping on top maintaining maintenance plans, so you can get on with yet more of the fun parts of your role.

SQL Sever doesn’t even offer a ‘Maintenance Plan’ to help automate restoring your databases to check your backups, so you’re on your own from the start. If you’re only wanting to perform a very simple automated restore, then you could write some simple T-SQL and schedule it via SQL Server Agent. But this would be very hard to write so it could work with any database, or be able to catch unusual setups. You end up in the world of cursors and Dynamic SQL, and are limited by SQL Server’s limited access to the filesystem (assuming you’ve set your security up for security rather than ease of administration (and we all do that don’t we?)).

By using PowerShell we can create scripts that can loop through folders and build up complex restores, can access OS and filesystems to work out how best to layout files, or to abort cleanly if there’s a problem. And if we combine that with a database behind it containing schedules and information, then we can have an automated system that will cope with restoring any of our databases. So when management (or the auditors) ask me how I can be sure that our backups work and that my RTO estimates are accurate I can just pull the numbers out and show them.

Hopefully the next 31 days of posts and scripts will help you towards that goal as well.

Tomorrow, the first script!

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day -1 of 31 days of SQL Server backup and Restore using Powershell: SQL Server Powershell Components

Continuing my pre 31 Days  of SQL Server backup and Restore using Powershell into, today we’ll look at getting the PowerShell SQL Server components, and a quick mention of GUIs

All Microsoft operating systems since Windows XP have shipped with PowerShell either installed or as an option. At the time of writing the current version is 3.0 (Download PowerShell v3.0). (PowerShell 4.0 is currently in public beta, but won’t be covered in this series)

Version 3 introduced enough new features to make it worthwhile upgrading to, especially for automating jobs. These include:

  • Better Scheduled Job Integration
  • Improved Integrate Scripting Environment
  • Workflows

A basic install of PowerShell doesn’t include the SQL PowerShell module which allows you to access the SQL Management Objects. These are installed when you install the client tools from your SQL Server media. They are also now freely available as part of SQL Server 2012 Express Service Pack 1 (download here).

The SMO capabilities are the same across the PowerShell module in each SQL Server edition, the limitations in usage are controlled by the edition of SQL Server you connect to. This means you can install the free SQL Express version on your desktop to control your Enterprise edition Servers, and that scripts developed on Express will also work with Standard and Enterprise editions.

SMO also relies on  standard T-SQL calls, so as long as the version of SQL Server supports the functionality you can use it. For instance, all the scripts in this series will work back to SQL Server 2005 (and the basic backups will also work backwards to SQL Server 7 and 2000). But just because PowerShell support AlwaysOn availability groups, it doesn’t mean you can enable them on SQL Server 2005

PowerShell 3.0 ships with a fairly usable  Integrated Scripting Environment GUI. This gives you Intellisense, some basic debugging functionality and an easier to read interface to Get-Help

If you start developing more complex PowerShell scripts then a more fully featured IDE may be preferable. There are a number available, both free and paid for. Commonly used ones are:

Which one you prefer will mostly come down to personal preference, so I’d suggest trying them all and then choosing one.

Tomorrow I’ll lay out some of the reasons why I’ve moved away from T-SQL for a lot of my backup and restore automation.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Page 3 of 4

Powered by WordPress & Theme by Anders Norén