So far all of the backup scripts we’ve looked at having been firing off backups synchronously, that is each one happens after the previous one has completed. This is the same as happens with SQL Server Maintenance Plans.
Within PowerShell we have 2 ways of submitting multiple backup jobs at the same time, each have their own advantages and drawbacks.
Method 1 – SMO’s SQLBackupAsync method and PowerShell Events
SMO’s backup object offers it’s own SQLBackupAsync method which can be simply fired off. This can also be extended with PowerShell’s Event subsystem to allow you to capture the status
Method 2 – PowerShell Jobs:
PowerShell 3 introduced Jobs into it’s bag of tricks. These allow you to fire out a block of script, allow it to run in the background while doing other work and then pick up the results at a later point. Jobs are simpler to work with, and allow you to group elements of work that have to happen together (ie; if you have 2 dbs that must backup together or need to make sure a filesystem backup happens at the same time as the SQL Server backup.
Today we’ll look at Method 1, and then move onto Method 2 tomorrow.
So without further ado, here’s the basics of of an asynchronous backup script:
import-module "SQLPS" -DisableNameChecking $ServerName="Server1" $BackupBath="c:psbackups\" $SQLSvr = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName) $Db= New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database $Backups=@() $i=0 foreach($Db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){ $btmp=@{} $BackupConn= New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName) $Backup= New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup $DeviceType= [Microsoft.SqlServer.Management.Smo.DeviceType"]::File $BackupName = $BackupPath+"\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak" $Backup.BackupSetDescription = "Full Back of "+$db.Name $Backup.database = $Db.name $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType) $Backup.Devices.Add($BackupDevice) $btmp["backup"]= $Backup $btmp["connection"] = $BackupConn $backups += $btmp $backups[$i]["backup"].SqlBackupAsync($backups[$i]["connection"]) $i++ }
This script will perform a full database backup of each databases on the specified server except for TempDB. But, it’s the PowerShell equivalent of throwing something and not being worried about where it lands or if it shatters.
Working through the script:
It starts off normally with loading the SQLPS module, setting the now usual variables, and creating the familiar Database and Server Connection objects.
Then we have this line
$Backups=@()
For those of you not familiar with PowerShell, this is creating a new Array variable called $Backups
, this because for Asynchronous backups we need to hold a connection object for EACH backup. Up till now, as we’ve been submitting the backups synchronously they are are queued up and submitted one by one, so they can use the same connection. As we’re submitting these asynchronously they need their own connection. The simplest way to keep track of which connection goes with which backup is to track them all in a multidimensional array, and we build the first dimension here.
My old friend the counter $i
is also set to 0, this will be used to keep track of our array elements.
Then we drop into our normal ForEach
loop through the databases on our server.
First we build a temporary hash $btmp
which will hold the backup information before we add it to $Backups
. If you’ve not met a PowerShell Hash before, it’s an associative array, so it holds data like:
$hash["author"]="stuart" $hash["topic"]="PowerShell"
And since this is PowerShell, each Value can in fact be an object. Which is exactly what we’re going to use it for
We build a new connection object for our backups called $BackupConn
, and then a Backup object as normal. We then add these to our $btmp
hash like follows:
$btmp["backup"]= $Backup $btmp["connection"] = $BackupConn
And then we append this to our $Backups
array.
As demonstrated in the next line when we actually submit the backup:
$backups[$i]["backup"].SqlBackupAsync($backups[$i]["connection"])
We can reference our Backup and it’s specific connection by referencing the index of our $Backup
array, and then referencing the contained hash.
If you’re backing up enough databases with enough data you’ll have time to switch to SQL Server Management Studio and run sp_who2
to see all the connections and backup processes running.
All the backups will now be submitted on their own connection and SQL Server will process them whichever order it wants. That last point is very important to remember. When running backups asynchronously there is no guarantee in which order the backups the will be run, so if you need to ensure a certain order you’ll either need to stick to running them synchronously or look at the PowerShell Job methods in tomorrows post which allows you to batch them up.
Now, this is all great, but as mentioned at the start of this blog we’re just throwing the backups at SQL Server, and have no way of knowing what’s completed. So we’ll know add some extra code into to the script so we can get this information back.
To do this we’re going to be using PowerShell Events. Many PowerShell methods expose events which can fire at certain points during the execution of the method. Your script can poll for this event, and then take appropriate action when they fire.
For this script we’re going to register the following 2 events:
Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Complete" -SourceIdentifier "asyncbkup-complete-$(db.name)" -MessageData "$($db.name)-$i" | Out-Null Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Information" -SourceIdentifier "asyncbkup-info-$($db.name)" -MessageData "$($db.name)-$i" | out-null
These events fire when our Backup completes, or when it wants to return some other informational message (ie; an error). In each case we register the Event against the specific backup object using our multidimensional array $backups
, specify the event we want to catch, give this particular event an identifier to distinguish it from all other events and then give it a ‘payload’ which we can get later.
We then need to poll the events till we get one of ours. There are a number of ways this loop can be run, either as a continuous while loop or from a Timer
which fires a regular intervals.
In this example we’re going to add a “complete” value to our hash which we’ll initially set to 0. As each backup completes (successfully or unsuccessfully) we’ll set the value to 1. If we sum all the “complete” values in our $backups
array, when it’s equal to the number of entries in the array then we know that all the backups have finished and we can exit the loop.
While in the loop we use the Wait-Event
cmdlet to catch any events. We use Tee-Object
to get the information from the event but still allow us to pass it to Remove-Event
, we need to do that to remove the Event from the queue to prevent processing it repeatedly.
Then we’ll split the payload into an array index and the Database Name, pull the ExecutionStatus from the Backup Object firing the event and get the Error message (if anything). Then using a case statement to catch the possible event statuses we build up a $output
to record the information, and we mark the appropriate records as completed.
After the loop exits we clear all Event registrations, and output the message we built up:
Import-Module "SQLPS" -DisableNameChecking $ServerName = "Server1" $BackupBath = "c:\psbackups\" $SQLSvr = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName) $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database $Backups = @() $i=0 foreach ($Db in $SQLSvr.Databases | Where-Object {$_.Name -neq "Tempdb"}){ $btmp = @{} $BackupConn = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName) $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup $BackupName = $BackupPath+"\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak" $Backup.BackupSetDescription = "Full Back of "+$Db.Name $Backup.database = $Db.Name $Backup.Devices.AddDevice($BackupName,'File') $btmp["backup"]= $Backup $btmp["connection"] = $BackupConn $backups += $btmp Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Complete" -SourceIdentifier "asyncbkup-complete-$($Db.Name)" -MessageData "$($Db.name)-$i" | Out-Null Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Information" -SourceIdentifier "asyncbkup-info-$($Db.Name)" -MessageData "$($Db.Name)-$i" | out-null $backups[$i]["backup"].SqlBackupAsync($backups[$i]["connection"]) $i++ } $output = "" while(($backups[0..$backups.length].Complete | Measure -sum).sum -lt $backups.Length){ Wait-Event | Tee-Object -variable event | Remove-Event $ErrorText = $event.SourceEventArgs.Error.Message $DbName,$i = $event.MessageData.Split('-') $status = $event.SourceArgs[0].AsyncStatus.ExecutionStatus if($backups[$i]["Complete"] -ne 1){ $now = get-date switch ($status){ "Succeeded" { $output += "`n $DbName Completed at $now" $backups[$i]["complete"] = 1 break;} "Failed" { $output += "`n $DbName Failed at $now with $ErrorText" $backups[$i]["complete"]=1 break;} "InProgress"{ $output += "`n $DbName In Progress at $now" break;} "Inactive" { $output += "`n $DbName Inactive at $now" break;} } } } Get-EventSubscriber | Unregister-Event write-output $output
As an example of the output from this script, here’s a sample from one of my test boxes where I’d marked one DB as offline:
InternetSales Failed at 09/09/2013 21:32:08 with Database 'InternetSales' cannot be opened because it is offline. dbgrow In Progress at 09/09/2013 21:32:09 dbgrow In Progress at 09/09/2013 21:32:09 AWDataWarehouse In Progress at 09/09/2013 21:32:09 AWDataWarehouse In Progress at 09/09/2013 21:32:09 fg_test In Progress at 09/09/2013 21:32:10 fg_test In Progress at 09/09/2013 21:32:10 fg_test In Progress at 09/09/2013 21:32:10 fg_test In Progress at 09/09/2013 21:32:10 HumanResources In Progress at 09/09/2013 21:32:10 dbgrow Completed at 09/09/2013 21:32:10 fg_test In Progress at 09/09/2013 21:32:10 HumanResources In Progress at 09/09/2013 21:32:10 msdb In Progress at 09/09/2013 21:32:11 model In Progress at 09/09/2013 21:32:11 AWDataWarehouse Completed at 09/09/2013 21:32:11 modellock2 In Progress at 09/09/2013 21:32:11 modellock In Progress at 09/09/2013 21:32:11 model In Progress at 09/09/2013 21:32:11 msdb In Progress at 09/09/2013 21:32:11 psrestore In Progress at 09/09/2013 21:32:12 modellock In Progress at 09/09/2013 21:32:12 modellock2 In Progress at 09/09/2013 21:32:12 ResellerSales In Progress at 09/09/2013 21:32:12 SomeDB In Progress at 09/09/2013 21:32:12 SomeDB In Progress at 09/09/2013 21:32:12 psrestore In Progress at 09/09/2013 21:32:12 SomeDB In Progress at 09/09/2013 21:32:12 ResellerSales In Progress at 09/09/2013 21:32:12 master In Progress at 09/09/2013 21:32:13 HumanResources Completed at 09/09/2013 21:32:13 restoretime In Progress at 09/09/2013 21:32:13 restoredt In Progress at 09/09/2013 21:32:13 fg_test Completed at 09/09/2013 21:32:13 master In Progress at 09/09/2013 21:32:13 Staging In Progress at 09/09/2013 21:32:13 restoretime In Progress at 09/09/2013 21:32:13 Staging In Progress at 09/09/2013 21:32:13 restoredt In Progress at 09/09/2013 21:32:13 psrestore Completed at 09/09/2013 21:32:13 msdb Completed at 09/09/2013 21:32:13 model Completed at 09/09/2013 21:32:13 ResellerSales Completed at 09/09/2013 21:32:13 modellock Completed at 09/09/2013 21:32:13 modellock2 Completed at 09/09/2013 21:32:13 test1 In Progress at 09/09/2013 21:32:13 test1 In Progress at 09/09/2013 21:32:13 Products In Progress at 09/09/2013 21:32:13 TSQL2012 In Progress at 09/09/2013 21:32:13 Products In Progress at 09/09/2013 21:32:13 TSQL2012 In Progress at 09/09/2013 21:32:13 master Completed at 09/09/2013 21:32:13 Staging Completed at 09/09/2013 21:32:13 SomeDB Completed at 09/09/2013 21:32:13 restoretime Completed at 09/09/2013 21:32:13 restoredt Completed at 09/09/2013 21:32:13 test1 Completed at 09/09/2013 21:32:13 Products Completed at 09/09/2013 21:32:13 TSQL2012 Completed at 09/09/2013 21:32:13
Showing that we have multiple backups happening at once, with the finish order mostly being determined by the size of the Database and the speed of my test rig’s hard drive.
And if you switch to SSMS while the script is running and look at connectiong/processes you’ll see all the backups working and queueing as well.
PowerShell Events are a powerful topic, and can be used in lots of other SQL Server scripting tasks. I’ve only just covered the tip of them here, a pair of good articles on them are :
Use Asynchronous Event Handling in PowerShell
Manage Event Subscriptions with PowerShell
which are well worth a read.
Tomorrow we’ll take a look at the PowerShell Job way of doing this.
This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is availablehere.
Anthony
Why do you set up $SQLServer for the connection and then again inside the foreach $DB loop setup another connection $BackupConn.
It seems like a superfluous step?
Stuart Moore
Hi,
Sorry for the slow reply, been moving house.
With asynchronous processes I like to have them running on their own connections, this means I know that I can close down the main connection or use it for something else without interfering with the asynchronous jobs running in the background.