Musings of a Data professional

Stuart Moore

Category: 31 days of SQL Server Backup and Restore with PowerShell Page 3 of 4

Day 11 of 31 Days of SQL Server Backup and Restore using PowerShell: Simple Database Restore

As we’ve seen in the past 10 days, performing SQL Server backups with PowerShell is pretty straight forward. Now we’re going to move on to Restores. This is where PowerShell really starts to shine. SQL Server doesn’t offer any real options for automating restores of Databases for checking or verification. You can schedule specific pieces of T-SQL, but that only applies to specific databases, or you end up tying yourself in knot with dynamic SQL.

As we did with backups we’ll start off with the simplest form of restore to introduce the basic concepts we’ll be building on. Here we’re just going to take a full database backup where we know which database it’s from, restore and recover it.

So here’s the scripts, first off here’s the full SMO version:

import-module "SQLPS" -DisableNameChecking

$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("Server1")
$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

$backupname = "C:\psbackups\psrestore.bak"

$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($backupname,$devicetype)

$restore.Database = "psrestore"
$restore.ReplaceDatabase = $True
$restore.Devices.add($restoredevice)
$restore.sqlrestore($sqlsvr)

As you can see this is pretty much identical to the basic PowerShell backup script from Day 1. The only real differences are that we have an $restore object of type Microsoft.SqlServer.Management.Smo.Restore and we’re calling a sqlrestore method, all the device creation methods are exactly the same.

And here’s the simple form of the Restore-SQLDatabase cmdlet:

Restore-SqlDatabase -ServerInstance "Server1" -Database "psrestore" -BackupFile "C:\psbackups\psrestore.bak"

Which is even simpler.

So far so simple. But as we’ll explore over the next days this gets a lot more complicated when you try to make it flexible, able to cope with complex restores and robust enough to ensure a properly recovered database.

The next step towards that is tomorrow’s post on restoring transaction logs.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 10 of 31 Days of SQL Server Backup and Restore using PowerShell: Automating with Task Scheduler

Yesterday we looked at scheduling PowerShell scripts with SQL Sever Agent, but what do you do if you’re running SQL Server Express and don’t have that luxury?

Well, it turns out that it’s pretty easy to schedule PowerShell taks with the Windows Task Scheduler.

Images in this post are from Task Scheduler on Windows 2012, but they are pretty much identical back to Windows 2008.

Task Scheduler offers a nice “Create Basic Task Wizard”, which quickly runs you through the pertinent options from the full task creation process,b ut unfortunately this doesn’t let you set a more complicated schedule than once a day:

basic-task-wizard

To do that you need to create a full scheduled task. Stepping through process isn’t too complicated, but just a few gotchas to watch out for.

schedule-properties

The first pane you’ll come to is the general properties one. Here you can provide a title for you task, and change the user under who’s security context the job will run. You’ll need to tick the box “Run whether user is logged on or not” otherwise it won’t run when the user is logged off. If you run it as any user other than the one creating the task, then any time you modify the task you’ll be prompted to provide the users credentials:

schedule-password

This is to prevent someone modifying a job to abuse any elevated privileges it runs under.

The next step along it ‘Triggers’:

Schedule-multiple-triggers

here we can build up the conditions under which our scheduled task will fire. As shown in the above screenshot you can set multiple triggers for the same task. For instance, you might want to run transaction log backups every 15 minutes durig 9-5 Production hours, but only every 2 hours outside of that window.

You build or Edit a trigger with the following screen:

schedule-triggers

You can set pretty much any schedule you want here. Don’t feel constrained by the values in the drop down boxes, you can type your own values if you want (for instance every 7 minutes).

Once we have our triggers we can define the actions that we want to take when they trigger:

schedule-action

Just like with Trigger you can have multiple Actions that will all fire on a trigger:

schedule-action-detail

Note that the Action we take is to start PowerShell, not to run out script. This is because .ps1 PowerShell scripts are not themselves executable, but need to be passed to PowerShell to run. So PowerShell is our program, here I’m relying on the system %PATH% variable to find it, but if you want to ensure the correct exe is started you can provide the full path to the exe. So as PowerShell is fired up we pass our script in as an Argument. As long as all file references in your scripts are fully qualified you probably don’t need to populate the “Start In” box, but if you are loading custom modules or other  sub scripts then it may be easier to start the script in a specific location.

Now as this is a series on PowerShell it’d a be a bit remiss not to show you how to create a new Scheduled task using PowerShell itself. The bad news is, that this is only implemented in Windows 2012 and won’t be back ported:

$action = New-ScheduledTaskAction -Execute PowerShell.exe -Argument "C:\scripts\backup_all_dbs.ps1"
$trigger = New-ScheduledTaskTrigger -once -at 00:01 -RepetitionInterval (New-TimeSpan -Minutes 30) -RepetitionDuration (new-timespan -Days 1)
Register-ScheduledTask -TaskName "SQL Backup" -TaskPath "\" -Action $action -Trigger $trigger -user "Domain\User" -Password "p@55w0rd"

In versions prior you have to use schtasks. schtasks has a large and flexible syntax which is beyond the scope of this article, but is well documented here with plenty of examples. For completeness here is the schtasks command line for the above PowerShell:

schtasks /create /sc minute /mo 30 /tn "SQL Backup" /tr "powershell.exe -c c:\scripts\SQL_backup.ps1" /ru Domain\User

When you run this, schtask will prompt you for the users password.

Tomorrow we’ll begin to look at using PowerShell to restore SQL Server databases

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here

Day 9 of 31 Days of SQL Server Backup and Restore using PowerShell: Automating with SQL Server Agent

So far in this series we’ve looked a variety of ways you can use PowerShell to run your SQL Server backups, but we’ve not touched on how you can automate them. That is going to be the topic of the next 2 posts. Today we’ll look at automating the scripts with SQL Server Agent, and tomorrow we’ll look at doing it with Task Scheduler.

SQL Agent has been able to schedule PowerShell jobs since 2008, and the basic method hasn’t changed. Screenshots here are from SQL Server 2012:

PowerShell is just another Step type in SQL Server you can pick from the dropdown:

New job step for PowerShell SQL Agent task

And then you can enter your PowerShell into the box provided. The Open button allows you to browse for existing files, but only puts the current content into the box:

New-job-step-powershell

Here I’ve used the basic looped database backup script from Day 2. Once crucial point to spot here, is that by default the script will be run as the “SQL Server Agent Service Account”. Now if you’re following SQL Server best practice, then this account won’t have much access to your local disks or to network shares, which is likely to cause your backup script to fail.

The way around this is to create a SQL Server Agent Proxy for PowerShell, and associate this with a credential that does have access to your backup locations, which is following best security practice.

The other problem that can occur is that so far we have incorporated very little checking and error reporting into our scripts. This means that if an error occurs then you a likely to get back less than useful error messages. Looking through the SQL Server errorlog can often provide some more details. I’ll be covering checking and error reporting in more detail towards the end of this series.

If you’d rather run from saved .ps1 PowerShell scripts then you’ll need to use the “Operating system (CmdExec”) Step type:

new-job-step-cmdexec

You can then enter the call to your script into the Command box. Note that we are actually calling PowerShell.exe and then passing our script in as a parameter. You can’t call a PowerShell script directly.

new-job-step-cmdexec-powershell2

The same security restrictions apply to this step as to the PowerShell one, so you may have to create a second SQL Server Agent Proxy or grant CmdExec to an existing proxy.

Now you’ve created you job you can schedule it like any other SQL Server job.

So this is all great if you’ve got SQL Server Agent, but what about if you’re using SQL Server Express and don’t have SQL Agent to schedule your backups? Well, we’ll cover scheduling PowerShell with Windows Task Scheduler tomorrow

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 8 of 31 Days of SQL Server Backup and Restore using PowerShell: Asynchronous Backups – Part 2 – PowerShell Jobs

Yesterday we looked at using PowerShell Events to track asynchronous backups to fire off concurrent backup sessions from the same script. All of those backups were being run independently of each other, so what would you do if you want to run batches of backups concurrently (ie; All dbs on Server A and all dbs on Server B concurrently)?

That’s where PowerShell Jobs come in. Introduced in PowerShell 3.0 they let you fire off a block of script and have it run in the background.  You can then wait for the jobs to finish, or poll for them yourself.

We’ll use the example above of wanting to backup all the databases on 2 SQL Server installs concurrently:

$JobFunction={function backup_server_dbs
{
    param([String]$ServerName)

    Import-Module "SQLPS" -DisableNameChecking

    $SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

    $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

    foreach ($Db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
        $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $BackupSetDescription = "Full Back of "+$Db.Name
        $Backup.Database = $Db.Name

        $BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
        $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)
        try{
            Backup-SqlDatabase -ServerInstance $ServerName -Database $Db.Name -BackupFile $BackupName -BackupSetDescription $BackupSetDescription
        }
        Catch [System.Exception]{
            $output += "`nError with $db.name at "+(get-date)
            $output += ($error[0].ToString())
            $error =1
        }
        Finally{
            if ($error -ne 1){
                $output += "`n Finished backup of $db.name at "+(get-date)
            }
            $error = 0
        }
    }
    return $output
}
} 

$servers = @("Server1","Server2")
$jobs = @()

foreach ($servername in $servers){
    $jobs += Start-job -ScriptBlock {backup_server_dbs $args[0]} -InitializationScript $JobFunction -ArgumentList($servername)
}

$jobs | Wait-Job

$FullOuput = ""
foreach ($job in $jobs){
    Receive-Job $job | Tee-Object -Variable joutput
    $FullOuput += $joutput
    Remove-Job $job
}

Write-Output $FullOutput

This script starts off a little differently to every backup script we’ve looked at so far in this series. What we do first is to create a variable $JobFunction which holds a function definition for a function backup_server_dbs. The reason for this will be explained slightly later on in the script.

backup_server_dbs accepts one paramater, a string called $ServerName. And then it should look pretty famaliar from our previous articles. We build a SQL Server connection, loop through all of it’s databases (ignoring poor TempDB) and backing them up. The main difference it the inclusions of a Try, Catch, Finally block around the actual backup. If you’ve not come across this before, it’s a simple way of handling errors in PowerShell. At a high level we are:

  • Try: Try doing something
  • Catch: If “trying” has caused a error, Catch it
  • Finally: Whatever happened when trying, Finally we’ll do this

So here we Try to perform the backup, if there’s an error we Catch it and add the error message to our output sting, and Finally if we’ve had no errors we append the completion time to our output string.

Once we’ve attmempted to backup all of our databases we exit the function returning out $output variable.

Moving on from the function we set up 2 array variables, the first $Servers holding our server names, and the 2nd one is unpopulated but is going to hold the jobs we create to run the backups.

Using foreach we loop through each server we’ve provided, and start a new job for each one with the Start-Job cmdlet.

And this is the reason we created a variable holding a function. This is so we can pass it into Start-Job as an InitializationScript and then call the function contained within it in the ScriptBlock parameter. A slightly cleaner way of doing this is by using a secondary PowerShell script file and passing the file location in to InitializationScript, but that’s a little hard to show in a simple blog posting, and this way also keeps everything synced in one file.

Once we’ve submitted the jobs we pass the array containing them ($jobs) into Wait-Job. This pauses execution until all the jobs in $jobs have completed.

Once all the jobs have completed we ForEach through echo $job in the $jobs array, and use Receive-Job cmdlet to get the return information from the job, pass it throught Tee-Object to grab the $output into a temporary holder, and then append to a full output variable, and then we pass it to Remove-Job to clear it from the jobs list

And finally we output the Success and Error Messages back to the console.

This basic approach can be used for anytime you want batches of work to be running at the same time. By extending the function to take a Filegroup name and a Backup location you could backup each filegroup to a different set of disk therebay maximising your throughput by not just hitting a single controller, NIC or HBA:

Import-Module SQLPS -DisableNameChecking

$JobFunction={function backup_server_dbs
{param([String]$ServerName, [String]$DatabaseName, [String]$FileGroup, [String]$BackupLocation)
    
    Import-Module "SQLPS" -DisableNameChecking

    $BackupSetDescription = "Filegroup backup of Filegroup "+$FileGroup+" of "+$DatabaseName
    
    $BackupName = $BackupLocation+"\"+$DatabaseName+"_"+$FileGroup+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
      try{
        Backup-SqlDatabase -ServerInstance $ServerName -Database "fg_test" -BackupFile $BackupName -BackupSetDescription $BackupSetDescription -DatabaseFileGroup $FileGroup -BackupAction Files
    }
    Catch [System.Exception]{
        $output += "`nerror with $DatabaseName at "+(get-date)
        $output += "`n vars = $servername, $DatabaseName, $BackupName, $BackupLocation" 
        $output += ($error[0].ToString())
        $error =1
    }
    Finally{
        if ($error -ne 1){
            $output += "`n finished backup of $DatabaseName "+(get-date)
        }
        $error = 0
    }

    return $output
}
} 

$servername = "Server1"
$DatabaseName = "fg_test"
$BackupLocations = @("c:\psbackups\fg1","\\Server2\shareA","\\Server3\ShareB")
$jobs = @()
$i=0

$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$db = $SQLSvr.Databases.Item($DatabaseName)

foreach ($FileGroup in $db.FileGroups){
     $jobs += Start-job -ScriptBlock {backup_server_dbs $args[0] $args[1] $args[2] $args[3]} -InitializationScript $JobFunction -ArgumentList ($servername,$DatabaseName, $Filegroup.Name,  $BackupLocations[$i])
 
    $i++
}


$jobs | wait-job

$FullOuput = ""
foreach ($job in $jobs){
    receive-job $job | Tee-Object -Variable joutput
    $FullOuput += $joutput
    Remove-Job $job
}

Write-Output $FullOuput

If you were writing the above script for production use, you’d probably want to include a check to ensure you’ve provided as many backup locations as you have filegroups.

Tomorrow we’ll be looking at scheduling PowerShell Tasks.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 6 of 31 days of SQL Server Backup and Restore using PowerShell: Redirecting backups

A very useful feature of performing your backups with PowerShell is that you can redirect your backups easily on pretty much any condition you want, or even run multiple backups asynchronously to different network drives to save saturating a single NIC.

At the simplest level you can push each database backup into it’s own folder. However, just like with T-SQL backups, if the folder doesn’t exist the backup will fail, so we’ll introduce a quick check:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.Database = $db.Name
    $BackupFolder = "c:\psbackups\"+$Db.name+"\"
    if ((Test-Path $BackupFolder) -eq $False){
        New-Item $BackupFolder -type Directory
    }
    $BackupName = $BackupFolder+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($SQLSvr)
    $Backup.Devices.Remove($BackupDevice)
}

Or we can monitor the space left on the backup drive, and if it falls below what we need (plus a margin for safety) the backups move to another area?:

Import-Module "SQLPS" -DisableNameChecking

$folders = @()
$folders += "c:\psbackups"
$folders += "\\server2\backups$"
$folders += "\\server3\panicspace$"

$FolderCount = 0
$CurrentDrive = New-Object -Com Scripting.FileSystemObject
$MarginForError = 200

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.Database = $db.Name

    $CurrentDrive = $folders[$FolderCount]
    if (($Db.Size-$Db.SpaceUnused)+$MarginForError -gt ($CurrentDrive.AvailableSpace/1024/1024)){
        $FolderCount++
    }
    $BackupName = $Folders[$FolderCount]+"\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($SQLSvr)
    $Backup.Devices.Remove($BackupDevice)
}

Here we build up an array of possible backup folders ($Folders), set a $MarginForError in Mb (in this case I’m setting 200MB) and creating a new Scripting object to hold our drive object. As we loop through the database we check that the AvailableSpace on the drive is greater that our requirements (AvailableSpace returns space in bytes, so we need to change that to MB), I check each time just in case something else is using the disk and don’t get caught out. If we are short of space we increase the $FolderCount variable by 1.

When building up the backup target we us $FolderCount to pick the currently used path from the $Folders array, and then continue with the backup as normal.

Using $Db.Size gives us a worst case scenario for the potential size of the backup. Though if you have a lot of free space in your DB it’ll be skewed, this could be corrected by using $Db.spaceavailable as well (correcting for the differences in return values (Size is in MB, SpaceAvilable in KB).

Tomorrow we’ll be looking at Asynchronous backups, allowing you to fire off multiple concurrently running backups from a single script

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 5 of 31 days of SQL Server Backup and Restore using PowerShell: File and Filegroup backups

So far we’ve only looked at backup types supported by the Maintenance Plans (Full, Differential and Log), now we’re going to start going past those and look at File and FileGroup backups. Normally to perform these you’d need a 3rd party tool or you own home rolled SQL scripts.

The script examples are based on using the following database, with 3 filegroups, 1 of which (tertiary) is readonly:

USE [master]
GO

CREATE DATABASE [fg_test]
 ON  PRIMARY
( NAME = N'fg_test', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test.mdf' ),
 FILEGROUP [secondary]
( NAME = N'fg_test_2', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_2.ndf' ),
( NAME = N'fg_test_2a', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_2a.ndf' ),
 FILEGROUP [tertiary]
( NAME = N'fg_test_3', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_3.ndf' )
 LOG ON
( NAME = N'fg_test_log', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_log.ldf');
GO

alter database fg_test modify filegroup tertiary readonly;
go

Using SMO, to backup up a single filegroup you use the Add method of the Backup object’s DatabaseFileGroup property, and everything else stays the same:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-C0BP65U3D4G"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("fg_test")
$FileGroupName="Secondary"

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Files
$Backup.BackupSetDescription = "Filegroup Backup of Filegroup "+$FileGroupName+" of "+$Db.Name
$Backup.Database = $db.Name
$backup.DatabaseFileGroups.add($FileGroupName)

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

The line that does the work is:

$Backup.DatabaseFileGroups.Add($FileGroupName)

If you want to backup multiple FileGroups then you can call the method multiple times:

$Backup.DatabaseFileGroups.Add("primary")
$Backup.DatabaseFileGroups.Add("secondary")

Or you can pass them in via the AddRange operator:

$Backup.DatabaseFileGroups.AddRange(("primary","secondary"))

It can also be done with a loop if you can identify which filegroups you want to backup. For example, if you wanted to backup all non readonly filegroups you could replace the line with this script snippet:

foreach ($fg in $db.FileGroups | where-object {$_.ReadOnly -eq $FALSE}){
    $Backup.DatabaseFileGroups.Add($fg.Name)
}

This is one of the cases where the Backup-SQLDatabase is slightly harder to use. To replicate the first SMO example you’d use:

Import-Module SQLPS -DisableNameChecking

Backup-SqlDatabase -ServerInstance WIN-C0BP65U3D4G -Database fg_test  -DatabaseFileGroup "secondary" -BackupFile "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"

For the second:

Import-Module SQLPS -DisableNameChecking

Backup-SqlDatabase -ServerInstance WIN-C0BP65U3D4G -Database fg_test  -DatabaseFileGroup "primary","secondary" -BackupFile "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"

Note passing in the filegroup names as a comma separated list of strings.

And for the looped version:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-C0BP65U3D4G"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("fg_test")
$Filegroup = ""
$i=0
foreach ($fg in $db.FileGroups | where-object {$_.ReadOnly -eq $FALSE}){
    if ($i -eq 0){
        $Filegroup = $FileGroup+"`"$fg.Name`""
    }else{
        $Filegroup = $FileGroup+",`"$fg.Name`""
    }
}
Backup-SqlDatabase -ServerInstance WIN-C0BP65U3D4G -Database fg_test  -DatabaseFileGroup $Filegroup -BackupFile "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"

Here we still have to build the connection to the SQL Server instance so we can get the information about the Filegroups. And then loop through it building up the correct string format to pass in as a parameter. I use a simple counter $i to keep track of how many filegroups we’ve found and to ensure the “,”s only go where they’re needed.

Database file backups work in exactly the same way, but using the $Backup.Files.Add() method or the -DatabaseFile parameter.

Tomorrow we’ll be looking at redirecting the backup files.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 4 of 31 days of SQL Server backup and Restore using Powershell: Catching new databases for a full backup

As the PowerShell backups are just calling the standard T-SQL backup commands, all the usual backup rules a SQL Server DBA is used to still apply.

The main one of these, is that before a Transaction or Differential Backup can be taken the database must have had at least one full backup. This ensures that there is a start to the backup chain.

A common setup at many ISVs is for a new database to be created for each new customer that sign’s up. If that’s an automated process kicked off by an online payment, then the new database could appear at any time, and if you’re only taking full backups once a day the database may not get a full backup for nearly 24 hours! If you’re lucky your developers or suppliers wrote in a check to do an initial backups, do ya feel lucky?

And it’s usually in that first 24 hours that the new customer will be very keen, loading up lots of data and making lots of configuration choicses; but not used to the product, potentially leading to an unexpected deletion of data. If that happens, you’l have no way to recover the data.

By using the following PowerShell script to run your transaction log backups, it will catch any database without a previous backup.

Import-Module "SQLPS" -DisableNameChecking

$Server = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($Server)
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($Db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb" -and $_.RecoveryModel -eq "Full"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    if ($Db.LastBackupDate -lt $Db.CreateDate){
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $BackupName = "c:\psbackups\"+$db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        $Backup.BackupSetDescription = "Full Back of "+$db.Name
    }else{
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
        $BackupName = "c:\psbackups\"+$db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".trn"
        $Backup.BackupSetDescription = "Log Back of "+$Db.Name
        $Backup.LogTruncation = [Microsoft.SqlServer.Management.Smo.BackupTruncateLogType]::Truncate
    }

    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $Backup.Database = $Db.Name

    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)
    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($sqlsvr)
    $Backup.Devices.Remove($BackupDevice
}

We make use of the fact that if a new database doesn’t have a backup then SQL Server returns LastBackupDate as “Monday, January 01, 0001 12:00:00 AM”. We don’t match for that value as if a database has previously existed with the same name and someone hasn’t run sp_delete_database_backuphistory to clear it’s backup history then SQL Server will return the old databases last backup date. But that will still be less than the CreateDate of the new database, so we use that instead.

Tomorrow we’ll be looking at FileGroup and File backups

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 3 of 31 days of SQL Server backup and Restore using Powershell: Transaction and Differential backups

So far we’ve only looked at performing Full Databse backups. PowerShell and SMO are perfectly happy to handle all the other types available in SQL Server:

  • Transaction Log backup
  • Differential
  • File or FileGroup
  • File or FileGroup Differential

In this post we’ll just look at the first 2 types, and then Files and Filegroups in tomorrow’s post But before we do, be aware that SQL Server and PowerShell do not care nor correct your backup file names or extensions, so it will quite happily write out your transaction log backups to filename.bak. As we’ll cover in the restore section, this isn’t a problem if you’re looking into the files, but may cause some confusion if someone just looks in the backup folder! For an SMO Transaction Log Backup we just need to change the backup action to Log:

$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log

By default under this action the backup will truncate the logs. If that’s not what you want to happen, then you can override it by using the LogTruncation property of the Backup object:

#To Not truncate the log and leave all the transactions in the log use
$Backup.LogTruncation = "NoTruncate"

#For completeness if you just want to truncate the log, and not back anything up use
$Backup.LogTruncation = "TruncateOnly"

For the Backup-SQLDatabase you use the following parameters:

Backup-SqlDatabase -BackupAction Log -LogTruncationType Truncate ...

Backup-SqlDatabase -BackupAction Log -LogTruncationType NoTruncate ...

Backup-SqlDatabase -BackupAction Log -LogTruncationType TruncateOnly ...

For a SMO Differential backup Action is left as file, but you set the backup objects’s Incremental property to $TRUE:

$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$Backup.Incremental = $TRUE

And for the Backup-SQLDatabase cmdlet you would use:

Backup-SqlDatabase -BackupAction Database -Incremental ...

So putting this together, a simple script to backup a database with the following schedule:

  • Sunday Evening at 18:00 – Full Backup
  • Every Evening (expect Sunday) at 18:00 – Differential Backup
  • Every hour – Transaction Log

would look like:

$ScriptStartTime = Get-Date
$BackupFileSuffix = ""
Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
$DataBase = "DB1"

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item($DataBase)
$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Database = $db.Name

if ($ScriptStartTime.Hour -eq 18){
    if ($ScriptStartTime.DayOfWeek -eq "Sunday"){
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $Backup.Incremental = $false
        $backup.BackupSetDescription = "Full Backup of "+$Db.Name
        $BackupFileSuffix = ".bak"
    }else{
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $Backup.Incremental = $true
        $backup.BackupSetDescription = "Differential Backup of "+$Db.Name
        $BackupFileSuffix = ".bck"
    }
}else{
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
    $Backup.Incremental = $true
    $backup.BackupSetDescription = "Log Backup of "+$Db.Name
    $BackupFileSuffix = ".trn"

}

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+$BackupFileSuffix
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

The sections doing the work are:

$ScriptStartTime = Get-Date
...
if ($ScriptStartTime.Hour -eq 18){
    if ($ScriptStartTime.DayOfWeek -eq "Sunday"){
        ....
    }else{
        ....
    }
}else{
    ....
}

As the script starts we get the time it started and use this as a reference. This prevents an issue if we were looping through a number of databases on a Sunday evening and it took more than an hour to get to the lasts one, they would end up getting just transaction backups. By referencing the start time we ensure that all the databases touched by that running of the script get the appropriate backup. We then check to see if the script started at 18:* (we match it a little fuzzily to allow for a delayed start due to other system jobs), if it has we check to see if it’s a Sunday. If it is we set for a Database backup, ensure that we aren’t doing a differential, populate the Description appropriately and set the suffix to .bak as normal. If it’s 18:00 on any other day we set for a Database backup but this time an Incremental backup, and set the suffix to .bck and populate the description correctly. Any other running of the script, and we set the parameters up for a Transaction log backup.

Tomorrow we’ll look at an example of how to use these ideas to catch newly created databases and ensure we start a new backup chain before it’s too late!

Day 2 of 31 days of SQL Server backup and Restore using Powershell: Looping through Databases

Yesterday we looked at backing up a single SQL Server database with PowerShell. There is certainly more code involved than a good old T-SQL style backup. But thanks to the wonders of PowerShell we now have a very reusable piece of code.

As a good example, if you wanted to loop through every database in a SQL Server instance we can now take the central part of the script and loop through it as many times as we want. And if we want to ignore certain DBs then that’s simple as well:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-4B40IEFH4CR\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.Database = $db.Name

    $BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($SQLSvr)
    $Backup.Devices.Remove($BackupDevice)
}

By changing one line we suddenly make the script a lot more useful:

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -neq "tempdb"}){
    code
}

We get our SQL Server to return an object containing all the database objects present on the server($SQLSvr.Databases. We use the Where-Object cmdlet to filter down to only the databases we want. In this case, we’ve asked for all Databases where their name is not equal (-ne to TempDB. You can modify this fitering to exclude anything you’d like based on any Database Property, for example:

  • {_.Name -notlike "*_test"} – returns all databases that don’t end in _test
  • {$_.IsSystemObject -eq $FALSE} – returns all non system databases

Then using Foreach we loop through all the DB objects in the Databases object executing our backup code.

This can be extended further to loop through a list of servers as well:

Import-Module "SQLPS" -DisableNameChecking

$ServerList = @("server1","server2","server3")

foreach ($ServerName in $ServerList){
    $SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

    $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

    foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
        $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $backup.BackupSetDescription = "Full Back of "+$Db.Name
        $Backup.Database = $db.Name

        $BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
        $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

        $Backup.Devices.Add($BackupDevice)
        $Backup.SqlBackup($SQLSvr)
        $Backup.Devices.Remove($BackupDevice)
    }
}

Now to look at getting the same outcome using the Backup-SQLDatabase cmdlet:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-4B40IEFH4CR\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -neq "tempdb"}){
    $BackupFile = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    Backup-SQLDatabase -InputObject $SQLSvr -Database $Db.name -BackupFile $BackupFile  -BackupAction Full -BackupSetDescription "Full Back of "+$Db.Name
}

Again the backup script is shorter, though in this case we still need to use the SMO methods to create a connection to the SQL Server instance so we can loop through it’s collection of databases. And as we have it open, we can also pass it to the Backup-SQLDatabase cmdlet, though note that we have to use the -InputObject parameter rather than the -ServerInstance we used previously

This can be extended to multiple servers in exactly the same way as the SMO version:

Import-Module "SQLPS" -DisableNameChecking

$ServerList = @("server1","server2","server3")

foreach ($ServerName in $ServerList){
    $SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
    $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

    foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -neq "tempdb"}){
        $BackupFile = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        Backup-SQLDatabase -InputObject $SQLSvr -Database $Db.name -BackupFile $BackupFile  -BackupAction Full -BackupSetDescription "Full Back of "+$Db.Name
    }
}

Now we’ve looked at performing full database backups, tomorrow we’ll move onto looking at transaction log and differential backups.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

PowerShell

Day 1 of 31 days of SQL Server backup and Restore using Powershell: Simple Database Backup

We’ll start with the simplest form of backup script, taking a full backup of a database. This demonstrate’s the basic principles that will then be expanded on to perform more complex operations later on in the series.

The script is:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("psdb1")

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Full Backup of "+$Db.Name
$Backup.Database = $db.Name

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

Now to break it down:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$DbName = "psdb1"
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item($DbName)

First we import the SQL PowerShell module, we use the -DisableNameChecking to hide this error message:

WARNING: The names of some imported commands from the module 'SQLPS' include unapproved verbs that might make them less discoverable. To find the commands with unapproved
verbs, run the Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.

which is just letting us know that Microsoft don’t always follow Microsoft’s recommended best practice

Next we build our connection to our SQL Server by putting the Server and Instance name into a string variable and then passing it into the constructor for a new SQL Server object. This examples assumes that you are using Windows Authentication to connect to your server and you are running the script under an account that has permissions to the server. You can pass the value straight to the constructor, but I find having well name variables towards the top of the script much easier to find and you can reuse them throughout, so you don’t have to search and replace each time you want to work on a different server,

Then we create a new Database object by passing the name of the database we want to backup to the constructor

The next section:

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Full Backup of "+$Db.Name
$Backup.Database = $Db.Name

This creates a new Backup object, and then we say what type of backup it is. In this case, we used the Database type. For later ease we name the backup. Then attach the database to be backup up.

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

Now we create our backup device. $BackupName for the File backup device is the path to the .bak file. Remeber that the path is relative to the SQL Server Instance you’re asking to perform the backup. ie; that c:\psbackups folder is on Server1, not the machine you’re running the PowerShell on. The type of device and path are passed into the constructor for a new backup device.

And now finally, we come to do the actual backup:

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

Add the newly created backup device to the backup object. Perform the actual backup, by calling the SqlBackup passing it the SQL server object we created before. The referenced SQL Server instance will now perform the backup before handing back to the script, so if you’ve pointed this at a 50GB database and are backing it up to slow disks this might be a good time to grab a coffee.

Once the backup has completed the script continues, and we remove the backup device from the backup object

In this case the same operation using Backup-SQL requires much less script:

Import-Module SQLPS -DisableNameChecking
$ServerName = "Server1\SQL2012"
$DbName = "psdb1"
$BackupFile ="c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
Backup-SQLDatabase -ServerInstance $ServerName -Database $DbName -BackupFile $BackupFile -BackupAction Database

As you can see, this new cmdlet removes the need for the scriptwriter to manually build the SMO objects manually as here it’s just taking in strings, and doing the work on your behalf.

Tomorrow, we’ll be looking at looping this script across all the databases in a SQL Server Instance excluding those we don’t want.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Page 3 of 4

Powered by WordPress & Theme by Anders Norén