Musings of a Data professional

Stuart Moore

Tag: sql server Page 4 of 5

Day 9 of 31 Days of SQL Server Backup and Restore using PowerShell: Automating with SQL Server Agent

So far in this series we’ve looked a variety of ways you can use PowerShell to run your SQL Server backups, but we’ve not touched on how you can automate them. That is going to be the topic of the next 2 posts. Today we’ll look at automating the scripts with SQL Server Agent, and tomorrow we’ll look at doing it with Task Scheduler.

SQL Agent has been able to schedule PowerShell jobs since 2008, and the basic method hasn’t changed. Screenshots here are from SQL Server 2012:

PowerShell is just another Step type in SQL Server you can pick from the dropdown:

New job step for PowerShell SQL Agent task

And then you can enter your PowerShell into the box provided. The Open button allows you to browse for existing files, but only puts the current content into the box:

New-job-step-powershell

Here I’ve used the basic looped database backup script from Day 2. Once crucial point to spot here, is that by default the script will be run as the “SQL Server Agent Service Account”. Now if you’re following SQL Server best practice, then this account won’t have much access to your local disks or to network shares, which is likely to cause your backup script to fail.

The way around this is to create a SQL Server Agent Proxy for PowerShell, and associate this with a credential that does have access to your backup locations, which is following best security practice.

The other problem that can occur is that so far we have incorporated very little checking and error reporting into our scripts. This means that if an error occurs then you a likely to get back less than useful error messages. Looking through the SQL Server errorlog can often provide some more details. I’ll be covering checking and error reporting in more detail towards the end of this series.

If you’d rather run from saved .ps1 PowerShell scripts then you’ll need to use the “Operating system (CmdExec”) Step type:

new-job-step-cmdexec

You can then enter the call to your script into the Command box. Note that we are actually calling PowerShell.exe and then passing our script in as a parameter. You can’t call a PowerShell script directly.

new-job-step-cmdexec-powershell2

The same security restrictions apply to this step as to the PowerShell one, so you may have to create a second SQL Server Agent Proxy or grant CmdExec to an existing proxy.

Now you’ve created you job you can schedule it like any other SQL Server job.

So this is all great if you’ve got SQL Server Agent, but what about if you’re using SQL Server Express and don’t have SQL Agent to schedule your backups? Well, we’ll cover scheduling PowerShell with Windows Task Scheduler tomorrow

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 8 of 31 Days of SQL Server Backup and Restore using PowerShell: Asynchronous Backups – Part 2 – PowerShell Jobs

Yesterday we looked at using PowerShell Events to track asynchronous backups to fire off concurrent backup sessions from the same script. All of those backups were being run independently of each other, so what would you do if you want to run batches of backups concurrently (ie; All dbs on Server A and all dbs on Server B concurrently)?

That’s where PowerShell Jobs come in. Introduced in PowerShell 3.0 they let you fire off a block of script and have it run in the background.  You can then wait for the jobs to finish, or poll for them yourself.

We’ll use the example above of wanting to backup all the databases on 2 SQL Server installs concurrently:

$JobFunction={function backup_server_dbs
{
    param([String]$ServerName)

    Import-Module "SQLPS" -DisableNameChecking

    $SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

    $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

    foreach ($Db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
        $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $BackupSetDescription = "Full Back of "+$Db.Name
        $Backup.Database = $Db.Name

        $BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
        $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)
        try{
            Backup-SqlDatabase -ServerInstance $ServerName -Database $Db.Name -BackupFile $BackupName -BackupSetDescription $BackupSetDescription
        }
        Catch [System.Exception]{
            $output += "`nError with $db.name at "+(get-date)
            $output += ($error[0].ToString())
            $error =1
        }
        Finally{
            if ($error -ne 1){
                $output += "`n Finished backup of $db.name at "+(get-date)
            }
            $error = 0
        }
    }
    return $output
}
} 

$servers = @("Server1","Server2")
$jobs = @()

foreach ($servername in $servers){
    $jobs += Start-job -ScriptBlock {backup_server_dbs $args[0]} -InitializationScript $JobFunction -ArgumentList($servername)
}

$jobs | Wait-Job

$FullOuput = ""
foreach ($job in $jobs){
    Receive-Job $job | Tee-Object -Variable joutput
    $FullOuput += $joutput
    Remove-Job $job
}

Write-Output $FullOutput

This script starts off a little differently to every backup script we’ve looked at so far in this series. What we do first is to create a variable $JobFunction which holds a function definition for a function backup_server_dbs. The reason for this will be explained slightly later on in the script.

backup_server_dbs accepts one paramater, a string called $ServerName. And then it should look pretty famaliar from our previous articles. We build a SQL Server connection, loop through all of it’s databases (ignoring poor TempDB) and backing them up. The main difference it the inclusions of a Try, Catch, Finally block around the actual backup. If you’ve not come across this before, it’s a simple way of handling errors in PowerShell. At a high level we are:

  • Try: Try doing something
  • Catch: If “trying” has caused a error, Catch it
  • Finally: Whatever happened when trying, Finally we’ll do this

So here we Try to perform the backup, if there’s an error we Catch it and add the error message to our output sting, and Finally if we’ve had no errors we append the completion time to our output string.

Once we’ve attmempted to backup all of our databases we exit the function returning out $output variable.

Moving on from the function we set up 2 array variables, the first $Servers holding our server names, and the 2nd one is unpopulated but is going to hold the jobs we create to run the backups.

Using foreach we loop through each server we’ve provided, and start a new job for each one with the Start-Job cmdlet.

And this is the reason we created a variable holding a function. This is so we can pass it into Start-Job as an InitializationScript and then call the function contained within it in the ScriptBlock parameter. A slightly cleaner way of doing this is by using a secondary PowerShell script file and passing the file location in to InitializationScript, but that’s a little hard to show in a simple blog posting, and this way also keeps everything synced in one file.

Once we’ve submitted the jobs we pass the array containing them ($jobs) into Wait-Job. This pauses execution until all the jobs in $jobs have completed.

Once all the jobs have completed we ForEach through echo $job in the $jobs array, and use Receive-Job cmdlet to get the return information from the job, pass it throught Tee-Object to grab the $output into a temporary holder, and then append to a full output variable, and then we pass it to Remove-Job to clear it from the jobs list

And finally we output the Success and Error Messages back to the console.

This basic approach can be used for anytime you want batches of work to be running at the same time. By extending the function to take a Filegroup name and a Backup location you could backup each filegroup to a different set of disk therebay maximising your throughput by not just hitting a single controller, NIC or HBA:

Import-Module SQLPS -DisableNameChecking

$JobFunction={function backup_server_dbs
{param([String]$ServerName, [String]$DatabaseName, [String]$FileGroup, [String]$BackupLocation)
    
    Import-Module "SQLPS" -DisableNameChecking

    $BackupSetDescription = "Filegroup backup of Filegroup "+$FileGroup+" of "+$DatabaseName
    
    $BackupName = $BackupLocation+"\"+$DatabaseName+"_"+$FileGroup+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
      try{
        Backup-SqlDatabase -ServerInstance $ServerName -Database "fg_test" -BackupFile $BackupName -BackupSetDescription $BackupSetDescription -DatabaseFileGroup $FileGroup -BackupAction Files
    }
    Catch [System.Exception]{
        $output += "`nerror with $DatabaseName at "+(get-date)
        $output += "`n vars = $servername, $DatabaseName, $BackupName, $BackupLocation" 
        $output += ($error[0].ToString())
        $error =1
    }
    Finally{
        if ($error -ne 1){
            $output += "`n finished backup of $DatabaseName "+(get-date)
        }
        $error = 0
    }

    return $output
}
} 

$servername = "Server1"
$DatabaseName = "fg_test"
$BackupLocations = @("c:\psbackups\fg1","\\Server2\shareA","\\Server3\ShareB")
$jobs = @()
$i=0

$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$db = $SQLSvr.Databases.Item($DatabaseName)

foreach ($FileGroup in $db.FileGroups){
     $jobs += Start-job -ScriptBlock {backup_server_dbs $args[0] $args[1] $args[2] $args[3]} -InitializationScript $JobFunction -ArgumentList ($servername,$DatabaseName, $Filegroup.Name,  $BackupLocations[$i])
 
    $i++
}


$jobs | wait-job

$FullOuput = ""
foreach ($job in $jobs){
    receive-job $job | Tee-Object -Variable joutput
    $FullOuput += $joutput
    Remove-Job $job
}

Write-Output $FullOuput

If you were writing the above script for production use, you’d probably want to include a check to ensure you’ve provided as many backup locations as you have filegroups.

Tomorrow we’ll be looking at scheduling PowerShell Tasks.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 6 of 31 days of SQL Server Backup and Restore using PowerShell: Redirecting backups

A very useful feature of performing your backups with PowerShell is that you can redirect your backups easily on pretty much any condition you want, or even run multiple backups asynchronously to different network drives to save saturating a single NIC.

At the simplest level you can push each database backup into it’s own folder. However, just like with T-SQL backups, if the folder doesn’t exist the backup will fail, so we’ll introduce a quick check:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.Database = $db.Name
    $BackupFolder = "c:\psbackups\"+$Db.name+"\"
    if ((Test-Path $BackupFolder) -eq $False){
        New-Item $BackupFolder -type Directory
    }
    $BackupName = $BackupFolder+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($SQLSvr)
    $Backup.Devices.Remove($BackupDevice)
}

Or we can monitor the space left on the backup drive, and if it falls below what we need (plus a margin for safety) the backups move to another area?:

Import-Module "SQLPS" -DisableNameChecking

$folders = @()
$folders += "c:\psbackups"
$folders += "\\server2\backups$"
$folders += "\\server3\panicspace$"

$FolderCount = 0
$CurrentDrive = New-Object -Com Scripting.FileSystemObject
$MarginForError = 200

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.Database = $db.Name

    $CurrentDrive = $folders[$FolderCount]
    if (($Db.Size-$Db.SpaceUnused)+$MarginForError -gt ($CurrentDrive.AvailableSpace/1024/1024)){
        $FolderCount++
    }
    $BackupName = $Folders[$FolderCount]+"\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($SQLSvr)
    $Backup.Devices.Remove($BackupDevice)
}

Here we build up an array of possible backup folders ($Folders), set a $MarginForError in Mb (in this case I’m setting 200MB) and creating a new Scripting object to hold our drive object. As we loop through the database we check that the AvailableSpace on the drive is greater that our requirements (AvailableSpace returns space in bytes, so we need to change that to MB), I check each time just in case something else is using the disk and don’t get caught out. If we are short of space we increase the $FolderCount variable by 1.

When building up the backup target we us $FolderCount to pick the currently used path from the $Folders array, and then continue with the backup as normal.

Using $Db.Size gives us a worst case scenario for the potential size of the backup. Though if you have a lot of free space in your DB it’ll be skewed, this could be corrected by using $Db.spaceavailable as well (correcting for the differences in return values (Size is in MB, SpaceAvilable in KB).

Tomorrow we’ll be looking at Asynchronous backups, allowing you to fire off multiple concurrently running backups from a single script

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 5 of 31 days of SQL Server Backup and Restore using PowerShell: File and Filegroup backups

So far we’ve only looked at backup types supported by the Maintenance Plans (Full, Differential and Log), now we’re going to start going past those and look at File and FileGroup backups. Normally to perform these you’d need a 3rd party tool or you own home rolled SQL scripts.

The script examples are based on using the following database, with 3 filegroups, 1 of which (tertiary) is readonly:

USE [master]
GO

CREATE DATABASE [fg_test]
 ON  PRIMARY
( NAME = N'fg_test', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test.mdf' ),
 FILEGROUP [secondary]
( NAME = N'fg_test_2', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_2.ndf' ),
( NAME = N'fg_test_2a', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_2a.ndf' ),
 FILEGROUP [tertiary]
( NAME = N'fg_test_3', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_3.ndf' )
 LOG ON
( NAME = N'fg_test_log', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\fg_test_log.ldf');
GO

alter database fg_test modify filegroup tertiary readonly;
go

Using SMO, to backup up a single filegroup you use the Add method of the Backup object’s DatabaseFileGroup property, and everything else stays the same:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-C0BP65U3D4G"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("fg_test")
$FileGroupName="Secondary"

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Files
$Backup.BackupSetDescription = "Filegroup Backup of Filegroup "+$FileGroupName+" of "+$Db.Name
$Backup.Database = $db.Name
$backup.DatabaseFileGroups.add($FileGroupName)

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

The line that does the work is:

$Backup.DatabaseFileGroups.Add($FileGroupName)

If you want to backup multiple FileGroups then you can call the method multiple times:

$Backup.DatabaseFileGroups.Add("primary")
$Backup.DatabaseFileGroups.Add("secondary")

Or you can pass them in via the AddRange operator:

$Backup.DatabaseFileGroups.AddRange(("primary","secondary"))

It can also be done with a loop if you can identify which filegroups you want to backup. For example, if you wanted to backup all non readonly filegroups you could replace the line with this script snippet:

foreach ($fg in $db.FileGroups | where-object {$_.ReadOnly -eq $FALSE}){
    $Backup.DatabaseFileGroups.Add($fg.Name)
}

This is one of the cases where the Backup-SQLDatabase is slightly harder to use. To replicate the first SMO example you’d use:

Import-Module SQLPS -DisableNameChecking

Backup-SqlDatabase -ServerInstance WIN-C0BP65U3D4G -Database fg_test  -DatabaseFileGroup "secondary" -BackupFile "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"

For the second:

Import-Module SQLPS -DisableNameChecking

Backup-SqlDatabase -ServerInstance WIN-C0BP65U3D4G -Database fg_test  -DatabaseFileGroup "primary","secondary" -BackupFile "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"

Note passing in the filegroup names as a comma separated list of strings.

And for the looped version:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-C0BP65U3D4G"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("fg_test")
$Filegroup = ""
$i=0
foreach ($fg in $db.FileGroups | where-object {$_.ReadOnly -eq $FALSE}){
    if ($i -eq 0){
        $Filegroup = $FileGroup+"`"$fg.Name`""
    }else{
        $Filegroup = $FileGroup+",`"$fg.Name`""
    }
}
Backup-SqlDatabase -ServerInstance WIN-C0BP65U3D4G -Database fg_test  -DatabaseFileGroup $Filegroup -BackupFile "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"

Here we still have to build the connection to the SQL Server instance so we can get the information about the Filegroups. And then loop through it building up the correct string format to pass in as a parameter. I use a simple counter $i to keep track of how many filegroups we’ve found and to ensure the “,”s only go where they’re needed.

Database file backups work in exactly the same way, but using the $Backup.Files.Add() method or the -DatabaseFile parameter.

Tomorrow we’ll be looking at redirecting the backup files.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 4 of 31 days of SQL Server backup and Restore using Powershell: Catching new databases for a full backup

As the PowerShell backups are just calling the standard T-SQL backup commands, all the usual backup rules a SQL Server DBA is used to still apply.

The main one of these, is that before a Transaction or Differential Backup can be taken the database must have had at least one full backup. This ensures that there is a start to the backup chain.

A common setup at many ISVs is for a new database to be created for each new customer that sign’s up. If that’s an automated process kicked off by an online payment, then the new database could appear at any time, and if you’re only taking full backups once a day the database may not get a full backup for nearly 24 hours! If you’re lucky your developers or suppliers wrote in a check to do an initial backups, do ya feel lucky?

And it’s usually in that first 24 hours that the new customer will be very keen, loading up lots of data and making lots of configuration choicses; but not used to the product, potentially leading to an unexpected deletion of data. If that happens, you’l have no way to recover the data.

By using the following PowerShell script to run your transaction log backups, it will catch any database without a previous backup.

Import-Module "SQLPS" -DisableNameChecking

$Server = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($Server)
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($Db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb" -and $_.RecoveryModel -eq "Full"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    if ($Db.LastBackupDate -lt $Db.CreateDate){
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $BackupName = "c:\psbackups\"+$db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        $Backup.BackupSetDescription = "Full Back of "+$db.Name
    }else{
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
        $BackupName = "c:\psbackups\"+$db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".trn"
        $Backup.BackupSetDescription = "Log Back of "+$Db.Name
        $Backup.LogTruncation = [Microsoft.SqlServer.Management.Smo.BackupTruncateLogType]::Truncate
    }

    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $Backup.Database = $Db.Name

    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)
    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($sqlsvr)
    $Backup.Devices.Remove($BackupDevice
}

We make use of the fact that if a new database doesn’t have a backup then SQL Server returns LastBackupDate as “Monday, January 01, 0001 12:00:00 AM”. We don’t match for that value as if a database has previously existed with the same name and someone hasn’t run sp_delete_database_backuphistory to clear it’s backup history then SQL Server will return the old databases last backup date. But that will still be less than the CreateDate of the new database, so we use that instead.

Tomorrow we’ll be looking at FileGroup and File backups

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 3 of 31 days of SQL Server backup and Restore using Powershell: Transaction and Differential backups

So far we’ve only looked at performing Full Databse backups. PowerShell and SMO are perfectly happy to handle all the other types available in SQL Server:

  • Transaction Log backup
  • Differential
  • File or FileGroup
  • File or FileGroup Differential

In this post we’ll just look at the first 2 types, and then Files and Filegroups in tomorrow’s post But before we do, be aware that SQL Server and PowerShell do not care nor correct your backup file names or extensions, so it will quite happily write out your transaction log backups to filename.bak. As we’ll cover in the restore section, this isn’t a problem if you’re looking into the files, but may cause some confusion if someone just looks in the backup folder! For an SMO Transaction Log Backup we just need to change the backup action to Log:

$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log

By default under this action the backup will truncate the logs. If that’s not what you want to happen, then you can override it by using the LogTruncation property of the Backup object:

#To Not truncate the log and leave all the transactions in the log use
$Backup.LogTruncation = "NoTruncate"

#For completeness if you just want to truncate the log, and not back anything up use
$Backup.LogTruncation = "TruncateOnly"

For the Backup-SQLDatabase you use the following parameters:

Backup-SqlDatabase -BackupAction Log -LogTruncationType Truncate ...

Backup-SqlDatabase -BackupAction Log -LogTruncationType NoTruncate ...

Backup-SqlDatabase -BackupAction Log -LogTruncationType TruncateOnly ...

For a SMO Differential backup Action is left as file, but you set the backup objects’s Incremental property to $TRUE:

$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$Backup.Incremental = $TRUE

And for the Backup-SQLDatabase cmdlet you would use:

Backup-SqlDatabase -BackupAction Database -Incremental ...

So putting this together, a simple script to backup a database with the following schedule:

  • Sunday Evening at 18:00 – Full Backup
  • Every Evening (expect Sunday) at 18:00 – Differential Backup
  • Every hour – Transaction Log

would look like:

$ScriptStartTime = Get-Date
$BackupFileSuffix = ""
Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
$DataBase = "DB1"

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item($DataBase)
$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Database = $db.Name

if ($ScriptStartTime.Hour -eq 18){
    if ($ScriptStartTime.DayOfWeek -eq "Sunday"){
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $Backup.Incremental = $false
        $backup.BackupSetDescription = "Full Backup of "+$Db.Name
        $BackupFileSuffix = ".bak"
    }else{
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $Backup.Incremental = $true
        $backup.BackupSetDescription = "Differential Backup of "+$Db.Name
        $BackupFileSuffix = ".bck"
    }
}else{
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
    $Backup.Incremental = $true
    $backup.BackupSetDescription = "Log Backup of "+$Db.Name
    $BackupFileSuffix = ".trn"

}

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+$BackupFileSuffix
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

The sections doing the work are:

$ScriptStartTime = Get-Date
...
if ($ScriptStartTime.Hour -eq 18){
    if ($ScriptStartTime.DayOfWeek -eq "Sunday"){
        ....
    }else{
        ....
    }
}else{
    ....
}

As the script starts we get the time it started and use this as a reference. This prevents an issue if we were looping through a number of databases on a Sunday evening and it took more than an hour to get to the lasts one, they would end up getting just transaction backups. By referencing the start time we ensure that all the databases touched by that running of the script get the appropriate backup. We then check to see if the script started at 18:* (we match it a little fuzzily to allow for a delayed start due to other system jobs), if it has we check to see if it’s a Sunday. If it is we set for a Database backup, ensure that we aren’t doing a differential, populate the Description appropriately and set the suffix to .bak as normal. If it’s 18:00 on any other day we set for a Database backup but this time an Incremental backup, and set the suffix to .bck and populate the description correctly. Any other running of the script, and we set the parameters up for a Transaction log backup.

Tomorrow we’ll look at an example of how to use these ideas to catch newly created databases and ensure we start a new backup chain before it’s too late!

Day 2 of 31 days of SQL Server backup and Restore using Powershell: Looping through Databases

Yesterday we looked at backing up a single SQL Server database with PowerShell. There is certainly more code involved than a good old T-SQL style backup. But thanks to the wonders of PowerShell we now have a very reusable piece of code.

As a good example, if you wanted to loop through every database in a SQL Server instance we can now take the central part of the script and loop through it as many times as we want. And if we want to ignore certain DBs then that’s simple as well:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-4B40IEFH4CR\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
    $backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.Database = $db.Name

    $BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

    $Backup.Devices.Add($BackupDevice)
    $Backup.SqlBackup($SQLSvr)
    $Backup.Devices.Remove($BackupDevice)
}

By changing one line we suddenly make the script a lot more useful:

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -neq "tempdb"}){
    code
}

We get our SQL Server to return an object containing all the database objects present on the server($SQLSvr.Databases. We use the Where-Object cmdlet to filter down to only the databases we want. In this case, we’ve asked for all Databases where their name is not equal (-ne to TempDB. You can modify this fitering to exclude anything you’d like based on any Database Property, for example:

  • {_.Name -notlike "*_test"} – returns all databases that don’t end in _test
  • {$_.IsSystemObject -eq $FALSE} – returns all non system databases

Then using Foreach we loop through all the DB objects in the Databases object executing our backup code.

This can be extended further to loop through a list of servers as well:

Import-Module "SQLPS" -DisableNameChecking

$ServerList = @("server1","server2","server3")

foreach ($ServerName in $ServerList){
    $SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

    $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

    foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
        $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $backup.BackupSetDescription = "Full Back of "+$Db.Name
        $Backup.Database = $db.Name

        $BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        $DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
        $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

        $Backup.Devices.Add($BackupDevice)
        $Backup.SqlBackup($SQLSvr)
        $Backup.Devices.Remove($BackupDevice)
    }
}

Now to look at getting the same outcome using the Backup-SQLDatabase cmdlet:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "WIN-4B40IEFH4CR\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -neq "tempdb"}){
    $BackupFile = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    Backup-SQLDatabase -InputObject $SQLSvr -Database $Db.name -BackupFile $BackupFile  -BackupAction Full -BackupSetDescription "Full Back of "+$Db.Name
}

Again the backup script is shorter, though in this case we still need to use the SMO methods to create a connection to the SQL Server instance so we can loop through it’s collection of databases. And as we have it open, we can also pass it to the Backup-SQLDatabase cmdlet, though note that we have to use the -InputObject parameter rather than the -ServerInstance we used previously

This can be extended to multiple servers in exactly the same way as the SMO version:

Import-Module "SQLPS" -DisableNameChecking

$ServerList = @("server1","server2","server3")

foreach ($ServerName in $ServerList){
    $SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
    $Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

    foreach ($db in $SQLSvr.Databases | Where-Object {$_.Name -neq "tempdb"}){
        $BackupFile = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
        Backup-SQLDatabase -InputObject $SQLSvr -Database $Db.name -BackupFile $BackupFile  -BackupAction Full -BackupSetDescription "Full Back of "+$Db.Name
    }
}

Now we’ve looked at performing full database backups, tomorrow we’ll move onto looking at transaction log and differential backups.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

PowerShell

Day 1 of 31 days of SQL Server backup and Restore using Powershell: Simple Database Backup

We’ll start with the simplest form of backup script, taking a full backup of a database. This demonstrate’s the basic principles that will then be expanded on to perform more complex operations later on in the series.

The script is:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("psdb1")

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Full Backup of "+$Db.Name
$Backup.Database = $db.Name

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

Now to break it down:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$DbName = "psdb1"
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item($DbName)

First we import the SQL PowerShell module, we use the -DisableNameChecking to hide this error message:

WARNING: The names of some imported commands from the module 'SQLPS' include unapproved verbs that might make them less discoverable. To find the commands with unapproved
verbs, run the Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.

which is just letting us know that Microsoft don’t always follow Microsoft’s recommended best practice

Next we build our connection to our SQL Server by putting the Server and Instance name into a string variable and then passing it into the constructor for a new SQL Server object. This examples assumes that you are using Windows Authentication to connect to your server and you are running the script under an account that has permissions to the server. You can pass the value straight to the constructor, but I find having well name variables towards the top of the script much easier to find and you can reuse them throughout, so you don’t have to search and replace each time you want to work on a different server,

Then we create a new Database object by passing the name of the database we want to backup to the constructor

The next section:

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Full Backup of "+$Db.Name
$Backup.Database = $Db.Name

This creates a new Backup object, and then we say what type of backup it is. In this case, we used the Database type. For later ease we name the backup. Then attach the database to be backup up.

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

Now we create our backup device. $BackupName for the File backup device is the path to the .bak file. Remeber that the path is relative to the SQL Server Instance you’re asking to perform the backup. ie; that c:\psbackups folder is on Server1, not the machine you’re running the PowerShell on. The type of device and path are passed into the constructor for a new backup device.

And now finally, we come to do the actual backup:

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

Add the newly created backup device to the backup object. Perform the actual backup, by calling the SqlBackup passing it the SQL server object we created before. The referenced SQL Server instance will now perform the backup before handing back to the script, so if you’ve pointed this at a 50GB database and are backing it up to slow disks this might be a good time to grab a coffee.

Once the backup has completed the script continues, and we remove the backup device from the backup object

In this case the same operation using Backup-SQL requires much less script:

Import-Module SQLPS -DisableNameChecking
$ServerName = "Server1\SQL2012"
$DbName = "psdb1"
$BackupFile ="c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
Backup-SQLDatabase -ServerInstance $ServerName -Database $DbName -BackupFile $BackupFile -BackupAction Database

As you can see, this new cmdlet removes the need for the scriptwriter to manually build the SMO objects manually as here it’s just taking in strings, and doing the work on your behalf.

Tomorrow, we’ll be looking at looping this script across all the databases in a SQL Server Instance excluding those we don’t want.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 0 of 31 days of SQL Server backup and Restore using Powershell: Why not T-SQL?

Last day of introductory material, tomorrow will feature an actual example and some code, promise.

So why do I think that it’s better to step outside of SQL Server into the world of PowerShell for doing backups?

SQL Server Maintenance plans are pretty good for nice simple environments. A few clicks and 2 maintenance plans later you’ve got some nice scheduled jobs that back your database up once a day, and your transaction logs each hour. Excellent.

But what about if you want to treat your system databases (master, model, tempdb, msdb) differently? Ok, you create another set of maintenance plans. Those databases are pretty fixed (you hope!) so it’s not too much work.

But what if you have hundreds of databases in a single instance and you want to vary when their backups run to spare your IO system? You could build up a number of maintenance plans, but they’ll still need manual maintenance. How much easier would it be to have a set of simple scripts that so you can define rules such as ‘Backup first 10% of databases at 01:00’, ‘Backup second 10% of databases at 02:00’? Much less of your time will be spent on keeping on top maintaining maintenance plans, so you can get on with yet more of the fun parts of your role.

SQL Sever doesn’t even offer a ‘Maintenance Plan’ to help automate restoring your databases to check your backups, so you’re on your own from the start. If you’re only wanting to perform a very simple automated restore, then you could write some simple T-SQL and schedule it via SQL Server Agent. But this would be very hard to write so it could work with any database, or be able to catch unusual setups. You end up in the world of cursors and Dynamic SQL, and are limited by SQL Server’s limited access to the filesystem (assuming you’ve set your security up for security rather than ease of administration (and we all do that don’t we?)).

By using PowerShell we can create scripts that can loop through folders and build up complex restores, can access OS and filesystems to work out how best to layout files, or to abort cleanly if there’s a problem. And if we combine that with a database behind it containing schedules and information, then we can have an automated system that will cope with restoring any of our databases. So when management (or the auditors) ask me how I can be sure that our backups work and that my RTO estimates are accurate I can just pull the numbers out and show them.

Hopefully the next 31 days of posts and scripts will help you towards that goal as well.

Tomorrow, the first script!

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day -1 of 31 days of SQL Server backup and Restore using Powershell: SQL Server Powershell Components

Continuing my pre 31 Days  of SQL Server backup and Restore using Powershell into, today we’ll look at getting the PowerShell SQL Server components, and a quick mention of GUIs

All Microsoft operating systems since Windows XP have shipped with PowerShell either installed or as an option. At the time of writing the current version is 3.0 (Download PowerShell v3.0). (PowerShell 4.0 is currently in public beta, but won’t be covered in this series)

Version 3 introduced enough new features to make it worthwhile upgrading to, especially for automating jobs. These include:

  • Better Scheduled Job Integration
  • Improved Integrate Scripting Environment
  • Workflows

A basic install of PowerShell doesn’t include the SQL PowerShell module which allows you to access the SQL Management Objects. These are installed when you install the client tools from your SQL Server media. They are also now freely available as part of SQL Server 2012 Express Service Pack 1 (download here).

The SMO capabilities are the same across the PowerShell module in each SQL Server edition, the limitations in usage are controlled by the edition of SQL Server you connect to. This means you can install the free SQL Express version on your desktop to control your Enterprise edition Servers, and that scripts developed on Express will also work with Standard and Enterprise editions.

SMO also relies on  standard T-SQL calls, so as long as the version of SQL Server supports the functionality you can use it. For instance, all the scripts in this series will work back to SQL Server 2005 (and the basic backups will also work backwards to SQL Server 7 and 2000). But just because PowerShell support AlwaysOn availability groups, it doesn’t mean you can enable them on SQL Server 2005

PowerShell 3.0 ships with a fairly usable  Integrated Scripting Environment GUI. This gives you Intellisense, some basic debugging functionality and an easier to read interface to Get-Help

If you start developing more complex PowerShell scripts then a more fully featured IDE may be preferable. There are a number available, both free and paid for. Commonly used ones are:

Which one you prefer will mostly come down to personal preference, so I’d suggest trying them all and then choosing one.

Tomorrow I’ll lay out some of the reasons why I’ve moved away from T-SQL for a lot of my backup and restore automation.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Page 4 of 5

Powered by WordPress & Theme by Anders Norén