Day 3 of 31 days of SQL Server backup and Restore using Powershell: Transaction and Differential backups

So far we’ve only looked at performing Full Databse backups. PowerShell and SMO are perfectly happy to handle all the other types available in SQL Server:

  • Transaction Log backup
  • Differential
  • File or FileGroup
  • File or FileGroup Differential

In this post we’ll just look at the first 2 types, and then Files and Filegroups in tomorrow’s post But before we do, be aware that SQL Server and PowerShell do not care nor correct your backup file names or extensions, so it will quite happily write out your transaction log backups to filename.bak. As we’ll cover in the restore section, this isn’t a problem if you’re looking into the files, but may cause some confusion if someone just looks in the backup folder! For an SMO Transaction Log Backup we just need to change the backup action to Log:

$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log

By default under this action the backup will truncate the logs. If that’s not what you want to happen, then you can override it by using the LogTruncation property of the Backup object:

#To Not truncate the log and leave all the transactions in the log use
$Backup.LogTruncation = "NoTruncate"

#For completeness if you just want to truncate the log, and not back anything up use
$Backup.LogTruncation = "TruncateOnly"

For the Backup-SQLDatabase you use the following parameters:

Backup-SqlDatabase -BackupAction Log -LogTruncationType Truncate ...

Backup-SqlDatabase -BackupAction Log -LogTruncationType NoTruncate ...

Backup-SqlDatabase -BackupAction Log -LogTruncationType TruncateOnly ...

For a SMO Differential backup Action is left as file, but you set the backup objects’s Incremental property to $TRUE:

$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$Backup.Incremental = $TRUE

And for the Backup-SQLDatabase cmdlet you would use:

Backup-SqlDatabase -BackupAction Database -Incremental ...

So putting this together, a simple script to backup a database with the following schedule:

  • Sunday Evening at 18:00 – Full Backup
  • Every Evening (expect Sunday) at 18:00 – Differential Backup
  • Every hour – Transaction Log

would look like:

$ScriptStartTime = Get-Date
$BackupFileSuffix = ""
Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
$DataBase = "DB1"

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item($DataBase)
$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Database = $db.Name

if ($ScriptStartTime.Hour -eq 18){
    if ($ScriptStartTime.DayOfWeek -eq "Sunday"){
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $Backup.Incremental = $false
        $backup.BackupSetDescription = "Full Backup of "+$Db.Name
        $BackupFileSuffix = ".bak"
    }else{
        $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
        $Backup.Incremental = $true
        $backup.BackupSetDescription = "Differential Backup of "+$Db.Name
        $BackupFileSuffix = ".bck"
    }
}else{
    $Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
    $Backup.Incremental = $true
    $backup.BackupSetDescription = "Log Backup of "+$Db.Name
    $BackupFileSuffix = ".trn"

}

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+$BackupFileSuffix
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.SqlBackup($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

The sections doing the work are:

$ScriptStartTime = Get-Date
...
if ($ScriptStartTime.Hour -eq 18){
    if ($ScriptStartTime.DayOfWeek -eq "Sunday"){
        ....
    }else{
        ....
    }
}else{
    ....
}

As the script starts we get the time it started and use this as a reference. This prevents an issue if we were looping through a number of databases on a Sunday evening and it took more than an hour to get to the lasts one, they would end up getting just transaction backups. By referencing the start time we ensure that all the databases touched by that running of the script get the appropriate backup. We then check to see if the script started at 18:* (we match it a little fuzzily to allow for a delayed start due to other system jobs), if it has we check to see if it’s a Sunday. If it is we set for a Database backup, ensure that we aren’t doing a differential, populate the Description appropriately and set the suffix to .bak as normal. If it’s 18:00 on any other day we set for a Database backup but this time an Incremental backup, and set the suffix to .bck and populate the description correctly. Any other running of the script, and we set the parameters up for a Transaction log backup.

Tomorrow we’ll look at an example of how to use these ideas to catch newly created databases and ensure we start a new backup chain before it’s too late!

Tagged ,

5 thoughts on “Day 3 of 31 days of SQL Server backup and Restore using Powershell: Transaction and Differential backups

  1. George says:

    Great series!
    How does the server know to execute this script @ 18:00 everyday? Is this built in into PS or do you need to set it up in the Task Scheduler?

    • Thank You. Glad you’re enjoying it

      Good spot, I’d not mentioned it. Yes, you’d need to have something to schedule the running. Either Task Scheduler or if using a Non-Express versions of SQL Server then SQL Server Agent. In fact if I stay on schedule then days 9 and 10 cover those very options (I’m just debating whether to split the asynchronous backups article across 2 days as it introduces quite a bit of “deep” PowerShell).

      • George says:

        Thanks for such a quick response! Here’s my motivation in studying the series. My clients use mainly SQL Express for obvious reasons. I’m always looking for ways to automate the Backup tasks such that the client has the least intervention possible & I can distribute the solution easily. In fact I’ve programmed a user control which encapsulates Sql Smo objects inside the application. For many years it worked fine but suddenly it’s breaking & my guess is database/backup size. This coincided with the addition of Filestream columns for images & Word docs..

        Anyway, I’m looking for a way to automate the Sql backup with a schedule very similar to the one you presented. Since this backup file(s) will be created in a server attached disk, it would also be nice to copy them to a client computer in the same PS script.

        Thanks again.

        • That’s a strange problem. Do the backups work OK if you manually run the T-SQL statements against the database? As all SMO is doing is building up T-SQL in the background there shouldn’t be anything it can’t do size wise that a normal BACKUP can’t.

          How big are these DBs? Something in the back of my mind is wondering if SQL Express isn’t looking at the size of the DB correctly with filestream, but does notice when you read it all through the buffer at backup time and then hitting the 10GB limit. Have you tried the loading the database into an evaluation copy of Enterprise and backing that up?

          • George says:

            Hi Stuart:
            The db size is 9411MB (using SqlExpress 2008R2). I can only backup using Sql Management Studio T-Sql or using Sqlcmd. My app works fine with Sql std. or workgroup edition.
            I adapted your PS script (which uses SMO) but the backup starts & runs for a while; I see the backp file grow up to 8.9GB & after 5 minutes the process aborts & the backup file disappears. The same happens with my User Control which is based on Smo.
            I configured the Task Scheduler to execute a batch file using SqlCmd. The final size of the backup file is aroung 17GB.
            I think MS didn’t update Smo to work with the excess data that comes from the Filestream?
            Will be following you on the PS series.

            Thanks.

Leave a Reply

Your email address will not be published. Required fields are marked *