Musings of a Data professional

Stuart Moore

Month: September 2013 Page 2 of 4

Day 17 of 31 Days of SQL Server Backup and Restore using PowerShell: More complex restore scenarios with Differentials

Building up yesterdays technique for ensuring we restore the correct backup files in the the correct order, today we’re going to look at what happens when you bring differentials into the mix, and maybe have more than one day’s backups for your target database.

Like yesterday, we’ll build a database to use as a demonstration:

create database DiffRestore;
go

use DiffRestore
go

create table steps(
StepID int identity(1,1),
StepTime datetime
)
go
backup database DiffRestore to disk='c:\DiffRestore\DiffRestore.bak'
go
insert into steps (steptime) values (current_timestamp)
go
backup log DiffRestore to disk='c:\DiffRestore\DiffRestore.trn'
go
insert into steps (steptime) values (current_timestamp)
go
backup database DiffRestore to disk='c:\DiffRestore\DiffRestore.bak'
go
insert into steps (steptime) values (current_timestamp)
go
backup log DiffRestore to disk='c:\DiffRestore\DiffRestore2.trn'
go
insert into steps (steptime) values (current_timestamp)
go
backup database DiffRestore to disk='c:\DiffRestore\DiffRestore.bak' with differential
go
insert into steps (steptime) values (current_timestamp)
go
backup log DiffRestore to disk='c:\DiffRestore\DiffRestore.trn'
go
insert into steps (steptime) values (current_timestamp)
go
backup database DiffRestore to disk='c:\DiffRestore\DiffRestore.bak' with differential
go
insert into steps (steptime) values (current_timestamp)
go
backup log DiffRestore to disk='c:\DiffRestore\DiffRestore.trn'
go

So we have a simple database, but with a more complicated backup structure. we have in this order:

  1. Full Backup
  2. Transaction Log Backup
  3. Full Backup
  4. Transaction Log Backup
  5. Differential Backup
  6. Transaction Log Backup
  7. Differential Backup
  8. Transaction Log Backup

So to restore this database to it’s latest state we need to restore Backup 4, then Backup 7 (Differential backups are cumulative, so we only need to restore the latest one) and then any following transaction log backups. In this case that’s just Backup 7.

In this case we’ll extract the backups we require, and build a new object to loop through for the actual restore: We’ll assume that the functions we created on Day 14 and Day 15 have been stored in the file SQLBackupFunctions.ps1

Import-Module sqlps -DisableNameChecking
."c:\PSScripts\SQLBackupFunctions.ps1"

$backupfiles = @()
$backups = @()

$backupfiles += Get-SQLBackupFiles -FilePath c:\DiffRestore -SQLServer "Server1"

foreach($file in $BackupFiles){
    $backups +=Get-BackupContents -FilePath $file -SQLServer "Server1"
}

$RestoreFiles = @()

$RestoreFiles += $Backups | Where-Object {$_.BackupTypeDescription -eq "Database"} | Sort-Object LastLSN | Select-Object -First 1
$RestoreFiles += $Backups | Where-Object {$_.BackupTypeDescription -eq "Database Differential"} | Sort-Object LastLSN | Select-Object -First 1
$tmpLSN = $RestoreFiles | Measure-Object -Property LastLSN -Maximum
$RestoreFiles += $Backups | Where-Object {($_.LastLSN -ge $tmpLSNMaximum) -and ($_.BackupTypeDescription -eq "Transaction Log")}

$RestoreServer = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("RestoreServer")
$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
foreach ($backup in $RestoreFiles | sort-object LastLSN){
    $restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
    $restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($backup.filepath,$devicetype)
    $restore.devices.add($restoredevice)
    $restore.FileNumber = $backup.position
    $restore.Database = $backup.DatabaseName
    if ($backup.LastLSN -eq $backups[-1].LastLSN){
        $restore.NoRecovery = $False
    }else{
        $restore.NoRecovery = $True
    }
    $restore.SqlRestore($RestoreServer)

    $restore.Devices.Remove($restoredevice)
}

We do our normal scan of the filesystem to find out backup files and read their headers into our $Backups array. Then, as we know we want to restore to the latest point we can pick the Full backup and the Differential Backup with the highest LSN. We then get all Transaction Log backups with a greater LastLSN than the Differential Backup. All of these are added to an array $RestoreFiles, which we then step through restoring each file populating the required values from our array.

So far we’ve only looked at restoring onto servers where we’re assuming that our database doesn’t already exist. Tomorrow we’ll look at we can easily redirect all the files in a restored database if needed

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 16 of 31 Days of SQL Server Backup and Restore using PowerShell: A more robust transaction restore

Over the last 2 days we’ve looked at getting backup information from 2 sources:

and loading it into a PowerShell object. Today we’re going to put that object to use to show how we can use it to make sure we’re restoring files in the right order without having to rely on filenames or modified dates.

To setup we need a database and some backups. And we’ll make those backups a little complicated just to show off everything.

create database RobustRestore;
go

use robustrestore
go

create table steps(
StepID int id(1,1),
StepTime datetime
)
go
backup database RobustRestore to disk='c:\robust\robustrestore.bak'
go
insert into steps (steptime) values (current_timestamp)
go
backup log RobustRestore to disk='c:\robust\robustrestore.trn'
go
insert into steps (steptime) values (current_timestamp)
go
backup log RobustRestore to disk='c:\robust\robustrestore2.trn'
go
insert into steps (steptime) values (current_timestamp)
go
backup log RobustRestore to disk='c:\robust\robustrestore.trn'
go
insert into steps (steptime) values (current_timestamp)
go
backup log RobustRestore to disk='c:\robust\robustrestore2.trn'
go

We now have a backup chain, and our transaction logs are interleaved across 2 files. Not something you’d want to see in production, but suitably complex for a simple demonstration.

First we use our disk scraping technique to load our object with all the backup information from the files. The function definitions are here – Day 14 of 31 Days of SQL Server Backup and Restore using PowerShell: Reading Backup information into a custom object

$backupfiles = @()
$backups = @()

$backupfiles += Get-SQLBackupFiles -FilePath c:\robust -SQLServer "Server1"

foreach($file in $BackupFiles){
    $backups +=Get-BackupContents -FilePath $file -SQLServer "Server1"
}

Assuming you get no errors, then we now have a $backups object with 5 entries, 1 for each of the backups. So how are we going to use this to run a restore, and make sure we get everything in the right order. Well, let’s take a look at one of the entries:

PS C:\> $backups.item(1)

FilePath               : C:\robust\robustrestore.trn
BackupName             :
BackupDescription      :
BackupType             : 2
ExpirationDate         :
Compressed             : 0
Position               : 1
DeviceType             : 2
UserName               : Server1\Administrator
ServerName             : Server1
DatabaseName           : RobustRestore
DatabaseVersion        : 706
DatabaseCreationDate   : 18/09/2013 09:14:01
BackupSize             : 77824
FirstLSN               : 40000000007600178
LastLSN                : 40000000016300001
CheckpointLSN          : 40000000007600178
DatabaseBackupLSN      : 40000000007600178
BackupStartDate        : 18/09/2013 09:14:02
BackupFinishDate       : 18/09/2013 09:14:02
SortOrder              : 0
CodePage               : 0
UnicodeLocaleId        : 1033
UnicodeComparisonStyle : 196609
CompatibilityLevel     : 110
SoftwareVendorId       : 4608
SoftwareVersionMajor   : 11
SoftwareVersionMinor   : 0
SoftwareVersionBuild   : 2100
MachineName            : Server1
Flags                  : 512
BindingID              : 815b85e5-d051-4afd-a192-bb9fb2f7302c
RecoveryForkID         : df243f5b-a451-427e-8622-ecc7996a390a
Collation              : Latin1_General_CI_AS
FamilyGUID             : df243f5b-a451-427e-8622-ecc7996a390a
HasBulkLoggedData      : False
IsSnapshot             : False
IsReadOnly             : False
IsSingleUser           : False
HasBackupChecksums     : False
IsDamaged              : False
BeginsLogChain         : True
HasIncompleteMetaData  : False
IsForceOffline         : False
IsCopyOnly             : False
FirstRecoveryForkID    : df243f5b-a451-427e-8622-ecc7996a390a
ForkPointLSN           :
RecoveryModel          : FULL
DifferentialBaseLSN    :
DifferentialBaseGUID   :
BackupTypeDescription  : Transaction Log
BackupSetGUID          : 2d5edca4-7ed2-473d-b2ca-124f9cca8784
CompressedBackupSize   : 77824
Containment            : 0

So we can see that we have all the information we could want in there to build up our backups. We can use BackupType or BackupTypeDescription to determine what sort of restore (database or log) we need, we have FilePath to tell us where the file is, Position tells us where in that file the particular backup is and DatabaseName let’s us confirm that we’re restoring a backup from the correct database.

But how do we ensure we’re restoring the backups in the right order? We have 2 options here, Times or LSN. We could order by BackupStartDate or BackupFinishDate, and most times this would be absolutely fine. But if we have backups that take a particularly long time to complete you can end up with backups happening within the window of other backups.

A more reliable method is to use Log Sequence Numbers (LSNs). LSNs are strictly increasing number that SQL Server assigns to every transaction, and this is written in the transaction log when the transaction is committed. LSNs are not contiguous or of any use for arithmetic (ie; subtracting FirstLSN from LastLSN for a Transaction log backup won’t tell you how many transactions the backup contains). What they do guarantee is that if a_LSN is greate than b_LSN then a_LSN was committed to the transaction log AFTER b_LSN. So this makes it ideal for working out the order to restore our files. For more information on LSNs, please see Technet for an Introduction to Log Sequence Numbers

So knowing that, we can now look at out restore script:

$RestoreServer = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("RestoreServer")
$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
foreach ($backup in $backups | sort-object LastLSN){
    $restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
    $restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($backup.filepath,$devicetype)
    $restore.devices.add($restoredevice)
    $restore.FileNumber = $backup.position
    $restore.Database = $backup.DatabaseName
    if ($backup.LastLSN -eq $backups[-1].LastLSN){
        $restore.NoRecovery = $False
    }else{
        $restore.NoRecovery = $True
    }
    $restore.SqlRestore($RestoreServer)

    $restore.Devices.Remove($restoredevice)
}

As you can see, we can now dynamically populate everything from our $Backups object. Except for the SQL Server we want to restore to.

In that example we were assuming that we only have backups from one database, and even just the one SQL Server in the folder we’d scanned. By altering the filtering in the foreach we can ensure we only see the details we want to see. For example, if you wanted to restore the backup of Database “dbprod” from SQL Server “DBServer1” then you’d do the following:

$RestoreServer = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("RestoreServer")
$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
foreach ($backup in $backups | Where-Object {($_.DatabaseName -eq "dbprod") -and ($_.ServerName -eq "DBServer1")} | sort-object LastLSN  ){
    $restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
    $restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($backup.filepath,$devicetype)
    $restore.devices.add($restoredevice)
    $restore.FileNumber = $backup.position
    $restore.Database = $backup.DatabaseName
    if ($backup.LastLSN -eq $backups[-1].LastLSN){
        $restore.NoRecovery = $False
    }else{
        $restore.NoRecovery = $True
    }
    $restore.SqlRestore($RestoreServer)

    $restore.Devices.Remove($restoredevice)
}

From that it can be seen that we can really leverage PowerShell’s built in cmdlets to extract just the restore information we need from a whole range.

Tomorrow we’ll look at how we can work with multiple days worth of data, and also bring Differential backups into the picture.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 15 of 31 Days of SQL Server Backup and Restore using PowerShell: Getting Backup information from SQL Server

Yesterday we looked at scraping Backup information from SQL Server backup files on disk. There is another source of information available. Every backup in SQL Server is also recorded within tables in msdb. For the simple examples we’re looking at  so far, we just need to consider 2 of them. These are:

  • msdb.dbo.backupset – Holds the information about every backup
  • msdb.dbo.backupmediafamily – Holds the location

There are a number of other tables that also contain information that is needed when dealing with File or Filegroup backups, or with backups that have been spanned across multiple mediasets

The only issue with using this data as opposed to the data ‘scraped’ from disk, is that most people will manage their backup files outside of SQL Server, so there’s no guarantee that the file is where SQL Server left it. It could have been arhived, compressed or just plain deleted. This is the case even if you’re using the ‘Maintenance Cleanup Task’ in a Maintenance plan to delete your old backups. Running the ‘History Cleanup Task’ will prune your backup history down to your specifications (SQL Server won’t do this unless told to, and I have seen MSDB dbs that have ballooned to a ridiculous size with years of backup records!). Because of this you may want to filter the data you retrieve from SQL Server by your retention policy, and also checking that any file you reference does in fact exist will avoid problems.

The data returned from the ReadBackupHeader and msdb.dbo.backupset is VERY similar, but with differences in naming convention. And the enumerators for BackupType | Type need a CASE statement to translate them. The basic query we’ll be running is:

select
 a.backup_finish_date as BackupFinishDate,
 a.backup_set_uuid as BackupSetGUID,
 a.backup_size as BackupSize,
 a.backup_start_date as BackupStartDate,
 a.begins_log_chain as BeginsLogChain,
 a.checkpoint_lsn as CheckpointLSN,
 a.collation_name as Collation,
 a.compatibility_level as CompatibilityLevel,
 a.compressed_backup_size as CompressBackupSize,
 a.database_backup_lsn as DatabaseBackupLSN,
 a.database_creation_date as DatabaseCreationDate,
 a.database_name as DatabaseName,
 a.database_version as DatabaseVersion,
 a.description as BackupDescritpion,
 a.differential_base_guid as DifferentialBaseGUID,
 a.differential_base_lsn as DifferentialBaseLSN,
 a.expiration_date as ExpirationDate,
 a.family_guid as FamilyGUID,
 a.first_lsn as FirstLSN,
 a.first_recovery_fork_guid as FirstRecoveryForkID,
 a.flags as Flags,
 a.fork_point_lsn as ForkPointLSN,
 a.has_backup_checksums as HasBackupChecksums,
 a.has_bulk_logged_data as HasBulkLoggedData,
 a.has_incomplete_metadata as HasIncompleteMetaData,
 a.is_copy_only as IsCopyOnly,
 a.is_damaged as IsDamaged,
 a.is_force_offline as IsForceOffline,
 a.is_password_protected,
 a.is_readonly as IsReadOnly,
 a.is_single_user as IsSingleUser,
 a.is_snapshot as IsSnapshot,
 a.last_family_number,
 a.last_lsn as LastLSN,
 a.last_media_number,
 a.machine_name as MachineName,
 a.name as BackupName,
 a.position as Position,
 a.recovery_model as RecoveryModel,
 a.server_name as ServeName,
 a.software_build_version as SoftwareVersionBuild,
 a.software_major_version as SoftwareVersionMajor,
 a.software_minor_version as SoftwareVersionMinor,
 a.sort_order as SortOrder,
 a.unicode_compare_style as UnicodeComparisonStyle,
 a.unicode_locale as UnicodeLocalID,
 a.user_name as UserName,
 case a.type
	when 'D' then 'Database'
	when 'I' then 'Differential database'
	when 'L' then 'Log'
	when 'F' then 'File or filegroup'
	when 'G' then 'Differential file'
	when 'P' then 'Partial'
	when 'Q' then 'Differential partial'
 end as BackupTypeDescription,
 case [type]
	when 'D' then 1
	when 'I' then 5
	when 'L' then 2
	when 'F' then 4
	when 'G' then 6
	when 'P' then 7
	when 'Q' then 8
 end as BackupType,
 b.physical_device_name as 'FilePath'
 from
    msdb.dbo.backupset a inner join msdb.dbo.backupmediafamily
        on a.media_set_id=b.media_set_id B

To fill our data from SQL Server we’ll create 2 functions:

  1. Get-SQLServerBackups
  2. Test-SQLServerBackupExistence

The first will retrieve a set of backup information from a specified SQL Sever instance, optionally filtering the data by a specific date or by only asking for the last x days.

Our 2nd function will take in an instance of our $Backups object, and will step through testing to see each file exists, removing those that don’t.

So with our first function:

function Get-SQLServerBackups
{param([string]$SQLServer,[int]$ExcludeDaysOlder=0,[string]$BackupNewerThan='01/01/1900')
push-location
import-module "SQLPS" -DisableNameChecking
pop-location

$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($SQLServer)

$sql="select a.backup_finish_date as BackupFinishDate,
 a.backup_set_uuid as BackupSetGUID,
 a.backup_size as BackupSize,
 a.backup_start_date as BackupStartDate,
 a.begins_log_chain as BeginsLogChain,
 a.checkpoint_lsn as CheckpointLSN,
 a.collation_name as Collation,
 a.compatibility_level as CompatibilityLevel,
 a.compressed_backup_size as CompressBackupSize,
 a.database_backup_lsn as DatabaseBackupLSN,
 a.database_creation_date as DatabaseCreationDate,
 a.database_name as DatabaseName,
 a.database_version as DatabaseVersion,
 a.description as BackupDescritpion,
 a.differential_base_guid as DifferentialBaseGUID,
 a.differential_base_lsn as DifferentialBaseLSN,
 a.expiration_date as ExpirationDate,
 a.family_guid as FamilyGUID,
 a.first_lsn as FirstLSN,
 a.first_recovery_fork_guid as FirstRecoveryForkID,
 a.flags as Flags,
 a.fork_point_lsn as ForkPointLSN,
 a.has_backup_checksums as HasBackupChecksums,
 a.has_bulk_logged_data as HasBulkLoggedData,
 a.has_incomplete_metadata as HasIncompleteMetaData,
 a.is_copy_only as IsCopyOnly,
 a.is_damaged as IsDamaged,
 a.is_force_offline as IsForceOffline,
 a.is_password_protected,
 a.is_readonly as IsReadOnly,
 a.is_single_user as IsSingleUser,
 a.is_snapshot as IsSnapshot,
 a.last_family_number,
 a.last_lsn as LastLSN,
 a.last_media_number,
 a.machine_name as MachineName,
 a.name as BackupName,
 a.position as Position,
 a.recovery_model as RecoveryModel,
 a.server_name as ServeName,
 a.software_build_version as SoftwareVersionBuild,
 a.software_major_version as SoftwareVersionMajor,
 a.software_minor_version as SoftwareVersionMinor,
 a.sort_order as SortOrder,
 a.unicode_compare_style as UnicodeComparisonStyle,
 a.unicode_locale as UnicodeLocalID,
 a.user_name as UserName,
 case a.type
	when 'D' then 'Database'
	when 'I' then 'Differential database'
	when 'L' then 'Log'
	when 'F' then 'File or filegroup'
	when 'G' then 'Differential file'
	when 'P' then 'Partial'
	when 'Q' then 'Differential partial'
	end as BackupTypeDescription,
	case [type]
	when 'D' then 1
	when 'I' then 5
	when 'L' then 2
	when 'F' then 4
	when 'G' then 6
	when 'P' then 7
	when 'Q' then 8
	end as BackupType,
	b.physical_device_name as 'FilePath'
	 from msdb.dbo.backupset a inner join msdb.dbo.backupmediafamily b on a.media_set_id=b.media_set_id
where 1=1"

if ($ExcludeDaysOlder -ne 0){
    $sql = $sql+" and a.backup_start_date>dateadd(dd,-"+$ExcludeDaysOlder.tostring()+",current_timestamp)"
}

if ($BackupNewerThan -gt '01/01/1900'){
    $sql = $sql+" and a.backup_start_date>'"+$BackupNewerThan+"'"
}

$dt = Invoke-Sqlcmd -ServerInstance $SQLServer -Query $sql

return $dt
}

This is a very simple function, it just looks huge due to the SQL Statement $sql. If the parameters are supplied then we append the appropriate where filter. Then the query is sent to the SQL Server using Invoke-SQL cmdlet, this returns the data in a DataTable which we then return.

And then our second function:

function Test-SQLServerBackupExistence
{param ($BackupArray)

Set-location c:\

foreach ($row in $BackupArray.filepath | sort -Unique){
    if ((Test-Path $row) -ne $true){
        $BackupArray = $BackupArray | Where-Object {$_.FilePath -ne $row}
     }
}
return $BackupArray
}

This takes in an array as a parameter. By piping the $BackupArray.FilePath property through Sort -unique we only need to check each file once. If it doesn’t exist then we remove all ‘rows’ with that FilePath from the array. We then return the remains of the array.

The one drawback with this method is that if the same file is being used repeatedly for backups, then all those backups will stay in the array as we are only checking the existence of the file not seeing what it actually contains.

To load the data we’d call the functions as follows:

$Backups = Get-SQLServerBackups -SQLServer Server1 -ExcludeDaysOlder 7
$Backups = Test-SQLSeverBackupExistence $Backups

As we’ve ensured that we’ve kept the same structure for the Data.Tables returned from the disk scrape and the SQL query we can combine them easily like so:

$Backups = Get-SQLServerBackups -SQLServer Server1 -ExcludeDaysOlder 7
$Backups = Test-SQLSeverBackupExistence $Backups
$BackupFiles += Get-SQLBackupFiles -FilePath c:\psbackups2 -SQLServer "Server2"
foreach($File in $BackupFiles){
    $Backups += Get-BackupContents -FilePath $Sile -SQLServer "Server1"
}

As we’re using a standard PowerShell Data Table we can also us standard PowerShell cmdlets to save it to disk, and then reload it at a later date. This means we can scrape disks overnight during periods of low usage and then load it up during the day. Or we can update our records from SQL Server on a daily basis to speed the process up.

To save the information:

$Backups | Export-CliXML c:\backup-store.xml

To load the information:

$Backups = Import-CliXML c:\backup-store.xml

Tomorrow we’ll be looking at how we can use this new objects to ensure we restore the right files in the right order in a more robust manner than just relying on file names and last modified dates.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

31brps

Day 14 of 31 Days of SQL Server Backup and Restore using PowerShell: Reading Backup information into a custom object

Yesterday we looked at finding SQL Server Backup files and then examining their contents. Today we’re going to look at how we can best record this information for later use in our scripts.

We’re going to create 2 functions:

  1. Scan a given folder structure for SQL Server Backup files, and record them in a global variable
  2. Read a given file and return the SQL Server Backups contained in it

Function 1 will be Get-SQLBackupFiles:

function Get-SQLBackupFiles
{param ([string]$FilePath, [string]$SQLServer)
    import-module "SQLPS" -DisableNameChecking
    $tmpout = @()
    $sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($SQLServer)
    $restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
    $devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

    $files = Get-ChildItem $FilePath

    foreach ($file in $files){
        $restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($file.FullName,$devicetype)
        $restore.Devices.add($restoredevice) | Out-Null
        $errcnt=0
        try{
            $restore.ReadMediaHeader($sqlsvr) | Out-Null
        }
        catch [System.Exception]{
             $errcnt =1
        }
        finally {
            if ($errcnt -ne 1){
                $tmpout += $file.FullName
            }
            $errcnt = 0
        }
        $restore.Devices.remove($restoredevice) | out-null
        Remove-Variable restoredevice
    }
    return $tmpout
 }

This is based on the scriplet we looked at yesterday. We pass in the folder we want to search, and the SQL Server instance we want to run the query on. Remember that the SQL Server service on that server will need to have read access to the file are you’re scanning. The function returns an array of all the files that have the correct headers. Out-Null is appended to many of the lines to prevent the output from those commands coming through in the return output.

By having this as a discrete function, we could also make use of the Start-Job asynchronous method we looked at in Day 8 ( Asynchronous Backups – Part 2 – PowerShell Jobs) if we wanted to scan lots of files by splitting the areas into logical divisions and then scanning them in parallel across a number of SQL instances.

Our 2nd function is Get-BackupContents:

function Get-BackupContents
{param ([string]$FilePath,[string]$SQLServer)
    import-module "SQLPS" -DisableNameChecking

    $sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($SQLServer)
    $restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
    $devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

    $restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($FilePath,$devicetype)
    $restore.Devices.add($restoredevice) | Out-Null
    $Temp = $Restore.ReadBackupHeader($sqlsvr)
    $Temp | Add-Member -MemberType NoteProperty -Name FilePath -value $FilePath
    $restore.Devices.Remove($restoredevice) | Out-Null
    return $temp
}

Here we pass in a a filepath to the backup file, and a SQL Server instance to use to read the backup. We store the output in a temporary object, and also add the filepath to the object before returning the object

So we can call these like so:

$BackupFiles = @()
$Backups = @()

$BackupFiles += Get-SQLBackupFiles -FilePath c:\psbackups -SQLServer "Server1"
$BackupFiles += Get-SQLBackupFiles -FilePath c:\psbackups2 -SQLServer "Server2"

foreach($File in $BackupFiles){
    $Backups += Get-BackupContents -FilePath $Sile -SQLServer "Server1"
}

So we now have a PowerShell object $Backups which contains all of our backup information. This means we can now use PowerShell’s standard cmdlets to pull information back. For example, this code snippet:

$backups | Where-Object {($_.DatabaseName -eq "psdb1") -and ($_.BackupTypeDescription -eq "Database")} | Sort-Object $_.BackupStartDate -Descending | Format-List -Property DataBaseName, FilePath, BackupTypeDescription, BackupStartDate, BackupFinishDate

will find all of the Full Database backups of psdb1, and return them in descending date order, and then we can simply extracted the information we’re interested in

Tomorrow we’ll look at getting the same information from the original SQL Server that performed the backups and putting it into the same object. We’ll also cover writing this information back to a SQL Server, so you can reuse the information without having to scan the disks every time.

Then on Thursday (Day 16) we’ll look at how we can use this object to ensure a we restore backups in the correct order, and also allow us to start working towards being able to perfmon a Point In Time restore to any random point in time for which we have backups available.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 13 of 31 Days of SQL Server Backup and Restore using PowerShell: Reading the contents of a Backup File

Yesterday we saw how easy it is to restore a SQL Server database from backups with PowerShell if we have a folder that contains:

  • Only SQL backup files
  • All files are in chronological/LSN order
  • The oldest file is the Full backup
  • There are no differential or partial backups in the mix
  • Only one database’s backups
  • Each backup file only contains 1 backups

Unfortunately we don’t always get that. So we need to be able to find out what we have and work out the the best way to restore our database. Over the next 2 days I’m going to introduce how we can read all the files we have, extract the information we need and then store it in a custom PowerShell object so we can manipulate it for ourselves. We will then look at how we can populate this object if we’re lucky enough to also have access to the original server.

Today we’ll look at reading the contents of the backup files and finding out what’s inside.

Let’s start with just working out which files in a folder are actually of interest to us. Let’s start with the worst case scenario; a folder full of files, some of which may be SQL Server backups of one kind or another, but someone hasn’t put file extensions on or has been very lax with which ones they have done. Like so:

PS C:\psbackups> Get-ChildItem

    Directory: C:\psbackups

Mode          LastWriteTime     Length Name
----          -------------     ------ -----

-a---  14/09/2013     07:46          0 file1
-a---  14/09/2013     07:46    5605888 file2

PS C:\psbackups>

OK, in this example you can probably deduce which one is not a backup file! But if you had folders full of these how would you work out which ones were of interest? Well, you’d use the ReadMediaHeader method of your $Restore object to quickly try to read the header that SQL Server writes at the start of all it’s backup files. This is a very quick operation, even on huge backup files as it’s at the start of the file. Unfortunately, ReadMediaHeader doesn’t return a nice “No this isn’t a SQL Server backup” message, so we have to wrap it in a Try, Catch, Finally block and handle it ourselves:

import-module "SQLPS" -DisableNameChecking

$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("WIN-C0BP65U3D4G")
$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

$files = gci C:\psbackups

$BackupFiles = @()

foreach ($file in $files){
    $restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($file.FullName,$devicetype)

    $restore.Devices.add($restoredevice)
    $errcnt=0
    try{
        $restore.ReadMediaHeader($sqlsvr)
    }
    catch [System.Exception]{
        write-output "$file is not a sql backup file `n"
        $errcnt =1
    }
    finally {
        if ($errcnt -ne 1){
            write-output "$file is a sql backup file `n"
            $BackupFiles += $file
        }
        $errcnt = 0
    }
    $restore.Devices.remove($restoredevice)
    Remove-Variable restoredevice
}

We add the positive matches to a $BackupFiles array so we can refer to them later.

So, what can you do if someone’s running their backups like this:

backup database psdb1 to disk='c:\psbackups\psbd1_backup.bak'
backup log psdb1 to disk='c:\psbackups\psbd1_backup.bak'
backup database psdb1 to disk='c:\psbackups\psbd1_backup.bak'
backup log psdb1 to disk='c:\psbackups\psbd1_backup.bak'

I have seen something similar out in the wild, and they had a BIG backup file for quite a small database, and making sure you restored the version you wanted took a lot of looking up. In this case we use PowerShell’s implement of RESTORE HEADERONLY, ReadBackupHeader:

Import-Module "SQLPS" -DisableNameChecking

$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("WIN-C0BP65U3D4G")
$Restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

$file = gci c:\psbackups\psbd1_backup.bak

$RestoreDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($file.FullName,$devicetype)
$Restore.Devices.Ddd($RestoreDevice)
$Restore.ReadBackupHeader($sqlsvr) | Format-List -Property DatabaseName, BackupStartDate, FirstLSN, BackupTypeDescription, Position
$Restore.Devices.Remove($restoredevice)

Here we pipe the output from ReadBackupHeader through Format-List to extract just the values we want:

DatabaseName          : psdb1
BackupStartDate       : 16/09/2013 07:18:37
FirstLSN              : 40000000049700037
BackupTypeDescription : Database
Position              : 1

DatabaseName          : psdb1
BackupStartDate       : 16/09/2013 07:18:37
FirstLSN              : 40000000049000001
BackupTypeDescription : Transaction Log
Position              : 2

DatabaseName          : psdb1
BackupStartDate       : 16/09/2013 07:18:37
FirstLSN              : 41000000004700037
BackupTypeDescription : Database
Position              : 3

DatabaseName          : psdb1
BackupStartDate       : 16/09/2013 07:18:38
FirstLSN              : 41000000002600001
BackupTypeDescription : Transaction Log
Position              : 4

ReadBackupHeader returns the same information as RESTORE HEADERONLY does.

Not that we have the position of the backups with the file we can restore the one we want by using the FileNumber property:

Import-Module "SQLPS" -DisableNameChecking

$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("WIN-C0BP65U3D4G")
$Restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

$file = gci c:\psbackups\psbd1_backup.bak

$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($file.FullName,$devicetype)
$Restore.Devices.add($restoredevice)
$Restore.DatabaseName = "psdb1"
$Restore.FileNumber = 3
$Restore.SQLRestore($SQLSvr)
$restore.Devices.Remove($restoredevice)

This will restore the most recent full backup from our example

So now we can find valid SQL Server backups on a given filesystem, and then find out what’s contained within them. Tomorrow we’ll look at taking this information, and putting it into a custom object so we can collate the results from multiple files and build up our restore plan.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 12 of 31 Days of SQL Server Backup and Restore using PowerShell: Simple Transaction Restore

Yesterday we looked at simplest possible example of a SQL Server restore; restoring a single backup file. Now we’re going to get a little bit smarter, and also restore transaction logs.

So the scenario is that you have folder containing your full database backup and the subsequent transaction logs, like so:

PS C:\> gci c:\psbackups

    Directory: C:\psbackups

Mode                LastWriteTime     Length Name
----                -------------     ------ ----
-a---        13/09/2013     11:23    2648576 psbd1_backup_201309131100.bak
-a---        13/09/2013     11:23    1800704 psbd1_backup_201309131105.trn
-a---        13/09/2013     11:23      90624 psbd1_backup_201309131110.trn
-a---        13/09/2013     11:23      90624 psbd1_backup_201309131115.trn

PS C:\>

So how do we do this? Well, exactly how we’d do it in T-SQL. Restore the Full backup with NoRecovery and then restore all the transaction log backups in order, also with no recovery, and then recover the database after the last one on has been restored.

Normally that means writing lots of lines of T-SQL (it’s always just before the next full backup isn’t it, and you have 92 transaction logs to restore!). Thanks to PowerShell’s objects we can do better than than.

So to the script:

Import-Module SQLPS -DisableNameChecking

$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server -ArgumentList ("Server1")
$Restore = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Restore
$DeviceType =[Microsoft.SqlServer.Management.Smo.DeviceType]::File

$Restore.Database='psrestore'
$Restore.NoRecovery = $TRUE

$Files = Get-ChildItem C:\psbackups\
$i=1

foreach ($File in $Files){
	if($i -eq $Files.count){
	    $Restore.NoRecovery= $FALSE
	}
	$BackupDevice = New-Object Microsoft.SqlServer.Management.Smo.BackupDeviceItem -ArgumentList $file.FullName, $DeviceType
	$Restore.Devices.Add($BackupDevice)
	$Restore.SQLRestore($SQLSvr)
	$Restore.Devices.Remove($BackupDevice)
	remove-variable BackupDevice
	$i++
}

First we do our normal setup, loading the SMO module and creating some basic objects. Next we set the name of the database we’re restoring, and also tell our $Restore object that we don’t want to recover the database at the end of the Restore (the default option is to recover them).

We use the Get-ChildItem cmdlet to load the contents of our backup directory into an object called $Files, and create a counter variable $i and initialise it to 1.

We now step through each $File in our $Files object. We use the $Files.count property and $i to work out when we are restoring the last file in the list, and when we are we set $Restore.NoRecovery to $FALSE, thereby recovering the database after the restore.

We restore the file and increment the counter variable by 1, and then go through the loop again if we aren’t on the last file.

We can do the same with the Restore-SQLDatabase cmdlet introduced with SQL Server 2012

Import-Module SQLPS -DisableNameChecking

$ServerName = "Server1"

$RestoreDatabase='psrestore'
$Files = Get-ChildItem C:\psbackups\
$i=1

foreach ($File in $Files){
    Restore-SqlDatabase -ServerInstance  $ServerName -Database $RestoreDatabase -BackupFile $File.FullName -NoRecovery

    if($i -eq $Files.count){
    Restore-SqlDatabase -ServerInstance  $ServerName -Database $RestoreDatabase -BackupFile $File.FullName
    }
    $i++
}

When we use this cmdlet we can’t set the NoRecovery property as we did with SMO, as it’s not a parameter but a switch. So we us the same logic to find when we’re restoring the last file, and then run the cmdlet again without the -NoRecovery switch.

This fairly straightforward, and simple to write it you know that you only have the files for 1 database in a folder, and they’re in correct chronological/LSN order with the full backup being the oldest. But what do you do if you just have a folder full of random files from multiple databases from multiple days with a mix of full, differential and transaction backups? Well, that’s what we’re going to explore over the next couple of days. Tomorrow we’ll begin by looking at how to look inside the backup files.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 11 of 31 Days of SQL Server Backup and Restore using PowerShell: Simple Database Restore

As we’ve seen in the past 10 days, performing SQL Server backups with PowerShell is pretty straight forward. Now we’re going to move on to Restores. This is where PowerShell really starts to shine. SQL Server doesn’t offer any real options for automating restores of Databases for checking or verification. You can schedule specific pieces of T-SQL, but that only applies to specific databases, or you end up tying yourself in knot with dynamic SQL.

As we did with backups we’ll start off with the simplest form of restore to introduce the basic concepts we’ll be building on. Here we’re just going to take a full database backup where we know which database it’s from, restore and recover it.

So here’s the scripts, first off here’s the full SMO version:

import-module "SQLPS" -DisableNameChecking

$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("Server1")
$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

$backupname = "C:\psbackups\psrestore.bak"

$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($backupname,$devicetype)

$restore.Database = "psrestore"
$restore.ReplaceDatabase = $True
$restore.Devices.add($restoredevice)
$restore.sqlrestore($sqlsvr)

As you can see this is pretty much identical to the basic PowerShell backup script from Day 1. The only real differences are that we have an $restore object of type Microsoft.SqlServer.Management.Smo.Restore and we’re calling a sqlrestore method, all the device creation methods are exactly the same.

And here’s the simple form of the Restore-SQLDatabase cmdlet:

Restore-SqlDatabase -ServerInstance "Server1" -Database "psrestore" -BackupFile "C:\psbackups\psrestore.bak"

Which is even simpler.

So far so simple. But as we’ll explore over the next days this gets a lot more complicated when you try to make it flexible, able to cope with complex restores and robust enough to ensure a properly recovered database.

The next step towards that is tomorrow’s post on restoring transaction logs.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 10 of 31 Days of SQL Server Backup and Restore using PowerShell: Automating with Task Scheduler

Yesterday we looked at scheduling PowerShell scripts with SQL Sever Agent, but what do you do if you’re running SQL Server Express and don’t have that luxury?

Well, it turns out that it’s pretty easy to schedule PowerShell taks with the Windows Task Scheduler.

Images in this post are from Task Scheduler on Windows 2012, but they are pretty much identical back to Windows 2008.

Task Scheduler offers a nice “Create Basic Task Wizard”, which quickly runs you through the pertinent options from the full task creation process,b ut unfortunately this doesn’t let you set a more complicated schedule than once a day:

basic-task-wizard

To do that you need to create a full scheduled task. Stepping through process isn’t too complicated, but just a few gotchas to watch out for.

schedule-properties

The first pane you’ll come to is the general properties one. Here you can provide a title for you task, and change the user under who’s security context the job will run. You’ll need to tick the box “Run whether user is logged on or not” otherwise it won’t run when the user is logged off. If you run it as any user other than the one creating the task, then any time you modify the task you’ll be prompted to provide the users credentials:

schedule-password

This is to prevent someone modifying a job to abuse any elevated privileges it runs under.

The next step along it ‘Triggers’:

Schedule-multiple-triggers

here we can build up the conditions under which our scheduled task will fire. As shown in the above screenshot you can set multiple triggers for the same task. For instance, you might want to run transaction log backups every 15 minutes durig 9-5 Production hours, but only every 2 hours outside of that window.

You build or Edit a trigger with the following screen:

schedule-triggers

You can set pretty much any schedule you want here. Don’t feel constrained by the values in the drop down boxes, you can type your own values if you want (for instance every 7 minutes).

Once we have our triggers we can define the actions that we want to take when they trigger:

schedule-action

Just like with Trigger you can have multiple Actions that will all fire on a trigger:

schedule-action-detail

Note that the Action we take is to start PowerShell, not to run out script. This is because .ps1 PowerShell scripts are not themselves executable, but need to be passed to PowerShell to run. So PowerShell is our program, here I’m relying on the system %PATH% variable to find it, but if you want to ensure the correct exe is started you can provide the full path to the exe. So as PowerShell is fired up we pass our script in as an Argument. As long as all file references in your scripts are fully qualified you probably don’t need to populate the “Start In” box, but if you are loading custom modules or other  sub scripts then it may be easier to start the script in a specific location.

Now as this is a series on PowerShell it’d a be a bit remiss not to show you how to create a new Scheduled task using PowerShell itself. The bad news is, that this is only implemented in Windows 2012 and won’t be back ported:

$action = New-ScheduledTaskAction -Execute PowerShell.exe -Argument "C:\scripts\backup_all_dbs.ps1"
$trigger = New-ScheduledTaskTrigger -once -at 00:01 -RepetitionInterval (New-TimeSpan -Minutes 30) -RepetitionDuration (new-timespan -Days 1)
Register-ScheduledTask -TaskName "SQL Backup" -TaskPath "\" -Action $action -Trigger $trigger -user "Domain\User" -Password "p@55w0rd"

In versions prior you have to use schtasks. schtasks has a large and flexible syntax which is beyond the scope of this article, but is well documented here with plenty of examples. For completeness here is the schtasks command line for the above PowerShell:

schtasks /create /sc minute /mo 30 /tn "SQL Backup" /tr "powershell.exe -c c:\scripts\SQL_backup.ps1" /ru Domain\User

When you run this, schtask will prompt you for the users password.

Tomorrow we’ll begin to look at using PowerShell to restore SQL Server databases

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here

Day 9 of 31 Days of SQL Server Backup and Restore using PowerShell: Automating with SQL Server Agent

So far in this series we’ve looked a variety of ways you can use PowerShell to run your SQL Server backups, but we’ve not touched on how you can automate them. That is going to be the topic of the next 2 posts. Today we’ll look at automating the scripts with SQL Server Agent, and tomorrow we’ll look at doing it with Task Scheduler.

SQL Agent has been able to schedule PowerShell jobs since 2008, and the basic method hasn’t changed. Screenshots here are from SQL Server 2012:

PowerShell is just another Step type in SQL Server you can pick from the dropdown:

New job step for PowerShell SQL Agent task

And then you can enter your PowerShell into the box provided. The Open button allows you to browse for existing files, but only puts the current content into the box:

New-job-step-powershell

Here I’ve used the basic looped database backup script from Day 2. Once crucial point to spot here, is that by default the script will be run as the “SQL Server Agent Service Account”. Now if you’re following SQL Server best practice, then this account won’t have much access to your local disks or to network shares, which is likely to cause your backup script to fail.

The way around this is to create a SQL Server Agent Proxy for PowerShell, and associate this with a credential that does have access to your backup locations, which is following best security practice.

The other problem that can occur is that so far we have incorporated very little checking and error reporting into our scripts. This means that if an error occurs then you a likely to get back less than useful error messages. Looking through the SQL Server errorlog can often provide some more details. I’ll be covering checking and error reporting in more detail towards the end of this series.

If you’d rather run from saved .ps1 PowerShell scripts then you’ll need to use the “Operating system (CmdExec”) Step type:

new-job-step-cmdexec

You can then enter the call to your script into the Command box. Note that we are actually calling PowerShell.exe and then passing our script in as a parameter. You can’t call a PowerShell script directly.

new-job-step-cmdexec-powershell2

The same security restrictions apply to this step as to the PowerShell one, so you may have to create a second SQL Server Agent Proxy or grant CmdExec to an existing proxy.

Now you’ve created you job you can schedule it like any other SQL Server job.

So this is all great if you’ve got SQL Server Agent, but what about if you’re using SQL Server Express and don’t have SQL Agent to schedule your backups? Well, we’ll cover scheduling PowerShell with Windows Task Scheduler tomorrow

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Page 2 of 4

Powered by WordPress & Theme by Anders Norén