dbatools logoNo matter how hard the dbatools team try, there’s always someone who wants to do things we’d never thought. This is one of the great things with getting feedback direct from a great community. Unfortunately a lot of these ideas are either too niche to implement, or would be a lot of complex code for a single use case

As part of the Restore-DbaDatabase stack rewrite, I wanted to do make things easier for users to be able to get their hands dirty within the Restore stack. Not necessarily needing to dive into the core code and the world of Github Pull Requests, but by manipulating the data flowing through the pipeline using standard PowerShell techniques All the while being able to do the heavy listing with out code.

So, below the fold we’ll be looking at some examples of how you can start going to town with your restores

Improving header scan performance

One of the most common requests has been ways of speeding up reading the headers from the backup files. Unfortunately we have to do this so we can be sure of what they contain, and there aren’t any shortcuts (I’ve looked into it in some depth before)

So we can now offer a couple more options to do this:

Spread the load over time.

Using Get-DbaBackupInformation it’s now possible to scan the headers ahead of time, and just add to them in small batches going forward. So you could scan every 3 hours, and if you needed more recent file, you’d only have to scan the files written since the last scan:

#First Scan:
Get-DbaBackupInformation -SqlInstance Server1\Instance1 -Path c:\backups -ExportPath c:\BackupInformation.xml
#Some time later:
$BackupInformation = Get-DbaBackupInformation -Import - Path c:\BackupInformation.xml
$LastScanTime = (Get-Item c:\BackupInformation.xml).LastWriteTime
$BackupInformation += Get-ChildItem c:\backups | Where-Object {$_.LastWriteTime -ge $LastScanTime} | Get-DbaBackupInformation -SqlInstance Server1\Instance1
$BackupInformation | Export-CliXml -Path C:\BackupInformation.xml

Run parallel jobs on a single server

Another option is to run multiple scans at the same time. For this example I’m using the PoshRsJob module as it’s one I use a bit, but any other runspace/jobs options would work just as well.

In the first example we scan multiple directories on the same SQL Server Instance. We pass an array of folders into Start-RsJob and scan each folder in parallel as a seperate job

Import-Module poshrsjob
$folders = ('C:\dbatools\RestoreTimeClean','C:\dbatools\RestoreTimeDiffDemo','c:\dbatools\restoretimeDiff')
$job = $Folders | Start-RsJob -ModulesToImport dbatools -ScriptBlock {
param($Folder)
Get-DbaBackupInformation -SqlInstance localhost\SqlExpress2016 -Path $Folder
}
$job | Wait-RsJob -ShowProgress
$BackupInformation = $job | Receive-RsJob
Get-RsJob | Remove-RSJob
$BackupInformation | Restore-DbaDatabase -SqlInstance Server1\Instance -TrustDbBackupHistory

Run parallel jobs on multiple servers

Perhaps you want to spread the load even more? The run the scans across multiple SQL Instances. This example uses a simple allocation routine that just ‘cross joins’ the options:

Import-Module PoshRsJob
$servers = ('localhost\sqlexpress2016','localhost\developer2016')
$folders = ('\\localhost\dbatools$\RestoreTimeClean','\\localhost\dbatools$\RestoreTimeDiffDemo','\\localhost\dbatools$\restoretimeDiff')
$InputObject=@()
$Counter = 0
ForEach ($folder in $folders){
$InputObject += [PSCustomObject]@{
ServerInstance = $servers[$Counter%($servers.count)]
Folder = $folder
}
$Counter++
}
$InputObject | Start-RsJob -ModulesToImport dbatools -name {"test"} -ScriptBlock {
param($Input)
Get-DbaBackupInformation -SqlInstance $Input.ServerInstance -Path $Input.folder
}
$job | Wait-RsJob -ShowProgress
$BackupInformation = $job | Receive-RsJob
Get-RsJob | Remove-RSJob
$BackupInformation | Restore-DbaDatabase -SqlInstance Server1\Instance -TrustDbBackupHistory

Custom Log and Data Folders for each database

Perhaps you want to restore a number of databases, and you want seperate Data and Log folders for each database, eg:

db1 - c:\folder\data\db1 and c:\folder\log\db1
db2 - c:\folder\data\db2 and c:\folder\log\db2

By dipping into the restore pipleine we can loop through the databases contained within the backup information and apply custom formatting to each database:

$Backups = Get-DbaBackupInformation -sqlinstance localhost\sqlexpress2016 -Path C:\dbatools\RestoreTimeClean
$Backups += Get-DbaBackupInformation -sqlinstance localhost\sqlexpress2016 -Path C:\dbatools\RestoreTimeDiffStripe
$BackupSelection = $Backups | Select-DbaBackupInformation
$BackupSelection | Format-DbaBackupInformation
$basedir = "C:\some\path\of\yourchoosing\"
$BackupSelection | Foreach-Object {
ForEach ($file in $_.FileList){
$t = [system.io.FileInfo]$File.PhysicalName
$type = switch($file.Type){'D'{'Data'} 'L'{'Log'}}
$File.PhysicalName = [io.path]::combine($basedir,$_.Database+"_"+$type,$t.Name)
}
}
$BackupSelection | Test-DbaBackupInformation -SqlInstance localhost\developer2016
$BackupSelection | Invoke-DbaAdvancedRestore -sqlinstance localhost\developer2016
view raw dbname_log.ps1 hosted with ❤ by GitHub

Creating multiple environments

Perhaps you need to refresh multiple environments at the same time. Using Get-DbaBackupInformation to create a BackupHistory you can use the same scanned files multiple times, saving scanning them repeatedly:

$BackupInformation = Get-DbaBackupInformation -sqlinstance localhost\sqlexpress2016 -Path C:\dbatools\RestoreTimeDiffDemo
$Environments = @('Dev','Test','Stage')
ForEach ($Environment in $Environments){
$BackupInformation | Restore-DbaDatabase -sqlinstance localhost\sqlexpress2016 -DestinationFilePrefix $Environment -RestoredDatababaseNamePrefix ("$($Environment)_") -TrustDbBackupHistory
}

Creating different points in time for comparison

Perhaps you’re trying to find out when an issue occured. This example restores the same database to multiple points in time so you can compare the state between them.

Get-DbaBackupInformation -sqlinstance localhost\sqlexpress2016 -Path C:\dbatools\RestoreTimeDiffDemo -ExportPath C:\temp\bk.xml
$BackupSelection = New-Object System.Collections.ArrayList
$startDate = Get-Date '24/11/2017 09:22'
$BaseDirectory = 'c:\restores\'
for ($i=0; $i -lt 35; $i+=7){
$BackupInfo = Get-DbaBackupInformation -Import -Path c:\temp\bk.xml
$BackupInfo = $BackupInfo | Select-DbaBackupInformation -RestoreTime $startDate.AddMinutes($i)
$DbName = $BackupInfo[0].Database
$RestorePath = ([io.path]::Combine($BaseDirectory,$DbName,$startDate.AddMinutes($i).ToString('hhmmddMMyyyy')))
$NewDbName = $DbName+"_"+$startDate.AddMinutes($i).ToString('hhmmddMMyyyy')
$null = $BackupInfo | Format-DbaBackupInformation -DataFileDirectory $RestorePath -ReplaceDatabaseName $NewDbName
$BackupSelection.Add($BackupInfo) > null
}
$null = $BackupSelection | Test-DbaBackupInformation -SqlInstance localhost\sqlexpress2016 -Withreplace
$RestoreOutput = $BackupSelection | Invoke-DbaAdvancedRestore -SqlInstance localhost\sqlexpress2016 -WithReplace

Rolling forward looking for data

Building on the one above, perhaps you’d rather something else did the checking for you? With this script we gradually roll forward a database a minute at a time. We use a SQL query (quite a simple one here) to indicated when the process should stop.

Get-DbaBackupInformation -sqlinstance localhost\sqlexpress2016 -Path C:\dbatools\RestoreTimeDiffDemo -ExportPath C:\temp\bk.xml
$BackupSelection = New-Object System.Collections.ArrayList
$startDate = Get-Date '24/11/2017 09:22'
$BaseDirectory = 'c:\restores\'
$Continue = $true
$LoopCount = 0
$NewDbName = 'Creeping'
$RestoreResults = @()
$Continue = $true
While ($Continue -eq $True){
$BackupInfo = Get-DbaBackupInformation -Import -Path c:\temp\bk.xml
$BackupInfo = $BackupInfo | Select-DbaBackupInformation -RestoreTime $startDate.AddMinutes($loopCount)
$null = $BackupInfo | Format-DbaBackupInformation -DataFileDirectory $BaseDirectory -ReplaceDatabaseName $NewDbName
$RestoreResults += $BackupInfo | Invoke-DbaAdvancedRestore -SqlInstance localhost\sqlexpress2016 -StandbyDirectory $BaseDirectory -Continue
$results = Invoke-Sqlcmd2 -ServerInstance localhost\sqlexpress2016 -query "select * from steps where step='20'" -Database $NewDbName
if ($null -ne $results){
$Continue = $false
}
$LoopCount++
}
view raw RollRestore.ps1 hosted with ❤ by GitHub

Complex Rebasing of backups

When people use Get-DbaBackupHistory to get the pre created Backup History direct from SQL Server, it’s a very common request for a way of changing the path of where the backup files were to where they are now. One thing that catches a lot of people out is that the Backup History object stores the Backup file paths as an array. This allows us to easily cope with striped backupsets where the backup consists of multiple files. The simplest way is a nested ForEach loop:

$BackupHistory = Get-DbaBackupHistory -SqlInstance localhost\sqlexpress2016 -Database RestoreTimeStripe
$BackupHistory | ForEach {$_.FullName = $_.FullName |
ForEach {
($_ -replace 'c:\\dbatools\\RestoreTimeStripe\\stripe1','\\new\stripeA') -replace 'c:\\dbatools\\RestoreTimeStripe\\stripe2', '\\old\StripeB'
}
}

With this method you can manipulate the string’s to your hearts content using any standard PowerShell technique.

I’m sure there’s plenty more things out there people want to do with their restores. If you’ve got something you want to do and don’t know how to do it, then please drop a comment below, ping my on Twitter (@napalmgram) or come over to our Slack channel for a chat