Complex SQL Server restore scenarios with the dbatools Restore-DbaDatabase pipeline

dbatools logoNo matter how hard the dbatools team try, there’s always someone who wants to do things we’d never thought. This is one of the great things with getting feedback direct from a great community. Unfortunately a lot of these ideas are either too niche to implement, or would be a lot of complex code for a single use case

As part of the Restore-DbaDatabase stack rewrite, I wanted to do make things easier for users to be able to get their hands dirty within the Restore stack. Not necessarily needing to dive into the core code and the world of Github Pull Requests, but by manipulating the data flowing through the pipeline using standard PowerShell techniques All the while being able to do the heavy listing with out code.

So, below the fold we’ll be looking at some examples of how you can start going to town with your restores
Continue reading

Tagged , ,

Deep copy arrays in PowerShell

PowerShellJust a quick post for something I had to deal with for another post I’m writing. I wanted to do some work where I’d reuse a base array over a number of passes in a loop, and didn’t want any change made impacting on later iterations. If you’ve tried copying an array containing arrays or other objects in PowerShell in the usual manner then you’ll have come across this problem:

Copying array of arrays fails in powerShell

As is common in .Net, the copy is ‘Copy by Reference’ so you don’t get a nice shiny new independent array to play with. All that’s copied is the references to the original’s place in memory. Therefore any changes to either object affects both, as both variable names are looking at the same piece of memory. This is nice and efficient in terms of storage and speed of copying, but not great for my purposes.

There are various workarounds kicking around if you’re using simple arrays, but they tend to breakdown when you’ve got arrays that contain arrays or other PowerShell objects. My method for copying them is a little down and dirty, but it works 95% of the time for what I want. The trick is to Serialize the object, and then DeSerialize it into the new one:

Deep Copy Powershell array using serialization

And, voila we have the outcome I wanted. Just to make the line of code easier to read, here it is:

$arr3 = [Management.Automation.PSSerializer]::DeSerialize([Management.Automation.PSSerializer]::Serialize($arr2))

Now, I mentioned up front that this works ~95% of the time. The times it doesn’t work for me are when the underlying object type doesn’t serialize nicely. The most common one I come across is BigInt. This ‘deserializes’ back in as a non integer type and then won’t play nice when compared to other ‘real’ BigInt value, so make sure to check you have the values you think you should do

Debugging the new dbatools Restore-DbaDatabase pipeline

A new version of dbatools Restore-DbaDatabase command was released into the wild this week. One of the main aims of this release was to make it easier to debug failures in the restore process, and to drag information out of the pipeline easily (and anonymously) so we can increase our Pestering of the module with Unit and Integration tests.

So I’d like to share some of the features I’ve put in so you can take part.

The biggest change is that Restore-DbaDatabase is now a wrapper around 5 public functions. The 5 functions are:

  • Get-DbabackupInformation
  • Select-DbabackupInformation
  • Format–DbabackupInformation
  • Test–DbabackupInformation
  • Invoke-DbaAdvancedRestore

These can be used individually for advanced restore scenarios, I’ll go through some examples in a later post.

For now it’s enough to know that Restore-DbaDatabase is a wrapper around this pipeline:

Get-DbabackupInformation |Select-DbabackupInformation | Format-DbabackupInformation | Test-DbabackupInformation | Invoke-DbaAdvancedRestore

and it’s other function is passing parameters into these sub functions as needed.

With version of Restore-DbaDatabase you were restricted to throwing data into one end, and seeing what came out of the other end, with some insight produced by Verbose messages. Now things can be stepped through, data extracted as need, and in a format that plugs straight into out testing functions.


This is the function that gets all of the information about backup files. It scans the given paths, and uses Read-DbaBackupHeader to extract the information from them. This is stored in a dbatools BackupHistory object (this is the same as the output from Get-DbaBackupHistory, so we are standardising on a format for Backup information to be passed between functions).

So this would be a good place to check that you’ve gotten the files you think you should have, and is also the first place we’d be looking if you had a report of a break in the LSN chain

To get the output from the pipeline at this point we use the GetBackupInformation parameter:

Restore-DbaDatabase - -GetBackupInformation gbi

This will create a globally scoped variable $gbi containing the ouput from Get-DbaBackupHistory. Note, that when passing the name to Restore-DbaDatabase you do not need to specify the $.

If you want to stop execution at this point, then use the -StopAfterGetBackupInformation switch. This will stop Restore-DbaDatabase from going any further.

This is also a good way of saving time on future runs, as the BackupHistory object can be passed straight in, saving the overhead of reading all the file heasers again:

$gbi | Restore-DbaDatabase [Usual Parameters] -TrustDbBackupHistory


Here we filter down the output from Get-DbaBackupInformation to restore to the point in time requested, or the latest point we can. This means we find :
– the last full backup before the point in time
– the latest differential between the full backup and the point in time
– and then all transaction log backups to get us to the requested time
This is done for every database found in the BackupHistory object

Here is where we’d begin looking for issues if you had a ‘good’ LSN chain from Get-DbaBackupInformation and then it broke.

To get this data you use the SelectBackupInformation parameter, passing in the name of the variable you want to store the data in (without the $ as per GetBackupInformation above)

There is also a corresponsing StopAfterSelectBackupInformation switch to halt processing at this point. We stop processing at the first stop in the pipeline, so specifying multiple StopAfter* switches won’t have an effect


This function performs the transforms on the BackupHistory object per the parameters pushed in. This includes renaming databases, and file moves and rename. For everything we touch we add an extra property of Orignal to the BackupHistory object. For example the original name of the database will be in OriginalDatabase, and the target name will be in Database

So this is a good spot to test why transforms aren’t working as expected.

To get data out at this pipeline stage use the FormatBackupInformation paramter with a variable name. And as normal it has an accompanying StopAfterFormatBackupInformation switch to halt things there


Before passing the BackupHistory object off to be restored we do some checks to make sure everything is OK. The following checks are made:

  • LSN chain complete
  • Does a destination file exist, if owned by a different database then fail
  • Does a destination file exist, if owned by the database being restored is WithReplace specfied
  • Can SQL Server see and write to all the destination folders
  • Can SQL Server create any destination folders missing
  • Can SQL Server see all the backup files

If a database passes all these checks then it’s backup history is marked as restorable by the IsVerified property being set $True.

To get the data stream out at this point use the TestBackupInformation parameter.

General Errors with restores

Once we’re past these stages, then our error reporting is at the mercy of the SMO Restore class. This doesn’t always provide an obvious cause straight away. Usually the main error can be found with:

$error[1] | Select-Object *

We like to think we capture most restore failure scenarios nicely, but if you find something we don’t then please let you know, either on Slack or by raising a Github issue

As usually the dbatools terminating error will be in $error[0].

Providing the information for testing or debugging.

If you’re running in to problems then the dbatools team may ask you to provide the output from one of these stages so we can debug it, or incorporate the information into our tests.

Of course you won’t want to share confidential information with us, so we would recommend anonymising your data. My normal way of doing this is to use these 2 stubbing functions:

So if we’ve asked for the Select-DbaBackupInformation the process would be:

Restore-DbaDatabase -[Normal parameters] -SelectBackupInfomation sbi -StopAfterSelectBackupInformation
Filter-DbaToolsHelpRequest $sbi
$sbi | Export-CliXml -Depth -Path c:\some\path\file.xml

And then upload the resulting xml file.

This method will anonymise the values in ComputerName, InstanceName, SqlInstance, Database, UserName, Path, FullName, FileList, OriginalDatabase, OriginalFileList, OriginalFullName and ReplaceDatabaseName. But will produce the same output for the same input, so we can work with multiple database sets at once.

I hope that’s been of some help. As always if you’ve a question then drop a comment below, ping me on twitter (@napalmgram) or raise an issue with dbatools on Slack or Github

Tagged ,

SQL Relay Nottingham 2017

SQL Relay logo Once again I’ve very happy to announce that the travelling SQL Server conference SQL Relay will be vising Nottingham again on Tuesday 10th October 2017. We’ll once again be offering a whole day of free training around the Microsoft Data Platform covering topics like

  • SQL Server
  • Azure
  • Machine learning
  • Data Science
  • Azure Data Lake
  • PowerBI

and much more. Our last visits to Nottingham have gone down very well with great feedback from attendees.

For more information on sessions, and to register for this free event (we even provide lunch) please head here – Nottingham SQL Relay registration

And if you’re based in Nottingham and interested in SQL Server, then you may want to checkout out the Nottingham SQL Server user group as well, we’d love to welcome you along to future meetings –

A new addition to UK events – PSDay UK

The UK has a great range of techie events these days, but it’s always good to see another one starting up. PSDay.Uk is the first in an international one day PowerShell events.

The idea is to provide a local set of events that complement the 3 big international PowerShell conferences (, PsConf.Asia and PowerShell Global Summit). This makes it easier for local attendees to get to events (no airfare, possibly no hotels, much easier to get agreement from employers), gives an opportunity for local speakers, but also provides a framework to let the organisers tap into the International speakers.

My good friend Rob has written more on the idea and formation on the organising body of which he’s a member over on his blog here – What’s a PSDay

The initial UK event is happening on Friday 22nd September 2017 at CodeSkills in London. Full details and tickets are available on the website – and more information is available on Twitter (@PSDayUK) and Facebook

I’m completely chuffed at being picked to speak at this event as well. I’ll be presenting a new session titled “DevOps by the back door, via the Helpdesk”, where I’ll be sharing examples and ideas of how you can start your DevOps journey with small bite sized actions which will start to show your employer (and colleagues) that this stuff really does work.

Hope to see lots of people there. This sounds like a great project, so make sure you can say you were at the first one in years to come!

Return debugging information from PowerShell functions via variables

You’re writing a series of PowerShell functions that are to be used in a pipeline (pseudo code):

$Output = Get-Data | Where-Data 

and you want to grab the output of each command but not interfere with the pipeline throughput? Say you want to see what Get-Data is getting and passing into Where-Data

Well, we’re going to look at implementing something close to the built in ErrorVariable and WarningVariable I’ve written about before We’ll start with this pipeline:

$output = Get-Data -Path 'C:\dbatools\standby' | Where-Object {$_.Name -like 'Restore-DbaDatabase.ps1'} 

where we’ll have the following Get-Data function:

function Get-Data{
  Process {
    Get-ChildItem -Path $Path

So nothing too fancy, really just a wrapper around Get-ChildItem, but it gets me a decent data set that I can then filter. Now if this function was also doing some filtering of the data and we weren’t seeing the output we wanted in $ouput then it would be nice to grab the data before it’s passed on. To do this we’ll take a variable name as a parameter and then assign than within the function using Set-Variable:

function Get-Data{
  Process {
    $Data = Get-ChildItem -Path $Path
    if ($ReturnVar){
      Set-Variable -Name $ReturnVar -Value $Data -Scope Global
$output = Get-Data -Path 'C:\dbatools\standby' -ReturnVar ReturnVar | Where-Object {$_.Name -eq 'PointInTime20170707182205.bak'} 

and we’ll get:

Things to note in the function:
we use Set-Variable as this copes better if the variable already exists. Be careful of this as you will clobber the existing variable
Wrap Set-Variable in an if block, or it’ll throw when passed an empty variable name (if you wanted to be really solid, you’d want to put some validation around the parameter to make sure you only take in legal variable names.
We set the scope of the variable to Global to make sure we can see it outside of our command or module.

If you have a large function with lots of points where you’d like the option to return data from various points for future debugging, then you can always have multiple return variable parameters and populate them as you need them.

Creating a Time Series database for SQL Server Restore testing

As a prelude to a post describing some new features of the dbatools Restore-DbaDatabase function, I thought I’d just got through how I create a time series database to give me something to test restoring to Point in Time with minimal overhead.

This sort of database can also be really handy for your own training if you want to play with point in time restores, or explore what can be done with standby restores.


I want to end up with:

  • A database – basic unit of restore
  • Single table – Let’s keep this small and simple
  • Table contains a series of rows written at a set time interval

So the table should end up like this:

308/06/2017 09:46:45
408/06/2017 09:47:15

We should end up with a nice spread of times so we can easily say ‘Restore to this point in time’ and then be able to check we’ve actually gotten there

Continue reading

Powershell, using ErrorVariable and WarningVariable

If you’re looking to record the error and warning outputs from your PowerShell commands whilst keeping your console free of nasty red text, then the ErrorVariable and WarningVariable parameters are just what you’re looking for.

Let’s take a look at a quick simple example:

Get-ChildItem -Path C:\DoesNotExist\
$ErrVar = 'ErrorVariable'
Get-ChildItem -Path C:\DoesNotExist\ -ErrorVariable ErrVar

Which assuming you don’t have C:\DoesNotExist\ on your workstation, then you should see output like this:

From this quick example a couple of things stand out:
* you pass the name for the Error variable in without the $
* If you resuse a variable name it’ll get clobbered, so be careful (or read on)
* The error is still output to the screen

The last one is not what we want. So one more little tweak will make that disapper

Get-ChildItem -Path C:\DoesNotExist\ -ErrorVariable ErrVar -ErrorAction SilentlyContinue

That’s much better, no more red text! -ErrorAction SilentlyContinue does exactly what it says on the tin.

WarningVariable and WarningAction work in exactly the same way, just on Warning events rather than Errors.

Even better news is that any advanced functions you write automatically get these options:

function Test-ErrorVariable {
  Process {
    Get-ChildItem -Path
Test-ErrorVariable -Path c:\DoesNotExist -ErrorAction SilentlyContinue -ErrorVariable ErrVar

That’s it, just write an advanced function and PowerShell puts all this in behind the scenes for you

Now, what happens if you’re calling multiple commands and want to store the error output from all of them. Do you need to specify a new ErrorVariable for each one? That could get horribly messy with auto generating variable names in a loop. Luckily there’s a built in method for handling that, prepend the variable name with + and the new data will be added to the variable (which is an array):

Test-ErrorVariable -Path c:\DoesNotExist -ErrorAction SilentlyContinue -ErrorVariable ErrVar
Test-ErrorVariable -Path c:\DoesNotExistEither -ErrorAction SilentlyContinue -ErrorVariable +ErrVar

So you can reuse the same name throughout your script. Don’t worry about when to add the + to the string, behind the scenes PS checks if the variable exists and if it doesn’t creates it the first time it’s used.

These Error and Warning paramters are very useful for collecting information about failed runs, and from hiding nasty screens of red text from end users if you don’t want to worry them.

Using the Chronometer PowerShell module to analyse module performance

Whilst going through my blog roll on Feedly this weekend, I spotted Kevin Marquette’s blog post announcing his new Chronometer module.

At a high level it generates a script profiler timeline as you run through a script. Showing you how long each line takes to run, which lines weren’t run and how many times that line has been called.

I’ve just finished writing a number of functions for the dbatools module to implement some new restore functionality, this seemed like a great way to peer into my functions and see what was going on inside, and spot where I was spending most of the time. This should make it easier to work out where I needed to focus attention to improve performance.

At it’s simplest form it can be called like this:

$start = Get-Date
$Chronometer = @{
    Path = 'c:\dbatools\module\functions\Restore-DbaDatabase.ps1'
    Script = {Restore-DbaDatabase -SqlServer localhost\sqlexpress2016 -Path C:\dbatools\backups -WithReplace -OutputScriptOnly}
$results = Get-Chronometer @Chronometer -Verbose
(New-TimeSpan -Start $start -End (Get-Date)).TotalSeconds

This took about 10 seconds to run on my test box and dumped out all the coverage for the Restore-DbaDatabase function.

Code example be here

Using regex to find PowerShell functions in PowerShell scripts

Whilst having a play with someone else’s code I wanted to quickly find all the function definitions within a module, and then all the function calls within a function definition.

Having had a quick bingle around for a prewritten regex example I didn’t come up with much that fitted the bill. So in the hope that this will help the next person trying to do this here they are:


  • A PowerShell function name is of the form Word-Word
  • A PowerShell function definition is of the form “Function Word-Word”
  • A Powershell function call can be preceeded by a ‘|’,'(‘, or ‘ ‘
  • The script is written using a reasonable style, so there is a ‘ ‘ post call

So to find the function definition I ended up using:

 -match 'function\s(\w+-\w+)'

The function name ends up in $Matches[1]

And for a PowerShell function call:

-match '[^\s|(]\w+-\w+'}

The function name ends up in $Matches[0]

This works accurately enough for what I needed it to do. Feel free to let me know if you can spot any improvements on it.