Category Archives: powershell

dbachecks – SQL Server compliance testing with simple configuration management

if you’ve not heard yet, the people behind the dbatools PowerShell module (including me) have a new toolset for you, dbachecks. dbachecks uses Pester to let you validate your SQL Server estate in a simple way and generate meaningful graphical reports. The official launch of the module is at SQL Bits 2018

Out of the box dbachecks uses test values that we’ve found to be the most appropriate from our years of experience with SQL Server. But these may not be the best values for your particular organisation. For example, we expect to see a Full backup less than 24 hours old for each database, in your case you might only take a Full backup once a week and use differential backups during the week. So we needed a flexible and simple system to let you change the values. dbatools friend Friedrich Weinmann (b | t) has written a great PowerShell module framework called PsFramework which we’ve integrated into dbachecks to handle the configuration of the tests

Out of the box dbachecks will load it’s default config values from the file .\internal\configurations\configuration.ps1 (#! link to GH). To see the current values use Get-DbcConfig:
Get-DbcConfig ouput

I’ve just picked the top few rows for the screenshot, as of writing (15/02/2018) there are 98 config options available. We’ve tried to make the configuration option names obvious. There are main sections of configuration values group:

  • app – Configuration for the module
  • domain – Configuration for authentication
  • mail – Configuration for sending reports
  • policy – The values for the tests
  • skip – Controls which tests should be excluded

And each has a useful desription as well to make it easy to find. In the screenshot you’ll notice that I’ve configured the mail.* options to suit my test environment. So now I can use Send-DbcMailMessage to email my test results without having to specify all the parameters. To set a configuration parameter you use the Set-DbcConfig command:

Set-DbcConfig -Name mail.subject -Value "DbcChecks report"

Set-DbcConfig example

Or perhaps as mentioned at the start of this post, you don’t want a Full backup in the last day, but in the last 7 days. You can easily configure that:

Set-DbcConfig -Name policy.backupfullmaxdays -Value 7

And now your tests will be checking you’re never more than 7 days from a full backup. By default we’re checking that you’ve taken a Differential backup in the last 25 hours, so you’re now good to go!

And don’t worry, you don’t need to reset these every time you run the tests. The PsFramework module persists these non default sessions in the Windows registry at:

Computer\HKEY_CURRENT_USER\Software\Microsoft\WindowsPowerShell\PSFramework\Config\Default

So on my system we can see:
DbaChecks registry storage

Now what happens if you want to distribute these config changes to multiple servers, or share them with colleagues to make sure you’re all singing the same tune. We’d strongly recommend you don’t modify the .\internal\configurations\configuration.ps1 file directly, as that will be replaced whenever you update the module. To make your life easier (which is the entire point of the module) we’ve included Export and import functionality. To export the configuration to an easy to parse JSON file you simply run:

Export-DbcConfig -Path c:\path\of\yourchoice\filename.json

If you omit the path value then by default we will export the results to $script:localapp\config.json. Now you’ve got a simple JSON file it’s easy to source control (you are using source control aren’t you?) to keep track of changes and make sure they were implemented as expected. If you want to apply that configuration to another install, then it’s simply a case of running the import command:

Import-DbcConfig -Path c:\path\of\yourchoice\filename.json

and you’re done. We support UNC, so again it’s simple to have a central repository to apply the same configuration. And it works well with just fragments of configuration as well, so if all you wanted to control were the email settings you can create a JSON file like this:

[
    {
        "Name":  "mail.failurethreshhold",
        "Value":  0,
        "Description":  "Number of errors that must be present to generate an email report"
    },
    {
        "Name":  "mail.from",
        "Value":  "null@stuart-moore.com",
        "Description":  "Email address the email reports should come from"
    },
    {
        "Name":  "mail.smtpserver",
        "Value":  "smtp.stuart-moore.com",
        "Description":  "Store the name of the smtp server to send email reports"
    },
    {
        "Name":  "mail.subject",
        "Value":  "DbcChecks report from stuart",
        "Description":  "Subject line of the email report"
    },
    {
        "Name":  "mail.to",
        "Value":  "SqlReports@stuart-moore.com",
        "Description":  "Email address to send the report to"
    }
]

Then you can import just this snippet to set the configuration for those options. Makes it easy to seperate out enterprise level configuration changes from the actual SQL test options. The same technique can be used to ‘force’ the correct backup testing parameters in all cases, while letting other tests be customised as needed.

Running simple PowerShell commands against multiple servers with a timeout

PowerShellMany years ago I wrote a post on passing functions into Start-Job (Calling a PowerShell function in a Start-Job script block when it’s defined in the same script). Over the years I’ve had a number of emails from people asking about how to use it to fix their own situation. In many of these cases there’s been a simpler way to achieve what they wanted to do.

Sometimes a non obvious solution using PowerShell basics can be much simpler. Patrick posted up that he was having problems running a command against a list of servers and wanting to be able to skip those that time out. Using an injected function for this was a bit of overkill. I had a premade solution I use, which relies on Invoke-Command and is simpler:

$session = New-PSSession -ComputerName Server1, server2,server3 -ErrorAction SilentlyContinue
$ServiceName = 'MSSQL`$'
$timeout = 10
$remote = Invoke-Command -Session $session -ScriptBlock {Get-Service -Name *$using:servicename*} -AsJob 
$remote | Wait-Job  -Timeout $timeout
$output = $remote | Receive-Job 
$output | ForEach {"$($_.Name) on $($_.PsComputerName) is $($_.Status)"}

We setup new PsSessions using New-PsSession, I set ErrorAction to SilentlyContinue just in case a host isn’t available for some reason (if I was being good I’d try/catch here).

As we’re just using PS standard functionality here with Get-Service there’s no need to build a a new function, we can just call this directly here. By calling Invoke-Command against a session pointed at numerous hosts we can PowerShell handle all the connection management here and just assume the command will be ran against each host. if we were running against a lot of hosts then we would want to look into using the -ThrottleLimit parameter to limit the number of concurrent hosts we’re hitting. The one little trick here is using the using scope modifier here so PS pulls in the variable defined in our main scope (gory details on scoping here

As we called Get-Service with the -AsJob switch we can now treat it as a job, so we can use the PS jobs cmdlets to manage it. The first thing we want to do is skip those jobs that are taking longer than out specified timeout value (in this case 10 seconds). So we pass out Invoke-Command job into Wait-Job with a Timeout parameter. PS will now keep an eye on each job and drop those exceeding our timeout limit.

Once we’ve gotten all the jobs that met our timelimit we grab the ouput using Receive-Job, and then just process it like any other PowerShell object

As we’ve removed a lot of complexity it’s much easier to revise or reuse the framework at a later date.

Nottingham and East Midlands PowerShell User Group

PowerShellInterested in any form of PowerShell usage and based around the East Midlands and Nottingham? Then this could be the group for you. We’re looking to cover anything that uses (or can be used with) PowerShell. So topics that are fair game include:

  • AD Management
  • Scripting
  • Source Control
  • DevOps
  • Azure/AWS/Cloud Provider of your choice
  • Exchange management
  • SQL Server
  • Pester testing
  • Continuous Integration
  • .Net Internals
  • Generally anything that would interest someone using PowerShell or give their career a boost!

Nothing is set in stone yet as we want to get some feedback from potential members. There’s a date booked for the 8th Febuary (Kick Off meeting), but what happens is up to you.

Would you prefer a traditional usergroup with booked speakers given presentations in a formal setting, or something more informal like a roundtable/whiteboard sessions? Or perhaps half and half?

We’d love to know what you’d like to see or learn. So please either drop a comment below or sign up for the Kick Off Meeting and give us feedback on Meetup.

Complex SQL Server restore scenarios with the dbatools Restore-DbaDatabase pipeline

dbatools logoNo matter how hard the dbatools team try, there’s always someone who wants to do things we’d never thought. This is one of the great things with getting feedback direct from a great community. Unfortunately a lot of these ideas are either too niche to implement, or would be a lot of complex code for a single use case

As part of the Restore-DbaDatabase stack rewrite, I wanted to do make things easier for users to be able to get their hands dirty within the Restore stack. Not necessarily needing to dive into the core code and the world of Github Pull Requests, but by manipulating the data flowing through the pipeline using standard PowerShell techniques All the while being able to do the heavy listing with out code.

So, below the fold we’ll be looking at some examples of how you can start going to town with your restores
Continue reading

Tagged , ,

Deep copy arrays in PowerShell

PowerShellJust a quick post for something I had to deal with for another post I’m writing. I wanted to do some work where I’d reuse a base array over a number of passes in a loop, and didn’t want any change made impacting on later iterations. If you’ve tried copying an array containing arrays or other objects in PowerShell in the usual manner then you’ll have come across this problem:

Copying array of arrays fails in powerShell

As is common in .Net, the copy is ‘Copy by Reference’ so you don’t get a nice shiny new independent array to play with. All that’s copied is the references to the original’s place in memory. Therefore any changes to either object affects both, as both variable names are looking at the same piece of memory. This is nice and efficient in terms of storage and speed of copying, but not great for my purposes.

There are various workarounds kicking around if you’re using simple arrays, but they tend to breakdown when you’ve got arrays that contain arrays or other PowerShell objects. My method for copying them is a little down and dirty, but it works 95% of the time for what I want. The trick is to Serialize the object, and then DeSerialize it into the new one:

Deep Copy Powershell array using serialization

And, voila we have the outcome I wanted. Just to make the line of code easier to read, here it is:

$arr3 = [Management.Automation.PSSerializer]::DeSerialize([Management.Automation.PSSerializer]::Serialize($arr2))

Now, I mentioned up front that this works ~95% of the time. The times it doesn’t work for me are when the underlying object type doesn’t serialize nicely. The most common one I come across is BigInt. This ‘deserializes’ back in as a non integer type and then won’t play nice when compared to other ‘real’ BigInt value, so make sure to check you have the values you think you should do

Debugging the new dbatools Restore-DbaDatabase pipeline

A new version of dbatools Restore-DbaDatabase command was released into the wild this week. One of the main aims of this release was to make it easier to debug failures in the restore process, and to drag information out of the pipeline easily (and anonymously) so we can increase our Pestering of the module with Unit and Integration tests.

So I’d like to share some of the features I’ve put in so you can take part.

The biggest change is that Restore-DbaDatabase is now a wrapper around 5 public functions. The 5 functions are:

  • Get-DbabackupInformation
  • Select-DbabackupInformation
  • Format–DbabackupInformation
  • Test–DbabackupInformation
  • Invoke-DbaAdvancedRestore

These can be used individually for advanced restore scenarios, I’ll go through some examples in a later post.

For now it’s enough to know that Restore-DbaDatabase is a wrapper around this pipeline:

Get-DbabackupInformation |Select-DbabackupInformation | Format-DbabackupInformation | Test-DbabackupInformation | Invoke-DbaAdvancedRestore

and it’s other function is passing parameters into these sub functions as needed.

With version of Restore-DbaDatabase you were restricted to throwing data into one end, and seeing what came out of the other end, with some insight produced by Verbose messages. Now things can be stepped through, data extracted as need, and in a format that plugs straight into out testing functions.

Get-DbaBackupInformation

This is the function that gets all of the information about backup files. It scans the given paths, and uses Read-DbaBackupHeader to extract the information from them. This is stored in a dbatools BackupHistory object (this is the same as the output from Get-DbaBackupHistory, so we are standardising on a format for Backup information to be passed between functions).

So this would be a good place to check that you’ve gotten the files you think you should have, and is also the first place we’d be looking if you had a report of a break in the LSN chain

To get the output from the pipeline at this point we use the GetBackupInformation parameter:

Restore-DbaDatabase - -GetBackupInformation gbi

This will create a globally scoped variable $gbi containing the ouput from Get-DbaBackupHistory. Note, that when passing the name to Restore-DbaDatabase you do not need to specify the $.

If you want to stop execution at this point, then use the -StopAfterGetBackupInformation switch. This will stop Restore-DbaDatabase from going any further.

This is also a good way of saving time on future runs, as the BackupHistory object can be passed straight in, saving the overhead of reading all the file heasers again:

$gbi | Restore-DbaDatabase [Usual Parameters] -TrustDbBackupHistory

Select-DbaBackupInformation

Here we filter down the output from Get-DbaBackupInformation to restore to the point in time requested, or the latest point we can. This means we find :
– the last full backup before the point in time
– the latest differential between the full backup and the point in time
– and then all transaction log backups to get us to the requested time
This is done for every database found in the BackupHistory object

Here is where we’d begin looking for issues if you had a ‘good’ LSN chain from Get-DbaBackupInformation and then it broke.

To get this data you use the SelectBackupInformation parameter, passing in the name of the variable you want to store the data in (without the $ as per GetBackupInformation above)

There is also a corresponsing StopAfterSelectBackupInformation switch to halt processing at this point. We stop processing at the first stop in the pipeline, so specifying multiple StopAfter* switches won’t have an effect

Format-DbaBackupInformation

This function performs the transforms on the BackupHistory object per the parameters pushed in. This includes renaming databases, and file moves and rename. For everything we touch we add an extra property of Orignal to the BackupHistory object. For example the original name of the database will be in OriginalDatabase, and the target name will be in Database

So this is a good spot to test why transforms aren’t working as expected.

To get data out at this pipeline stage use the FormatBackupInformation paramter with a variable name. And as normal it has an accompanying StopAfterFormatBackupInformation switch to halt things there

Test-DbaBackupInformation

Before passing the BackupHistory object off to be restored we do some checks to make sure everything is OK. The following checks are made:

  • LSN chain complete
  • Does a destination file exist, if owned by a different database then fail
  • Does a destination file exist, if owned by the database being restored is WithReplace specfied
  • Can SQL Server see and write to all the destination folders
  • Can SQL Server create any destination folders missing
  • Can SQL Server see all the backup files

If a database passes all these checks then it’s backup history is marked as restorable by the IsVerified property being set $True.

To get the data stream out at this point use the TestBackupInformation parameter.

General Errors with restores

Once we’re past these stages, then our error reporting is at the mercy of the SMO Restore class. This doesn’t always provide an obvious cause straight away. Usually the main error can be found with:

$error[1] | Select-Object *

We like to think we capture most restore failure scenarios nicely, but if you find something we don’t then please let you know, either on Slack or by raising a Github issue

As usually the dbatools terminating error will be in $error[0].

Providing the information for testing or debugging.

If you’re running in to problems then the dbatools team may ask you to provide the output from one of these stages so we can debug it, or incorporate the information into our tests.

Of course you won’t want to share confidential information with us, so we would recommend anonymising your data. My normal way of doing this is to use these 2 stubbing functions:

So if we’ve asked for the Select-DbaBackupInformation the process would be:

Restore-DbaDatabase -[Normal parameters] -SelectBackupInfomation sbi -StopAfterSelectBackupInformation
Filter-DbaToolsHelpRequest $sbi
$sbi | Export-CliXml -Depth -Path c:\some\path\file.xml

And then upload the resulting xml file.

This method will anonymise the values in ComputerName, InstanceName, SqlInstance, Database, UserName, Path, FullName, FileList, OriginalDatabase, OriginalFileList, OriginalFullName and ReplaceDatabaseName. But will produce the same output for the same input, so we can work with multiple database sets at once.

I hope that’s been of some help. As always if you’ve a question then drop a comment below, ping me on twitter (@napalmgram) or raise an issue with dbatools on Slack or Github

Tagged ,

A new addition to UK events – PSDay UK

The UK has a great range of techie events these days, but it’s always good to see another one starting up. PSDay.Uk is the first in an international one day PowerShell events.

The idea is to provide a local set of events that complement the 3 big international PowerShell conferences (PSConf.eu, PsConf.Asia and PowerShell Global Summit). This makes it easier for local attendees to get to events (no airfare, possibly no hotels, much easier to get agreement from employers), gives an opportunity for local speakers, but also provides a framework to let the organisers tap into the International speakers.

My good friend Rob has written more on the idea and formation on the organising body of which he’s a member over on his blog here – What’s a PSDay

The initial UK event is happening on Friday 22nd September 2017 at CodeSkills in London. Full details and tickets are available on the website – https://psday.uk/ and more information is available on Twitter (@PSDayUK) and Facebook

I’m completely chuffed at being picked to speak at this event as well. I’ll be presenting a new session titled “DevOps by the back door, via the Helpdesk”, where I’ll be sharing examples and ideas of how you can start your DevOps journey with small bite sized actions which will start to show your employer (and colleagues) that this stuff really does work.

Hope to see lots of people there. This sounds like a great project, so make sure you can say you were at the first one in years to come!

Using the Chronometer PowerShell module to analyse module performance

Whilst going through my blog roll on Feedly this weekend, I spotted Kevin Marquette’s blog post announcing his new Chronometer module.

At a high level it generates a script profiler timeline as you run through a script. Showing you how long each line takes to run, which lines weren’t run and how many times that line has been called.

I’ve just finished writing a number of functions for the dbatools module to implement some new restore functionality, this seemed like a great way to peer into my functions and see what was going on inside, and spot where I was spending most of the time. This should make it easier to work out where I needed to focus attention to improve performance.

At it’s simplest form it can be called like this:

$start = Get-Date
$Chronometer = @{
    Path = 'c:\dbatools\module\functions\Restore-DbaDatabase.ps1'
    Script = {Restore-DbaDatabase -SqlServer localhost\sqlexpress2016 -Path C:\dbatools\backups -WithReplace -OutputScriptOnly}
}
$results = Get-Chronometer @Chronometer -Verbose
(New-TimeSpan -Start $start -End (Get-Date)).TotalSeconds

This took about 10 seconds to run on my test box and dumped out all the coverage for the Restore-DbaDatabase function.

Code example be here

Using regex to find PowerShell functions in PowerShell scripts

Whilst having a play with someone else’s code I wanted to quickly find all the function definitions within a module, and then all the function calls within a function definition.

Having had a quick bingle around for a prewritten regex example I didn’t come up with much that fitted the bill. So in the hope that this will help the next person trying to do this here they are:

Assumptions:

  • A PowerShell function name is of the form Word-Word
  • A PowerShell function definition is of the form “Function Word-Word”
  • A Powershell function call can be preceeded by a ‘|’,'(‘, or ‘ ‘
  • The script is written using a reasonable style, so there is a ‘ ‘ post call

So to find the function definition I ended up using:

 -match 'function\s(\w+-\w+)'

The function name ends up in $Matches[1]

And for a PowerShell function call:

-match '[^\s|(]\w+-\w+'}

The function name ends up in $Matches[0]

This works accurately enough for what I needed it to do. Feel free to let me know if you can spot any improvements on it.

Removing multiple Backup devices from an SMO restore object in PowerShell

Probably a bit niche this one, but as I scratched my head trying to work it out I’ll share it to save others in future.

The Problem

Working on some new functionality for DbaTools, adding the ability to cope with striped SQL Server backupsets in fact. Everything’s going pretty swimmingly, I’ve got a process for finding the backup files and grouping them together.

Adding them to the Restore device collection is nice and easy as I have a Object to loop through:

Foreach ($RestoreFile in $RestoreFiles)
{
    $Device = New-Object -TypeName Microsoft.SqlServer.Management.Smo.BackupDeviceItem
    $Device.Name = $RestoreFile.BackupPath
    $Device.devicetype = "File"
    $Restore.Devices.Add($device)
}

nice and simple.

The problem comes when I want to reuse my restore object as I loop through the Differentials and the transaction logs. If these Devices aren’t removed, then SQL Server tries to reuse them on every run. As this code is designed to be run anywhere by anyone I’ve know idea of the number of devices needed. Not only will people stripe their backups across differing numbers of files, but it’s possible to strip your backups differently for each type! So this is perfectly fine:

  • Full Backup – Across 4 files
  • Differential Backup – Across 2 files
  • Transaction Backup – Across 1 file (you CAN strip transaction backups if you want)

and you can even do it differently for different runs.
I’ve found examples for c# if you know the devices names that don’t work in PowerShell. I’ve found methods using Get-Member that don’t appear to be documented, and generally worn a dent in my work desk with my forehead (annoying my co-workers with the dull thud).

The classic PowerShell way of using ForEach fails:

ForEach ($Device in $Restore.Devices)
{
    $Restore.Devices.Remove($Device)
}

Fails with the nice red error message:

An error occurred while enumerating through a collection: Collection was modified; 
enumeration operation may not execute..

The Solution

The oft forgotten while loop!

while ($Restore.Devices.count -gt 0)
{
    $device = $restore.devices[0]
    $restore.devices.remove($Device)
}

No need to know beforehand how many objects you need to remove, though here I’m expecting at least 1. If there was a chance of 0 runs I’d use a do…while loop instead. But as if there wasn’t even 1 backup device to remove I’d have had an error during the restore I’m fairly safe here.