Stuart Moore

Musings of a Data professional

Stuart Moore

Category: powershell (Page 1 of 4)

PowerShell

Converting SQL Server LSNs with PowerShell

If you’ve spent much time playing with SQL Server backups and restores then you’ll have discovered Log Sequence Numbers (LSNs), I’ve discussed before How SQL Server LSNs are generated, and went through how to manually convert a hexadecimal LSN to a numeric value.

As part of some work I was doing on the dbatools module I needed something to convert the Log Sequence Numbers coming back from the transaction log into numeric LSNs so I could use them to query the backup history tables in msdb.

Thinking this might be useful to other people, I’ve chopped out the dbatools specific code, and present the function here:

Standard PowerShell good practice to include a decent help and examples block

I use a couple of regex expressions to make sure we’re getting what we expect. Both are based off of the Microsoft documentation for the LSN formats. So for the numeric LSN we’re looking for 26 numeric digits, and for the hexadecimal LSN representation we’re after 2 8 character hex numbers and a 4 character one separated by colons (‘:’). If the LSN passed in doesn’t match either of those, we raise a warning as we can’t do anything with it.

For hex to numeric, we split the string on :. Convert each section from hex to int64 (using int64 as LSNs can get really big), then convert to a string and pad with the requisite number of 0s.

For number to hex we work from the back of the LSN as we have to cope with potential leading 0’s.

PsDay.UK 2018 incoming

Time sure seems to fly. It’s been just under a year since the first PsDay.UK appeared on the UK PowerShell scene. After the success of that event it’s back for another edition! The 2018 event is on 10th October 2018.

The PsDay.UK team have gathered up a great set of speakers and sessions again, have a look at the quality of the agenda. 3 tracks of sessions, it’s going to be tough picking which one to go to. Well, except at 15:00 when I’ll be in Room 3 (aka Shift) presenting on ChatOps for PowerShell. I’ll be covering what ChatOps offers the PowerShell developer and how you can leverage your current skills and scripts to join the gif filled party

Tickets are availble here – PsDay.UK 2018 Tickets – at a very reasonable price for a full day of quality sessions. There’s even a decent refund programme, so you’re covered if things change.

Hopefully be seeing some of you at Code Node on the 10th October. Feel free to wander up and say Hi.

Rolling Back SQL Server agent job owner changes

So in this post I showed you a technique for updating the owner on all your SQL Agent jobs.

Whilst this isn’t a major change, you do need to have checked that the new ownership account has the correct permissions or some thing (eg; EXECUTE AS or CMDEXEC steps). With this in mind someone asked me how I’d back this out. Either for all jobs or that one specific MUST NOT FAIL job that is failing.

As we have 2 ways to change them, we have 2 ways to back out the changes. Note that both of these require you to have done a small amount of work up front, there is no magic rollback option πŸ™

PowerShell and dbatools

This one is a little simpler as we can use a lot of PowerShell plumbing to do the lifting for us. To set this up we use:

Get-DbaAgentJob -SqlInstance server1 | Select-Object Name, OwnerLoginName | ConvertTo-Csv | Out-File .\server1-JobOwners.csv

If you’re not a PowerShell user I’ll just break this down into steps:

  1. We user dbatools’ `Get-DbaAgentJob` cmdlet to get all the Agent jobs on Server1
  2. We pipe the ouput to `Select-Object`, we select just the job Name and it’s Owner’s login name
  3. The output is then converted to a CSV object
  4. Finally we write the date to a file in the current folder

So we’ve created a nice csv list of the original owners of our jobs, great. Now how do we use those to roll back?
To roll back every job:

Get-Content .\server1-JobOwners.csv | ConvertFrom-Csv | Foreach-Object {Set-DbaAgentJob -SqlInstance Server1 -Job $_.Name -OwnerLoginName $_.OwnerLoginName}

To rollback just one job it’s very similar, we just insert a Where-Object step into the pipeline:

Get-Content .\server1-JobOwners.csv | ConvertFrom-Csv | Where-Object -Name -eq 'My Job' | Foreach-Object {Set-DbaAgentJob -SqlInstance Server1 -Job $_.Name -OwnerLoginName $_.OwnerLoginName}

In both cases we read in our CSV, convert it from CSV into a PowerShell Object. For a single job we use Where-Object to filter down to that one job, you could also use like if you wanted to pattern match. The remain records then get piped through a ForEach-Object loop, where we use Set-DbaAgentJob to reset the jobs owner.

T-SQL

The pre setup for this one involves creating and populating a table to store the old values:

CREATE TABLE JobOwnerHistory(
  Job_Id char(36),
  JobName nvarchar(128),
  OwnerLoginName nvarchar(50)
)

INSERT INTO JobOwnerHistory
  SELECT 
    sj.job_id, 
    sj.JobName,
    sp.name as 'OwnerLoginName' 
  FROM 
    msdb..sysjobs sj 
    INNER JOIN sys.server_principals sp on sj.owner_sid=sp.sid

So now, resetting a job’s owner is just a modification of our original script:

DECLARE @job_id char(36)
DECLARE @OwnerLoginName varchar(50)
DECLARE JobCursor CURSOR
FOR
SELECT 
  job_id,
  OwnerLoginName
FROM
  JobOwnerHistory
--WHERE JobName LIKE '*Finance*'
OPEN JobCursor 
FETCH NEXT FROM JobCursor INTO @job_id, @OwnerLoginName
WHILE (@@FETCH_STATUS <> -1)
BEGIN
exec msdb..sp_update_job
    @job_name = @job_id,
    @owner_login_name = @OwnerLoginName
FETCH NEXT FROM JobCursor INTO @job_id, @OwnerLoginName
END 
CLOSE JobCursor 
DEALLOCATE JobCursor 

As written that will reset every job in JobOwnerHistory, if you want to filter down to a subset of tables you’d uncomment and modify the WHERE line

Hope those examples are helpful. If you’ve stumbled across this page and it doesn’t quite fix your problem, please drop a comment and I’ll try to lend a hand.

Granting Azure Roles to Users or groups without a SignInName in Azure AD

Whilst sorting some permissions out on a resource group I stumbled across an issue with assigning them to a group. We have a replicate On Premise and Azure Active Directory solution in place. In the On Premise AD not all groups are Distribution groups so don’t have a ‘group1@contoso.com’ sign in. This causes a problem when trying to assign a role in the usual PowerShell way:

New-AzureRmRoleAssignment -SignInName 'group1@contoso.com' -RoleDefinitionName Contributor -ResourceGroupName MyRg

or

New-AzureRmRoleAssignment -SignInName 'group1' -RoleDefinitionName Contributor -ResourceGroupName MyRg

both return a nice red error message:

'New-AzureRmRoleAssignment : The provided information does not map to an AD object id.
At line:1 char:1
+ New-AzureRmRoleAssignment -SignInName group1@contoso.com -RoleD ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : CloseError: (:) [New-AzureRmRoleAssignment], KeyNotFoundException
    + FullyQualifiedErrorId : Microsoft.Azure.Commands.Resources.NewAzureRoleAssignmentCommand'

The workaround for this is to use the ObjectID parameter, and to grab the ObjectId like this:

$objectid = (Get-AzureRmADGroup -DisplayName "BSD Developers").Id
New-AzureRmRoleAssignment -ObjectId $objectid -RoleDefinitionName Contributor -ResourceGroupName MyRg

Just another one of those little oddities that crop up when you’ve got a long established infrastructure that doesn’t quite fit the modern templates πŸ˜‰

dbachecks – SQL Server compliance testing with simple configuration management

if you’ve not heard yet, the people behind the dbatools PowerShell module (including me) have a new toolset for you, dbachecks. dbachecks uses Pester to let you validate your SQL Server estate in a simple way and generate meaningful graphical reports. The official launch of the module is at SQL Bits 2018

Out of the box dbachecks uses test values that we’ve found to be the most appropriate from our years of experience with SQL Server. But these may not be the best values for your particular organisation. For example, we expect to see a Full backup less than 24 hours old for each database, in your case you might only take a Full backup once a week and use differential backups during the week. So we needed a flexible and simple system to let you change the values. dbatools friend Friedrich Weinmann (b | t) has written a great PowerShell module framework called PsFramework which we’ve integrated into dbachecks to handle the configuration of the tests

Out of the box dbachecks will load it’s default config values from the file .\internal\configurations\configuration.ps1 (#! link to GH). To see the current values use Get-DbcConfig:
Get-DbcConfig ouput

I’ve just picked the top few rows for the screenshot, as of writing (15/02/2018) there are 98 config options available. We’ve tried to make the configuration option names obvious. There are main sections of configuration values group:

  • app – Configuration for the module
  • domain – Configuration for authentication
  • mail – Configuration for sending reports
  • policy – The values for the tests
  • skip – Controls which tests should be excluded

And each has a useful desription as well to make it easy to find. In the screenshot you’ll notice that I’ve configured the mail.* options to suit my test environment. So now I can use Send-DbcMailMessage to email my test results without having to specify all the parameters. To set a configuration parameter you use the Set-DbcConfig command:

Set-DbcConfig -Name mail.subject -Value "DbcChecks report"

Set-DbcConfig example

Or perhaps as mentioned at the start of this post, you don’t want a Full backup in the last day, but in the last 7 days. You can easily configure that:

Set-DbcConfig -Name policy.backupfullmaxdays -Value 7

And now your tests will be checking you’re never more than 7 days from a full backup. By default we’re checking that you’ve taken a Differential backup in the last 25 hours, so you’re now good to go!

And don’t worry, you don’t need to reset these every time you run the tests. The PsFramework module persists these non default sessions in the Windows registry at:

Computer\HKEY_CURRENT_USER\Software\Microsoft\WindowsPowerShell\PSFramework\Config\Default

So on my system we can see:
DbaChecks registry storage

Now what happens if you want to distribute these config changes to multiple servers, or share them with colleagues to make sure you’re all singing the same tune. We’d strongly recommend you don’t modify the .\internal\configurations\configuration.ps1 file directly, as that will be replaced whenever you update the module. To make your life easier (which is the entire point of the module) we’ve included Export and import functionality. To export the configuration to an easy to parse JSON file you simply run:

Export-DbcConfig -Path c:\path\of\yourchoice\filename.json

If you omit the path value then by default we will export the results to $script:localapp\config.json. Now you’ve got a simple JSON file it’s easy to source control (you are using source control aren’t you?) to keep track of changes and make sure they were implemented as expected. If you want to apply that configuration to another install, then it’s simply a case of running the import command:

Import-DbcConfig -Path c:\path\of\yourchoice\filename.json

and you’re done. We support UNC, so again it’s simple to have a central repository to apply the same configuration. And it works well with just fragments of configuration as well, so if all you wanted to control were the email settings you can create a JSON file like this:

[
    {
        "Name":  "mail.failurethreshhold",
        "Value":  0,
        "Description":  "Number of errors that must be present to generate an email report"
    },
    {
        "Name":  "mail.from",
        "Value":  "null@stuart-moore.com",
        "Description":  "Email address the email reports should come from"
    },
    {
        "Name":  "mail.smtpserver",
        "Value":  "smtp.stuart-moore.com",
        "Description":  "Store the name of the smtp server to send email reports"
    },
    {
        "Name":  "mail.subject",
        "Value":  "DbcChecks report from stuart",
        "Description":  "Subject line of the email report"
    },
    {
        "Name":  "mail.to",
        "Value":  "SqlReports@stuart-moore.com",
        "Description":  "Email address to send the report to"
    }
]

Then you can import just this snippet to set the configuration for those options. Makes it easy to seperate out enterprise level configuration changes from the actual SQL test options. The same technique can be used to ‘force’ the correct backup testing parameters in all cases, while letting other tests be customised as needed.

Running simple PowerShell commands against multiple servers with a timeout

PowerShellMany years ago I wrote a post on passing functions into Start-Job (Calling a PowerShell function in a Start-Job script block when it’s defined in the same script). Over the years I’ve had a number of emails from people asking about how to use it to fix their own situation. In many of these cases there’s been a simpler way to achieve what they wanted to do.

Sometimes a non obvious solution using PowerShell basics can be much simpler. Patrick posted up that he was having problems running a command against a list of servers and wanting to be able to skip those that time out. Using an injected function for this was a bit of overkill. I had a premade solution I use, which relies on Invoke-Command and is simpler:

$session = New-PSSession -ComputerName Server1, server2,server3 -ErrorAction SilentlyContinue
$ServiceName = &#039;MSSQL`$&#039;
$timeout = 10
$remote = Invoke-Command -Session $session -ScriptBlock {Get-Service -Name *$using:servicename*} -AsJob 
$remote | Wait-Job  -Timeout $timeout
$output = $remote | Receive-Job 
$output | ForEach {&quot;$($_.Name) on $($_.PsComputerName) is $($_.Status)&quot;}

We setup new PsSessions using New-PsSession, I set ErrorAction to SilentlyContinue just in case a host isn’t available for some reason (if I was being good I’d try/catch here).

As we’re just using PS standard functionality here with Get-Service there’s no need to build a a new function, we can just call this directly here. By calling Invoke-Command against a session pointed at numerous hosts we can PowerShell handle all the connection management here and just assume the command will be ran against each host. if we were running against a lot of hosts then we would want to look into using the -ThrottleLimit parameter to limit the number of concurrent hosts we’re hitting. The one little trick here is using the using scope modifier here so PS pulls in the variable defined in our main scope (gory details on scoping here

As we called Get-Service with the -AsJob switch we can now treat it as a job, so we can use the PS jobs cmdlets to manage it. The first thing we want to do is skip those jobs that are taking longer than out specified timeout value (in this case 10 seconds). So we pass out Invoke-Command job into Wait-Job with a Timeout parameter. PS will now keep an eye on each job and drop those exceeding our timeout limit.

Once we’ve gotten all the jobs that met our timelimit we grab the ouput using Receive-Job, and then just process it like any other PowerShell object

As we’ve removed a lot of complexity it’s much easier to revise or reuse the framework at a later date.

Nottingham and East Midlands PowerShell User Group

PowerShellInterested in any form of PowerShell usage and based around the East Midlands and Nottingham? Then this could be the group for you. We’re looking to cover anything that uses (or can be used with) PowerShell. So topics that are fair game include:

  • AD Management
  • Scripting
  • Source Control
  • DevOps
  • Azure/AWS/Cloud Provider of your choice
  • Exchange management
  • SQL Server
  • Pester testing
  • Continuous Integration
  • .Net Internals
  • Generally anything that would interest someone using PowerShell or give their career a boost!

Nothing is set in stone yet as we want to get some feedback from potential members. There’s a date booked for the 8th Febuary (Kick Off meeting), but what happens is up to you.

Would you prefer a traditional usergroup with booked speakers given presentations in a formal setting, or something more informal like a roundtable/whiteboard sessions? Or perhaps half and half?

We’d love to know what you’d like to see or learn. So please either drop a comment below or sign up for the Kick Off Meeting and give us feedback on Meetup.

Complex SQL Server restore scenarios with the dbatools Restore-DbaDatabase pipeline

dbatools logoNo matter how hard the dbatools team try, there’s always someone who wants to do things we’d never thought. This is one of the great things with getting feedback direct from a great community. Unfortunately a lot of these ideas are either too niche to implement, or would be a lot of complex code for a single use case

As part of the Restore-DbaDatabase stack rewrite, I wanted to do make things easier for users to be able to get their hands dirty within the Restore stack. Not necessarily needing to dive into the core code and the world of Github Pull Requests, but by manipulating the data flowing through the pipeline using standard PowerShell techniques All the while being able to do the heavy listing with out code.

So, below the fold we’ll be looking at some examples of how you can start going to town with your restores

Read More

Deep copy arrays in PowerShell

PowerShellJust a quick post for something I had to deal with for another post I’m writing. I wanted to do some work where I’d reuse a base array over a number of passes in a loop, and didn’t want any change made impacting on later iterations. If you’ve tried copying an array containing arrays or other objects in PowerShell in the usual manner then you’ll have come across this problem:

Copying array of arrays fails in powerShell

As is common in .Net, the copy is ‘Copy by Reference’ so you don’t get a nice shiny new independent array to play with. All that’s copied is the references to the original’s place in memory. Therefore any changes to either object affects both, as both variable names are looking at the same piece of memory. This is nice and efficient in terms of storage and speed of copying, but not great for my purposes.

There are various workarounds kicking around if you’re using simple arrays, but they tend to breakdown when you’ve got arrays that contain arrays or other PowerShell objects. My method for copying them is a little down and dirty, but it works 95% of the time for what I want. The trick is to Serialize the object, and then DeSerialize it into the new one:

Deep Copy Powershell array using serialization

And, voila we have the outcome I wanted. Just to make the line of code easier to read, here it is:

$arr3 = [Management.Automation.PSSerializer]::DeSerialize([Management.Automation.PSSerializer]::Serialize($arr2))

Now, I mentioned up front that this works ~95% of the time. The times it doesn’t work for me are when the underlying object type doesn’t serialize nicely. The most common one I come across is BigInt. This ‘deserializes’ back in as a non integer type and then won’t play nice when compared to other ‘real’ BigInt value, so make sure to check you have the values you think you should do

Debugging the new dbatools Restore-DbaDatabase pipeline

A new version of dbatools Restore-DbaDatabase command was released into the wild this week. One of the main aims of this release was to make it easier to debug failures in the restore process, and to drag information out of the pipeline easily (and anonymously) so we can increase our Pestering of the module with Unit and Integration tests.

So I’d like to share some of the features I’ve put in so you can take part.

The biggest change is that Restore-DbaDatabase is now a wrapper around 5 public functions. The 5 functions are:

  • Get-DbabackupInformation
  • Select-DbabackupInformation
  • Format–DbabackupInformation
  • Test–DbabackupInformation
  • Invoke-DbaAdvancedRestore

These can be used individually for advanced restore scenarios, I’ll go through some examples in a later post.

For now it’s enough to know that Restore-DbaDatabase is a wrapper around this pipeline:

Get-DbabackupInformation |Select-DbabackupInformation | Format-DbabackupInformation | Test-DbabackupInformation | Invoke-DbaAdvancedRestore

and it’s other function is passing parameters into these sub functions as needed.

With version of Restore-DbaDatabase you were restricted to throwing data into one end, and seeing what came out of the other end, with some insight produced by Verbose messages. Now things can be stepped through, data extracted as need, and in a format that plugs straight into out testing functions.

Get-DbaBackupInformation

This is the function that gets all of the information about backup files. It scans the given paths, and uses Read-DbaBackupHeader to extract the information from them. This is stored in a dbatools BackupHistory object (this is the same as the output from Get-DbaBackupHistory, so we are standardising on a format for Backup information to be passed between functions).

So this would be a good place to check that you’ve gotten the files you think you should have, and is also the first place we’d be looking if you had a report of a break in the LSN chain

To get the output from the pipeline at this point we use the GetBackupInformation parameter:

Restore-DbaDatabase - -GetBackupInformation gbi

This will create a globally scoped variable $gbi containing the ouput from Get-DbaBackupHistory. Note, that when passing the name to Restore-DbaDatabase you do not need to specify the $.

If you want to stop execution at this point, then use the -StopAfterGetBackupInformation switch. This will stop Restore-DbaDatabase from going any further.

This is also a good way of saving time on future runs, as the BackupHistory object can be passed straight in, saving the overhead of reading all the file heasers again:

$gbi | Restore-DbaDatabase [Usual Parameters] -TrustDbBackupHistory

Select-DbaBackupInformation

Here we filter down the output from Get-DbaBackupInformation to restore to the point in time requested, or the latest point we can. This means we find :
– the last full backup before the point in time
– the latest differential between the full backup and the point in time
– and then all transaction log backups to get us to the requested time
This is done for every database found in the BackupHistory object

Here is where we’d begin looking for issues if you had a ‘good’ LSN chain from Get-DbaBackupInformation and then it broke.

To get this data you use the SelectBackupInformation parameter, passing in the name of the variable you want to store the data in (without the $ as per GetBackupInformation above)

There is also a corresponsing StopAfterSelectBackupInformation switch to halt processing at this point. We stop processing at the first stop in the pipeline, so specifying multiple StopAfter* switches won’t have an effect

Format-DbaBackupInformation

This function performs the transforms on the BackupHistory object per the parameters pushed in. This includes renaming databases, and file moves and rename. For everything we touch we add an extra property of Orignal to the BackupHistory object. For example the original name of the database will be in OriginalDatabase, and the target name will be in Database

So this is a good spot to test why transforms aren’t working as expected.

To get data out at this pipeline stage use the FormatBackupInformation paramter with a variable name. And as normal it has an accompanying StopAfterFormatBackupInformation switch to halt things there

Test-DbaBackupInformation

Before passing the BackupHistory object off to be restored we do some checks to make sure everything is OK. The following checks are made:

  • LSN chain complete
  • Does a destination file exist, if owned by a different database then fail
  • Does a destination file exist, if owned by the database being restored is WithReplace specfied
  • Can SQL Server see and write to all the destination folders
  • Can SQL Server create any destination folders missing
  • Can SQL Server see all the backup files

If a database passes all these checks then it’s backup history is marked as restorable by the IsVerified property being set $True.

To get the data stream out at this point use the TestBackupInformation parameter.

General Errors with restores

Once we’re past these stages, then our error reporting is at the mercy of the SMO Restore class. This doesn’t always provide an obvious cause straight away. Usually the main error can be found with:

$error[1] | Select-Object *

We like to think we capture most restore failure scenarios nicely, but if you find something we don’t then please let you know, either on Slack or by raising a Github issue

As usually the dbatools terminating error will be in $error[0].

Providing the information for testing or debugging.

If you’re running in to problems then the dbatools team may ask you to provide the output from one of these stages so we can debug it, or incorporate the information into our tests.

Of course you won’t want to share confidential information with us, so we would recommend anonymising your data. My normal way of doing this is to use these 2 stubbing functions:

So if we’ve asked for the Select-DbaBackupInformation the process would be:

Restore-DbaDatabase -[Normal parameters] -SelectBackupInfomation sbi -StopAfterSelectBackupInformation
Filter-DbaToolsHelpRequest $sbi
$sbi | Export-CliXml -Depth -Path c:\some\path\file.xml

And then upload the resulting xml file.

This method will anonymise the values in ComputerName, InstanceName, SqlInstance, Database, UserName, Path, FullName, FileList, OriginalDatabase, OriginalFileList, OriginalFullName and ReplaceDatabaseName. But will produce the same output for the same input, so we can work with multiple database sets at once.

I hope that’s been of some help. As always if you’ve a question then drop a comment below, ping me on twitter (@napalmgram) or raise an issue with dbatools on Slack or Github

Page 1 of 4

Powered by WordPress & Theme by Anders Norén