Stuart Moore

Musings of a Data professional

Stuart Moore

Category: powershell Page 3 of 5

Using the SQLAutoRestores PowerShell Module

PowerShellThe SQLAutoRestores module as currently stands (04/08/2016  v 0.9.0.0) is very much based on my personal usage patterns, but the plan is to make it as flexible as possible. So if you see something in this workflow that you’d like to change, or need something to cope with your SQL Sever backup procedures, then please leave a comment below, or drop in a feature request at github – https://github.com/Stuart-Moore/SQLAutoRestores/issues

Current practice at work is that all SQL Server databases are backed up, either to a central share, or locally and then transferred to the central share. So we end up with a backup folder structure like:

Root 
 |
 +-Server1
 |    +-DB1
 |    +-DB2
 |
 +-Server2
      +-DB1
      +-DB2

I want to randomly pick a set of backup files to restore. So the first thing I want to do is to get all the ‘Bottom folders’ of this directory tree:

$folders = Get-BottomFolders \\backupserver\share

And if I’ve split the backups across multiple shares I can add more:

$folders += Get-BottomFolders \\server2\backups$\

So now I want to pick a folder at random from those. But, not everyone cleans up after themselves, so there could be folders that don’t contain anything we’re interested in. So there’s a Test-DBBackupsExist function that makes sure we have a file. So we loop until we get something useful:

$RestoreFolder = Get-RandomElement $folders
while (!(Test-DBBackupsExist $RestoreFolder)){
    $RestoreFolder = Get-RandomElement $folders
}

Great, we’ve got a folder with some SQL Server backups in it. Now we need to see what’s in the backup files. So we do a scan of the file headers, which needs a SQL Server, so we build a SQL Server connection as well:

$SQLconnection = New-SQLConnection 'server1\instance2'
$BackupObjects = Get-DBBackupObject -InputPath $RestoreFolder -ServerInstance $SQLconnection

This returns a simple PowerShell Object containing the header hightlights from each file in the folder.

Note; at this point we’ve not checked we’ve gotten a complete restorable set of files. For all we know, we got 30 Transaction log files and no Full Backup to start from!

I prefer to restore databases to random points in time rather than just the latest available. This gives a wider range of options to compare, and might just mean that you’ll discover than your SAN is corrupting the 22:15 t-log backup.

The next function checks we’ve got at least one ‘anchoring’ full backup, picks the earliest point in time that backup covers, and then gets the latest point in time covered by the backup files, and returns a random point between those 2 extremes. This will be our Recovery Point Objective

$TimeToRestore = Get-PointInTime -BackupsObject $BackupObjects

We then filter out backup files to just those needed to his this point in time:

$Objective = Get-RestoreSet -BackupsObject $BackupObjects -TargetTime $TimeToRestore

Or if you did just want the latest point then you can:

$Objective = Get-RestoreSet -BackupsObject $BackupObjects -Latest

Now we deal with moving the restored database files to a different location:

$Objective = Get-FileRestoreMove -BackupsObject $Objective -DestinationPath e:\some\path

And now we run some tests before the ‘expensive’ time taking restore itself. First off we’ll check we’re not about to clobber another database:

Test-DatabaseExists -RestoreSQLServer $SQLconnection -DatabaseName $Objective[0].DatabaseName

Then we check we have enough space to restore. This includes checking for any file growth during the restore (if your transaction log grows to a stupid size during the day, then it’ll be grown to that size during the restore and sized down later on, so you need to accomdate the largest amount of space your database occupies, not just it’s final size):

Test-RestoreSpace -BackupsObject $Objective -RestoreSQLServer $SQLconnection -RestorePath e:\some\Path

And then we test the difference beteen the SQL Server version of the instance that did the backup and the SQL Server instance we’re asking to perform the restore. Microsoft state that restoring more that 2 major versions isn’t allowed, so we fail it in this case (non SQL Server backups aren’t supported (yet!))

Test-DBRestoreVersion -BackupsOject $Objective -RestoreSQLServer $SQLconnection

And finally we restore the database:

Restore-Database -BackupsObject $Objective -RestoreSQLServer $SQLconnection -RestoreTime $TimeToRestore

Now, we want to check the restore is fine. It is possible to restore a corrupt database with no errors! (Demo and example here) :

Test-Database -DatabaseName $Objective.Databasename -RestoreSQLServer $SQLconnection

And then clean up after ourselves:

Remove-Database -DatabaseName $Objective.Databasename -RestoreSQLServer $SQLconnection

Rinse, repeat ad infinitum. I’ve this process running 24×7 on a dedicated restore instance. On average I restore 80 databases a day and cover every production database in a 2 week windows (it’s random so not guaranteed, but I have a priority list that skews it!)

Currently I collect my statistics with some simple Send-MailMessage usage, but I want something more robust in this module, so thats on the list of things to get fixed before we go to 1.0.0.0 properly.

Hopefully that’s given some ideas on how to use the module. I’d love to hear any ideas on improvements or how you’d want to use it in your environment. Comment here, drop me an email, or ping me on twitter (accounts all linked top right).

PowerShell module to automate restoring SQL Server backups

PowerShellI’ve long been a proponent of automatically restoring as many of your SQL Server backups as you can. There are many benefits of doing this, amongst them:

  • Checking your backups are working, I’ve seen people with backups they can’t restore
  • Knowing how long it will take to recover, your boss isn’t going to be happy if they’re still waiting 3 hours after you said the business would be back up and running
  • Having logs of having restore them, Auditors love this sort of thing
  • Peace of mind knowing that your restores are going to work when the brown stuff hits the reciprocating draught machine.

I’ve been using a collection of scripts and functions based on my previous blogs and presentations on the subjects. But they’ve never really been in a state that I felt comfortable sharing with the world.

Until now:

Read More

New-AzureRmAutomationCredential returning “Input string was not in a correct format”

Having banged my head against this misleading error message, I thought I’d post this.

it’s nothing to do with any of the parameters you’re passing in, believe me I checked.

It’s actually complaining that the Credential already exists! I was doing a redeployment and the credential hadn’t been removed.

And in other useful annoyances, Find-AzureRmResource doesn’t work for Automation Credentials. So now my deployment scripts looks like this:

if ((Get-AzureRmAutomationCredential -ResourceGroupName $rgname  -name "meadmin" -AutomationAccountName $AutoName)){
    Remove-AzureRmAutomationCredential -ResourceGroupName $rgname  -name "meadmin" -AutomationAccountName $AutoName 
  }

New-AzureRmAutomationCredential -AutomationAccountName $autoname -ResourceGroupName $rgname -name "meadmin" -Value $acred -Verbose

The term ‘Test-AzureResource’ is not recognized as the name of a cmdlet, function, script file, or operable program

Microsoft have followed through with the deprecated warning on Test-AzureResource and it’s finally been removed. So there’s now a lot of PowerShell scripts out there that no longer work.

A quick alternative that involves nothing more than a search and replace (at least in my scripts so far!) is Find-AzureRmResource, behaviour is the same returning $FALSE if the Azure resource doesn’t exist and $TRUE if it does. So this snippet:

    if (!(Test-AzureRMResource `
        -ResourceName $vmName `
        -ResourceType "Microsoft.Compute/virtualMachines" `
        -ResourceGroupName $rgName)) {

can simply be replaced with:

    if (!(find-AzureRMResource `
        -ResourceName $vmName `
        -ResourceType "Microsoft.Compute/virtualMachines" `
        -ResourceGroupName $rgName)) {

and everything’s good.

Using Powershell to push registry fixes around multiple machines

PowerShellNeeding to push this (Microsoft KB 911272 around a bunch of web servers in a hurry. Luckily, thanks to 2 good design decisions:

  • Enabling PS remoting
  • A sane machine naming convention of the form server-1, server-2

We could push the change around with a small bit of Powershell:

$ccred = Get-Credential

$ScriptBlock={
    New-ItemProperty "HKLM:\Software\Microsoft\ASP.NET" -Name "FCNMode" -Value 1 -PropertyType "DWord" 
    New-ItemProperty "HKLM:\SOFTWARE\Wow6432Node\Microsoft\ASP.NET" -Name "FCNMode" -Value 1 -PropertyType "DWord" 
    iisreset
    }

$i=1
while ($i-lt 9){
    $cname = "app-prod-web-$i"
    Invoke-Command -ComputerName $cname -ScriptBlock $ScriptBlock -Credential $ccred
    $i++
}

We grab some credentials for the remote boxes (remember, seperate accounts for desktop and server admin duties is a GOOD idea).

As we’re passing through multiple commands/lines to Invoke-Command using a scriptblock is the easiest way, so create one of those. Quickly use New-ItemProperty to add the required DWORD values under the right keys, and then an iisreset to make sure they take affect.

A nice quick and dirty while loop allows us to push the scriptblock to the 8 servers in this set, and we’re done. A quick change of the $cname variable and I can push it to the next set as well. And another edit and it’s pushed into the staging environments as well.

Finding the bottom folders of a file tree using PowerShell

There’s a simple way of getting Powershell to tell you the last folder in a known path:

$path = \\server\share\folder1\folder2\folder3
$folder = Split-Path $path -Leaf

and that’s all you need.

Now, how do you go about doing that if you don’t know the path ahead of time? That is, you have a large collection of directories, and all you want are those folders than make up the bottom levels of the branches. In this graphic it would be all the folders underlined.

Example of bottom folders

As an example, current $EMPLOYER holds backups from all SQL databases on a share, with each database’s backups being in a folder. Due to various DBAs over the years the depth of the tree to the actual backups isn’t fixed. So I need to find the “bottom” folder so I can process the backup files in there.

Well, that’s a little trickier. So lets have a look at a couple of ways of doing that:

Read More

Bulk adding Active Directory service accounts with PowerShell

I’ve recently had to create a lot of new service accounts to support some new infrastructure projects. As these are supporting a lot of web services, there’s been a large number of service accounts that needed creating to own all the app pools.

To make life simpler when this moves from implementation to support, all of these accounts want to be setup the same way. And we want secure passwords as well.

And then those accounts also need to be setup for each of our prod, stage, dev, test environments.

So, a dull, repetitive task where making sure details are replicated exactly, this sounds like the perfect job to automate with some PowerShell.

[Reflection.Assembly]::LoadFile('C:\Windows\Microsoft.NET\Framework\v4.0.30319\system.web.dll') | out-null

function New-Password ($length = 12)
{
        $p = [System.Web.Security.Membership]::GeneratePassword($length,2)
        $password = $p | ConvertTo-SecureString -AsPlainText -Force
        $retpass = @{"string"=$p; "secstring"=$password}
        return $retpass
}

$environments = @()

$environments += @{"env"="prod";
                   "prefix"="pwa-";
                   "adpath"= "ou=Prod Accounts, ou=Service Accounts, ou=Domain Controllers, dc=ad, dc=contoso,dc=com,";
                   "identgroup"="name1";
                   "ro-group"="group1"}

$environments += @{"env"="test";
                    "prefix"="twa-";
                    "adpath"= "ou=Test Accounts, ou=Service Accounts, ou=Domain Controllers, dc=ad, dc=contoso,dc=com,";
                    "identgroup"="name2";
                    "ro-group"="group2"}

$environments += @{"env"="dev";
                    "prefix"="dwa-";
                    "adpath"= "ou=Dev Accounts, ou=Service Accounts, ou=Domain Controllers, dc=ad, dc=contoso,dc=com,";
                    "identgroup"="name3";
                    "ro-group"="group3"}


$Users = import-csv -path "c:\temp\users.csv"

$output = @()

foreach ($environment in $environments){
    $out += "`n "+$environment.env + "`n"
    foreach ($user in $Users){
        $password = New-Password( 15)
        $username = $environment.prefix+$user.user.trim()
        Aew-ADuser $username -Path $environment.adpath -AccountPassword $password.secstring -DisplayName $username -PasswordNeverExpires $true -ChangePasswordAtLogon $false -CannotChangePassword $true -Enabled $true
        Add-ADGroupMember $identgroup -Members $username
        Add-ADGroupMember $rogroup -Members $username
        $output += @{"username"=$username; "password"=$password.string;"Environment"=$environment.env
    } 
}

First off, we’ll want to be creating secure complex passwords. Rather than rewrite everything from scratch, we’ll use the built in function from the .Net System.Web class, which we load in the first line.

Then we’ll wrap this call in a PowerShell function to make calling things easier:

function New-Password ($length = 12)
{
        $p = [System.Web.Security.Membership]::GeneratePassword($length,2)
        $password = $p | ConvertTo-SecureString -AsPlainText -Force
        $retpass = @{"string"=$p; "secstring"=$password}
        return $retpass
}

We pass in a length parameter for later use, but set a default to guarantee at least 12 characters. Then we create a new password of required length, the 2nd paramater to GeneratePassword specifies the minimum number of special characters (@!# etc) that the password has to contain. GeneratePassword returns the password as a string, but we need it to be a SecureString to pass to an Active Directory cmdlet later on, so we convert it. Then return the password both as a string and a secure string via a Hash Table.

Now we setup our environments we’ll be creating our accounts for. I like to do this using arrarys and hashtables. We have an array/object $Environments which holds the details of all of the environments we want to create. For my needs, each environment is defined as follows:

$environments += @{"env"="dev";
                    "prefix"="dwa-";
                    "adpath"= "ou=Dev Accounts, ou=Service Accounts, ou=Domain Controllers, dc=ad, dc=contoso,dc=com,";
                    "identgroup"="name3";
                    "ro-group"="group3"}

The values in the hash table are:

  • The environment Name
  • The prefix for the usernames for this environment
  • The path in Active Directory for the OU the user account should be created in
  • An group for all the accounts in this environment to belong to
  • Another group for all the accounts in this environment to belong to

This is based on the AD setup I’m currently working with, your needs may be different, if you’re not the Active Directory admin then please go talk to them about how they’d like things configured

Now that we’ve got everything setup, let’s create some accounts!

I load the usernames from a csv file that looks like this:

user
Account1
web-user
web-user2
app-user
app-user2

The first line stays at user, so that we can use it to get the values later on.

Create an array $output to hold our output.

Now we loop through each of our environments, and then for each environment we loop through out list of users.

For each user:

  • we create a new secure password using our previously defined function.
  • Build a full username using the supplied username and the appropriate prefix for the environment
  • Create the user in Active Directory based on our environment
  • Add them to any groups we want them in
  • Add the newly created users details to our output object

And we’re done. In the great tradition of PowerShell I’ve kept the user details in an object $output so they can easily be passed along to another script, command or whatever you want.

If you want to just dump them to screen, then you can use:

$output | Format-Table 

Or redirect the object to you secure password storage utility of choice.

And another dull repetitive task is handed off to a simple script.

Adding SQL Server jobs using PowerShell

PowerShellA sudden requirement for a new SQL Server agent job to be put on to every instance in our Enterprise was hardly greeted with joy. At least by those who hadn’t been playing with PowerShell.

Just like everything else in SQL Server, we can automate SMO to achieve our ends. And once we’ve got it working on one SQL Sever instance, it’s simple to extend this to hundreds.

Read More

Get Cluster size for all disks and volumes on a Windows machine using powershell and wmi

Having been building some new SQL Server boxes recently I wanted to check that the volumes have been formatted with the appropriate block size. (64k as per this document – http://msdn.microsoft.com/en-us/library/dd758814.aspx)

I’m lucky, and have nice storage admins who’ll give me what I ask for, but it’s always better to check rather than assume. Can be a bit tricky if you discover they’re wrong after you’ve put the data on the disks.

Most examples on the interwebs appeared to rely on Get-PSDrive or fsutil. These are great little tools, but don’t play nicely with mount points, and since we use a lot of mount points they were out.

So I came up the with this quick snippet to quickly list all volumes, be they drive, mountpoint or unattached:

$wql = "SELECT Label, Blocksize, Name FROM Win32_Volume WHERE FileSystem='NTFS'"
Get-WmiObject -Query $wql -ComputerName '.' | Select-Object Label, Blocksize, Name

And now I’ve got a nice table listing everything owned by the box, with enough information to work out which volume, drive or mountpoint needs a bit of attention

Label                                      Blocksize Name                               
-----                                      --------- ----                               
SQL-Web-Mnt                                    65536 L:\                                
SQL-Web-Mnt-Prod                               65536 L:\data1\                          
SQL-Acct-Mnt                                   65536 M:\                                
SQL-Acct-Mnt-Prod                              65536 M:\prod\                           
SQL-Acct-Mnt-Rep                               65536 M:\reporting\                      
SQL-Acct-Mnt-Other                             65536 M:\data\       

Out of the 60 volumes I checked across various boxes, only 1 was incorrect. But fixing it now will save some time further down the line.

Using PowerShell to add images to mp3 files

PowerShellI spent a fair chunk of the Christmas break ripping yet more of my CD collection to mp3s so I can shove them on the home NAS, put the CDs in the loft and reclaim a load of space. Ripping them was a fairly predictably repetitive job of shoving CDs into the drive, waiting for the FreeDB lookup to run (or fail, which meant a bit of typing) and then hitting the go button.

Not the most fun of jobs, but still a lot less work that when I digitised some large parts of my vinyl collection a few years ago After a couple of days I had a nice collection of mp3s all sat there on a disk ready to be imported into the main collection folder.

Just one job remained, getting the Album art together. This is always a fun job of getting the right image. I’ve been using Album Art Downloader for this for a while, as you can set plenty of options to remove most of the effort. But, it doesn’t embed the images into the files. Which as an iDevice user I want so I can have the pictures transfer across when I sync. This got me thinking. I’ve got a windows file system with a nice hierarchical folder structure containing files, and I want to perform an action on each of the files in that structure.

Now if only there was some sort of Windows scripting language that was good at this sort of thing……..

Read More

Page 3 of 5

Powered by WordPress & Theme by Anders Norén