Musings of a Data professional

Stuart Moore

Author: Stuart Moore Page 13 of 18

Nottingham based SQL Server DBA who dabbles with many other technologies. 15+ years of experience with databases and still keen to learn and explore.

Day 31 of 31 Days of SQL Server Backup and Restore using PowerShell: Rounding Up

So here we are, post 34 in my 31 post series about using PowerShell to perform SQL Server backups and Restores. We’ve taken a fairly whistle stop tour of the main points. There’s been a couple of omissions, but hopefully I’ve provided you with plenty of examples you can use in your own situations, or at least some ideas about how you can approach automating your restores. I also hope I’ve shown you that automating those restores needn’t be the nightmare you might have been led to believe from reading through some of the dynamic T-SQL out there.

Much of the more advanced uses of these techniques is down to using deeper features of PowerShell rather the SMO features. If you want to learn more about PowerShell then I recommend the following books:

And the following blogs:

And the following blogs:

I have also been presenting on this topic at UK SQL Server User Groups this year. A set of slides, scripts and demos from one at Cardiff SQL User Group from October 2013 are available for download (cardiff-usergroup-powershell-backups-and-restores-01012013).

I’m also consolidating some of my production functions and routines into a PSBackup module, which I’m hoping to put up on GitHub so they can be worked on as a Open project by anyone who wants to contribute. When it’s up I’ll post an announcement.

I hope you’ve enjoyed the posts, and please feel free to leave a comment, drop me a mail or ping me on twitter if you have a questions.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Scripts and Demos from Cardiff SQL Server User Group presentation, 1st October 2013

All the files from my presentation on using PowerShell for SQL Server Backups and Restores at Cardiff SQL Server User Group on 1st October 2013.

cardiff-usergroup-powershell-backups-and-restores-01012013

Any problems, just drop me a line

Day 30 of 31 Days of SQL Server Backup and Restore using PowerShell: Recording for the Auditors

One of the reasons I became so keen on automating restores was that I’d regularly get requests from various auditors asking for examples of valid restores carried out in the last 3 months, or wanting me to justify my Recovery Time Objectives, or needing to ensure we had adequate DR testing. And there’s always a manager wanting reassurance that we could meet our SLA commitments.

By automating restores of our major production systems and recording the results, whenever a request came in I could quickly dump the information into Excel for them (remember, it’s not Real Information™ unless it’s in Excel).

So what sort of information should be be audited about restores? I find the following are a minimum that cover most requests, though be sure to check for any industry/business specifics that may apply to your own case.

  • Time of restore
    • Restores should ideally be attempted at different times throughout the day. This will highlight and potential slowdowns due to other activity on hardware or network
  • What database was restored
  • Which server was the restore performed on
    • If you have multiple restore/DR servers, it’s important you have a log of testing a restore on all of them to avoid having to use the one of the set that doesn’t work at a critical point.
  • How long it took
  • How much data was written out
    • This could be the amount of data on disk at the end of the backup, or you could calculate the total throughput of all backup files restored, or both
  • To what size did the database files grow during the restore
    • This may not be the same as the previous metric. This value will also include the empty space within data files, and accommodate any ‘shrinks’ that happened during the period being restored
  • User running the restore
    • Just so you can recreate any permissions issues
  • Original server of backup
  • Location of all backup files restored
  • Output (or lack of) from DBCC
    • If you’re using NO_INFOMSGS you may still want to log the fact that you had no reported errors, just to record that it had beem run
  • Output from in house check scripts
  • Log of any notifications sent to DBAs for further investigations

Once you have this information you can start to mine it for your own use as well. You can make sure that all your ‘matched’ hardware is actually working at the same speed, check that restoring whilst the network is under normal business load won’t add an extra hour to your RTO.

You can also start looking for trends, are your restores taking a lot longer since the new SAN was put in? or is Server A producing a lot more alerts on checking, perhaps there’s a underlying hardware problem to be investigated there?

A side bonus of this is also that your recovery hardware is being used. Rathe than just being sat there waiting for a disaster you’re actually reading and writing data from the drives. So now at 3am during a panic restore you can also be confident that you don’t have a dead spindle or a flaky drive controller in your server.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 29 of 31 Days of SQL Server Backup and Restore using PowerShell: Why I don’t use the checktables SMO method

2 different occasions at SQL Saturday Cambridge 2013 made me realise that I needed to ‘justify’ my use of Invoke-SQLCmd to run DBCC back on Day 21 of 31 Days of SQL Server Backup and Restore using PowerShell: Verfiying a restored database. One was during my session (Slides and scripts here), and the other was during Seb Matthews session where he said to view every use of Invoke-SQLCmd as a failure.

The first thing I don’t like about this particular message is what happens when you check a ‘good’ database:

Import-Module sqlps -DisableNameChecking

$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("Server1")

$db = $sqlsvr.Databases.item("GoodDB")

$db.CheckTables("None")

If you run this, you’ll notice you get nothing back, not a thing. This is because CheckTables runs with NO_INFOMSGS as default. And there’s no way to override it. This might work for some applications, but in some environments you would want to have an informational messages returned and recorded, as proof that all checks were ran and passed.

That’s strike 1 for me.

If you’re wondering what the parameter to CheckTables is, it’s the repair level. Accepted values are:

  • None
  • Fast
  • Rebuild
  • AllowDataLoss

Which operate in the same way as they do under T-SQL

If you’re lucky enough not to have a corrupt database to play with, then Paul Randal has some you can download: (Paul Randal, SQLSkills.com, Corrupt SQL Server Databases). Assuming you’ve restored the db as chkDBCC then you can check it as so:

Import-Module sqlps -DisableNameChecking

$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("Server1")

$db = $sqlsvr.Databases.item("chkDBCC")

$db.CheckTables("None")

This time we will get some output, unfortunately this time a not hugely useful generic error message:

Exception calling "CheckTables" with "1" argument(s): "Check tables failed for Database 'chkdbcc'. "
At line:5 char:1
+ $db.CheckTables("None")
+ ~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : Fai

Strike 2

We can improve on this is we actually examine the error in a bit more detail by putting it in a Try, Catch block:

Import-Module sqlps -DisableNameChecking

$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("Server1")
$db = $sqlsvr.Databases.item("chkdbcc")

try{
    $db.CheckTables("None")
}
catch{[System.Exception]
  $err = $_.Exception
  while ( $err.InnerException )
    {
    $err = $err.InnerException
    write-output $err.Message
    };
}

And now we get a more informative response:

Check tables failed for Database 'chkdbcc'.
An exception occurred while executing a Transact-SQL statement or batch.
Check Catalog Msg 3853, State 1: Attribute (object_id=1977058079) of row (object_id=1977058079,column_id=1) in
 sys.columns does not have a matching row (object_id=1977058079) in sys.objects.
Check Catalog Msg 3853, State 1: Attribute (object_id=1977058079) of row (object_id=1977058079,column_id=2) in
 sys.columns does not have a matching row (object_id=1977058079) in sys.objects.
CHECKDB found 0 allocation errors and 2 consistency errors not associated with any single object.
CHECKDB found 0 allocation errors and 2 consistency errors in database 'chkDBCC'.

To get the full information returned in our previous examples:

Error       : 8992
Level       : 16
State       : 1
MessageText : Check Catalog Msg 3853, State 1: Attribute (object_id=1977058079) of row
              (object_id=1977058079,column_id=2) in sys.columns does not have a matching row
              (object_id=1977058079) in sys.objects.
RepairLevel :
Status      : 16
DbId        : 23
DbFragId    : 1
ObjectId    : 0
IndexId     : -1
PartitionId : 0
AllocUnitId : 0
RidDbId     : 23
RidPruId    : 0
File        : 0
Page        : 0
Slot        : 0
RefDbId     : 23
RefPruId    : 0
RefFile     : 0
RefPage     : 0
RefSlot     : 0
Allocation  : 1

Well, we can’t. As well as enforcing the NO_INFOMSGS clause, CheckTables doesn’t allow the use of TABLERESULTS.

Which is strike 3, and it’s out of here!

This is all personal opinion, but this is one of the very few DBA tasks where I will always opt to use Invoke-SQLCmd over an SMO object.

Hopefully we might see a better SMO option with the release of SQL Server 2014…..

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 28 of 31 Days of SQL Server Backup and Restore using PowerShell: Setting up Database Mirroring

A very practical use for automated SQL Server backups and restores with PowerShell is setting up High Availability solutions. Usually these will involve bringing 2 or more SQL Server instances into sync with each other, the usual solution being backups and restores. SQL Server Mirroring may have been deprecated in SQL Server 2012 in favour of AlwaysOn Availability Groups, but I still see plenty of installs that haven’t migrated yet so it’s still heavily in use. To setup SQL Server mirroring successfully there are a number of conditions that must be fulfilled.

  • All SQL Server instances should be of the same version. While you can set up mirroing from lower version to a high version (ie; 2008 -> 2008R2), if you fail over you get a free database upgrade thrown in and can never mirror back.
  • All SQL Server instances should be of the same edition (Enterprise, Standard, etc). You can get away with this if you are NOT running any edition specific features, but the minute you enable an “Enterprise” feature then your database won’t start on any instance that doesn’t support that version
  • Instances should have the same start ‘Log Sequence Number’ in their active transaction log. This ensures that all the ‘new’ transaction transmitted in the intial sync are working from the same start point. Prior to SQL Server 2008R2 you could just restore a current backup, now you need to restore a full backup and at least 1 transaction log
  • There should be a valid TCP mirroring endpoint on each instance
  • There should be a valid SQL Login with permissions to access the above Endpoints

All of this we can handle within PowerShell to make our lives nice and easy. The following is going to assume that you have a shared folder which both SQL Server instances have access to. If not, then you’ll just need to use Copy-Item cmdlet to manually copy the backup files across. We’re also going to assume that both instances are running under the same domain accounts so there’s no need to grant permissions on the endpoints

Import-Module SQPS -DisableNameChecking

#Setup some basic variables
$ServerPrimaryName = "Server1"
$ServerSecondaryName = "Server2"

#Let's do multiple databases
$Databases = ("db1","ProdDB2", "MirrorDB1")

#Both SQL Server instances should have read/write to here
$FileShare = "\\FileServer\DBShare\"

#Get all dbs onto the same log:
foreach($db in $Databases){
    Backup-SQLDatabase -Database $db -ServerInstance $ServerPrimaryName -BackupFile $FileShare+$db+"_full.bak"
    Backup-SQLDatabase -Database $db -ServerInstance $ServerPrimaryName -BackupFile $FileShare+$db+"_log.trn"
    Restore-SQLDatabase -Database $db -ServerInstance $ServerPrimaryName -BackupFile $FileShare+$db+"_full.bak" -NoRecovery
    Restore-SQLDatabase -Database $db -ServerInstance $ServerPrimaryName -BackupFile $FileShare+$db+"_log.trn" -NoRecovery
}

#Now we need to create a TCP Mirroring EndPoint on each Server
$ServerPrimary = New-Item $ServerPrimaryName
$ServerSecondary = New-Item $ServerSecondaryName
$EPName = "DBMirror-PS"
$EPPort = 7022

        $PrimaryEP = new-object ('Microsoft.SqlServer.Management.Smo.EndPoint -Argument $ServerPrimary, $EPName
        $PrimaryEP.ProtocolType = [Microsoft.SqlServer.Management.Smo.ProtocolType]::Tcp
        $PrimaryEP.EndpointType = [Microsoft.SqlServer.Management.Smo.EndpointType]::DatabaseMirroring
        $PPrimaryEP.Protocol.Tcp.ListenerPort = $EPPort
        $PrimaryEP.Payload.DatabaseMirroring.ServerMirroringRole = [Microsoft.SqlServer.Management.Smo.ServerMirroringRole]::Partner
        $PPrimaryEP.Create()
        $PrimaryEP.Start()

        $SecondaryEP  = new-object ('Microsoft.SqlServer.Management.Smo.EndPoint -Argument $ServerPrimary, $EPName
        $SecondaryEP.ProtocolType = [Microsoft.SqlServer.Management.Smo.ProtocolType]::Tcp
        $SecondaryEP.EndpointType = [Microsoft.SqlServer.Management.Smo.EndpointType]::DatabaseMirroring
        $SecondaryEP.Protocol.Tcp.ListenerPort = $EPPort
        $SecondaryEP.Payload.DatabaseMirroring.ServerMirroringRole = [Microsoft.SqlServer.Management.Smo.ServerMirroringRole]::Partner
        $SecondaryEP.Create()
        $SecondaryEP.Start()

foreach ($db in $Databases){
    $ServerPrimary.Databases.item($db).MirroringPartner = "TCP://"+$ServerPrimary.NetName+":"+$EPPort
    $ServerPrimary.Databases.item($db).alter()
    $ServerSecondary.Databases.item($db).MirroringPartner = "TCP://"+$ServerSecondary.NetName+":"+$EPPort
    $ServerSecondary.Databases.item($db).alter()
}

As you can see this is pretty simple, and easily reusable whenever you need to set DB Mirroring up. For this example we entered the database names explicitly, but this could be easily modified to mirror every database on an instance with the following changes:

Import-Module SQPS -DisableNameChecking

#Setup some basic variables
$ServerPrimaryName = "Server1"
$ServerSecondaryName = "Server2"
$ServerPrimary = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerPrimaryName)
$ServerSecondary = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerSecondaryName)

#Load all the databases in an instance
$Databases = $ServerPrimary.databases

#Both SQL Server instances should have read/write to here
$FileShare = "\\FileServer\DBShare\"

#Get all dbs onto the same log:
foreach($db in $Databases){
    Backup-SQLDatabase -Database $db -ServerInstance $ServerPrimaryName -BackupFile $FileShare+$db.name+"_full.bak"
    Backup-SQLDatabase -Database $db -ServerInstance $ServerPrimaryName -BackupFile $FileShare+$db.name+"_log.trn"
    Restore-SQLDatabase -Database $db -ServerInstance $ServerPrimaryName -BackupFile $FileShare+$db.name+"_full.bak" -NoRecovery
    Restore-SQLDatabase -Database $db -ServerInstance $ServerPrimaryName -BackupFile $FileShare+$db.name+"_log.trn" -NoRecovery
}

#Now we need to create a TCP Mirroring EndPoint on each Server

$EPName = "DBMirror-PS"
$EPPort = 7022

        $PrimaryEP = New-Object -TypeName Microsoft.SqlServer.Management.Smo.EndPoint($ServerPrimary, $EPName)
        $PrimaryEP.ProtocolType = [Microsoft.SqlServer.Management.Smo.ProtocolType]::Tcp
        $PrimaryEP.EndpointType = [Microsoft.SqlServer.Management.Smo.EndpointType]::DatabaseMirroring
        $PPrimaryEP.Protocol.Tcp.ListenerPort = $EPPort
        $PrimaryEP.Payload.DatabaseMirroring.ServerMirroringRole = [Microsoft.SqlServer.Management.Smo.ServerMirroringRole]::Partner
        $PPrimaryEP.Create()
        $PrimaryEP.Start()

        $SecondaryEP  = New-Object -TypeName Microsoft.SqlServer.Management.Smo.EndPoint($ServerPrimary, $EPName)
        $SecondaryEP.ProtocolType = [Microsoft.SqlServer.Management.Smo.ProtocolType]::Tcp
        $SecondaryEP.EndpointType = [Microsoft.SqlServer.Management.Smo.EndpointType]::DatabaseMirroring
        $SecondaryEP.Protocol.Tcp.ListenerPort = $EPPort
        $SecondaryEP.Payload.DatabaseMirroring.ServerMirroringRole = [Microsoft.SqlServer.Management.Smo.ServerMirroringRole]::Partner
        $SecondaryEP.Create()
        $SecondaryEP.Start()

foreach ($db in $Databases){
    $db.MirroringPartner = "TCP://"+$ServerPrimary.NetName+":"+$EPPort
    $$db.alter()
    $ServerSecondary.Databases.item($db.name).MirroringPartner = "TCP://"+$ServerSecondary.NetName+":"+$EPPort
    $ServerSecondary.Databases.item($db.name).alter()
}

And this certainly speeds things up if you need to set up a large number of mirror dbs in one fell swoop

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 27 of 31 Days of SQL Server Backup and Restore using PowerShell: Why the Long version?

You might have noticed over the last 26 days that I seem to prefer using the long form of PowerShell backups over the more concise cmdlet versions.

There’s real reason for this, as they’re all building up the same T-SQL BACKUP statements at the end of the day. However, I find that when automating things, building up a backup one option at a time like:

$Restore.Replace=$True
$Restore.NoRecovery = $False

This gives me very easy control of what gets passed in to my backup or restore.:

if ($x -eq y){
    $Restore.Replace=$True
}else{
    $Restore.Replace=$False
}

and so on for all the parameters you might want to add.

But with the newer Restore-SQLDatabase cmdlet you can’t use:

if ($x -eq y){
    $RecoveryOption = $True
}else{
    $RecoveryOption = $False
}
Restore-SQLDatabase -Database $DBName -BackupFile $backupfile -NoRecovery $RecoveryOption

As -NoRecovery is a switch, not a parameter. This means that just specifying the switch turns the feature on or off, there is no actual parameter to pass through to it. This can be a bit limiting when you’re wanting complete control within a flexible set of scripts.

I find the new cmdlets are much easier when you just want to quickly back something up from the command line. Today I used this 2 line to quickly backup all the databases on a dev/test server before we tried something destructive:

cd SQLSERVER:\SQL\DEVBOX1\DEFAULT\DATABASES
ls|%{$_|Backup-SQLDatabase}

Much simpler and quicker to type than anything else I can think of!

(Just in case you can’t read what is a great example of bad code, I’m making use of a couple of shortcuts in PowerShell:
ls is a system alias for Get-ChildItem, which coming from a Unix background I tend to use naturally. And % is an alias for foreach, and doing that on a collection of system objects defaults to stepping through the objects in the collection loading $_ on each iteration. Plus PowerShell doesn’t mind about missing spaces if things aren’t ambiguous. This type of code is great for one offs when you just want to do something quickly, but not something you ever want leave in proper scripts or functions, and is not fun to discover when debugging as it probably means shortcuts were also taken elsewhere!)

So to answer the original question, it’s just down to a personal preference, and using the right tool at the right time.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 26 of 31 Days of SQL Server Backup and Restore using PowerShell: Restoring SQL Server Analysis Services databases

Yesterday we looked at backing up SQL Server Analysis Services databases, and today we’ll look at restoring them.

Just as with backups, prior to SQL Server 2012 SQLPS restores had to be done a little differently. Unlike Backup which was a method of a Database object, Restore is a method of the Server class. This makes sense, as if you’re restoring a database it won’t necessarily exist for you to create an object from. In it’s simplest form you just need to provide the location of the backup file:

Import-Module SQLPS -DisableNameChecking

$SSASInstance = "SSASServer1\Instance1"

$BackupFile = "c:\backups\db1.abf"

$SSASServer = New-Object Microsoft.AnalysisServices.Server
$SSASServer.connect($SSASSInstance)

$SSASServer.Restore($BackupFile)

However the Restore method can take more parameters to allow for more flexibility, in much the same way as we saw with the Backup method yesterday. For Restore these are:

Parameter Type Description
BackupFile String Path to backup file
DatabaseName String Name of database to restore as (default is to restore with same name as backed up from
OverWrite Boolean Whether to overwrite an existing db, default is False
RestoreLocation An array of RestoreLoctions for restoring remote dbs
RestoreSecurity RestoreSecurity Enum Specifies what to do with security roles after the restore (Copy, Ignore, Skip)
Password String Password to restore a password protected backup
StorageLocations String Specify a different storage location for the restored db

These can also be specified by creating a RestoreInfo object and passing that to the Restore method. Just as with the Backup method yesterday to supply a value you must provide the preceeding values as well, even if it’s just empty strings or NULL.

With the release of the SQL Server 2012 PowerShell provide MS also release a Restore-ASDatabae cmdlet. This can make the restores much simple to automate:

Import-Module SQLPS -DisableNameChecking

$SSASInstance = "SSASServer1\Instance1"
$BackupFiles = Get-ChildItem "c:\SSASbackups\"

foreach ($file in $BackupFiles){
    Restore-ASDatabase -Sever $SSASInstance -RestoreFile $file.fullname -AllowOverwrite
}

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 25 of 31 Days of SQL Server Backup and Restore using PowerShell: Backing up SQL Server Analysis Services databases

So far in this series we’ve only looked at SQL Server database backups, but PowerShell can also be used to backup and restore SQL Server Analysis Server objects.

Prior to the release of the latest version of SQLPS with SQL Server 2012, SSAS backups were achieved via the Backup method of a Analysis Server Database object:

Import-Module SQLPS -DisableNameChecking

$SSASInstance = "SSASServer1\Instance1"
$DBName = "db1"

$BackupFile = $DBName+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".abf"

$SSASServer = New-Object Microsoft.AnalysisServices.Server
$SSASServer.connect($SSASSInstance)

$DB = $SSASServer.databases[$DBName]
$DB.backup($BackupFile)

Not that we don’t have to build an backup object, or any devices. So this is all a lot simpler than a Data Platform backup. However there is less flexibility. Some other options are available as the Backup method is ‘overloaded’ (it has a number of different implementations). All the possible options are listed here.

Parameter Type Action
File String Filename to backup to
allowOverwrite Boolena Whether the backup can overwrite an existing file, default is now
backupRemotePartitions Boolean Whether remote partitions should be backuped up, default is no
RemoteLocations An array of Analysis Server Backup locations Where remote partitions should be backed up to
ApplyCompression Boolean Whether to compress the backup or not, default is off
Password String Password to encrypt backup files

Just to make life slightly more awkward you need to specify every parameter before the last one you want to use. So if you want to set Compression on, then you need to specify File, AllowOverwrite, backupRemotePartitions and RemoteLocations as well, even if just specifying empty strings like so:

$DB.backup($BackupFile,$False,$False,NULL,$True)

With SQLPS as provided with SQL Server 2012, there is another way. The CmdLet Backup-ASDatabase was introduced, which works in much the same way as Backup-SQLDatabase:

Backup-ASDatabase -BackupFile $backupfile -Server $SSASInstance -Name $DB -ApplyCompression

This is slightly easier to work with, and can be automated for all the dbs in an instance just like we did for the Data Platform examples:

Import-Module SQLPS -DisableNameChecking

$SSASInstance = "SSASServer1\Instance1"
$SSASServer = New-Object Microsoft.AnalysisServices.Server
$SSASServer.connect($SSASSInstance)

foreach($DB in $SSASServer.databases){
    $BackupFile = $$DB.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".abf"
    Backup-ASDatabase -BackupFile $backupfile -Server $SSASInstance -DataBase $DB -ApplyCompression
}

And as a bonus, this format is backward compatible with SQL Server 2008 and SQL Server 2008R2, you only need the SQL Server 2012 SQLPS module installed, which comes with the SQL Server client tools (free to download and use if you grab SQL Server 2012 Express SP1 or better). So it could make your life easier if you grab this even if you upgrade nothing else to 2012.

Note that all the usual Analysis Services restrictions on what is actually backup up apply when backing up via PowerShell as they would via any other method.

One advantage of backing up SSAS via PowerShell is that it’s very easy to ensure that the underlying datastores are also backed up at the same time.

Tomorrow we’ll look at restoring SSAS Databases.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 24 of 31 Days of SQL Server Backup and Restore using PowerShell: Error Checking

So far most of our scripts have been a little bereft of error checking. This has mainly been to make the easier to read. If you’ve been experimenting with the scripts, then it’s almost certain that you’ll have come across the following unhelpful error message from PowerShell:

Exception calling "SqlBackup" with "1" argument(s): "Backup failed for Server 'Server1'. "
At line:1 char:1
+ $Backup.SqlBackup($SQLSvr)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : FailedOperationException

You’ll probably also have discovered that amongst many things it can be a response to:

  • Wrong Server name
  • Permissions error on backup destination
  • Contradictory parameters
  • DB exists and you’ve not specified replace
  • You’ve specified a filename that’s already in use
  • Many, many, more

This means that for scripts that you want to be reliable for automation that you need to make sure you’ve got robust checking in place. Or at least a good logging system so you can go back and review the settings for each run. Common issues, and how to check for them are listed below

Missing folders, or files that already exist. PowerShell provides Test-Path – does what it says on the tin, it tests a path to see if it exists, so you can quickly check if a path exists before writing to it, it returns $True and $False as appropriate. So to make sure that your backup folder exists before writing to it:

if (Test-Path "c:\mybackups\" -eq $False){
    write-output "Missing folder"
}

This could be modified to create the missing path if required, or if you actually wanted the file not to be there you can redirect your process.

Getting the SQL Server name wrong or the server not being available. When you build your new SQL Server object like so:

$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("WrongServer")

you might think that PowerShell would warn you if the server didn’t exist, but it doesn’t. And you might not find out till you get the error message above. To check it as soon as you create it you need to try to use it:

$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("WrongServer")
try{
    $sqlsvr.databases | Out-Null
}
catch{
    Write-Host "SQL Server Not Available"
}

You’ll need to check the database you’re targeting for backup or restore exists, thought for different reasons. If you’re restoring and the database exists you need to set the Restore objects Replace property to $True:

$DBname="fdg"

try{
    $sqlsvr.Databases.item($DbName) | Get-Member -ErrorAction stop
}
catch{[System.Exception]
    <#     DB does not exist.      Choose to abort backup attempt.     Set a flag that can be checked in in the Finally     block      #>
    $DBexists = $false

}
finally{
    if ($DBexists -ne $False){
        <#         DB exists, choose whether to abort restore         or set .Replace = $True         #>
    }
}

A common problem when backing up or restoring is checking that the SQL Server Service account has the correct permissions to access the area. Remember that PowerShell is just submitting T-SQL to SQL Server to execute, so the account running the script may have permissions.

Checking the permissions takes a bit more code but is also possible. This assumes we have a validated SQL Server connection called $sqlsvr, and also that the account running the script has permissions to see the permissions on the requested object:

$serviceaccount = $sqlsvr.ServiceAccount

#Need to append \\ to the end of the domain so it works in -replace
$domain= $env:USERDOMAIN+"\\"

#Path to Folder we want to check
$path = "c:\folder1\folder2"

#Get Group memberships for Service Account
$groups = Get-ADPrincipalGroupMembership ($ServiceAccount -ireplace $domain,"")

#Build up an Array to hold all the group and account names
$accounts = @()
$accounts += $groups.samaccountname
$accounts += $ServiceAccount -ireplace $domain,""

#Set a check for whether we have write permissions
$write=0
$read=0

#Get the ACL on our specified path
$perms = get-acl $path

#For each ACL in array
foreach ($acl in $perms.access){
    #Then for each acccount/group
    foreach ($acc in $accounts){
        #If a right is assigned to a group or account
        if ("ads\"+$acc -ieq $acl.IdentityReference){
            #Check if it's a permission that gives us write access
            if (($acl.FileSystemRights -ilike "*FullControl*") -or ($acl.FileSystemRights -ilike "*Modify*")){
                if($acl.AccessControlType -ne "Allow"){
                    #Hit a Deny, overrides any other permission, so drop out of checking
                    $write = 0
                    break
                }
                #Increment our write check. Keep looping as we may not have hit a deny yet
                $write++

                #If we have write we have read
                $read++

            }elseif ($acl.FileSystemRights -contains "Read"){
                $read++
            }
        }
    }
}

if ($write -eq 0){
    write-output "SQL Server service account $serviceaccount doesn't have write access to $path"
    #Take action
}

if ($read -eq 0){
    write-output "SQL Server service account $serviceaccount doesn't have read access to $path"
    #Take action
}

If you’re accessing your path through a share, then you also need to check share permissions as they may override your FileSystem permissions (same assumptions as filesystem example above):

$serviceaccount = $sqlsvr.ServiceAccount

#Need to append \\ to the end of the domain so it works in -replace
$domain= $env:USERDOMAIN+"\\"

#Path to Folder we want to check
$path="\\Server1\Share3$\folder1"
#Split the path on \ to get the bits we want
$ShareArray = $path -Split [regex]::Escape('\')

#Get Group memberships for Service Account
$groups = Get-ADPrincipalGroupMembership ($ServiceAccount -ireplace $domain,"")

#Build up an Array to hold all the group and account names
$accounts = @()
$accounts += $groups.samaccountname
$accounts += $ServiceAccount -ireplace $domain,""

#Set a check for whether we have write permissions
$write=0
$read=0

$share=$sharearray[3].Trim()
$ShareSecurity = Get-WMIObject -Class Win32_LogicalShareSecuritySetting -Filter "name='$Share'"  -ComputerName $ShareArray[2]
$ShareSecurityDesc = $ShareSecurity.GetSecurityDescriptor().Descriptor
foreach($ace in $SD.DACL){
       foreach ($acc in $accounts){
            $acl = New-Object Security.AccessControl.FileSystemAccessRule($ace.Trustee.name, $ace.AccessMask, $ace.AceType)
        #If a right is assigned to a group or account
        if ($acc -ieq $acl.IdentityReference){
             #Check if it's a permission that gives us write access
            if (($acl.FileSystemRights -ilike "*FullControl*") -or ($acl.FileSystemRights -ilike "*Modify*")){
                if($acl.AccessControlType -ne "Allow"){
                    #Hit a Deny, overrides any other permission, so drop out of checking
                    $write = 0
                    break
                }
                #Increment our write check. Keep looping as we may not have hit a deny yet
                $write++
                 #If we have write we have read
                $read++

            }elseif ($acl.FileSystemRights -ilike "*Read*"){
                $read++
            }
        }
    }

}

Those examples cover a number of the most common errors I see when leaving backups and restores to run on their own. You can add more to cover cases as you discover them.

And at a minimum, ensure that everything is encased in try,catch, finally blocks (making sure to add -ErrorAction Stop to non terminating cmdlet. And the next most useful thing is to use the script() method to keep a record of what you’re asking SQL Server to do, a lot of the time you’ll look at it and realise what you’ve done wrong.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Day 23 of 31 Days of SQL Server Backup and Restore using PowerShell: It’s all just T-SQL, really

Yesterday we compared the performance of running backups with T-SQL and with PowerShell. And the conclusion was that on small backups there’s small overhead of building up the

At the end of the day, we shouldn’t have been expecting much of a difference in performance. All PowerShell’s SQLPS module is doing is using SQL Management Objects (SMO) to create T-SQL which is then submitted to SQL Sever. So most of the small performance overhead is purely the work of creating that T-SQL and then connecting to our SQL Server instance to submit it for execution.

We can exploit this feature in 2 ways:

  • Rather than executing the T-SQL we can just get PowerShell to give us the scripts so we can run them. This might be something you want to do in case of an important restore if you wanted to confirm the actions before committing.
  • Fault finding. SQLPS doesn’t offer the most informative error messages (in fact it quite ofter returns the same one for any error), so by extracting the T-SQL you can quickly see if the logic was correct, or if you’ve forgotten to create part of your backup or restore.

Getting the SQL Script is very simple, it’s just a substitution at the point of restore or backup:

  • Switch $backup.SQLBackup for $Backup.Script
  • Switch $Restore.SQLRestore for $Restore.Script
  • add the -Script swith to Restore-SQLDatabase or Backup-SQLDatabse cmdlets

With a simple example, it’s easy to see what will come out:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1\SQL2012"
$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)

$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Db = $SQLSvr.Databases.Item("psdb1")

$Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Full Backup of "+$Db.Name
$Backup.Database = $db.Name

$BackupName = "c:\psbackups\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
$DeviceType = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)

$Backup.Devices.Add($BackupDevice)
$Backup.Script($SQLSvr)
$Backup.Devices.Remove($BackupDevice)

will produce:

BACKUP DATABASE [psdb1] TO  DISK = N'c:\psbackups\psdb1_20130924_181155.bak' WITH  DESCRIPTION = N'Full Backup of psdb1', NOFORMAT, NOINIT, NOSKIP, REWIND, NOUNLOAD,  STATS = 10

But if we go for a more complex restore it provides a lot more. As an example let’s modify our Restore-SQLDatabasePointInTime function from Day 20

function Restore-SQLDatabasePointInTimeScript
{param($BackupFiles, [string]$SQLServer, [datetime]$PointInTime=(Get-Date).AddDays(1))

$ScriptOutPut = @()
$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$restore.ToPointInTime = get-date($PointInTime) -format "MMM dd, yyyy hh:mm tt"

$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
foreach ($backup in $BackupFiles ){
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($backup.filepath,$devicetype)

$restore.devices.add($restoredevice) | Out-Null
$restore.FileNumber = $backup.position
$restore.Database = $backup.DatabaseName
$ScriptOutput += $restore.Script($RestoreServer)

if ($backup.LastLSN -eq $BackupFiles[-1].LastLSN){
$ScriptOutput += "Restore database "+$restore.Database.tostring()+" with recovery"
}
$restore.Devices.Remove($restoredevice) | Out-Null
}
return $scriptoutput
}

And then call the same sequence of functions for our point in time restore operation, except substituting our new function at the very end. Then we can see that we get a very handy amount of code built for us:

RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_3_18.bak' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_.bcf' WITH  FILE = 4,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_5_19.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_5_20.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_5_21.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_5_22.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_5_23.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_0.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_1.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_2.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_3.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_4.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_5.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_6.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_7.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_8.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_9.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
RESTORE DATABASE [PointInTime] FROM  DISK = N'C:\piit\PointInTime_6_10.trn' WITH  FILE = 1,  NOUNLOAD,  STATS = 10,  STOPAT = N'Sep 20, 2013 05:32 PM'
Restore database PointInTime with recovery

It is fact fine to do the restore, and retrieve the script during the same operation, so we could modify our original function to include the .Script() method as well, or make it optional via a switch or variable. Collecting the scripts for every restore/backup and then storing it would mean that you’d have a head start when trying to debug issues, or as more information to provide for auditors.

As the PowerShell submissions are ultimately just T-SQL you can use all your normal tools to monitor or control them.

Tomorrow we’ll look at

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is available here.

Page 13 of 18

Powered by WordPress & Theme by Anders Norén