Musings of a Data professional

Stuart Moore

Day 17 – dbatools Restore-DbaDatabase and Azure – 31 days of dbatools backup and restores

Restoring with Azure with dbatools

All our talk about restores so far has been talking about restoring to on premises SQL Server instances, from on premises filestores. How well does dbatools Restore-DbaDatabae work with Azure?

Very well in fact. In today’s post we’ll look at the options for restoring to and from Azure databases.

Pre Requisites

The first thing I’d recommend with any of the Azure restore scenarios, is to create an explicit dbatoosl SqlInstance connection, rather than just passing in a name to SqlInstance as we’ve been doing so far.

This is very easy to do, and then use:

#To create it
$sqlCred = Get-Credential
$sqlInstance = Connect-DbaInstance -SqlInstance MyNewMI.public.cus29s972e4513d6.database.windows.net,3342 -SqlCredential $sqlCred
# To use it:
Restore-DbaDatabase -SqlInstance $sqlInstance .......
 

The reason for doing this is that Connect-DbaInstance offers a lot more tuning options to handle the different options Azure supports:

  • AzureDomain
    • By default Azure SQL Databases are created under ‘database.windows.net’, if your database is under a different domain you can specify that with this parameter
  • Tenant
    • The TenantID for the Azure tenancy owning the database
  • Thumbprint
    • The Managed Instance certificate Thumprint
  • Store
    • The certificate store containing the certificate for Manged SQL Instnace authentication
  • AuthenticationType
    • The type of authentication to be used. Should be specified if you want to force AD Universal with MFA support
  • ConnectionTimeout
    • Increase the time that dbatools waits for a connection request to time out

This is good practice if you’re doing anything with dbatools and Azure based SQL ‘things’ as it gives you a little more leeway if the network slows down or something happens that causes the instance to slow as well.

The second thing you’ll need to do is to ensure your SQL Server instance has access to your Azure Storage account. This isn’t too hard to setup, and I have a longer blog post on just this – Creating Azure Blob Storage accounts for sql server backups using PowerShell

I’d highly recommend you use Shared Access Signatures if you can, they offer a lot of benefits over the old Access Key authentication option. If you want to see how to backup SQL Server databases to Azure, then have a look at day 7’s post dbatools Backups to Azure where we walk through it.

Things we can’t do

and how to work around them!

Scan Azure storage

dbatools can’t natively scan Azure blob storage for backups. This is a design decision to save us having to have a dependancy on the Az PowerShell module. But, if you use the Az module then we can work with it’s output:

# If you have more than one account, subscription or Tenancy you may need to provide further params
Connect-AzAccount
$aSA = Get-AzStorageAccount -Name azblogdemo -ResourceGroupName dbatools
# Substitute your container name here
$container = Get-AzStorageContainer -Context $aSa.Context -Name sql
$blobsRaw = Get-AzStorageBlob -Container $c.name -Context $asa.Context 
$blobsReady = $blobsRaw | select @{Name='URI'; Expression = {"$($asa.Context.BlobEndPoint)$($c.name)/$($_.name)"}}

$blobsReady now contains a list of URIs for all of the files in the specified container. If you want to restrict the files by date you can do this:

    $blobsRaw = Get-AzStorageBlob -Container $c.name -Context $asa.Context | Where-Object {$_.LastModified -gt (Get-Date).AddDays(-1)}
    

Which would bring back the last 24 hours worth of backup files.

If you’ve anything other than SQL Server backup files in the blob storage container, then you’ll need to filter like this (assuming you’re using standard suffixes):

    $blobsRaw = Get-AzStorageBlob -Container $c.name -Context $asa.Context | Where-Object {$_.name -match '\.((bak)|(trn))$'

Depending on your internet connection and the size of your backups you may find scanning the files can take a while as they need to be pulled down to the SQL Instance you’re using. If you find this happening, then can I suggest looking at BackupHistory posts from this series for some advice(Day 15 dbatools BackupHistory, Day 16 – Recreating or saving dbatools backuphistory)

Restoring to Azure SQL Database

Azure SQL Database is a great Platform As A Service database. Unfortunately it’s one flaw is that you can’t restore to it from a SQL backup 🙁 If you want to move a database into Azure SQL Database then Restore-DbaDatabase isn’t going to help, but dbatools DacPac commands can.

Now on to what you can do with Restore-DbaDatabase.

Restoring from Azure with Restore-DbaDatabase

Once you’ve got your URIs for the files, either as above or from Get-DbaDbBackupHistory then it’s just a case of pushing them into Restore-DbaDatabase:

    $blobsReady.Uri | Restore-DbaDatabase -SqlInstance Server1\Instance

Will go through the list of URIs into Restore-DbaDatabase, and then work it’s way through the paths as normal to build up a restore plan, and then restore the database to the specified instance.

This assumes you’re using Shared Access Signatures, and that it’s stored in a SQL Server credential on the target instance with a name that matchs the base URI of the blob files being restored. If a suitable credential can not be found, an error will be raised and the function will exit. It it possible to restore from multiple storage account at same time, if credentials for each storage account are present.

If you are still using Access Keys for access, then you’ll need to specify the name of the SQL Server credential storing the key, like so:

    $blobsReady.Uri | Restore-DbaDatabase -SqlInstance Server1\Instance -AzureCredential myAzCredential

All examples so far work the same for on-premises SQL Server Instances, and SQL Server instances hosted in Azure Virtual Machines

Restore-DbaDatabase works well with Azure SQL Managed Instances. The backups to be restored must be in Azure, and the credentials exist on the Managed Instance. We would really encourage you user Connect-DbaInstance to make a connection as detailed at the start of this post.

When restoring to a SQL Managed Instance a number of parameters are not available:
DestinationDataDirectory, DestinationLogDirectory, DestinationFileStreamDirectory, XpDirTree, FileMapping, UseDestinationDefaultDirectories, ReuseSourceFolderStructure, DestinationFilePrefix, StandbyDirecttory, ReplaceDbNameInFile, KeepCDC

This is because a SQL Managed Instance doesn’t support these options. If the are provided, a warning will be raised and Restore-DbaDatabase will exit.

Conclusions

Where we can we’re supporting as many Azure SQL features as we can. We’d like to support more, but we need the corresponding functionality in Azure as well. As new features get added, we’ll be adding support for them as well.

All posts in this series can be found at 31 Days of dbatools Backup and Restores

Previous

Day 16 – Saving and recreating BackupHistory – 31 days of dbatools backup and restores

Next

Day 18 – Ola Hallengren maintnance solutions and Restore-DbaDatabase – 31 days of dbatools backup and restores

4 Comments

  1. Kartheek

    Hi Stuart how I need to restore bak file from Azure blob. I tried this command
    PS C:\> Restore-DbaDatabase -SqlInstance 172.31.12.22 -Path http://demo.blob.core.windows.net/backups/dbbackup.bak -AzureCredential MyAzureCredential. But it is not working for me.

  2. Eug

    Hi Sturt,
    I might have left a comment on another of your post today, it may or may not have been put through.
    I have an URI which is given to me which as a folder called c04rt9, the URI as follows (which I have changed)

    $URI=”https://cbabc.blob.storage.azure.net/c04rt9?sv=2019-07-07&sr=c&sig=P7znfdsflm2Dxepc8fWfsdafsagdtretHmTwCQ2fdsffdsffsdfGanSCk%3D&st=2021-05-dfgertydfsgdsfg-05-19T09%3A52%3A59Z&sp=rwdl”

    when I do Restore-DbaDatabase -SqlInstance localhost -Path $uri
    I get this error:
    Stop-Function : A positional parameter cannot be found that accepts argument ‘https://cbabc.blob.storage.azure.net/c04rt9?sv=2019-07-07&sr=c&sig=P7znfdsflm2Dxepc8fWfsdafsagdtretHmTwCQ2fdsffdsffsdfGanSCk%3D&st=2021-05-dfgertydfsgdsfg-05-19T09%3A52%3A59Z&sp=rwdl) should be a file, not a folder’.
    can list the files in the URI
    azcopy.exe list $URI
    INFO: changelog.md; Content Length: 45.20 KiB
    INFO: dbatools-master.zip; Content Length: 25.28 MiB
    INFO: t1_11.bak; Content Length: 392.50 KiB
    INFO: t1_11.dif; Content Length: 38.00 KiB
    INFO: t1_11.ldf; Content Length: 10.50 KiB
    INFO: t1_12.ldf; Content Length: 5.00 KiB
    INFO: t2_11.bak; Content Length: 390.00 KiB
    INFO: t2_11.ldf; Content Length: 12.00 KiB
    INFO: t2_12.ldf; Content Length: 5.00 KiB

    • Avatar photo

      Hi Eug,
      Sorry, been snowed under at work. I’ve just put up a new post, Scanning Azure storage blobs for dbatools’ restore-dbadatabase, where I show how I do this. Basically dbatools can’t talk directly to Azure storage as we’d have to make the Azure Powershell modules a dependancy, and that doesn’t work for most of our user.

      Hope that helps you. If it doesn’t let me know and we’ll see if we can work out another solution

      Cheers
      Stuart

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Powered by WordPress & Theme by Anders Norén