Stuart Moore

Musings of a Data professional

Stuart Moore

Category: dbatools

Creating Azure Blob storage account for SQL Server backup and restore with PowerShell

In a previous post I looked at Creating Azure Blob storage account for SQL Server backup and restore via the portal. In this post I’m going to work through setting up the thing but using PowerShell. To me this is the simpler and easier way to do things. Most of it can be left running in the background while you drink coffee or read twitter, work smarter not harder 😉

Prequisites

For these examples I’m going to assume you’ve installed the up to date versions of these 2 PowerShell modules:

So no we’ll step through the PowerShell to setup the Azure Storage. I’ll post a screen shot after each block of code so you can see what you should expect to see. This is running on PowerShell 6.1.1 on MacOS (ain’t cross platform compatibility grand!)

TLDR; – I’ve uploaded the whole script to github in this gist – Setting up an Azure Blob container for SQL Server Backups and Restores

Setup

Import-Module dbatools
Import-Module Az

Connect-AzAccount
# Currently returns the wrong URL, use https://aka.ms/DeviceLogon instead

Subscription Check

Make sure you’re in the right subscription, here I’ve logged into my Visual Studio subscription and want to use my MVP Credits one:

#Check you're in the correct subscription
Get-AzContext
Get-AzContext -ListAvailable | Where-Object {$_.Name -Like 'MVP*'} | Set-AzContext
Get-AzContext

Check what’s there

#check existing Storage Accounts
Get-AzStorageAccount  

Create the account

Here I’m using PowerShell splatting which is a great technique to avoid stupidly long parameter strings

I’m creating this storage account with the same settings as the one I created in the post Creating Azure Blob Storage for SQL Server Backups and Restores via the Azure Portal

#we're going to use splatting
$NewStorageAccount = @{
   ResourceGroupName  = "dbatools"
   AccountName = "azblogdemo"
   Location = "Uk South"
   SKUName = "Standard_GRS"
   Kind = "StorageV2"
   AccessTier = "Hot"
   EnableHttpsTrafficOnly = $true
}
#Create the Storage Account
New-AzStorageAccount @NewStorageAccount

Getting the Access Keys

If you want to use the older Access Keys method to authenticate you’re pretty much done at this session. If you’re on a version of SQL Server that supports the Shared Access Signature method (2016 onwards), then it’s a good idea to move over to that method as it’s more secure and performs better

$AzStorageKeys = Get-AzStorageAccountKey -ResourceGroupName $NewStorageAccount.ResourceGroupName -Name $NewStorageAccount.AccountName
$AzStorageKeys
$Sql = "CREATE CREDENTIAL [$($NewStorageAccount.AccountName)] WITH IDENTITY='$($NewStorageAccount.AccountName)', SECRET = '$($AzStorageKeys[0].Value)'" 
$Sql
Invoke-DbaSql -SqlInstance server1 -Database master -Query $Sql

Onwards to Shared Access Signatures

Grab a context so we can pass it in to other cmdlets:

$AzStorageContext = New-AzStorageContext -StorageAccountName $NewStorageAccount.AccountName -StorageAccountKey $AzStorageKeys[0].Value

Now we create a new container on the Storage Context

#Create a blob container called SQL
$AzStorageContainer = New-AzStorageContainer -Context $AzStorageContext -Name sql

Create a new Shared Access Policy to the context we’ve just created


#Create a Shared Access Policy giving (r)ead, (w)rite, (l)ist and (d)elete permissions for 1 year from now
$SharedAccessPolicy = @{
    Context = $AzStorageContext
    Policy = $AzStorageContext.StorageAccountName+"Policy2"
    Container = "sql"
    ExpiryTime = (Get-Date).ToUniversalTime().AddYears(1)
    Permission = "rwld"
}
$AzSharedAccessPolicy = New-AzStorageContainerStoredAccessPolicy @SharedAccessPolicy

Now get the token for the Shared Access Policy, this is the actual ‘credential’ we need.

#Get the Shared Access Token
$AzSas = New-AzStorageContainerSASToken -Policy $SharedAccessPolicy.Policy -Context $AzStorageContext -Container sql

Grab the full URL to the sql container

#We need the URL to the blob storage container we've created:
$Url = $AzStorageContainer.CloudBlobContainer.uri.AbsoluteUri

And then a little bit of string manipulation to build our Query to create the credential

#The SASToken is prefixed with a '?' to make it easy to append to a HTTP querystring, but we don't need it, so use substring(1) to drop it
$AzSas
$SasSql = "CREATE CREDENTIAL [$Url] WITH IDENTITY='SHARED ACCESS SIGNATURE', SECRET='$($AzSas.SubString(1))'"
Invoke-DbaQuery -SqlInstance localhost -Database Master -Query $SasSql

And since we’re using dbatools to create the credentials, it’s very simple to do something like:

Invoke-DbaSqlQuery -SqlInstance server1,server2,server3 -Database Master -Query $SasSql

And then you have the credential on a number of SQL Server instances with no extra effort

The next blog in this short series will have some example of how you can then use these credentials to backup and restore via the dbatools commands

To save you piecing it togther I’ve uploaded the whole script to github in this gist – Setting up an Azure Blob container for SQL Server Backups and Restores

If you have any questions then please comment below or ping me on twitter

Creating Azure Blob storage account for SQL Server backup and restore via the portal

Since SQL Server 2012 SP1 you’ve been able to backup and restore SQL Server using Azure Blob storage. This can be useful to have offsite replicated backups or if you need to move data securely between sites, and now with Azure hosted SQL Server (of various stripes) it makes it easy to get data into those systems.

I’m just going to run through creating an Azure Blob Storage account for use with SQL Server using the Azure Portal. Another post will walk through doing the same thing using PowerShell. I normally use the PowerShell versions (no surprise there), but sometimes the graphical version is good way of understanding what’s going on under the hood.

First up, there are 2 ways that you authenticate your SQL Server with Azure storage:

Storage Account Identity with an Access Key

This is the original method introduced with SQL Server 2012, but it has been deprecated since SQL Server 2016. So it will still work, but will disappear at a time of Microsoft’s choosing.

With this method you’re handing over your Access Key, which gives the end service full rights over your storage account. As you can only have 2 active at any time, it can be hard to revoke a key if it’s being misused

Shared Access Signature

This is the more modern way of granting access to your storage. You can create as many signatures as you want, restricting to them only accessing parts of your storage account, different permissions, different expiry dates and they can be revoked individually.

So you could create one SAS for backup with create/write permissions and another for restore with read permissions to keep everything lined up.

Creating the Storage Account

Go into the Azure Portal, and find Storage Accounts (I prefer to start typing in the search box):

Ignore the ‘Classic’ option as that won’t work here. We should now have the Storage Accounts blade up:

Storage accounts in Azure Portal

Select Add to create a new storage account

Setting up a Storage account for SQL Server backup and restore

Select which subscription you want to use. You can reuse an existing resource group or create a new one. You need to give your new account a name, which must be unique in the *.blob.core.windows.net namespace. The rest of the settings you can leave along, unless you want to have a better spread of replication

Storage Account settings for SQL Server Backup and Restore

Under advanced you have a few more options. Enabling Secure transfer is a good idea, and is the default. If you need to restrict access to specific networks (if you’re using ExpressRoute for example), then you can specify them here.

Once you’ve clicked next you’ll have a chance to review your choices, and Azure will check they’re right as well.

Creating Storage container for SQL Server Backups and Restores

Click create and things will whir away in the Data Centre of your choice. After a minute or 2 your deployment will be complete.

Storage Account successfully created for SQL Server backup and restore

Creating a Blob container

Click on the Resource Name and you’ll be taken to your fresh new storage account

Storage Account successfully created for SQL Server backup and restore

Next thing we need to do is to create a container. Click on Blobs, and then on the ‘+ Container’ link

Creating Storage container for SQL Server Backups and Restores

Think of the container as a sort of folder, so give it a name. And as best practice don’t allow anonymous access unless you really want to share your backups with the world!

Access Key Authentication

Now, if you’re using the older Access key method of authenticating you’re almost done. On the Storage account main page select Access Keys under Settings:

Storage Account keys for SQL Server Backups and Restores

There you have your 2 secure access keys. Be very aware that these keys give anyone who has them TOTAL access to the storage account. There is nothing they can’t do once they have these keys. You can only have 2 active keys at any one time. You can generate a new key at any time, but it will invalidate the old one immediately. This is why you have 2 keys, so you can move over to the alternate key and then regenerate the original.

To create the credential you use the following T-SQL:

CREATE CREDENTIAL <CredentialName>
WITH IDENTITY= '<StorageAccountName>',  
SECRET = '<Secure Access Key>'

So in my example, this would look like:

CREATE CREDENTIAL dbatoolslab
WITH IDENTITY= 'dbatoolslab',  
SECRET = 'OPoyubf7gK+aIKDXm6NurzYWlv7tb2GvRb4e1qddq5E5k/2O6KcaQ9+hnRopbrJqT7eWG/h2gwKrrBgDpssPrw=='

You can call the credential whatever you want, but if you’ve multiple azure accounts then using the same name helps make sure you’re using the one you want.

Shared Access Signature authentication

For Shared Access Signature authentication we’ve another step to go. On the Overview page for you storage account, select the Shared Access Signature on the left hand side:

Shared Access Signature settings for SQL Server Backup and Restores

Here you can adjust the access that the SAS will grant to the holder. By default they are created with a life span of 8 hours which you can expand as you want. As you can generate as many SAS as you want you can create them with different permission sets for different access rights or for different services/offices via allowed IP addresses. When you hit generate you’ll be presented with:

Azure Shared Access Signatures for SQL Server Backups and Restores

The SAS token is the value you want. But without the leading ‘?’, that’s there for easy appending to a HTTP querystring, you want to start with ‘sv’

To create the SAS credential you use the following T-SQL

CREATE CREDENTIAL [https://<storageaccount>.blob.core.windows.net/<container>] 
WITH IDENTITY='SHARED ACCESS SIGNATURE' 
 , SECRET = 'SAS token' 

So in our example that would be:

CREATE CREDENTIAL [https://dbatoolslas.blob.core.windows.net/sql] 
WITH IDENTITY='SHARED ACCESS SIGNATURE' 
 , SECRET = 'sv=2018-03-28&amp;ss=b&amp;srt=c&amp;sp=rwdlac&amp;se=2019-04-03T17:20:25Z&amp;st=2019-04-03T09:20:25Z&amp;spr=https&amp;sig=PLpxNQCW%2FftHsC2NFgR3f4UUSIGGOtRRPLyLG5G90Ak%3D' 

Common mistakes when creating this type of credetial are:

  • leaving a trail space on the URL in the name
  • Not removing the ? at the start of the SAS token
  • Case insensitivity in the SAS token
  • Not setting the IDENTITY value correctly, it must be SHARED ACCESS SIGNATURE

And you’re done.

This is a bit long winded, so in the post I’m going to show the quicker way of doing this via PowerShell.

If you run into any problems or anything isn’t clear hit me up in the comments or on Twitter and I’ll give you a hand

Rolling Back SQL Server agent job owner changes

So in this post I showed you a technique for updating the owner on all your SQL Agent jobs.

Whilst this isn’t a major change, you do need to have checked that the new ownership account has the correct permissions or some thing (eg; EXECUTE AS or CMDEXEC steps). With this in mind someone asked me how I’d back this out. Either for all jobs or that one specific MUST NOT FAIL job that is failing.

As we have 2 ways to change them, we have 2 ways to back out the changes. Note that both of these require you to have done a small amount of work up front, there is no magic rollback option 🙁

PowerShell and dbatools

This one is a little simpler as we can use a lot of PowerShell plumbing to do the lifting for us. To set this up we use:

Get-DbaAgentJob -SqlInstance server1 | Select-Object Name, OwnerLoginName | ConvertTo-Csv | Out-File .\server1-JobOwners.csv

If you’re not a PowerShell user I’ll just break this down into steps:

  1. We user dbatools’ `Get-DbaAgentJob` cmdlet to get all the Agent jobs on Server1
  2. We pipe the ouput to `Select-Object`, we select just the job Name and it’s Owner’s login name
  3. The output is then converted to a CSV object
  4. Finally we write the date to a file in the current folder

So we’ve created a nice csv list of the original owners of our jobs, great. Now how do we use those to roll back?
To roll back every job:

Get-Content .\server1-JobOwners.csv | ConvertFrom-Csv | Foreach-Object {Set-DbaAgentJob -SqlInstance Server1 -Job $_.Name -OwnerLoginName $_.OwnerLoginName}

To rollback just one job it’s very similar, we just insert a Where-Object step into the pipeline:

Get-Content .\server1-JobOwners.csv | ConvertFrom-Csv | Where-Object -Name -eq 'My Job' | Foreach-Object {Set-DbaAgentJob -SqlInstance Server1 -Job $_.Name -OwnerLoginName $_.OwnerLoginName}

In both cases we read in our CSV, convert it from CSV into a PowerShell Object. For a single job we use Where-Object to filter down to that one job, you could also use like if you wanted to pattern match. The remain records then get piped through a ForEach-Object loop, where we use Set-DbaAgentJob to reset the jobs owner.

T-SQL

The pre setup for this one involves creating and populating a table to store the old values:

CREATE TABLE JobOwnerHistory(
  Job_Id char(36),
  JobName nvarchar(128),
  OwnerLoginName nvarchar(50)
)

INSERT INTO JobOwnerHistory
  SELECT 
    sj.job_id, 
    sj.JobName,
    sp.name as 'OwnerLoginName' 
  FROM 
    msdb..sysjobs sj 
    INNER JOIN sys.server_principals sp on sj.owner_sid=sp.sid

So now, resetting a job’s owner is just a modification of our original script:

DECLARE @job_id char(36)
DECLARE @OwnerLoginName varchar(50)
DECLARE JobCursor CURSOR
FOR
SELECT 
  job_id,
  OwnerLoginName
FROM
  JobOwnerHistory
--WHERE JobName LIKE '*Finance*'
OPEN JobCursor 
FETCH NEXT FROM JobCursor INTO @job_id, @OwnerLoginName
WHILE (@@FETCH_STATUS <> -1)
BEGIN
exec msdb..sp_update_job
    @job_name = @job_id,
    @owner_login_name = @OwnerLoginName
FETCH NEXT FROM JobCursor INTO @job_id, @OwnerLoginName
END 
CLOSE JobCursor 
DEALLOCATE JobCursor 

As written that will reset every job in JobOwnerHistory, if you want to filter down to a subset of tables you’d uncomment and modify the WHERE line

Hope those examples are helpful. If you’ve stumbled across this page and it doesn’t quite fix your problem, please drop a comment and I’ll try to lend a hand.

T-SQL Tuesday 104 – Code they won’t pry out of my hands

T-SQL TuesdayIt’s the 2nd Tuesday so time for a T-SQL Tuesday post. This month’s host Bert Wagner (b | t) posed the following topic for us:

For this month’s T-SQL Tuesday, I want you to write about code you’ve written that you would hate to live without

Off on slight tangent I’m actually going to write about the life of some code you’d never wrench out of my hands, just to show how learning to scratch a simple itch can lead to learning a lot more and getting involved with larger projects. And mainly because we didn’t do source control properly back in those days so the early code is lost to history (thankfully as it wasn’t pretty!)

About 6 years ago I needed to implement some SQL Server restore testing to keep our corporate auditors happy. I ran around with T-Sql for a while, but trying to find a solution that worked with all of our systems was a nightmare. So I dug out the PowerShell documentation and started reading up on SMO (that page looks a lot nicer than it did back in 2013). Soon I had a working piece of code, that with enough baling twine and candle wax could do what we wanted.

By now I’d got lots of code examples, so decided to turn them into 31 days worth of blog posts – 31 Days of SQL Server Backup and Restores with PowerShell to help others work out where to go and save them running into the same loops and holes I’d done. Also wrote a presentation and took it around the UK to Usergroups, SQL Saturdays and conferences.

Now we move onto the bit where I can start showing off some actual code

Roll on forward to 2016 and I start to update my scripts to get them a little more up to date. In the process I decided to transform them from a ragtag collection of scripts into a shiny PowerShell model, and so SqlAutoRestores came into being. The code in github is very much a work in progress. The basics worked but a lot of supporting stuff to cope with other people’s infrastructure was still needed.

Luckily I was asked to help with the dbatools project around then, mainly with the restore code. And the rest is history

So in 5 years my scrappy little bit of code has moved from this:

import-module "SQLPS" -DisableNameChecking
 
$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("Server1")
$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
 
$backupname = "C:\psbackups\psrestore.bak"
 
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($backupname,$devicetype)
 
$restore.Database = "psrestore"
$restore.ReplaceDatabase = $True
$restore.Devices.add($restoredevice)
$restore.sqlrestore($sqlsvr)

(Basic PS Restore)

to these:
(linking to source as these are a tad larger than earlier files)
Restore-DbaDatabase (697 lines)
Get-DbaBackupInformation (354 lines)
Select-DbaBackupInformation (172 lines)
Test-DbaBackupInformation (209 lines)
Invoke-DbaAdvancedRestore (367 lines)

All of which have been worked on by many hands. But I now have a restore solution that I use every day in my day job to reset development environments and restore check my backups still, and is in use around the world by DBAs who need a solid reliable restore solution (If you’ve done a migration with dbatools, guess what’s underneath that)

It’s probably my most used single piece of code, and not just because it’s running 24/7/365 testing! The muscle memory is now so ingrained that I can beat almost any other restore option to start.

The point of that story is what this T-SQL Tuesday is about, if you shout about some nifty code, or at least let someone know, there’s a good chance other people will want to use it or help you develop it. It won’t happen overnight, this story is about 5.5 years old and there’s still debates about where this code is going to go in the future. And it’s been great to see where this tiny project has eventually lead me with my career and community involvement over the years.

Bulk uploading CSVs to Azure SQL Database with dbatools

PowerShellLike most people we’re busy moving ourselves over to Azure, and like a lot of people (even though they won’t admit it) we’ve got years of data stashed away in CSV files. Go on, own up there’s years worth of department membership stashed away in a HR csv folder somewhere in your organisation 😉

To get some of this data usable for reporting we’re importing it into Azure SQL Database so people can start working their way through it, and we can fix up errors before we push it through into Azure Data Lake for mining. Being a fan of dbatools it was my first port of call for automating something like this.

Just to make life interesting, I want to add a time of creation field to the data to make tracking trends easier. As this information doesn’t actually exist in the CSV columns, I’m going to use LastWriteTime as a proxy for the creationtime.

$Files = Get-ChildItem \\server\HR\HandSTraining\Archive -Filter *.Csv
$SqlCredential = Get-Credential

ForEach ($File in $Files | Where-Object {$_.Length -gt 0}) {
    $InputObject = ConvertFrom-Csv -InputObject (Get-Content $File.fullname -raw) -Header UserName, StatusName
    $InputObject | Add-Member -MemberType NoteProperty -Value $File.LastWriteTime -Name DateAdded
    $DataTable = $InputObject | Out-DbaDataTable
    Write-DbaDataTable -SqlInstance superduper.database.windows.net -Database PreventPBI -Table Training -InputObject $DataTable -Schema dbo -SqlCredential $SqlCredential -RegularUser
    Remove-Variable InputObject
}

Working our way through that, we have:

$Files = Gci \\server\HR\HandSTraining\Archive -Filter *.Csv
$SqlCredential = Get-Credential

Setup the basics we’re going to need throughout. Grab all the csv files off of our network share. I prefer grabbing credentials with Get-Credential, but if you’d prefer to embed them in the script you can use:


We then ForEach through all the files, having filterer out the empty ones

    $InputObject = ConvertFrom-Csv -InputObject (Get-Content $File.fullname -raw) -Header UserName, StatusName
    $InputObject | Add-Member -MemberType NoteProperty -Value $File.LastWriteTime -Name DateAdded

Load the file contents into a object with ConverTo-Csv. These csv files don’t contain a header row so I’m use the -Header parameter to force them in. This also helps with Write-DbaDataTable as I can ensure that the object names match with the Sql column names for the upload

Then we add a new property to our Input Object. Doing it this way we add it to every ‘row’ in the object at once. If you want to add multiple new properties just keep doing this for each one.

    $DataTable = $InputObject | Out-DbaDataTable
    Write-DbaDataTable -SqlInstance superduperdb.database.windows.net -Database HealthAndSafety -Table Training -InputObject $DataTable -Schema dbo -SqlCredential $SqlCredential -RegularUser

Convert our InputObject into a datatable, which is the format Write-DbaDataTable needs for input.

And then the command that does the loading, Write-DbaDataTable. There are only things here that you have to do differently for loading to an Azure SQL database as opposed to a normal SQL Server instance. For Azure SQL Databases you have to use a SQL Credential as the underlying dlls don’t work (yet) with the various Integrate Authentication options. You need to use the RegularUser switch. Normally dbatools will assume you have sysadmin rights on your SQL Server instance as they are needed for many of the tasks. In an Azure SQL Database you can’t have those rights as they don’t exists, so without Regular user you’ll get a nice error message. Just something to look out for, I’ve tripped myself up in the past when repointing load scripts.

Then we drop InputObject and go round the loop again until we’re finished.

Easy and very quick, and now I can just point PowerBI at it and let the users and analysts work out what they want to do with it.

Complex SQL Server restore scenarios with the dbatools Restore-DbaDatabase pipeline

dbatools logoNo matter how hard the dbatools team try, there’s always someone who wants to do things we’d never thought. This is one of the great things with getting feedback direct from a great community. Unfortunately a lot of these ideas are either too niche to implement, or would be a lot of complex code for a single use case

As part of the Restore-DbaDatabase stack rewrite, I wanted to do make things easier for users to be able to get their hands dirty within the Restore stack. Not necessarily needing to dive into the core code and the world of Github Pull Requests, but by manipulating the data flowing through the pipeline using standard PowerShell techniques All the while being able to do the heavy listing with out code.

So, below the fold we’ll be looking at some examples of how you can start going to town with your restores

Read More

Powered by WordPress & Theme by Anders Norén