Musings of a Data professional

Stuart Moore

Category: Azure

Nottingham Global Azure Bootcamp 2019

For the last couple of years Microsoft has been encouraging communities to run a 1 day Azure focused Bootcamp event on the same day all around the world.

For 2019 we’re pleased to announce that Nottingham will be hosting one of these events for the first time on the 27th April

Sessions and speaker information are available on the website at http://www.nottsdevworkshop.co.uk/azurebootcamp/. Registration is free, so please grab a ticket at Eventbrite

If you want to find out more about the SQL Server options available on Azure, how a company used Azure to produce PDFs on a massive scale or want to get hands on with Service Fabric then we’ve got sessions and workshops for you.

Creating Azure Blob storage account for SQL Server backup and restore with PowerShell

In a previous post I looked at Creating Azure Blob storage account for SQL Server backup and restore via the portal. In this post I’m going to work through setting up the thing but using PowerShell. To me this is the simpler and easier way to do things. Most of it can be left running in the background while you drink coffee or read twitter, work smarter not harder 😉

Prequisites

For these examples I’m going to assume you’ve installed the up to date versions of these 2 PowerShell modules:

So no we’ll step through the PowerShell to setup the Azure Storage. I’ll post a screen shot after each block of code so you can see what you should expect to see. This is running on PowerShell 6.1.1 on MacOS (ain’t cross platform compatibility grand!)

TLDR; – I’ve uploaded the whole script to github in this gist – Setting up an Azure Blob container for SQL Server Backups and Restores

Setup

Import-Module dbatools
Import-Module Az

Connect-AzAccount
# Currently returns the wrong URL, use https://aka.ms/DeviceLogon instead

Subscription Check

Make sure you’re in the right subscription, here I’ve logged into my Visual Studio subscription and want to use my MVP Credits one:

#Check you're in the correct subscription
Get-AzContext
Get-AzContext -ListAvailable | Where-Object {$_.Name -Like 'MVP*'} | Set-AzContext
Get-AzContext

Check what’s there

#check existing Storage Accounts
Get-AzStorageAccount  

Create the account

Here I’m using PowerShell splatting which is a great technique to avoid stupidly long parameter strings

I’m creating this storage account with the same settings as the one I created in the post Creating Azure Blob Storage for SQL Server Backups and Restores via the Azure Portal

#we're going to use splatting
$NewStorageAccount = @{
   ResourceGroupName  = "dbatools"
   AccountName = "azblogdemo"
   Location = "Uk South"
   SKUName = "Standard_GRS"
   Kind = "StorageV2"
   AccessTier = "Hot"
   EnableHttpsTrafficOnly = $true
}
#Create the Storage Account
New-AzStorageAccount @NewStorageAccount

Getting the Access Keys

If you want to use the older Access Keys method to authenticate you’re pretty much done at this session. If you’re on a version of SQL Server that supports the Shared Access Signature method (2016 onwards), then it’s a good idea to move over to that method as it’s more secure and performs better

$AzStorageKeys = Get-AzStorageAccountKey -ResourceGroupName $NewStorageAccount.ResourceGroupName -Name $NewStorageAccount.AccountName
$AzStorageKeys
$Sql = "CREATE CREDENTIAL [$($NewStorageAccount.AccountName)] WITH IDENTITY='$($NewStorageAccount.AccountName)', SECRET = '$($AzStorageKeys[0].Value)'" 
$Sql
Invoke-DbaSql -SqlInstance server1 -Database master -Query $Sql

Onwards to Shared Access Signatures

Grab a context so we can pass it in to other cmdlets:

$AzStorageContext = New-AzStorageContext -StorageAccountName $NewStorageAccount.AccountName -StorageAccountKey $AzStorageKeys[0].Value

Now we create a new container on the Storage Context

#Create a blob container called SQL
$AzStorageContainer = New-AzStorageContainer -Context $AzStorageContext -Name sql

Create a new Shared Access Policy to the context we’ve just created


#Create a Shared Access Policy giving (r)ead, (w)rite, (l)ist and (d)elete permissions for 1 year from now
$SharedAccessPolicy = @{
    Context = $AzStorageContext
    Policy = $AzStorageContext.StorageAccountName+"Policy2"
    Container = "sql"
    ExpiryTime = (Get-Date).ToUniversalTime().AddYears(1)
    Permission = "rwld"
}
$AzSharedAccessPolicy = New-AzStorageContainerStoredAccessPolicy @SharedAccessPolicy

Now get the token for the Shared Access Policy, this is the actual ‘credential’ we need.

#Get the Shared Access Token
$AzSas = New-AzStorageContainerSASToken -Policy $SharedAccessPolicy.Policy -Context $AzStorageContext -Container sql

Grab the full URL to the sql container

#We need the URL to the blob storage container we've created:
$Url = $AzStorageContainer.CloudBlobContainer.uri.AbsoluteUri

And then a little bit of string manipulation to build our Query to create the credential

#The SASToken is prefixed with a '?' to make it easy to append to a HTTP querystring, but we don't need it, so use substring(1) to drop it
$AzSas
$SasSql = "CREATE CREDENTIAL [$Url] WITH IDENTITY='SHARED ACCESS SIGNATURE', SECRET='$($AzSas.SubString(1))'"
Invoke-DbaQuery -SqlInstance localhost -Database Master -Query $SasSql

And since we’re using dbatools to create the credentials, it’s very simple to do something like:

Invoke-DbaSqlQuery -SqlInstance server1,server2,server3 -Database Master -Query $SasSql

And then you have the credential on a number of SQL Server instances with no extra effort

The next blog in this short series will have some example of how you can then use these credentials to backup and restore via the dbatools commands

To save you piecing it togther I’ve uploaded the whole script to github in this gist – Setting up an Azure Blob container for SQL Server Backups and Restores

If you have any questions then please comment below or ping me on twitter

Creating Azure Blob storage account for SQL Server backup and restore via the portal

Since SQL Server 2012 SP1 you’ve been able to backup and restore SQL Server using Azure Blob storage. This can be useful to have offsite replicated backups or if you need to move data securely between sites, and now with Azure hosted SQL Server (of various stripes) it makes it easy to get data into those systems.

I’m just going to run through creating an Azure Blob Storage account for use with SQL Server using the Azure Portal. Another post will walk through doing the same thing using PowerShell. I normally use the PowerShell versions (no surprise there), but sometimes the graphical version is good way of understanding what’s going on under the hood.

First up, there are 2 ways that you authenticate your SQL Server with Azure storage:

Storage Account Identity with an Access Key

This is the original method introduced with SQL Server 2012, but it has been deprecated since SQL Server 2016. So it will still work, but will disappear at a time of Microsoft’s choosing.

With this method you’re handing over your Access Key, which gives the end service full rights over your storage account. As you can only have 2 active at any time, it can be hard to revoke a key if it’s being misused

Shared Access Signature

This is the more modern way of granting access to your storage. You can create as many signatures as you want, restricting to them only accessing parts of your storage account, different permissions, different expiry dates and they can be revoked individually.

So you could create one SAS for backup with create/write permissions and another for restore with read permissions to keep everything lined up.

Creating the Storage Account

Go into the Azure Portal, and find Storage Accounts (I prefer to start typing in the search box):

Ignore the ‘Classic’ option as that won’t work here. We should now have the Storage Accounts blade up:

Storage accounts in Azure Portal

Select Add to create a new storage account

Setting up a Storage account for SQL Server backup and restore

Select which subscription you want to use. You can reuse an existing resource group or create a new one. You need to give your new account a name, which must be unique in the *.blob.core.windows.net namespace. The rest of the settings you can leave along, unless you want to have a better spread of replication

Storage Account settings for SQL Server Backup and Restore

Under advanced you have a few more options. Enabling Secure transfer is a good idea, and is the default. If you need to restrict access to specific networks (if you’re using ExpressRoute for example), then you can specify them here.

Once you’ve clicked next you’ll have a chance to review your choices, and Azure will check they’re right as well.

Creating Storage container for SQL Server Backups and Restores

Click create and things will whir away in the Data Centre of your choice. After a minute or 2 your deployment will be complete.

Storage Account successfully created for SQL Server backup and restore

Creating a Blob container

Click on the Resource Name and you’ll be taken to your fresh new storage account

Storage Account successfully created for SQL Server backup and restore

Next thing we need to do is to create a container. Click on Blobs, and then on the ‘+ Container’ link

Creating Storage container for SQL Server Backups and Restores

Think of the container as a sort of folder, so give it a name. And as best practice don’t allow anonymous access unless you really want to share your backups with the world!

Access Key Authentication

Now, if you’re using the older Access key method of authenticating you’re almost done. On the Storage account main page select Access Keys under Settings:

Storage Account keys for SQL Server Backups and Restores

There you have your 2 secure access keys. Be very aware that these keys give anyone who has them TOTAL access to the storage account. There is nothing they can’t do once they have these keys. You can only have 2 active keys at any one time. You can generate a new key at any time, but it will invalidate the old one immediately. This is why you have 2 keys, so you can move over to the alternate key and then regenerate the original.

To create the credential you use the following T-SQL:

CREATE CREDENTIAL <CredentialName>
WITH IDENTITY= '<StorageAccountName>',  
SECRET = '<Secure Access Key>'

So in my example, this would look like:

CREATE CREDENTIAL dbatoolslab
WITH IDENTITY= 'dbatoolslab',  
SECRET = 'OPoyubf7gK+aIKDXm6NurzYWlv7tb2GvRb4e1qddq5E5k/2O6KcaQ9+hnRopbrJqT7eWG/h2gwKrrBgDpssPrw=='

You can call the credential whatever you want, but if you’ve multiple azure accounts then using the same name helps make sure you’re using the one you want.

Shared Access Signature authentication

For Shared Access Signature authentication we’ve another step to go. On the Overview page for you storage account, select the Shared Access Signature on the left hand side:

Shared Access Signature settings for SQL Server Backup and Restores

Here you can adjust the access that the SAS will grant to the holder. By default they are created with a life span of 8 hours which you can expand as you want. As you can generate as many SAS as you want you can create them with different permission sets for different access rights or for different services/offices via allowed IP addresses. When you hit generate you’ll be presented with:

Azure Shared Access Signatures for SQL Server Backups and Restores

The SAS token is the value you want. But without the leading ‘?’, that’s there for easy appending to a HTTP querystring, you want to start with ‘sv’

To create the SAS credential you use the following T-SQL

CREATE CREDENTIAL [https://<storageaccount>.blob.core.windows.net/<container>] 
WITH IDENTITY='SHARED ACCESS SIGNATURE' 
 , SECRET = 'SAS token' 

So in our example that would be:

CREATE CREDENTIAL [https://dbatoolslas.blob.core.windows.net/sql] 
WITH IDENTITY='SHARED ACCESS SIGNATURE' 
 , SECRET = 'sv=2018-03-28&ss=b&srt=c&sp=rwdlac&se=2019-04-03T17:20:25Z&st=2019-04-03T09:20:25Z&spr=https&sig=PLpxNQCW%2FftHsC2NFgR3f4UUSIGGOtRRPLyLG5G90Ak%3D' 

Common mistakes when creating this type of credetial are:

  • leaving a trail space on the URL in the name
  • Not removing the ? at the start of the SAS token
  • Case insensitivity in the SAS token
  • Not setting the IDENTITY value correctly, it must be SHARED ACCESS SIGNATURE

And you’re done.

This is a bit long winded, so in the post I’m going to show the quicker way of doing this via PowerShell.

If you run into any problems or anything isn’t clear hit me up in the comments or on Twitter and I’ll give you a hand

azure cli error ‘NoneType’ object has no attribute ‘server_farm_id’

Been working on a pipeline that spins up Azure WebApp instances as needed for a development process. Trying to bind a custom cert to the instance using:

thumbprint=$(az webapp config ssl upload \
    --name stuartstestapp \
    --resource-group stuartstestapp  \
    --certificate-file ./stuartstestapp.pfx \
    --certificate-password '6ulY$KiD4@iEBQ3kG#8!nqz' \
    --query thumbprint \
    --output tsv)

And was getting back this error:

'NoneType' object has no attribute 'server_farm_id'
Traceback (most recent call last):
  File "/Users/stuart/lib/azure-cli/lib/python2.7/site-packages/knack/cli.py", line 206, in invoke
    cmd_result = self.invocation.execute(args)
  File "/Users/stuart/lib/azure-cli/lib/python2.7/site-packages/azure/cli/core/commands/__init__.py", line 343, in execute
    cmd.exception_handler(ex)
  File "/Users/stuart/lib/azure-cli/lib/python2.7/site-packages/azure/cli/command_modules/appservice/commands.py", line 54, in _polish_bad_errors
    raise ex
AttributeError: 'NoneType' object has no attribute 'server_farm_id'

Which appears all over the shop in search results, but without many actual solutions.

In my case the issue was that the wrong one of my Azure subscriptions was selected, for once I wasn’t testing stuff in my default subscription. So adding a quick line to the start of my script:

az account set --subscription WebAppTest

fixed it right up.

Get your Azure Subscription ID from the Azure portal

Normally getting an Azure Subscription ID isn’t a problem for me. Normally I’m sat at a machine with the PowerShell ARM modules already loaded or the Azure-Cli a docker spin up away. But not this time, just the raw Azure Portal. Despite it trying to hide them from you it’s quite easy to find the Azure Subscription IDs once you know where to look.

Login to the Azure Portal as normal. Then start typing Subscriptions into the search bar:

Click on the subscription service and a nice list of all your available Azure Subscriptions along with their IDs will pop up:

Just copy and paste the Azure Subscription ID you want and you’re done. Hope that’s useful

Granting Azure Roles to Users or groups without a SignInName in Azure AD

Whilst sorting some permissions out on a resource group I stumbled across an issue with assigning them to a group. We have a replicate On Premise and Azure Active Directory solution in place. In the On Premise AD not all groups are Distribution groups so don’t have a ‘group1@contoso.com’ sign in. This causes a problem when trying to assign a role in the usual PowerShell way:

New-AzureRmRoleAssignment -SignInName 'group1@contoso.com' -RoleDefinitionName Contributor -ResourceGroupName MyRg

or

New-AzureRmRoleAssignment -SignInName 'group1' -RoleDefinitionName Contributor -ResourceGroupName MyRg

both return a nice red error message:

'New-AzureRmRoleAssignment : The provided information does not map to an AD object id.
At line:1 char:1
+ New-AzureRmRoleAssignment -SignInName group1@contoso.com -RoleD ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : CloseError: (:) [New-AzureRmRoleAssignment], KeyNotFoundException
    + FullyQualifiedErrorId : Microsoft.Azure.Commands.Resources.NewAzureRoleAssignmentCommand'

 

The workaround for this is to use the ObjectID parameter, and to grab the ObjectId like this:

$objectid = (Get-AzureRmADGroup -DisplayName "Our Developers").Id
New-AzureRmRoleAssignment -ObjectId $objectid -RoleDefinitionName Contributor -ResourceGroupName MyRg

Just another one of those little oddities that crop up when you’ve got a long established infrastructure that doesn’t quite fit the modern templates 😉

Bulk uploading CSVs to Azure SQL Database with dbatools

PowerShellLike most people we’re busy moving ourselves over to Azure, and like a lot of people (even though they won’t admit it) we’ve got years of data stashed away in CSV files. Go on, own up there’s years worth of department membership stashed away in a HR csv folder somewhere in your organisation 😉

To get some of this data usable for reporting we’re importing it into Azure SQL Database so people can start working their way through it, and we can fix up errors before we push it through into Azure Data Lake for mining. Being a fan of dbatools it was my first port of call for automating something like this.

Just to make life interesting, I want to add a time of creation field to the data to make tracking trends easier. As this information doesn’t actually exist in the CSV columns, I’m going to use LastWriteTime as a proxy for the creationtime.

$Files = Get-ChildItem \\server\HR\HandSTraining\Archive -Filter *.Csv
$SqlCredential = Get-Credential

ForEach ($File in $Files | Where-Object {$_.Length -gt 0}) {
    $InputObject = ConvertFrom-Csv -InputObject (Get-Content $File.fullname -raw) -Header UserName, StatusName
    $InputObject | Add-Member -MemberType NoteProperty -Value $File.LastWriteTime -Name DateAdded
    $DataTable = $InputObject | Out-DbaDataTable
    Write-DbaDataTable -SqlInstance superduper.database.windows.net -Database PreventPBI -Table Training -InputObject $DataTable -Schema dbo -SqlCredential $SqlCredential -RegularUser
    Remove-Variable InputObject
}

Working our way through that, we have:

$Files = Gci \\server\HR\HandSTraining\Archive -Filter *.Csv
$SqlCredential = Get-Credential

Setup the basics we’re going to need throughout. Grab all the csv files off of our network share. I prefer grabbing credentials with Get-Credential, but if you’d prefer to embed them in the script you can use:


We then ForEach through all the files, having filterer out the empty ones

    $InputObject = ConvertFrom-Csv -InputObject (Get-Content $File.fullname -raw) -Header UserName, StatusName
    $InputObject | Add-Member -MemberType NoteProperty -Value $File.LastWriteTime -Name DateAdded

Load the file contents into a object with ConverTo-Csv. These csv files don’t contain a header row so I’m use the -Header parameter to force them in. This also helps with Write-DbaDataTable as I can ensure that the object names match with the Sql column names for the upload

Then we add a new property to our Input Object. Doing it this way we add it to every ‘row’ in the object at once. If you want to add multiple new properties just keep doing this for each one.

    $DataTable = $InputObject | Out-DbaDataTable
    Write-DbaDataTable -SqlInstance superduperdb.database.windows.net -Database HealthAndSafety -Table Training -InputObject $DataTable -Schema dbo -SqlCredential $SqlCredential -RegularUser

Convert our InputObject into a datatable, which is the format Write-DbaDataTable needs for input.

And then the command that does the loading, Write-DbaDataTable. There are only things here that you have to do differently for loading to an Azure SQL database as opposed to a normal SQL Server instance. For Azure SQL Databases you have to use a SQL Credential as the underlying dlls don’t work (yet) with the various Integrate Authentication options. You need to use the RegularUser switch. Normally dbatools will assume you have sysadmin rights on your SQL Server instance as they are needed for many of the tasks. In an Azure SQL Database you can’t have those rights as they don’t exists, so without Regular user you’ll get a nice error message. Just something to look out for, I’ve tripped myself up in the past when repointing load scripts.

Then we drop InputObject and go round the loop again until we’re finished.

Easy and very quick, and now I can just point PowerBI at it and let the users and analysts work out what they want to do with it.

New-AzureRmAutomationCredential returning “Input string was not in a correct format”

Having banged my head against this misleading error message, I thought I’d post this.

it’s nothing to do with any of the parameters you’re passing in, believe me I checked.

It’s actually complaining that the Credential already exists! I was doing a redeployment and the credential hadn’t been removed.

And in other useful annoyances, Find-AzureRmResource doesn’t work for Automation Credentials. So now my deployment scripts looks like this:

if ((Get-AzureRmAutomationCredential -ResourceGroupName $rgname  -name "meadmin" -AutomationAccountName $AutoName)){
    Remove-AzureRmAutomationCredential -ResourceGroupName $rgname  -name "meadmin" -AutomationAccountName $AutoName 
  }

New-AzureRmAutomationCredential -AutomationAccountName $autoname -ResourceGroupName $rgname -name "meadmin" -Value $acred -Verbose

Powered by WordPress & Theme by Anders Norén