For the last couple of years Microsoft has been encouraging communities to run a 1 day Azure focused Bootcamp event on the same day all around the world.
For 2019 we’re pleased to announce that Nottingham will be hosting one of these events for the first time on the 27th April
If you want to find out more about the SQL Server options available on Azure, how a company used Azure to produce PDFs on a massive scale or want to get hands on with Service Fabric then we’ve got sessions and workshops for you.
In a previous post I looked at Creating Azure Blob storage account for SQL Server backup and restore via the portal. In this post I’m going to work through setting up the thing but using PowerShell. To me this is the simpler and easier way to do things. Most of it can be left running in the background while you drink coffee or read twitter, work smarter not harder 😉
Prequisites
For these examples I’m going to assume you’ve installed the up to date versions of these 2 PowerShell modules:
So no we’ll step through the PowerShell to setup the Azure Storage. I’ll post a screen shot after each block of code so you can see what you should expect to see. This is running on PowerShell 6.1.1 on MacOS (ain’t cross platform compatibility grand!)
If you want to use the older Access Keys method to authenticate you’re pretty much done at this session. If you’re on a version of SQL Server that supports the Shared Access Signature method (2016 onwards), then it’s a good idea to move over to that method as it’s more secure and performs better
#We need the URL to the blob storage container we've created:
$Url = $AzStorageContainer.CloudBlobContainer.uri.AbsoluteUri
And then a little bit of string manipulation to build our Query to create the credential
#The SASToken is prefixed with a '?' to make it easy to append to a HTTP querystring, but we don't need it, so use substring(1) to drop it
$AzSas
$SasSql = "CREATE CREDENTIAL [$Url] WITH IDENTITY='SHARED ACCESS SIGNATURE', SECRET='$($AzSas.SubString(1))'"
Invoke-DbaQuery -SqlInstance localhost -Database Master -Query $SasSql
And since we’re using dbatools to create the credentials, it’s very simple to do something like:
Since SQL Server 2012 SP1 you’ve been able to backup and restore SQL Server using Azure Blob storage. This can be useful to have offsite replicated backups or if you need to move data securely between sites, and now with Azure hosted SQL Server (of various stripes) it makes it easy to get data into those systems.
I’m just going to run through creating an Azure Blob Storage account for use with SQL Server using the Azure Portal. Another post will walk through doing the same thing using PowerShell. I normally use the PowerShell versions (no surprise there), but sometimes the graphical version is good way of understanding what’s going on under the hood.
First up, there are 2 ways that you authenticate your SQL Server with Azure storage:
Storage Account Identity with an Access Key
This is the original method introduced with SQL Server 2012, but it has been deprecated since SQL Server 2016. So it will still work, but will disappear at a time of Microsoft’s choosing.
With this method you’re handing over your Access Key, which gives the end service full rights over your storage account. As you can only have 2 active at any time, it can be hard to revoke a key if it’s being misused
Shared Access Signature
This is the more modern way of granting access to your storage. You can create as many signatures as you want, restricting to them only accessing parts of your storage account, different permissions, different expiry dates and they can be revoked individually.
So you could create one SAS for backup with create/write permissions and another for restore with read permissions to keep everything lined up.
Creating the Storage Account
Go into the Azure Portal, and find Storage Accounts (I prefer to start typing in the search box):
Ignore the ‘Classic’ option as that won’t work here. We should now have the Storage Accounts blade up:
Select Add to create a new storage account
Select which subscription you want to use. You can reuse an existing resource group or create a new one. You need to give your new account a name, which must be unique in the *.blob.core.windows.net namespace. The rest of the settings you can leave along, unless you want to have a better spread of replication
Under advanced you have a few more options. Enabling Secure transfer is a good idea, and is the default. If you need to restrict access to specific networks (if you’re using ExpressRoute for example), then you can specify them here.
Once you’ve clicked next you’ll have a chance to review your choices, and Azure will check they’re right as well.
Click create and things will whir away in the Data Centre of your choice. After a minute or 2 your deployment will be complete.
Creating a Blob container
Click on the Resource Name and you’ll be taken to your fresh new storage account
Next thing we need to do is to create a container. Click on Blobs, and then on the ‘+ Container’ link
Think of the container as a sort of folder, so give it a name. And as best practice don’t allow anonymous access unless you really want to share your backups with the world!
Access Key Authentication
Now, if you’re using the older Access key method of authenticating you’re almost done. On the Storage account main page select Access Keys under Settings:
There you have your 2 secure access keys. Be very aware that these keys give anyone who has them TOTAL access to the storage account. There is nothing they can’t do once they have these keys. You can only have 2 active keys at any one time. You can generate a new key at any time, but it will invalidate the old one immediately. This is why you have 2 keys, so you can move over to the alternate key and then regenerate the original.
To create the credential you use the following T-SQL:
CREATE CREDENTIAL dbatoolslab
WITH IDENTITY= 'dbatoolslab',
SECRET = 'OPoyubf7gK+aIKDXm6NurzYWlv7tb2GvRb4e1qddq5E5k/2O6KcaQ9+hnRopbrJqT7eWG/h2gwKrrBgDpssPrw=='
You can call the credential whatever you want, but if you’ve multiple azure accounts then using the same name helps make sure you’re using the one you want.
Shared Access Signature authentication
For Shared Access Signature authentication we’ve another step to go. On the Overview page for you storage account, select the Shared Access Signature on the left hand side:
Here you can adjust the access that the SAS will grant to the holder. By default they are created with a life span of 8 hours which you can expand as you want. As you can generate as many SAS as you want you can create them with different permission sets for different access rights or for different services/offices via allowed IP addresses. When you hit generate you’ll be presented with:
The SAS token is the value you want. But without the leading ‘?’, that’s there for easy appending to a HTTP querystring, you want to start with ‘sv’
To create the SAS credential you use the following T-SQL
Been working on a pipeline that spins up Azure WebApp instances as needed for a development process. Trying to bind a custom cert to the instance using:
'NoneType' object has no attribute 'server_farm_id'
Traceback (most recent call last):
File "/Users/stuart/lib/azure-cli/lib/python2.7/site-packages/knack/cli.py", line 206, in invoke
cmd_result = self.invocation.execute(args)
File "/Users/stuart/lib/azure-cli/lib/python2.7/site-packages/azure/cli/core/commands/__init__.py", line 343, in execute
cmd.exception_handler(ex)
File "/Users/stuart/lib/azure-cli/lib/python2.7/site-packages/azure/cli/command_modules/appservice/commands.py", line 54, in _polish_bad_errors
raise ex
AttributeError: 'NoneType' object has no attribute 'server_farm_id'
Which appears all over the shop in search results, but without many actual solutions.
In my case the issue was that the wrong one of my Azure subscriptions was selected, for once I wasn’t testing stuff in my default subscription. So adding a quick line to the start of my script:
Normally getting an Azure Subscription ID isn’t a problem for me. Normally I’m sat at a machine with the PowerShell ARM modules already loaded or the Azure-Cli a docker spin up away. But not this time, just the raw Azure Portal. Despite it trying to hide them from you it’s quite easy to find the Azure Subscription IDs once you know where to look.
Login to the Azure Portal as normal. Then start typing Subscriptions into the search bar:
Click on the subscription service and a nice list of all your available Azure Subscriptions along with their IDs will pop up:
Just copy and paste the Azure Subscription ID you want and you’re done. Hope that’s useful
Whilst sorting some permissions out on a resource group I stumbled across an issue with assigning them to a group. We have a replicate On Premise and Azure Active Directory solution in place. In the On Premise AD not all groups are Distribution groups so don’t have a ‘group1@contoso.com’ sign in. This causes a problem when trying to assign a role in the usual PowerShell way:
'New-AzureRmRoleAssignment : The provided information does not map to an AD object id.
At line:1 char:1
+ New-AzureRmRoleAssignment -SignInName group1@contoso.com -RoleD ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [New-AzureRmRoleAssignment], KeyNotFoundException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.Resources.NewAzureRoleAssignmentCommand'
Â
The workaround for this is to use the ObjectID parameter, and to grab the ObjectId like this:
Just another one of those little oddities that crop up when you’ve got a long established infrastructure that doesn’t quite fit the modern templates 😉
Like most people we’re busy moving ourselves over to Azure, and like a lot of people (even though they won’t admit it) we’ve got years of data stashed away in CSV files. Go on, own up there’s years worth of department membership stashed away in a HR csv folder somewhere in your organisation 😉
To get some of this data usable for reporting we’re importing it into Azure SQL Database so people can start working their way through it, and we can fix up errors before we push it through into Azure Data Lake for mining. Being a fan of dbatools it was my first port of call for automating something like this.
Just to make life interesting, I want to add a time of creation field to the data to make tracking trends easier. As this information doesn’t actually exist in the CSV columns, I’m going to use LastWriteTime as a proxy for the creationtime.
Setup the basics we’re going to need throughout. Grab all the csv files off of our network share. I prefer grabbing credentials with Get-Credential, but if you’d prefer to embed them in the script you can use:
We then ForEach through all the files, having filterer out the empty ones
Load the file contents into a object with ConverTo-Csv. These csv files don’t contain a header row so I’m use the -Header parameter to force them in. This also helps with Write-DbaDataTable as I can ensure that the object names match with the Sql column names for the upload
Then we add a new property to our Input Object. Doing it this way we add it to every ‘row’ in the object at once. If you want to add multiple new properties just keep doing this for each one.
Convert our InputObject into a datatable, which is the format Write-DbaDataTable needs for input.
And then the command that does the loading, Write-DbaDataTable. There are only things here that you have to do differently for loading to an Azure SQL database as opposed to a normal SQL Server instance. For Azure SQL Databases you have to use a SQL Credential as the underlying dlls don’t work (yet) with the various Integrate Authentication options. You need to use the RegularUser switch. Normally dbatools will assume you have sysadmin rights on your SQL Server instance as they are needed for many of the tasks. In an Azure SQL Database you can’t have those rights as they don’t exists, so without Regular user you’ll get a nice error message. Just something to look out for, I’ve tripped myself up in the past when repointing load scripts.
Then we drop InputObject and go round the loop again until we’re finished.
Easy and very quick, and now I can just point PowerBI at it and let the users and analysts work out what they want to do with it.