Musings of a Data professional

Stuart Moore

Author: Stuart Moore Page 3 of 18

Nottingham based SQL Server DBA who dabbles with many other technologies. 15+ years of experience with databases and still keen to learn and explore.

Day 8 – Why is Restore-DbaDatabase like it is? – 31 days of dbatools backup and restores

Why is Restore-DbaDatabase like it is?

At first glance Restore-DbaDatabase looks like a slow lumberig complex beast. In reality it’s not that bad.

It’s the result of design decisions I took in wanting a solid versatile command that could cope with everything that people would want from it.

In this post, we’ll go through the main decisions/points of contention one by one

Day 7 – Backup up to Azure – 31 days of dbatools backup and restore

Day 7 – Backing up to Azure

Backing up to Azure? Why would you want to do that if you’re not running Azure SQL Database/Managed Instance?

There’s a couple of reasons, which you may or may not agree with:

  • Reliable storage
  • Off site storage
  • Scalable storage
  • Cheap(ish) storage

All of those of course are debatable depdending on your circumstances. But if you’re starting from scratch and don’t already have a on premises storage infrastructure then it’s quicker to start there and build up. And if you’re running SQL Server inside an Azure VM do you want to be piping data back down to the home network or paying for another VM to act as a backup target?

The main reason I like Azure storage when I can use it is number 2 – Offsite storage.

My backups will survive a major incident, say a failed Data Centre or a major fire. OK, those could be mediated with a secondary Data Centre, but that’s an extra level of expense that your organisation might not be able to justify.

And there are disasters that may affect all storage in your organisation. Ransomware being the one that springs to mind. If your SQL Server backups get encrypted by someone other than yourself, then they are worthless. You have NO way of recovering without paying the money. It has happened!

Companies who think they have good security have found that ransomware can still propogate onto their backup servers as the software is written to exploit any weakness in NTLM or SMB it can fine. However, I’ve yet head of one that can break into SQL Server and get hold of a credential which is what would need to happen for it to even know about SQL Server backups stored in Azure.

And if you’re really paranoid (you’re probably a DBA, so there’s a good chance) you can even use an Azure Function to move all new backups to another Azure Storage account when they finish, so then theres no tracable link anywhere on premises.

Setting up Azure for SQL Server Backups

Now, I’m going to cheat here a little point you at a post I’ve already made that covers setting up Azure Blob storage for SQL Server Backups

If you want to use the Azure Portal – Creating Azure Blob Storage Account for SQL Server Backup and Restore, using the Azure Portal

If you want to use PowerShell – Creating Azure Blob Storage Account for SQL Server Backup and Restore, using PowerShell

As I say in the above posts, I’d really really recommend you go with Shared Access Signatures. Better security granularity, easier to work with and they’re the future.

Once you’ve created your Azure Blob storage account and the SQL Server credential, you can use Backup-DbaDatabase to backup to Azure.

When backing up to Azure you are limited to a subset of parameters, this purely because Azure Blob storage won’t support things like MaxTransferSize. The allowed paramters are:

  • CopyOnly
  • Type
  • CompressBackup
  • Checksum
  • Verify
  • AzureCredential
  • ReplaceInName
  • CreateFolder
  • OutputScriptOnly
  • EncrytionAlgorithm
  • EncryptionCertificate

When using a Shared Access key you can also use “FileCount”

Backing up to Azure with Backup-DbaDatabase

There are 2 ways the command can be used to backup to Azure, depending on whether you’re using Access Keys or Shared Access Signatures.

With Access keys you use this format:

    Backup-DbaDatabase -SqlInstance server1\instance -Database db1 -AzureBaseUrl https://dbatoolsaz.blob.core.windows.net/azbackups/ -AzureCredential myAzureCred

Where myAzureCred is the name of the credential on server1\instance that holds the Azure Access keys. When using the older Access Keys style you cannot stripe backups, so can only provide one URL

With a Shared Access Signature you use this format:

    Backup-DbaDatabase -SqlInstance server1\instance -Database db1 -AzureBaseUrl https://dbatoolsaz.blob.core.windows.net/azbackups/

This will work as Backup-DbaDatabase will look for a credential named ‘https://dbatoolsaz.blob.core.windows.net/azbackups/’ on instance ‘server1\instance’

With Shared Access Signature credentials you can stripe files, either within the same container:

    Backup-DbaDatabase -SqlInstance server1\instance -Database db1 -AzureBaseUrl https://dbatoolsaz.blob.core.windows.net/azbackups/ -FileCount 3

or by specifying multiple AzureBaseUrl values:

    Backup-DbaDatabase -SqlInstance server1\instance -Database db1 -AzureBaseUrl https://dbatoolsaz.blob.core.windows.net/azbackups/,https://azdbatools.blob.core.windows.net/azbackups/ 

Assuming you have valid ‘https://dbatoolsaz.blob.core.windows.net/azbackups/’ and ‘https://azdbatools.blob.core.windows.net/azbackups/’ in the SQL instance performing the backup.

Conclusion

Backing up your SQL instances to Azure can be a good option if your organisation is just starting out, and by having backups off site takes some of the risks during a Disaster Recovery scenario can be minimised. Plus it’s a great way to migrate database up to Azure, in fact it’s Copy-DbaDatabase works.

For more ideas on how to use dbatools to migrate SQL Server databases to Azure have a look at the book dbatools in a month of lunches

All posts in this series can be found at 31 Days of dbatools Backup and Restores

Day 6 – Backup-DbaDatabase in scripts – 31 days of dbatools Backup and Restores

Day 6 – Backup-DbaDatabase in Scripts

One of the great things with dbatools is that you can pull all of the different commands together to make some really powerful scripts for yourself.

Backup-DbaDatabase is no exception. In today’s post I’m going to show you a couple of things you can do with it, and hopefully inspire you to come up with some ideas for your own usage

Day 5 – Performance Tuning Backups – 31 Days of dbatools Backup and Restores

Day 5 – Performance Tuning Backups

Why does backup performance matter?

As your database gets bigger, backups take longer to complete. This makes sense, more data has to be written to disk which is going to take longer.

The reason it becomes a problem is that your organisation will have a Recovery Point Objective (RPO), which is the most data loss they’re willing to accept in the event of a disaster. Now, it might not be officially announced but I’m sure management has a number that’s lower than you think.

To hit your RPO you need to have less time than that between your backups. So let’s say your RPO is 15 minutes. This means you need a backup at LEAST every 15 minuted to hit this target. If your transaction log backups are taking 20 minutes, you’ll never make it.

Obviously we can’t reduce the amount of data being created, your boss won’t thank you for rate limiting customers!

So all we can do is to make things flow out faster. Let’s see how we do that:

Day 4 – Backup Encryption with Backup-DbaDatabase – 31 days of dbatools backup and restores

Day 4 – Backup Encryption with Backup-DbaDatabase

Now, we all take our db security very seriously (or at least I hope we do?). But a lot of people don’t think this through to their database backups

A database backup is a full copy of the database, by definition all of the data is in there

So if someone get’s their hands on it? Then they may as well have dbo on your production database, as they can just restore it on their own SQL Instance and go to town on the data.

That is, unless you’ve implemented some security around your backups

Day 3 – Backup-DbaDatabase Filenames and paths – 31 days of dbatools backup and restores

Day 3 – Backup-DbaDatabase Filenames and paths

Yesterday we looked at the basics of using the Backup-DbaDatabase function. In all of those examples the backup files were going to the default backup location of our SQL Server instance, and using the default name generation process of Backup-DbaDatabase.

This often isn’t what you want, so we offer you plenty of options on how to create and handle paths

Day 2 – Backup-DbaDatabase Basics – 31 days of dbatools backup and restores

Day 2 – Backup Basics

One of the most basic things a DBA needs to do is to take reliable backups. Before you undertake any piece of work you need to have somewhere to roll back to in case it all goes horribly wrong.

The dbatools Backup-DbaDatabase function can really help out here.

While it can be automated with SQL Agent jobs, it’s not it’s real strong point. For that you’re better off sticking with Ola Hallengren’s maintenance solution.

The Basics

At it’s most basic all Backup-DbaDatabase needs is a sql instance:

Backup-DbaDatabase -SqlInstance server1\instance 

This will take a full backup of every database on instance server1\instance into the instance’s default backup location

This assumes that you’re running Windows authentication on your SQL instance. If not, you need to provide the credentials to connect with.

You do this with the SqlCredential parameter like so:

 
Backup-DbaDatabase -SqlInstance server1\instance -SqlCredential $credential

To create a credential you have 2 options:

Using Get-Credential. This will pop up a box for you to enter the username and password to create the credential:

$credential = Get-Credential

Or you can create one on the command line, though this will leave your password in the command history:

$password = ConvertTo-SecureString 'P@ssw0rd' -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ('username', $password)

Just in case you don’t know what the default backup location for an instance is, you can grab it like so:

Get-DbaDefaultPath -SqlInstance server\instance

If you want to backup a specific database or more then you can use the -Database parameter

Backup-DbaDatabse -SqlInstance server1\instance -Database db1, db4

which will backup db1 and db4. or you can pipe in values from Get-DbaDbDatabase like so:

Get-DbaDatabase -SqlInstance server\instance | Where-Object {$_.name -like '*test'} | Backup-DbaDatabase

Which will back up every database whose name ends in test on the server\instance. Really hand for when the devs want to deploy something new

Other backup types

Now, it’s not always just about the full backups. Of course we can handle differential and log backups, just a parameter away

Get-DbaDatabase -SqlInstance server\instance | Where-Object {$_.name -like '*test'} | Backup-DbaDatabase -Type Differential

Get-DbaDatabase -SqlInstance server\instance | Where-Object {$_.name -like '*test'} | Backup-DbaDatabase -Type Log

There is one other type of backup that is very important to be aware of, the COPYONLY backup. When a SQL Server database is in full recovery mode you want to make sure that you don’t interrupt the backup chain. This can ruin your chances of performing a successful restore.

For instance, if you’re using Differential backups. Each differential backup is the sum of all changes from a specific full backup. If you take a new backup, all subsequent differentials are based on that one. That can be a problem if you’re just grabbing a backup to refresh test or to investigate an issue.

A COPYONLY backup does not break the chain, so it can be removed or deleted once it’s no longer needed without breaking your recovery plan

To take one of these, just use the -CopyOnly switch:

Get-DbaDatabase -SqlInstance server\instance | Where-Object {$_.name -like '*test'} | Backup-DbaDatabase -CopyOnly

Testing a backup command

Maybe you want to try something out and see what’s going to happen, but you don’t want to actually do the backup?

That’s what the -OutputScriptOnly switch is for. This will go through all the same processing as a full run would, the only difference is that Backup-DbaDatabase will just spit out the T-SQL without running the command.

Running this:

Backup-DbaDatabase @Script:appsplat -Database roles1 -FileCount 3 -OutputScriptOnly

will not perform a backup, but will produce this for you:

BACKUP DATABASE [roles1] TO DISK = N'/var/opt/mssql/data\roles1_202004281714-1-of-3.bak', DISK = N'/var/opt/mssql/data\roles1_202004281714-2-of-3.bak', DISK = N'/var/opt/mssql/data\roles1_202004281714-3-of-3.bak' WITH NOFORMAT, NOINIT, NOSKIP, REWIND, NOUNLOAD, STATS = 1

We’ll be covering what some of those other options are in a post in a couple of days

Backup-DbaDatabase Output

Backup-DbaDatabase produces a lot of output that can be of interest. Quite a lot of it isn’t shown at first glance so we’ll take a look at what’s available:

$output = Backup-DbaDatabase @Script:appsplat -Database roles1 -FileCount 3

Now, the standard output looks like this:

Which gives you the basic information. But if you then ask for everything in the output you get a lot more information:

That’s just about every stat you might want for your backup, or everything you might want to test for after taking the backup.

Conclusion

Today we’ve looked at the basics of what Backup-DbaDatabase can do, tomorrow we’ll be looking at handling backup files

All posts in this series can be found at 31 Days of dbatools Backup and Restores

 

An introduction to dbatools backup and restores – Day 1 of 31 days of dbatools Backup and Restores

Day 1 of 31 – Introduction

Welcome to Day 1 of 31 days of posts on dbatools backup and restore capabilities.

Wait, that sounds like a lot of posts about dbatools backup and restores? Well, some years ago I did a similar series on standard PowerShell Backup and Restores, and all of that plus a lot more is now built into the dbatools module.

This means there’s a lot of power hidden in Backup-DbaDatabase and Restore-DbaDatabase, but that also means it’s easy to miss features and options. So 31 days is about right to cover everything I think

Currently it’s 1st May 2020, and we’re in a lockdown in the UK. Which you’d think means lots of freetime right? It’s not quite working out like that, so there may be a gap when real life takes over. But I’m going to try my best for a post every day for this month

Why me?

I wrote a lot of the Restore and Backup code, so I’ve got a good idea of how it hangs together and just why it was designed in certain ways. It wasn’t just to annoy people, there was a plan to offer some big power featured. Really!

About the examples

Just to prove the flexibility of things I’ll be using as many different versions of PowerShell and SQL Server to demo thing. If you want to follow along then Windows PowerShell 5.1 and Sql Server -gt 2008 will be fine.

Level

We’ll be starting off with simple concepts and examples, and then building on those. Most things we’ll be looking at won’t be to technical unless we’re inside the functions. If something doesn’t make sense, then please let me know and I’ll do my best to clear anything up.

So, that’s Day 1. Tomorrow we’ll begin by taking a look at the basics of Backup-DbaDatabase

31 Days of Backup and Restore with dbatools

So I’m doing another 31 days of posts, hopefully consecutive days but given current pandemics and sporadic working hours it may not be.

This is the index of all the posts, I’ll update it as each one is posted.

Posts

 

Please clean down your SQL Server backup history

From looking around at some of the Issues that get logged for dbatools, it looks like people never clear out their SQL backup history from msdb. This causes slow queries and potentially misleading backup trees to be returned.

In my (not so humble) opinion you really shouldn’t let your backup history fill up with, let’s be honest, useless junk. In this post I’ll go through why I think you should be keeping way less in there than people think you should

It’s not an audit trail

A common reason people give for keeping years worth of backup history records is that they need it to show the auditors. I’ve yet to find an auditor who wants a raw SQL database of database to look at.

The options I’ve used in the past are to use:

  • Export it to Excel or CSV on a regular basis. Easier to hand off to an auditor, compresses well and get’s it out of your database
  • Monitoring system records. You are monitoring your backups? In that case, you should have record of no failed jobs.

Please check what your auditors are actually asking for and why. Unless they’re also asking for verification and restore test results pure backup history is worthless. Because as a number of people have said over the years (in various ways):

An untested backup is just a wish that may not be granted

Me, probably

Will you ever go that far back?

So you’ve got 2 years worth of SQL Server backup history tucked away in msdb. Now, be honest, are you ever going to use it? Really, you’re going to roll a database back 6 months? Why?

That would be a hell of a lot of data to throw away to /dev/null or have to import all over again. Would there ever be a business case for doing that?

And once you’ve restored it, that leads on to…..

Once it’s restored, it’s a new database

You restore a SQL Server database and you start 2 new things:

  • A new Recovery Fork
  • A new LSN chain

All databases are running along a Recovery Fork (identified by a RecoveryForkGUID), even a new one will have one. You perform a restore, and SQL Server will generate a new Recovery Fork under the covers. Much like Ghostbusters, SQL Server won’t let you cross streams

As covered elsewhere on here, SQL Server uses LSNs (Log Sequence Numbers) to track database actions, and as backups are database actions they are also linked to LSNs. You’ll be taking the database back to an LSN in the past, and restarting the count from there. This is why SQL Server uses the recovery fork. Even if you were lucky enough to have backups starting on the same LSN (and it’s the sort of luck where I’d be asking you for the lottery numbers) you still won’t be able to use the backup.

So if all the backups are still in the backup history they’re useless, and potentially very confusing when you come to do a restore

Can you even restore it?

If you’re storing 2 years of old backup history, are you also storing the old backups as well? If you don’t have the backups to restore, again the history is worthless and might as well be deleted

I’m pretty hot on backups, but even I don’t keep things that long. Currently we’ll keep everything for 2 weeks, then we’ll get rid of Transaction Log and Differential backups. After 4 weeks we’ll only keep the Full backup from the last day of the month, and then at the end of the year we’ll only keep the last Full backup

So again, why keep the history if I can’t use it to restore most of the databases it contains?

Do you need that granularity?

As mentioned in the last reason you may end up with just a few backups kicking around that may need restoring for investigations or legal reasons. Do you need the backup history to restore those?

With tools like Restore-DbaDatabase I’d say no. While generating the restore chain from a folder full of backups may take a while, for a single database backup you’re only looking at seconds to scan the file(s) and be restoring. Probably quicker than trying to find the records in a bloated backup history table 😉

So what do I suggest

I’d suggest really looking at why you’re storing all that probably useless backup history.

Personally I consider anything more that a months worth as being too much, you (and your boss) may think that’s a bit tight. But I can’t ever see us needing to roll backup to a point in time over a month ago. Most of the times we do need to, the end of the month will be good enough.

So, to clear down the backup history, schedule sp_delete_backuphistory to run once a week like so:

use msdb
go

declare @oldest DATETIME  = Getdate()-28;
exec sp_delete_backuphistory @oldest_date=@oldest;

This will get rid of ALL backup history over 4 weeks old from ALL databases. This will work on SQL Server from 2005 upwards

Or you can enable the ‘History Cleanup Task’ in your maintenance plans.

Or if you’re using Ola’s great SQL Server maintenance scripts they’ll install a sp_delete_backuphistory job for you

So, go on. Get those deletions automated and make your life easier 🙂

Feedback

Obviously, all of the above is just my opinion and thoughts on best practice I’ve picked up over the years. You may have a need to keep years of backup history, and I’d love to hear what it is.

Page 3 of 18

Powered by WordPress & Theme by Anders Norén