Musings of a Data professional

Stuart Moore

Category: community Page 1 of 3

Nottingham Global Azure Bootcamp 2019

For the last couple of years Microsoft has been encouraging communities to run a 1 day Azure focused Bootcamp event on the same day all around the world.

For 2019 we’re pleased to announce that Nottingham will be hosting one of these events for the first time on the 27th April

Sessions and speaker information are available on the website at Registration is free, so please grab a ticket at Eventbrite

If you want to find out more about the SQL Server options available on Azure, how a company used Azure to produce PDFs on a massive scale or want to get hands on with Service Fabric then we’ve got sessions and workshops for you.

PsDay.UK 2018 incoming

Time sure seems to fly. It’s been just under a year since the first PsDay.UK appeared on the UK PowerShell scene. After the success of that event it’s back for another edition! The 2018 event is on 10th October 2018.

The PsDay.UK team have gathered up a great set of speakers and sessions again, have a look at the quality of the agenda. 3 tracks of sessions, it’s going to be tough picking which one to go to. Well, except at 15:00 when I’ll be in Room 3 (aka Shift) presenting on ChatOps for PowerShell. I’ll be covering what ChatOps offers the PowerShell developer and how you can leverage your current skills and scripts to join the gif filled party

Tickets are availble here – PsDay.UK 2018 Tickets – at a very reasonable price for a full day of quality sessions. There’s even a decent refund programme, so you’re covered if things change.

Hopefully be seeing some of you at Code Node on the 10th October. Feel free to wander up and say Hi.

T-SQL Tuesday 104 – Code they won’t pry out of my hands

T-SQL TuesdayIt’s the 2nd Tuesday so time for a T-SQL Tuesday post. This month’s host Bert Wagner (b | t) posed the following topic for us:

For this month’s T-SQL Tuesday, I want you to write about code you’ve written that you would hate to live without

Off on slight tangent I’m actually going to write about the life of some code you’d never wrench out of my hands, just to show how learning to scratch a simple itch can lead to learning a lot more and getting involved with larger projects. And mainly because we didn’t do source control properly back in those days so the early code is lost to history (thankfully as it wasn’t pretty!)

About 6 years ago I needed to implement some SQL Server restore testing to keep our corporate auditors happy. I ran around with T-Sql for a while, but trying to find a solution that worked with all of our systems was a nightmare. So I dug out the PowerShell documentation and started reading up on SMO (that page looks a lot nicer than it did back in 2013). Soon I had a working piece of code, that with enough baling twine and candle wax could do what we wanted.

By now I’d got lots of code examples, so decided to turn them into 31 days worth of blog posts – 31 Days of SQL Server Backup and Restores with PowerShell to help others work out where to go and save them running into the same loops and holes I’d done. Also wrote a presentation and took it around the UK to Usergroups, SQL Saturdays and conferences.

Now we move onto the bit where I can start showing off some actual code

Roll on forward to 2016 and I start to update my scripts to get them a little more up to date. In the process I decided to transform them from a ragtag collection of scripts into a shiny PowerShell model, and so SqlAutoRestores came into being. The code in github is very much a work in progress. The basics worked but a lot of supporting stuff to cope with other people’s infrastructure was still needed.

Luckily I was asked to help with the dbatools project around then, mainly with the restore code. And the rest is history

So in 5 years my scrappy little bit of code has moved from this:

import-module "SQLPS" -DisableNameChecking
$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("Server1")
$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$backupname = "C:\psbackups\psrestore.bak"
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($backupname,$devicetype)
$restore.Database = "psrestore"
$restore.ReplaceDatabase = $True

(Basic PS Restore)

to these:
(linking to source as these are a tad larger than earlier files)
Restore-DbaDatabase (697 lines)
Get-DbaBackupInformation (354 lines)
Select-DbaBackupInformation (172 lines)
Test-DbaBackupInformation (209 lines)
Invoke-DbaAdvancedRestore (367 lines)

All of which have been worked on by many hands. But I now have a restore solution that I use every day in my day job to reset development environments and restore check my backups still, and is in use around the world by DBAs who need a solid reliable restore solution (If you’ve done a migration with dbatools, guess what’s underneath that)

It’s probably my most used single piece of code, and not just because it’s running 24/7/365 testing! The muscle memory is now so ingrained that I can beat almost any other restore option to start.

The point of that story is what this T-SQL Tuesday is about, if you shout about some nifty code, or at least let someone know, there’s a good chance other people will want to use it or help you develop it. It won’t happen overnight, this story is about 5.5 years old and there’s still debates about where this code is going to go in the future. And it’s been great to see where this tiny project has eventually lead me with my career and community involvement over the years.

Now with added MVP!

Massively honoured that Microsoft have awarded me their Data Platform MVP award for 2018.

Looking at the other 19 UK Data Platform MVPs I’ve joining a pretty elevated set of individuals. Many of them were the people I looked up to and got advice from when I first started dabbling into the UK SQL Server community, so it’s a reflection on how much advice and knowledge there is out there that I’ve gotten this. And that I’ve got some living up to do.

The MVP is a acknowledgement of a person’s engagement with the community. I’ve done lots over the years:

But the really good stuff has been getting to meet people at all the events. There’s so much to be learnt over that quick 10 minute coffee break or at the after event shindig. Those are the conversations that have led me down interesting tech or made me consider speaking, and then lead me into organising as well.

So Thank You to everyone who’s talked to me online or at an event, and if you haven’t then feel free to bend my ear at the next event I’m at.

dbachecks – SQL Server compliance testing with simple configuration management

if you’ve not heard yet, the people behind the dbatools PowerShell module (including me) have a new toolset for you, dbachecks. dbachecks uses Pester to let you validate your SQL Server estate in a simple way and generate meaningful graphical reports. The official launch of the module is at SQL Bits 2018

Out of the box dbachecks uses test values that we’ve found to be the most appropriate from our years of experience with SQL Server. But these may not be the best values for your particular organisation. For example, we expect to see a Full backup less than 24 hours old for each database, in your case you might only take a Full backup once a week and use differential backups during the week. So we needed a flexible and simple system to let you change the values. dbatools friend Friedrich Weinmann (b | t) has written a great PowerShell module framework called PsFramework which we’ve integrated into dbachecks to handle the configuration of the tests

Out of the box dbachecks will load it’s default config values from the file .\internal\configurations\configuration.ps1 (#! link to GH). To see the current values use Get-DbcConfig:
Get-DbcConfig ouput

I’ve just picked the top few rows for the screenshot, as of writing (15/02/2018) there are 98 config options available. We’ve tried to make the configuration option names obvious. There are main sections of configuration values group:

  • app – Configuration for the module
  • domain – Configuration for authentication
  • mail – Configuration for sending reports
  • policy – The values for the tests
  • skip – Controls which tests should be excluded

And each has a useful desription as well to make it easy to find. In the screenshot you’ll notice that I’ve configured the mail.* options to suit my test environment. So now I can use Send-DbcMailMessage to email my test results without having to specify all the parameters. To set a configuration parameter you use the Set-DbcConfig command:

Set-DbcConfig -Name mail.subject -Value "DbcChecks report"

Set-DbcConfig example

Or perhaps as mentioned at the start of this post, you don’t want a Full backup in the last day, but in the last 7 days. You can easily configure that:

Set-DbcConfig -Name policy.backupfullmaxdays -Value 7

And now your tests will be checking you’re never more than 7 days from a full backup. By default we’re checking that you’ve taken a Differential backup in the last 25 hours, so you’re now good to go!

And don’t worry, you don’t need to reset these every time you run the tests. The PsFramework module persists these non default sessions in the Windows registry at:


So on my system we can see:
DbaChecks registry storage

Now what happens if you want to distribute these config changes to multiple servers, or share them with colleagues to make sure you’re all singing the same tune. We’d strongly recommend you don’t modify the .\internal\configurations\configuration.ps1 file directly, as that will be replaced whenever you update the module. To make your life easier (which is the entire point of the module) we’ve included Export and import functionality. To export the configuration to an easy to parse JSON file you simply run:

Export-DbcConfig -Path c:\path\of\yourchoice\filename.json

If you omit the path value then by default we will export the results to $script:localapp\config.json. Now you’ve got a simple JSON file it’s easy to source control (you are using source control aren’t you?) to keep track of changes and make sure they were implemented as expected. If you want to apply that configuration to another install, then it’s simply a case of running the import command:

Import-DbcConfig -Path c:\path\of\yourchoice\filename.json

and you’re done. We support UNC, so again it’s simple to have a central repository to apply the same configuration. And it works well with just fragments of configuration as well, so if all you wanted to control were the email settings you can create a JSON file like this:

        "Name":  "mail.failurethreshhold",
        "Value":  0,
        "Description":  "Number of errors that must be present to generate an email report"
        "Name":  "mail.from",
        "Value":  "",
        "Description":  "Email address the email reports should come from"
        "Name":  "mail.smtpserver",
        "Value":  "",
        "Description":  "Store the name of the smtp server to send email reports"
        "Name":  "mail.subject",
        "Value":  "DbcChecks report from stuart",
        "Description":  "Subject line of the email report"
        "Name":  "",
        "Value":  "",
        "Description":  "Email address to send the report to"

Then you can import just this snippet to set the configuration for those options. Makes it easy to seperate out enterprise level configuration changes from the actual SQL test options. The same technique can be used to ‘force’ the correct backup testing parameters in all cases, while letting other tests be customised as needed.

T-SQL Tuesday #97 – What I’ll be working to improve in 2018

T-SQL TuesdayThis month Mala has asked us to post about our Learning Goals for 2018. Now, there’s nothing like nailing your colours to the mast so you can come back in a year and see if you actually did something (though let’s not talk about my 2017 exercise plans!), so here goes:

Tech Learning Goals


I read TCP Illustrated years ago, so apart from IPv6 I’m pretty solid on the fundamentals. but the increasing number of ways these can be manipulated and managed has ballooned. And no matter where your data and applications are going to live in the future it’s still going to be essential to know how they can be connected. So I’m definitely pencilling in some time with our Network Infrastructure team to get an overview of what’s out there, and what they see as the future. I know I can bingle that sort of stuff, but it’s even better to get some professional advice

Then I’ll be hitting Pluralsight and other video and looking forward to building some very different labs to the ones I’m used to. For once the servers and the data won’t be the focus.

The aim is to sit in a meeting with our network team and keep up with everything! Though I might draw the line at memorising the Cisco model numbers like the seem to have

More Automation

I’ve increasingly automated a lot of my repetitive work tasks. But I need to automate a lot of the routine work that’s come in a filled up my ‘spare’ time. Now I’m as guilty as the next IT Pro about sticking to the easy fix for automation and sticking to technologies I know. Well, for 2018 I’m going to start branching out and make the automation to force me to learn some new technologies. So rather than sticking to PowerShell I’ll be looking at Python and Go in more depth, looking at more serverless Cloud technologies for data transformation and endpoints rather than spinning up more VMs or containers and trying to let other Cloud services take the strain rather than me.

If I get this right and make progress, then the other things on this list will come to pass. Freeing up time to work on the good stuff will be the biggest pay off.

New Personal skill

Improve my Writing

I’m not a natural writer by any stretch of the imagination, my natural inclination is numbers, equations and strange pencil diagrams that only I can decode. This tends to limit me when writing documentation, and blog posts take a disproportionate amount of time to write.

So I’m going to head to the Nottingham library to get some books on writing. And to make sure I put it into practice, I’m going to be keeping a journal from January 1st (don’t worry, it won’t be online!). Hopefully forcing myself to write everyday without an audience will help me with writing for an audience. So I’ve 2 weeks to find a nice pen and notebook to try and inspire my inner Samuel Pepys!

New non tech skill

Get better at marketing myself

In the classic British manner I’m happy to hid my lamp under a bushel. So for next year I’m going to try to put myself out there. I’m hoping this will be help along with some from improving my writing as above. I’m going to make more use of asking other tech friends to review session submissions with useful feedback. Asking for feedback from organisers on my submissions, including when I succeed so I can build on good things as well as fixing the bad.

And hoping that the increased blog posting from my better writing skills means I’ll have more content to share here, and from the new tech skills more interesting things to write about.

The metrics for this one will be the number of events I present at in 2018, the number of visitors to my blog.


To me that looks like a good solid set of skills to work on. They complement each other, so an improvement in one should help with another one. They can also be worked on at different rates, which stops things getting stressful if something has to be put aside due to other pressures.

Nottingham and East Midlands PowerShell User Group

PowerShellInterested in any form of PowerShell usage and based around the East Midlands and Nottingham? Then this could be the group for you. We’re looking to cover anything that uses (or can be used with) PowerShell. So topics that are fair game include:

  • AD Management
  • Scripting
  • Source Control
  • DevOps
  • Azure/AWS/Cloud Provider of your choice
  • Exchange management
  • SQL Server
  • Pester testing
  • Continuous Integration
  • .Net Internals
  • Generally anything that would interest someone using PowerShell or give their career a boost!

Nothing is set in stone yet as we want to get some feedback from potential members. There’s a date booked for the 8th Febuary (Kick Off meeting), but what happens is up to you.

Would you prefer a traditional usergroup with booked speakers given presentations in a formal setting, or something more informal like a roundtable/whiteboard sessions? Or perhaps half and half?

We’d love to know what you’d like to see or learn. So please either drop a comment below or sign up for the Kick Off Meeting and give us feedback on Meetup.

New year, New speaking dates

Lining up a few SQL Server Usergroups speaking sessions for the year already:

All sessions will be:

Indexing Nightmare – Cut Through the Clutter

Inherited a database with 30 indexes on every table? Has the vendor for your 3rd party app recommended more indexes over the years than you can remember? Got indexes that were added to fix a data load problem 4 years ago, but not sure if they’re still being used? Indexes are key to SQL Server performance, but like everything too much of a good thing is a bad thing. In this sessions we’ll look at how you can analyse your current indexes with the aim of consolidating them into useful ones, even removing some completely and how to improve the ones you’ve got left

Except for Southampton, where it’ll be:

Get on the Bus

As time goes by, more systems are crossing hosting boundaries (On Premises, Azure, multi-cloud provider, ISVs). We need a simple reliable mechanism to transfer data between these systems easily, quickly and reliably. Microsoft have taken their Message Bus technology and moved it to the cloud as Service Bus. This session will introduce you to this service and give examples of how internal databases can safely process data from cloud hosted applications without having to be exposed to the InterTubes. Examples are predominantly .Net C#, but aren’t complex!


Nottingham SQL Server Usergroup – 12th January 2017
(Also presenting will be Steph Middleton, talking about Building a Robust SSIS Solution)
(More details and registration here

Midlands/Birmingham SQL Server Usergroup – 19th January 2017
More Details and Registration here

SQL Surrey Server Usergroup (Guilford) – 20th February 2017
Link and details to be confirmed

Southampton SQL Server Usergroup – 1st March 2017
More details and registration here

Hope to see some of you there. And if there’s any other usergroups out there that are looking for speakers then let me know, have presentations on SQL, Powershell and general IT process to offer.

So you want to present at a SQL Server Event

So you’re thinking about stepping up to speak at a SQL Server (or any other technical event), or are having your arm gently twisted by an organiser to do so. How bad is it going to be?

tl;dr version:
– Just do it, it’s easy, and it’s great!

Long Version:
Not very. Let’s break down the most comment arguments:

1) – I’m not using the later version
Doesn’t matter. Most people out there won’t be. As of writing this, there are very few people running SQL Server 2016, but there are a lot of people still on SQL Server 2012 (and older!). So don’t think you have to be talking about the latest greatest feature

2) – I’m not using the coolest technology
Yes, each SQL Server release has a must-use technology which people preach about. But that’s not always what people want to hear about. Replication is as old as dust, but it’s still something people want to learn about or know how to fix, a good replication talk aways gets listeners. I talk a lot about backups, and not the new features either, and those talks go down well. What about indexing and performance, well those are perennial favourites, and everyone does them differently so maybe you’ve got something to add there

3) – I’m not doing anything exciting
Neither are most people out there! The lie in the marketing papers is that everyone should be doing a billion transactions a second and have a multi terabyte Web Scale database!
Truth is, 90% of your audience aren’t doing that either. Most of us have the same issues, too many databases and not enough time to look after them all. Those are topics that will grab people

4) – I’m not going in depth enough.
I admit it, I love a good Bob Ward (w|t) or Bradley Balls (w|t) 500 level session on deep SQL Server internals, but then that’s me!
For most people a good level 200 session on a topic is a great introduction to that topic, pushing into 300 for someone who wants to move on to the next level. So don’t worry if you’re not breaking out the debugger or tracing into dll calls

4) – I’m not an MVP or other high end consultancy title
Neither are most of us doing the speaking. Don’t let that hold you back. You think they got those titles before they started speaking? It’s putting yourself out there that get’s you noticed.

5) – I don’t have enough content
You’ll be surprised how easy it is to fill up 50 minutes with content. And that’s without questions, once they come into the picture you’ll find yourself accelerating to get everything in. Demos always take longer than you plan as well, seriously, never underestimate how long demo can take in front of an audience!

6) – I’ve not done it before
We all start somewhere (Birmingham SQL User group for me many years ago), local user groups are good as you’ll have friendly faces around. If you want to dip a toe in the water then keep an eye out for events offering a shorter quicker intro, for example lightning talks of 10-12 minutes for you to have a go with, or there’s webinars, so you can present from the security of home.

7) – Don’t be afraid of questions, or you answers (An addition suggested by Rob Sewell (w|t))
Yes, people will ask questions. But don’t be scared of them. I’ve yet to see someone throw in a question explicitly to be nasty to a presenter. Most of the questions will be because someone’s not quite followed what you’re saying so repeat yourself and see how that goes. If you get a question you really can’t answer, you can’t answer in a reasonable amount of time, or is going to lose the rest of your audience you can always arrange to take it afterwards or give them your contact details and discuss it offline.

So there’s nothing insurmountable there. All group leaders and organisers want to see new speakers, so don’t be afraid to ask for help. We’ll happily let you know of any topic requests we’ve had from our members, or give you feedback on your topic. They’re also happy to go through your presentation with you before the big day to make sure it’s going.

Post up below if there’s anything else you’re worrying about. And if there isn’t, go and start writing that presentation

SQL Relay Nottingham 2016

SQL Relay logp

After a great time last year, we’re bring SQL Relay back to Nottingham on October 6th 2016.

We will have 3 tracks covering SQL Server, BI, and Analytics. There’ll be 3 tracks of 1hr sessions, plus a workshop track with half day sessions.

We’ll be covering a wide range of topics such as SQL Server performance, management, and development, Azure/Cloud technology, R, big data, cubes, reporting and dashboarding. These topics will be covered at a variety of levels so there’s something to suit everyone whether an accidental DBA, a hotshot BI pro, or a jack of all trades.

All of this for the princely sum of 0p! We’ll even provide lunch. You won’t get a better SQL Server training off in Nottingham in 2016

Speaker submissions are open if you want to speak. Closing date is 19th August

The timetable for the day should be out shortly after that.

Register to attend or speak here – Nottingham SQLRelay Registration

Page 1 of 3

Powered by WordPress & Theme by Anders Norén