Musings of a Data professional

Stuart Moore

Author: Stuart Moore Page 1 of 18

Nottingham based SQL Server DBA who dabbles with many other technologies. 15+ years of experience with databases and still keen to learn and explore.

New Year, New Data Relay

It’s been two years since we last tried to run an in person Data Relay tour. With the thawing of Covid restrictions and a general air of optimism we’ve got the band back together and are heading back out on the road.

To test the waters we’re just doing a 4 date tour this time:

  • Leeds – 04/10
  • Birmingham – 05/10
  • Reading – 06/10
  • Bristol – 07/10

I’m looking forward to getting out there and seeing more people face to face. Seems like a long time since we had a Q&A session that wasn’t run via a chat window!

Details are still being finalised, but we plan to have more details and Session submission open before SQLBits.

Looking back through some ancient emails, it appears that it’s 12 years since I first attended a Relay event, 10 years ago I first presented at a Relay even and 8 years since I got pulled onto the organising committee. That’s a long time, but it’s always been fun (well, there was one incident with a Leeds venue which got heated), so I can’t wait to get back on the bus with the rest of the team.

Bulk renaming SQL Server tables with T-SQL or dbatools

While setting up for a large migration project involving a third party I had to move a load of tables out of the way to accommodate a sudden change in approach. This happens fairly often with our dev environments. So I thought I’d share a couple of approaches I use to do this. First up , renaming tables using T-SQL and then renaming tables with dbatools

WARNING!!!!!

Renaming any SQL Server needs to be done carefully. Any existing references to the object will break. So any view or stored procedure referencing the table will stop working until you fix it.

This can be a useful trick if you’ve been told you can remove some tables. Rename them, leave them for a day, and then if anyone screams you can rename them back

Renaming tables with T-SQL

This one is good for:

  • simple rename options
  • when you don’t have dbatools available

To do this we’re going to generate the SQL statements using system views and then running them.

Lets say we have a bunch of tables you want to rename. For the first example we want to prefix a bunch of tables whose names contain ‘user’ with ‘tmp_’ while other testing goes on in the database

The key to this is using sys.all_objects. To get our list of tables to rename, we use some simple T-SQL:

select name from sys.all_objects where name like '%user%' and type_desc='USER_TABLE';

Adding something to specify the type of object is ALWAYS a good idea, just to avoid trying to rename a stored procedure or system table.

Now use the T-SQL to build up sp_rename statements:

select ''exec sp_rename '''+name+''', ''tmp_'+name+''';' where name like '%user%' and type_desc='USER_TABLE';

Which will give you a nice little list of SQL statements that you can then copy and paste into a query window to run.

Renaming tables with dbatools

Whilst the dbatools module doesn’t have a Rename-DbaTable command (yet!), you can still use dbatools to rename tables with a little bit of added Powershell

The first part of the pipeline is going to be grab all the tables from the database using Get-DbaDbTable. Then pass that into Where-Object to filter down to the tables we want. Then into a ForEach loop for the actual rename which we do with the SMO rename() method:

$instance = 'server1\instance'
$db = 'testdb'
Get-DbaDbTable -Sqlinstance $instance -Database $db | Where-Object {$_.name -like '*user*'} | ForEach {$_.Rename('tmp_'+$_.name)}

Being Powershell there’s a lot of options you can use to create the new name, which makes this more flexible that the T-SQL version above.

 

Data Platform Discovery Day returns

After the great feedback we got last year for the first Data Platform Discovery day we’re going to be doing another one on September 21st 2021.

We’re still going to have great intro sessions to Data Platform topics help you kickstart your Data Platform career or just so you know what your techies are talking about.

But this year we’re also going to be having Panel discussions between the sessions. We’ll be picking a range of people to discuss Data Platform topics, different ways to start you Data Platform career, and also how people can make the Data Platform community and workplaces better spaces for everyone

Also this year we’re going to be running a single day that goes from 10:00 till 19:00 UTC. This allows us to cover a wide range of Timezones and gives audiences a chance to maybe see speakers they wouldn’t normally get to.

More details will be announced on the web site as we work them out – https://dataplatformdiscoveryday.com/

Our Call for Speakers is now open! So if you have some great intro/level 100 sessions you feel would help a someone starting out then we’d love for you to submit. Do you have something you wish someone would have told you when you first started? Then submit it here – https://sessionize.com/data-platform-discovery-day-2021/

If you’d like to take part in the panels, then please read through the descriptions: DPDD 2021 panel descriptions. We’re looking for anyone who can offer insights or commentary on any of these topics. If that sounds like you then please submit a ‘session’ with the title of the panel your interested in, and use the abstract box to tell us a bit about yourself and why you’d be great on our panel

Call for Speakers closes on the 31st August and we will be notifying everyone successful or unsuccessful as soon as we can.

So where have I been?

Covid Impact

So you might have noticed a break even longer than my usual disappearances. So where have I been and have I done anything interesting?

Like everyone else Covid-19 has had a major impact on my job. I work for a large Higher Education provider. The UK lockdown meant we suddenly had ~4k staff members and ~35k students working elsewhere than our Campus. This caused any number of issues and problems to solve.

It’s certainly been an ‘interesting’ year and I’ve learnt a lot about networking and how the organisation actually operates. Lots of performance issues have become apparent when they aren’t being masked by a good fast network.

That’s eaten up a lot of my time, and meant I’ve not really fancied spending extra time in the ‘office’/backroom as it feels a little too like staying late. Thanks to getting more efficient and some improved workflows I should get more time to do things outside of the job.

New Role

At the start of 2021 I moved into a new role. I’m now the Techncial Lead for an Operations team. Still doing a lot of SQL DBA stuff, but the role is now focused more on setting out our future trajectory for a wider range of technologies.

A lot of the role is looking forward and creating strategy, so there’s been some dark corners of our infrastructure I’ve had to poke around into and then come up with plans to bring things up to scratch or convince the owners that we’ve just going to have to shut it down.

Technology

I’ve found myself moving back toward the *nix world with the job shuffle. There’s a lot more Linux in use as it makes Azure cheaper, and our developers are moving towards .Net Core so Linux hosting makes sense. So I’ve been spending more time at the bash prompt than the pwsh prompt, sometimes ending up with an unholy marriage to get things done.

There’s been quite a steep learning curve catching up with some more obscure bits that I’d never looked too deeply into before.

Still plenty of SQL Server in the mix, but certainly less than 12 months ago.

Mech Keyboards

My world has become a lot more text based during lockdown. What would have been a quick chat on the way to grab a coffee is now a Teams chat.

At the start of lockdown I was using an Apple Magic Keyboard, but soon found it wasn’t doing my fingers any favours. This led me down the mech keyboard rabbit hole. After a couple of iterations I’m now on a home built Kyria split keyboard using the Colemak Mod-DH layout.

I’ll probably do post on this in a bit as I found thinking a bit more about my main user interface an interesting experience

Dog

We got a dog. Say hello to Marx:

He’s an energetic little Cocker Spaniel, he’s got a lot to learn so I’ve been getting into a lot of dog training theory and practice for the last couple of months getting him ready for some adventures once the weather improves

He’s named for this Marx:

Gary Marx

Not this Marx:

Groucho Marx

So…….

All that means I’m going to be starting to blog a bit more regularly, but that the topics might change a bit. Not hugely, but there are posts about how to build dog training placeboards and what dog transport crate fits into an F40 BMW 1 series in the pipeline 😉

Scanning Azure Storage Blobs for dbatools Restore-DbaDatabase

In the comments here I was asked about using Restore-DbaDatabase when all you have is blobs in an Azure Storage account. This can be done, involves a couple of non dbatools steps.

Restore-DbaDatabase doesn’t natively ‘talk’ Azure, nor do any of the other dbatools commands. The reason for this is that we didn’t want to force dbatools users to have to install the AzureRM/Az powershell modules just to use our module. We’ve gone to a lot of effort to make sure that dbatools is acceptable to Security Admins and that it has a small(ish) footprint, and adding those large modules as prerequisites would have broken that.

In this example I’m going to be using the Az module as that’s the current recommend Azure module, and the one I’ve got to hand.

To follow through this you’re going to need to have already set up the restore target instance to talk to the Azure Blob Storage. Full instructions for that are here:

depending on your preference. For the examples here I’m assuming that you’re using the SAS authentication method (as it’s the current recommendation)

Once that’s setup we move into PowerShell to get the blob information we’re going to need.

Import-Module Az
Import-Module dbatools

Login-AzAccount
Set-AzContext -Context 'MyStorage Subscription'

Start off by importing the modules we’re going to be using. Then login to Azure and set the current subscription/context to the one holding the storage account we’re going to be querying

$account = Get-AzStorageAccount -ResourceGroupName sqlbackups -Name sqlbackups
$context = $acccount.Context
$blobs = Get-AzStorageBlob -Context $context -Container sqlbackup
$filteredblobs = ($blobs | Where-Object{$_.LastModified -gt (Get-Date).AddDays(-1)}).ICloudBlob.Uri.AbsoluteUri 

We start by getting the storage account details, here they’re in a storage account called sqlbackups which lives in a resource group named sqlbackups.

Grab all the blobs in the specified container. We do the filtering in the next step as Get-AzStorageBlob doesn’t have any filtering other than -Prefix which is of almost zero use in this case

Here we’re just filtering for all the blobs touched in the last 24 hours. You can filter for other things, but remember that there are no ‘folders’ below the containers, just /’s in the name. So if you’re trying to filter for something like Ola’s paths for a known database full backup you’ll need to use something like this (using master as an example)

$filteredblobs = ($blobs | Where-Object{$_.name -like '*master*FULL*).ICloudBlob.Uri.AbsoluteUri 

All we need for the restore is the full URI (yes, it’s a URL, but technically it’s a URI

Now we can pipe the filtered blobs into Restore-DbaDatabase and things will work as normal

$filteredblobs | Restore-DbaDatabase -SqlInstance sql1 .............

So it’s a little more work that a traditional disk based restore, but not too much.

Pi-Hole and Plusnet

As a nice little lockdown project I decided to up our home network security game a bit. And it was a good excuse to finally buy a Raspberry Pi. I’ve fancied one for years, but always worried it’d end up buried in a drawer not getting used.

I’ve been using Steven Black’s blocking scripts (https://github.com/StevenBlack/hosts)on my Mac for ages now, and they do cut down the amount of annoying ads and other distractions that cover the internet these days. Seemed rude to not have the same protection running on all the hosts in the house. I am just running the lists that block know advertising sites and malicious site. There are also lists to block adults sites, gambling sites and others if you want to restrict what your network can be used for.

But maintaining the lists across multiple machines would be a complete pain. The iOS devices don’t let you do it for a start, and some machines I’m not allowed to play with, like C’s work laptop.

Pi-Hole

Enter Pi-Hole which is basically a DNS server that implements Steven’s host lists at a network level. A DNS server is what turns Domain Names like stuart-moore.com into the IP Addresses that the internet actually runs on, which in this case would be 77.72.1.19

Pi-Hole works by stepping in between your hosts and the normal DNS, and if a request is made to a ‘bad’ domain then it will answer the request with the address 0.0.0.0 rather than forwarding it on for the real answer. Your browser will ignore this address and it’ll be like your request just fell into a black hole

It has a very simple, almost automated (almost!), install which will get you up and running pretty quickly. Steven’s various host lists can be selected, and you can add you own or from other sources as well

Installation and Hardware

There’s a lot of good PiHole installation documents out there so I’m not going to repeat them here. The one I used was the official one here is the one I followed

Hardware wise I ordered myself a Raspberry Pi 4 2GB kit from The Pi Hut. Came with almost everything I needed, the only extra bits I added were a heatsink and a Cat 5 cable as shamefully I don’t have any lying around the house now it’s all Wifi

2GB seemed plenty enough to run PiHole and some VPN stuff I wanted to use (more details on that in another post), and it saved a few sheckles.

The install went really easily and soon I have my own DNS server running on a fixed IP address. During the install I’d selected to have all external DNS queries relayed to Cloudflare’s DNS servers (1.1.1.1) via an encrypted channel, so no chance of them being sniffed. An extra layer of security there.

Now to make it really useful I had to get that address out to all of the devices on the network. This is where PlusNet comes in

Configuring it with PlusNet

We have a nice simple router that came with out PlusNet fibre. Years ago I used to have custom firmware and things, but having a bit of an SLA with C we went to something simple and easy that I wasn’t likely to break at an inopportune moment.

Unfortunately the PlusNet router doesn’t give you very much control over the networking settings. To make configuring all the devices (and any new ones) easy I wanted to push the new DNS IP address out via DHCP so they’d pick it up automatically.

While the PlusNet router gives you a few options to alter a very small number of configuration values, the DNS server address isn’t one of them. Turns out this has been a know issue with these PlusNet routers for a while, with a long running thread on the PlusNet forums complaining about it

So that option was out, but…..

The solution to the PlusNet problem

Luckily PiHole comes with a solution to this. It has it’s very own DHCP server built in! So all I had to do was to configure that. It’s very simple to do.

Once you installed Pi-Hole it provides you with a nice Web admin interface to configure it with. Under Settings you can find the DHCP opion:

All you need to provide is the range of IPs you want you devices to be on. Keeping it on the 192.168.1.x range makes it easier with the PlusNet router as they’ll all be on the same subnet.

2 things to note in my screen shot:

  • A 1 hour lease is really short. I’ve set mine like that while I’m experimenting
  • The PlusNet router gateway is 192.168.254, use that one. Mine is different due to my VPN setup

You can only have 1 DHCP server on a network or the clients get confused over which one to get their configuaration. Now that we have an alternative to the PlusNet DHCP server we need to disable DHCP on the PlusNet router

To disable it
login to you PlusNet router by going to http://192.168.1.254
Click on the ‘Advanced Settings’ button
Login using the password from the back of your PlusNet router
Continue to Advanced Settings
Select ‘Home Network’
Select ‘IP Addresses’

You should now be at a screen that look like this:

Tick No next to ‘DHCP Server Enable’. Then click on apply at the bottom right, confirm you’re sure, and you’re all done.

Your little Raspberry Pi and PiHole are now controlling your DNS and your DHCP settings for everything on your home network.

PiHole so far

I’ve only had mine in for a little less than a week, and we appear to be blocking about 8-10% of DNS queries atm. Which is a solid amount of web queries that we’re not having to transfer over the network, so things are feeling a bit speedier than before.

Also seeing fewer pop ups coming through, and some sites are much more readable without the inserted images and videos getting in the way.

Looking at the load averages on my little Pi, it seems to be coping with all this with it’s feet up. So there’s plenty of room to add some more bits and pieces onto it. I’ve already added an outbound VPN connection for our outgoing traffic and there’ll be more coming along as well

Conclusion

Adding the Pi-Hole was a simple and easy way to block out annoying internet adverts and potential malicious sites for every device on our network. If we had children then I’d definitely be using some of the stricter lists to keep an lid on what they could stumble across.

It would have been simpler if the PlusNet router allowed you to alter a few more settings, but disabling it only takes a few clicks through simple admin interfaces to sort out.

Microsoft Data Platform MVP – Year 3

I’m very proud to have been selected to be a Microsoft Data Platform MVP for the 3rd year running. Until you’ve finally ‘signed’ the NDA you’re not sure if you’re going to be re-confirmed or if there’s been an admin mistake.

It’s quite a bit of waiting from the end of March to now to find out if you’re activities have been deemed worthy of renewal. No one’s quite sure what you have to do to become and MVP, which is good because it allows Microsoft to reward different skill sets or pick people who support the community in different ways, which just adds to the suspense as you don’t even know what or how much they’re looking for.

I gave myself a little talk the night before, about how I’d spent the year doing all of the things I normally do in the community and enjoying myself whilst doing them. So if I wasn’t reconfirmed then I’d lost absolutely nothing. My technical skills were going to be exactly the same and the talks I present would be just the same in terms of quality and content.

I think if you’re just doing community events to ‘rack up points’, you’re missing quite a big point. If you’re only doing something to try to become (or keep) being an MVP then it’ll probably be noticed, but not in the way you want to be.

Having run events before and after becoming an MVP I can tell you that it’s not the though of ‘MVP points’ that keep you going when things breakdown, it’s the wanting to provide something fun and interesting for people in your community.

All of the MVPs I know and have met, are doing this because it’s fun. This makes working with them or learning from them fun. Someone who’s just ‘grinding’ away trying to level up will not be as much fun to learn from or cooperate with, and people will notice.

So if you’re a fellow recertified MVP, well done for keeping on doing what you’ve been doing so well. Hopefully we’ll start bumping into each other again once life is less virtual

Or if you fancy becoming an MVP one day, then get into things because they’re fun and things may happen along the way. If you don’t become an MVP, then at least you’ll have had a lot of fun along the way and met a lot of great people who are part of the technical community.

So this year I’m just going to keep on doing what I do each year and fingers crossed, this time in 2021 I’ll be looking at MVP number 4. If not, I’ll still have a grin thinking about everything I did and more knowledge than I do today, so nothing to be upset about there.

Journaling, my year so far

No, this isn’t an in depth discussion of various *nix filesystem internals. This is about the old fashioned idea of keeping a written journal on a regular basis.

I’ve been journaling for most of 2020 so far, 158 days of entries so you can work out the miss/hit ratio yourself 🙂

I started on the first of January, wanting to capture what I actually do and achieve over a year. And also as a way to record things like mood, sleep and my running. Not because of any great historical legacy, but just because I wanted to see how things correlated and see if I could improve things

Starting off

I started off as cheaply and easily as I could, with a £1 reporters notebook and the first pen I could find with ink in it.

My original journal. Cheap as chips from Wilkos

The first entries were very perfunctory, and usually just a single side of writing. Picking a random day in February I seem to have been concerned about Project ‘X’ at work, the chance of getting rained on during the next day’s run to the office, and if I needed a smaller notebook for my ToDo list so I was more likely to carry it around.

So, no Earth shattering revelations or Zen like wisdom seeping out 38 days in then.

Getting into it

It really has come into it’s own as we’ve been locked down for COVID. Looking back now I can see we started out personal lock down on 14th Mach (we’re so cool, we started early) as that was the first mention of C’s illness. In that way it’s been good to keep track of that happens when time and life seem to go very slow or almost stop.

Not being in the office, and having to rely on the magic that is Teams to talk anything through it’s not always possible to gauge what’s going on or what the end game is, and to simmering over insignificant issues.

Often, just writing something down is enough to see a solution or take some weight off. I’d quite often find it wasn’t as big an issue as I expected, or that it’s something I need to flag up for a more senior answer. Not having anyone looking over my shoulder I’m freer to write down the first answer that comes to mind, and then to negate that in the next sentence as I spot why it was a stupid idea. Gradually inching my way to a solution

The longer I’ve been journaling, the longer my entries have become and the easier I’ve found to just put down anything that comes to mind. When I started there was a bit of hesitancy that you’re ‘oversharing’, but the realisation creeps in that you’re only sharing what you already know with yourself.

Occasionally I’ll throw in a prompt to make me think about something. Nothing to deep, so I stick to the 3 standards that all Journaling sites recommend:

  • What are you grateful for today
  • What do you want to achieve tomorrow
  • What went well today

I tend to throw these in if the entry seems to be a bit of a downer. Just because I’ve been struggling with a SQL Server bug for hours, it doesn’t mean that the day was a complete write off, so this forces me to think back and look for something good in the day. Or if the day has been a hell storm of despair, I can acknowledge that but try to give myself something to look forward to tomorrow.

It’s also a great way to spend 15-20 minutes concentrating on something that doesn’t involve either your work or home life. You might be writing about them, but the action is completely separate. I tend to finish the kitchen chores in the evening and then spend the time before I head up to bed sitting at the kitchen table writing it out. I find it leaves me calmer and a bit more peaceful as I head upstairs.

You don’t need to handwrite it! I’m sure, like myself, and a lot of people reading this don’t write long hand much outside of meetings. I wanted to improve my handwriting this year, so writing things out in long had to make sure I practiced regularly. I also wanted to make sure I wasn’t distracted by any incoming alerts or the ‘Oh, I could fix that quickly’ thoughts. If you want to just write a daily .txt file in your Journal folder, that’s great as well.

To me the real benefit just comes from the doing, not how you do it. My journal is pretty plain, maybe a horizontal line between entries, but that’s about it, but it works for me. If you prefer some drawing, or more colour a la Bullet Journals then that’s great as well

I don’t beat myself up if I miss a night. Real Life crops up a lot, if we’ve been out then it might too late by the time we get back. Or if I’ve been out for a couple of pints my handwriting goes from bad to unreadable pretty quickly. I’ll try not to miss more than one night on the trot though. With Covid we’ve not been anywhere, but I’d take my journal with me in future.

Routine

Like all habits, once you’re on the treadmill and have a routine then it becomes second nature. I look forward to writing each evening now, and even treated myself to my first fountain pen since High School 28 years ago, some quality ink and a nice Lamy notebook

my current journalling

Now I have the routine, I’m not sure if I’d want to stop really. It has been good to flick back to see what’s happening or remind myself that things are changing. And there will always been something to write about

Give it a go

So, go find an old notebook and a biro, or fire up your text editor of choice, and give it a go for a couple of weeks, you might find up you enjoy it and keep on going for a lot longer.

The ‘Done It’ list

I love my todo lists. Writing down everything I need to do stops me worrying that I’ve forgotten something. I even put on todo items to remind me to strop working and take a break or do some exercise.

But, just like every other IT worker, something new will crop up. Your Boss wants some information on a spreadsheet, an urgent all hands Teams call crops up or that misbehaving app on your machine that takes an hour to fix.

There things pop up and throw your entire day out of whack, and your To Do list looks more and more like a wish list than a plan. After a few days of this it can start to get really disheartening as you fall further and further back from where you want to be. I was finding this more and more as I’ve settled into working from home during lock down

So now my To Do list also has a ‘Done It’ section. Anything that comes in and has to be done that day gets added to the list, but prefixed with ‘+’. This means I have an easy way to see what has come in and bumped things off of my list.

If it doesn’t need to be actioned the same day the it goes onto my planner list for actioning at a more appropriate time in the future

At the end of the day, if I have tasks that I haven’t ticked off on my to do list, then I can quickly see if they were pushed off by too many incoming tasks. I also have a better list of what I HAVE done, not just what I had planned to do that day.

After a couple of days I get a better feel for how much overhead I need to leave each day to cope with these unknown tasks that are going to come in. So I can make an allowance in how much I commit to each day to make sure I get things done. If I’m strike lucky and noting unseen come in, then I just pick more task off of my planning list to fill up the rest of the day.

Remember that your to do list should be a good indication of not only what you want to achieve on the day, but also a realistic list of what you CAN achieve that day. If you’ve listed more to do than you can, they you have a wish list not a to do list.

Break in service

First off, apologies to everyone who who was following the 31 days of dbatools Backup and Restores, I will finish it off! Just need to get back into a bit more of a stable routine so I’ve got time to finish it off.

Like everyone else my life has had a big change with the current COVID lockdown in the UK. I’m pretty lucky that I can work from home and that my employer wants us to keep working so no risk of redundancy or furloughing (yet 🤞)

Not having a proper home office made the first couple of weeks quite painful, as long days at the kitchen table weren’t kind on my back. But with some luck on eBay, I’ve now got myself somewhere more ergonomic to work.

Getting used to this new way of working, and also having to pick up some new skills (today I’ve been debugging a JS app) has meant working on my setup, planning skills and the software I have available to me. So there’s going to be some posts on those topics coming up over the next couple of weeks as they’re easier to fit in around my current schedules.

Hope everyone else is doing well in lock down, and look forward to when we can start getting back to meeting up at Data Platform events.

Page 1 of 18

Powered by WordPress & Theme by Anders Norén