Stuart Moore

Musings of a Data professional

Stuart Moore

Rolling Back SQL Server agent job owner changes

So in this post I showed you a technique for updating the owner on all your SQL Agent jobs.

Whilst this isn’t a major change, you do need to have checked that the new ownership account has the correct permissions or some thing (eg; EXECUTE AS or CMDEXEC steps). With this in mind someone asked me how I’d back this out. Either for all jobs or that one specific MUST NOT FAIL job that is failing.

As we have 2 ways to change them, we have 2 ways to back out the changes. Note that both of these require you to have done a small amount of work up front, there is no magic rollback option ๐Ÿ™

PowerShell and dbatools

This one is a little simpler as we can use a lot of PowerShell plumbing to do the lifting for us. To set this up we use:

Get-DbaAgentJob -SqlInstance server1 | Select-Object Name, OwnerLoginName | ConvertTo-Csv | Out-File .\server1-JobOwners.csv

If you’re not a PowerShell user I’ll just break this down into steps:

  1. We user dbatools’ `Get-DbaAgentJob` cmdlet to get all the Agent jobs on Server1
  2. We pipe the ouput to `Select-Object`, we select just the job Name and it’s Owner’s login name
  3. The output is then converted to a CSV object
  4. Finally we write the date to a file in the current folder

So we’ve created a nice csv list of the original owners of our jobs, great. Now how do we use those to roll back?
To roll back every job:

Get-Content .\server1-JobOwners.csv | ConvertFrom-Csv | Foreach-Object {Set-DbaAgentJob -SqlInstance Server1 -Job $_.Name -OwnerLoginName $_.OwnerLoginName}

To rollback just one job it’s very similar, we just insert a Where-Object step into the pipeline:

Get-Content .\server1-JobOwners.csv | ConvertFrom-Csv | Where-Object -Name -eq 'My Job' | Foreach-Object {Set-DbaAgentJob -SqlInstance Server1 -Job $_.Name -OwnerLoginName $_.OwnerLoginName}

In both cases we read in our CSV, convert it from CSV into a PowerShell Object. For a single job we use Where-Object to filter down to that one job, you could also use like if you wanted to pattern match. The remain records then get piped through a ForEach-Object loop, where we use Set-DbaAgentJob to reset the jobs owner.

T-SQL

The pre setup for this one involves creating and populating a table to store the old values:

CREATE TABLE JobOwnerHistory(
  Job_Id char(36),
  JobName nvarchar(128),
  OwnerLoginName nvarchar(50)
)

INSERT INTO JobOwnerHistory
  SELECT 
    sj.job_id, 
    sj.JobName,
    sp.name as 'OwnerLoginName' 
  FROM 
    msdb..sysjobs sj 
    INNER JOIN sys.server_principals sp on sj.owner_sid=sp.sid

So now, resetting a job’s owner is just a modification of our original script:

DECLARE @job_id char(36)
DECLARE @OwnerLoginName varchar(50)
DECLARE JobCursor CURSOR
FOR
SELECT 
  job_id,
  OwnerLoginName
FROM
  JobOwnerHistory
--WHERE JobName LIKE '*Finance*'
OPEN JobCursor 
FETCH NEXT FROM JobCursor INTO @job_id, @OwnerLoginName
WHILE (@@FETCH_STATUS <> -1)
BEGIN
exec msdb..sp_update_job
    @job_name = @job_id,
    @owner_login_name = @OwnerLoginName
FETCH NEXT FROM JobCursor INTO @job_id, @OwnerLoginName
END 
CLOSE JobCursor 
DEALLOCATE JobCursor 

As written that will reset every job in JobOwnerHistory, if you want to filter down to a subset of tables you’d uncomment and modify the WHERE line

Hope those examples are helpful. If you’ve stumbled across this page and it doesn’t quite fix your problem, please drop a comment and I’ll try to lend a hand.

T-SQL Tuesday 104 – Code they won’t pry out of my hands

T-SQL TuesdayIt’s the 2nd Tuesday so time for a T-SQL Tuesday post. This month’s host Bert Wagner (b | t) posed the following topic for us:

For this monthโ€™s T-SQL Tuesday, I want you to write about code youโ€™ve written that you would hate to live without

Off on slight tangent I’m actually going to write about the life of some code you’d never wrench out of my hands, just to show how learning to scratch a simple itch can lead to learning a lot more and getting involved with larger projects. And mainly because we didn’t do source control properly back in those days so the early code is lost to history (thankfully as it wasn’t pretty!)

About 6 years ago I needed to implement some SQL Server restore testing to keep our corporate auditors happy. I ran around with T-Sql for a while, but trying to find a solution that worked with all of our systems was a nightmare. So I dug out the PowerShell documentation and started reading up on SMO (that page looks a lot nicer than it did back in 2013). Soon I had a working piece of code, that with enough baling twine and candle wax could do what we wanted.

By now I’d got lots of code examples, so decided to turn them into 31 days worth of blog posts – 31 Days of SQL Server Backup and Restores with PowerShell to help others work out where to go and save them running into the same loops and holes I’d done. Also wrote a presentation and took it around the UK to Usergroups, SQL Saturdays and conferences.

Now we move onto the bit where I can start showing off some actual code

Roll on forward to 2016 and I start to update my scripts to get them a little more up to date. In the process I decided to transform them from a ragtag collection of scripts into a shiny PowerShell model, and so SqlAutoRestores came into being. The code in github is very much a work in progress. The basics worked but a lot of supporting stuff to cope with other people’s infrastructure was still needed.

Luckily I was asked to help with the dbatools project around then, mainly with the restore code. And the rest is history

So in 5 years my scrappy little bit of code has moved from this:

import-module "SQLPS" -DisableNameChecking
 
$sqlsvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server("Server1")
$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$devicetype = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
 
$backupname = "C:\psbackups\psrestore.bak"
 
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($backupname,$devicetype)
 
$restore.Database = "psrestore"
$restore.ReplaceDatabase = $True
$restore.Devices.add($restoredevice)
$restore.sqlrestore($sqlsvr)

(Basic PS Restore)

to these:
(linking to source as these are a tad larger than earlier files)
Restore-DbaDatabase (697 lines)
Get-DbaBackupInformation (354 lines)
Select-DbaBackupInformation (172 lines)
Test-DbaBackupInformation (209 lines)
Invoke-DbaAdvancedRestore (367 lines)

All of which have been worked on by many hands. But I now have a restore solution that I use every day in my day job to reset development environments and restore check my backups still, and is in use around the world by DBAs who need a solid reliable restore solution (If you’ve done a migration with dbatools, guess what’s underneath that)

It’s probably my most used single piece of code, and not just because it’s running 24/7/365 testing! The muscle memory is now so ingrained that I can beat almost any other restore option to start.

The point of that story is what this T-SQL Tuesday is about, if you shout about some nifty code, or at least let someone know, there’s a good chance other people will want to use it or help you develop it. It won’t happen overnight, this story is about 5.5 years old and there’s still debates about where this code is going to go in the future. And it’s been great to see where this tiny project has eventually lead me with my career and community involvement over the years.

Granting Azure Roles to Users or groups without a SignInName in Azure AD

Whilst sorting some permissions out on a resource group I stumbled across an issue with assigning them to a group. We have a replicate On Premise and Azure Active Directory solution in place. In the On Premise AD not all groups are Distribution groups so don’t have a ‘group1@contoso.com’ sign in. This causes a problem when trying to assign a role in the usual PowerShell way:

New-AzureRmRoleAssignment -SignInName 'group1@contoso.com' -RoleDefinitionName Contributor -ResourceGroupName MyRg

or

New-AzureRmRoleAssignment -SignInName 'group1' -RoleDefinitionName Contributor -ResourceGroupName MyRg

both return a nice red error message:

'New-AzureRmRoleAssignment : The provided information does not map to an AD object id.
At line:1 char:1
+ New-AzureRmRoleAssignment -SignInName group1@contoso.com -RoleD ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : CloseError: (:) [New-AzureRmRoleAssignment], KeyNotFoundException
    + FullyQualifiedErrorId : Microsoft.Azure.Commands.Resources.NewAzureRoleAssignmentCommand'

The workaround for this is to use the ObjectID parameter, and to grab the ObjectId like this:

$objectid = (Get-AzureRmADGroup -DisplayName "BSD Developers").Id
New-AzureRmRoleAssignment -ObjectId $objectid -RoleDefinitionName Contributor -ResourceGroupName MyRg

Just another one of those little oddities that crop up when you’ve got a long established infrastructure that doesn’t quite fit the modern templates ๐Ÿ˜‰

Now with added MVP!

Massively honoured that Microsoft have awarded me their Data Platform MVP award for 2018.

Looking at the other 19 UK Data Platform MVPs I’ve joining a pretty elevated set of individuals. Many of them were the people I looked up to and got advice from when I first started dabbling into the UK SQL Server community, so it’s a reflection on how much advice and knowledge there is out there that I’ve gotten this. And that I’ve got some living up to do.

The MVP is a acknowledgement of a person’s engagement with the community. I’ve done lots over the years:

But the really good stuff has been getting to meet people at all the events. There’s so much to be learnt over that quick 10 minute coffee break or at the after event shindig. Those are the conversations that have led me down interesting tech or made me consider speaking, and then lead me into organising as well.

So Thank You to everyone who’s talked to me online or at an event, and if you haven’t then feel free to bend my ear at the next event I’m at.

Bulk changing SQL Server Agent job owners with dbatools and T-SQL

Now we all know that having SQL Server Agent jobs owned by ‘Real’ users isn’t a good idea. But I don’t keep that close an eye on some of our test instances, so wasn’t surprised when I spotted this showing up in the monitoring:

The job failed. Unable to determine if the owner (OldDeveloper) of job important_server_job has server access (reason: Could not obtain information about Windows NT group/user 'OldDeveloper', error code 0x534. [SQLSTATE 42000] (Error 15404)).

Wanting to fix this as quickly and simply as possible I just wanted to bulk move them to our job owning account (let’s use the imaginative name of ‘JobOwner’).

I have 2 ways of doing this quickly. First up is an old T-SQL script I wrote years back (yeah, this happens all the time)). In this case we’re looking for all the jobs owner by the user OldDeveloper and moving them to the JobAccount user. It uses a cursor, but they aren’t always the spawn of evil ๐Ÿ™‚

DECLARE @job_id char(36)
DECLARE JobCursor CURSOR
FOR
SELECT job_id  FROM msdb..sysjobs sj 
    INNER JOIN sys.server_principals sp on sj.owner_sid=sp.sid
    WHERE sp.name = 'OldDeveloper'
OPEN JobCursor 
FETCH NEXT FROM JobCursor INTO @job_id
WHILE (@@FETCH_STATUS <> -1)
BEGIN
exec msdb..sp_update_job
    @job_name = @job_id,
    @owner_login_name = 'JobAccount'
FETCH NEXT FROM JobCursor INTO @job_id
END 
CLOSE JobCursor 
DEALLOCATE JobCursor 

And then, of course, there’s a simpler dbatools method in PowerShell:

Get-DbaAgentJob -SqlInstance server\instance | Where-Object {$_.OwnerLoginName -eq 'OldDeveloper'} | Set-DbaAgentJob -OwnerLogin JobAccount

Use Get-DbaAgentJOb to get all the SQL Agent jobs from your instance, a quick pipe through Where-Object to filter down to just the ones with a ‘bad’ owner, and finally into Set-DbaAgentJob to set the new owner login.

You can use whichever you prefer as they’ll do the same job. I find the PowerShell/dbatools the easiest one as I can remember the syntax, the T-SQL one I have to remember where I left the original copy ๐Ÿ˜‰

Can’t find a recently deployed SSIS package

Had a developer wander across confused about where the SSIS package he’d just redeployed had gone. No errors when he’d deployed the package to Integration Services, but when he looked at the properties of the package (or at least the one he thought he’d uploaded) it was showing a deployment date of 3 weeks ago.

A handy feature of having SSIS now using SSISDB to store it’s packages is that we can now query it to find out where it had gone with some simple T-SQL. In this case all I needed to do was to ask:

select 
	ip.name, 
	id.name as 'FolderName', 
	ip.deployed_by_name, 
	ip.last_deployed_time 
from 
	internal.projects ip inner join internal.folders id on ip.folder_id=id.folder_id
order by 
	ip.last_deployed_time desc

And I could see that the last time he’d deployed it, it had gone into Folder Y rather than Folder X. So after a quick clean up and redeploy everything was as he expected.

PsConf.Eu – a quick review

This was my second trip to Hannover for PsConf.EU, a 4 day conference covering every corner of PowerShell.

Sessions

Let’s start with the most important part of a conference, the content. Just a quick scan of the Agenda gave a hint of the quality lined up. With sessions from the PowerShell Dev team, and deep dives from community maestros there was plenty of highly technical content, but mixed up with some easier intros to new topics.

For me some of the highlights were getting a first view of some of the new features coming out from the DSC team for the LCM previews straight from the horse’s mouth (well, <a href=’https://social.msdn.microsoft.com/profile/Michael+Greene>Michael Greene the team leader), more details here. And the look at Pester internals from Jakub Jares (b | t) was good as well, now feel I know a lot more about the underpinning, and the reasons behind some of the ways you have to use it make much more sense.

No bad sessions from my point of view. A few issues with demos. Just from my point as a speaker, if you’re going to run a fragile multi machine demo, record a good version ahead of time. While it’s always good to do a demo live, with those setups once it’s gone wrong its’s very hard to recover.

The days run long, 08:30 – 20:00 on the wednesday with the 3 hour workshops at the end. For me this a bonus. When I’m at a conference I like to concentrate on the learning. Some conferences I’ve been to it’s felt as those you spend almost as much time at lunch or coffee breaks as you do in the sessions. So while there were long days it felt worth it.

The venue is well lit, the AV is great, even from the back of the room the screens were clear and the PA carrried the speaker right back. The rooms were closer together this year, and easier to navigate between. And despite the temperature being a step up from the UK the rooms were kept a pleasant but not overly cold temperature. Plenty of food and drink, the lunch menus weren’t holding anything back.

Community

I’m lucky enough to know quite a few PowerShell community members already, but met plenty more at PSConf.EU. Everyone’s there for the same reason so it was always easy to spark up a conversation with someone, whether sat next to them waiting for a session or in the queue for lunch. Was good to see the number of PS UserGroups that were teased out over a lunchtime breakout session, hopefully it’ll be up on the web shortly.

Party

Tuesday night was party night this year. Back to the Hannover Zoo for an evening of catching up and making new friends. The boat is pushed out with the venue, the demonstration from the Zoo (this year Polar Bear feeding), free food and beer. Don’t think anyone went hungry or thirsty that night.

Logistics

From the UK this is a pretty easy conference to get to. From Nottingham I took the train down to Heathrow, a 1 hour hop over with British Airways and a 20 minute train journey saw me in the centre of Hannover. Taxis are about ยข12 from the station to the Conference hotel, but as I’d spent a lot of time sitting down I decided to take the 25 minute walk across.

Accomodation

This year I stayed in the conference hotel the Congress Hotel Am StadtPark next door to the main conference venue. Good hotel, room was a decent size and had all the usual features. The bar was reasonably priced and usually full of PowerShell geeks if you wanted a chat over beers. Restaurant was good, but not cheap. The only downside was flaky WiFi, but that improved on Saturday once all the phones, tablets and laptops from the conference had left for some reason…..

Another great year from Tobias and his team. They’ve announced the dates for next year, 4th-7th June 2019 and they’ve been written into the calendar with ink as I’ll definitely be heading back.

Bulk uploading CSVs to Azure SQL Database with dbatools

PowerShellLike most people we’re busy moving ourselves over to Azure, and like a lot of people (even though they won’t admit it) we’ve got years of data stashed away in CSV files. Go on, own up there’s years worth of department membership stashed away in a HR csv folder somewhere in your organisation ๐Ÿ˜‰

To get some of this data usable for reporting we’re importing it into Azure SQL Database so people can start working their way through it, and we can fix up errors before we push it through into Azure Data Lake for mining. Being a fan of dbatools it was my first port of call for automating something like this.

Just to make life interesting, I want to add a time of creation field to the data to make tracking trends easier. As this information doesn’t actually exist in the CSV columns, I’m going to use LastWriteTime as a proxy for the creationtime.

$Files = Get-ChildItem \\server\HR\HandSTraining\Archive -Filter *.Csv
$SqlCredential = Get-Credential

ForEach ($File in $Files | Where-Object {$_.Length -gt 0}) {
    $InputObject = ConvertFrom-Csv -InputObject (Get-Content $File.fullname -raw) -Header UserName, StatusName
    $InputObject | Add-Member -MemberType NoteProperty -Value $File.LastWriteTime -Name DateAdded
    $DataTable = $InputObject | Out-DbaDataTable
    Write-DbaDataTable -SqlInstance superduper.database.windows.net -Database PreventPBI -Table Training -InputObject $DataTable -Schema dbo -SqlCredential $SqlCredential -RegularUser
    Remove-Variable InputObject
}

Working our way through that, we have:

$Files = Gci \\server\HR\HandSTraining\Archive -Filter *.Csv
$SqlCredential = Get-Credential

Setup the basics we’re going to need throughout. Grab all the csv files off of our network share. I prefer grabbing credentials with Get-Credential, but if you’d prefer to embed them in the script you can use:


We then ForEach through all the files, having filterer out the empty ones

    $InputObject = ConvertFrom-Csv -InputObject (Get-Content $File.fullname -raw) -Header UserName, StatusName
    $InputObject | Add-Member -MemberType NoteProperty -Value $File.LastWriteTime -Name DateAdded

Load the file contents into a object with ConverTo-Csv. These csv files don’t contain a header row so I’m use the -Header parameter to force them in. This also helps with Write-DbaDataTable as I can ensure that the object names match with the Sql column names for the upload

Then we add a new property to our Input Object. Doing it this way we add it to every ‘row’ in the object at once. If you want to add multiple new properties just keep doing this for each one.

    $DataTable = $InputObject | Out-DbaDataTable
    Write-DbaDataTable -SqlInstance superduperdb.database.windows.net -Database HealthAndSafety -Table Training -InputObject $DataTable -Schema dbo -SqlCredential $SqlCredential -RegularUser

Convert our InputObject into a datatable, which is the format Write-DbaDataTable needs for input.

And then the command that does the loading, Write-DbaDataTable. There are only things here that you have to do differently for loading to an Azure SQL database as opposed to a normal SQL Server instance. For Azure SQL Databases you have to use a SQL Credential as the underlying dlls don’t work (yet) with the various Integrate Authentication options. You need to use the RegularUser switch. Normally dbatools will assume you have sysadmin rights on your SQL Server instance as they are needed for many of the tasks. In an Azure SQL Database you can’t have those rights as they don’t exists, so without Regular user you’ll get a nice error message. Just something to look out for, I’ve tripped myself up in the past when repointing load scripts.

Then we drop InputObject and go round the loop again until we’re finished.

Easy and very quick, and now I can just point PowerBI at it and let the users and analysts work out what they want to do with it.

Changes to SQL Relay session selection process 2018

This year the SQL Relay committee has decided to make a couple of changes to our session selection procedure.

Our aim this year is to be as inclusive and transparent on our process as we can manage. To this end, this post is going to lay out our aims and how we plan to achieve them.

We want everyone to feel they can submit without worrying that their submission will be overlooked due to Gender, Disability, Availability to do the whole tour or any other reason you may be concerned about.

To do this we are moving to a blind selection process. I (Stuart) will be the only member of the committee who can see the list of submissions. At the end of the Call for Sessions I will release only the session abstracts to the rest of the committee members, who will then choose the sessions based only on the abstracts.

Then, should we have openings due to people’s availability we will go through again to fill in any holes in the agenda.

If you require any special requirements then please add them to your submission. They won’t be used during the selection process, but will allow us and our venues to ensure we cover them
The Call for Papers is available here – SQL Relay 2018 CfP

If you are worried about any other reasons you may feel that your submission may be overlooked and want to talk them through, then please contact me (comments, Contact Form or on Twitter (@napalmgram). Anything you ask will be treated in confidence.

If you would like some help with or an independent review your abstract then please get in touch with Speaking Mentors where some of the best SQL Session submitters can help you out.

How are SQL Server Log Sequence Numbers (LSNs) generated?

If you’ve ever dug down in the SQL Server transaction logs or had to build up restore chains, then you’ll have come across Log Sequence Numbers (LSNs). Ever wondered why they’re so large, why they all look suspiciously the same, why don’t they start from 0 and just how does SQL Server generate these LSNs? Well, here we’re going to take a look at them

Below we’ll go through examples of how to look inside the current transaction log, and backed up transaction logs. This will involve using some DBCC commands and the undocumented fn_dblog and fn_dump_dblog function. The last 2 are very handy for digging into SQL Server internals, but be wary about running them on a production system without understanding what’s going on. They can leave filehandles and processes behind that can impact on your system.

So with that warning out of the way, let’s push on

Read More

Page 1 of 13

Powered by WordPress & Theme by Anders Norén