Musings of a Data professional

Stuart Moore

Category: sql server Page 3 of 5

Snipe hunting in the SQL Server Transaction Log

Nice quiet day in the office, busily cracking through the To Do list when suddenly something pops into the ticketing system as a P1. Data has gone ‘missing’ from a corporate system, and there’s nothing in the audit trail about when it went missing or how it went missing. For how, read ‘Whodunnit’!

Now if there’s nothing in the audit trails, and noone is confessing then that doesn’t leave me with very much to go on. A slight break through is that someone clearly recalls seeing the data at 09:45 on the 3rd of Febuary, but that’s as good as it gets. We’re now on the 6th Febuary, so that’s a window of just over 3 days it could have gone missing.

Now, as any DBA knows, SQL Server comes equipped with a bit in ‘audit’ for any operation that modifies data. It’s just awkward and complex to read and search.

The name of that audit is the Transaction Log. SQL Server won’t do any data modification without it being written into the log, as SQL Server doesn’t count the transaction committed until it’s logged as it needs it for recovery after a restart.

The transaction log holds a LOT of information. Searching 3 days worth of it would be like looking for something in a hay stack. And at the moment we don’t even know if that’s a needle, a pin or an apple seed

An aside about reading Transaction Logs

SQL Server comes with 2 built in, but undocumented, functions fn_dblog and fn_dump_dblog. Undocumented means Microsoft don’t publish documentation and reserve the right to modift them without notice. But they’re a pretty open secret, and there’s pleny of info out there that

For the purposes of this post, you just to need to know this about them:

  • fn_dblog reads the current transaction log
  • fn_dump_dblog reads transaction log backups

If you’re on a SQL version lower that SQL Server 2014 or SQL Server 2012 SP2 then there is a know bug that uses up threads and can cause hanging.

Finding the time of deletion

So the first thing to do is to is to work out a time window when the data went missing. This was going to be a tedious process, so lets automate it. If we’re automating a SQL Server task, then dbatools becomes the obvious answer.

The plan is:

  1. Restore the database the last time the data was seen
  2. Roll forward in 6 hour increments until the data disappears
  3. Roll forward in 30 minute steps from the beginning of the last 6 hour lost until the data disappears
  4. Roll forward from the 30 minute start to the disappearance of the data in 5 minute increments

So we now have a 5 minute window we need to search in the transaction log backup, which is much better than 72 hours

To speed things up we’ll scan all the backup headers first and save them into an object which we can reuse through out the process. We’ll also rename the database, and change the filenames on restore. This is because we are restoring the database onto the instance it came from, the reason we have to do this will become clearer later on

We also gather some extra information about the missing rows, which I’ll explain in a moment.

# Time to start Restoring from
$startTime = get-date ('03/02/2020 09:45')

# The time in minutes between restore points
$windowSize = @('720','30','5')

# Used to track how far through the windows we are
$counterLoop = 0

# Query to test if the object has been deleted during the window
# We also gather some internal SQL Server page information we'll need later
$sqlQuery = 'select db_id(), sys.fn_PhysLocFormatter(%%physloc%%),* from dbo.Objects where ObjectID=''1'''

# Scan all the backup headers first to speed things up
$backups = Get-DbaBackupInformation -Path c:\RestoredBackps -SqlInstance MyInstance 

# Set a variable to tell restore whether to start a new restore, or to continue

$restoreContiue = $false
while ($counterLoop -lt $windowSize.count){
    $backups | Restore-DbaBackup -SqlInstance MyInstance -DatabaseName RestoreDb -ReplaceDbNameInFile -TrustDbBackupHistory -RestoreTime $startTime.AddMinutes($windowSize[$loopcount]) -Standby c:\Standby -Continue:$restoreContiue

    #Prevent the old page information from being overwritten
    $oldResult = $sqlResult

    $sqlResult = Invoke-DbaQuery -SqlInstance MyInstance -DatabaseName RestoreDb -Query $sqlQuery
    
    if ($sqlResult.count -eq 0){
        # If we get no results, we're past when the data has disappeared
        # So, move to the next loop and start a new restore
        
        $counterLoop++
        $restoreContiue  = $false
    } else {
        # If data still there, start the next increment
        
        $startTime = $startTime.AddMinutes($windowSize[$counterLoop])
        $restoreContiue = $true
    }
}
Write-Host "Data disappeared between $startTime and $($startTime.AddMinutes($windowSize[$counterLoop]-1))`n"
Write-Host "Missing data was on the following pages:`n"
$oldResult

Now we know when the data disappeared, now we just want how

Searching the Transaction Logs

SQL Server Transaction logs store a LOT of information, so querying them will return more information that you really want. Also the information in the Transaction Log is for the benefit of SQL Server, so not all of it is easily readable for us humans

For instance, when we’re looking for a specific missing row the easiest way is to search for transaction log records that modified the physical location the row existed on. While the actual row data and SQL details are in the log, they’re encoded, so getting to them is a lot of work.

We’re looking for a specific row in a specific table so we need the following 3 bits of information to identify it’s physical location:

  • Datafile ID
  • Page ID
  • Slot ID

These can be found for a specific row with the following query using the fn_PhysLocFormatter function which handily returns the physical location of a row in a table:

select db_id(),sys.fn_PhysLocFormatter(%%physloc%%) from dbo.Objects where ObjectID='1'

This will give you result like:

db_idFile:Page:Slot
5(1:232:58)

The reason we were saving this from the previous iteration of the loop is because it wouldn’t exist once the data had been deleted.

Now all we need to do is to search the transaction log. However, there’s one more small step before we can do that. In the transaction log the FileID and PageID are stored as fixed length hexadecimal values, and we’ve just pulled them out in decimal format.

I convert them with PowerShell as it’s a little easier for me to remember:

PS C:\ '{0:x4}' -f 1
0001
PS C:\ '{0:x8}' -f 336
00000150

Note how the FileID and PageID are padded out to 4 characters and 8 characters respectively. This query will return all the transactions that affected that slot on that page in that file. Depending on how busy that page is that might be a lot.

select * from fn_dblog(NULL,NULL) where [Page ID]='0001:00000150' and [Slot ID]='58'

Fiding out who and when

You’ll instantly find out just how much information there is in a transaction log record. So let’s trim down the data a bit much to a couple of things we want to know:

  • Who issued the command
  • Exactly when it was run

And also limit it to just delete records:

select [Transaction ID],[Begin Time],[Transaction SID] from fn_dblog(NULL,NULL) where [Page ID]='0001:00000150' and [Slot ID]='58' and Operation='LOP_DELETE_ROWS

If you’re lucky and this returnselect [Transaction ID],[Begin Time],[Transaction SID] from fn_dblog(NULL,NULL) where [Page ID]=’0001:00000150′ and [Slot ID]=’58’ and Operation=’LOP_DELETE_ROWSs a single row, you’ll find that only [Transaction ID] is populated as the other goodies are recorded at the Transaction wrapper level, not the statement level, so we take the Transaction ID and use that:

select [Transaction ID],[Begin Time],suser_sname([Transaction SID]) from fn_dblog(NULL,NULL) where [Transaction ID]=’0000:000003ba’

select [Transaction ID],[Begin Time],suser_sname([Transaction SID]) from fn_dblog(NULL,NULL) where [Transaction ID]='0000:000003ba'

This returns all the rows for the specified transaction in the transaction log. You’ll have 2 rows for the Transaction wrapper, the BEGIN and the COMMIT, these will be the first and last row.

The rows in between will be the actual deletion records. You might have one or more depending how much data was in the row, or if it was part of a delete than remove multiple rows.

The BEGIN transaction record contains the information you wanted:

  • [Begin Time] gives you the time the delete started
  • [Transaction SID] gives you the Security Identifier of the account that executed the transaction.

We user SUSER_SNAME to convert the SID to a username. This is the reason I mentioned that you want to do this on the same instance as the original database was on, this is to make sure the SIDs match up and you get the right name!

Conclusion

As you can see, everything you’d ever want to know about what happened in SQL server is in the transaction log. But it’s not an easy beast to work with, much better to look at better methods to save you having to do this.

PowerShell

Failing a PowerShell Sql Agent step

SQL Server Agent offers some great branching facilities to cope with step failures. These work really well with T-SQL and SSIS steps, but how can you get your PowerShell steps to respond properly to cause a fail?

This question was put to me at SQL Bits over the weekend. The asker was using PowerShell to move files around prior to importing them. They wanted the PowerShell step to report failure properly so their Agent logic would rerun the step

You might want the same response to make sure your monitoring is letting you know when jobs fail. On that note, it would also be nice if you could raise an error or failure message in your PowerShell step and have that propagate back up to SQL Server

Unfortunately the usual scripting standbys of returning 0 or $false don’t work.

The solution is to Throw an error. This gives a simple way to register a failed step and raise an intelligible error back to SQL Server. At it’s most basic it’s just adding this line to your PowerShell:

For example:

Throw "File Not Found"

So in the example I was looking at you might have some like this:

If (-not (Test-Path -Path c:\imports\file.csv)){
    Throw "File c:\imports\file.csv does not exist"
} 
# And the rest of the script goes here

So now if the file is missing, the job step will fail AND you’ll have a nice error message in the job history.

For even move information you can have multiple throws in your script:

If (-not (Test-Path -Path c:\imports\file.csv)){
    Throw "File c:\imports\file.csv does not exist"
} 
$content = Get-Content c:\imports\file.csv -raw
if ($content.length -eq 0){
    Throw "File is empty"
}


So you can really narrow down the reasons for the SQL Agent job step failing.

This technique works with both the PowerShell and CmdExec SQL Agent step methods of running PowerShell scripts.

PowerShell

Converting SQL Server LSNs with PowerShell

If you’ve spent much time playing with SQL Server backups and restores then you’ll have discovered Log Sequence Numbers (LSNs), I’ve discussed before How SQL Server LSNs are generated, and went through how to manually convert a hexadecimal LSN to a numeric value.

As part of some work I was doing on the dbatools module I needed something to convert the Log Sequence Numbers coming back from the transaction log into numeric LSNs so I could use them to query the backup history tables in msdb.

Thinking this might be useful to other people, I’ve chopped out the dbatools specific code, and present the function here:

function Convert-LSN {
<#
.SYNOPSIS
Converts Lsns betweent Hex and/or numeric formats
.DESCRIPTION
Function takes an LSN in either split Hexadecimal format () or numberic
It then returns both formats in an object
.PARAMETER LSN
The LSN value to be converted
.EXAMPLE
PS C:\ $output = Convert-LSN -LSN 0000014:000000f3:0001
Will return object $Output with the following value
$Output.HexLSN = 0000014:000000f3:0001
$Output.NumbericLSN = 20000000024300001
#>
[CmdletBinding()]
param(
[string]$LSN
)
if ($LSN -match '^[a-fA-F0-9]{8}:[a-fA-F0-9]{8}:[a-fA-F0-9]{4}$') {
Write-Verbose -Message 'Hexadecimal LSN passed in, converting to numeric'
$sections = $LSN.Split(':')
$sect1 = [System.Convert]::ToInt64($sections[0], 16).ToString()
$sect2 = [System.Convert]::ToInt64($sections[1], 16).ToString().PadLeft(10, '0')
$sect3 = [System.Convert]::ToInt64($sections[2], 16).ToString().PadLeft(5, '0')
$Hexadecimal = $LSN
$Numeric = $sect1 + $sect2 + $sect3
} elseif ($LSN -match '^[0-9]{15}[0-9]+$') {
Write-Verbose -Message 'Numeric LSN passed in, converting to Hexadecimal'
$sect1 = '{0:x}' -f [System.Convert]::ToString($LSN.Substring(0, $LSN.length-15), 16).PadLeft(8,'0')
$sect2 = '{0:x}' -f [System.Convert]::ToString($LSN.Substring($lsn.length-14, 9), 16).PadLeft(8, '0')
$sect3 = '{0:x}' -f [System.Convert]::ToString($LSN.Substring($lsn.length-5, 5), 16).PadLeft(4, '0')
$Numeric = $LSN
$Hexadecimal = $sect1 + ':' + $sect2 + ':' + $sect3
} else {
Write-Warning -Message 'LSN passed in is neither Numeric nor in the correct hexadecimal format'
return
}
[PSCustomObject]@{
Hexadecimal = $Hexadecimal
Numeric = $Numeric
}
}
view raw Convert-LSN.ps1 hosted with ❤ by GitHub

Standard PowerShell good practice to include a decent help and examples block

I use a couple of regex expressions to make sure we’re getting what we expect. Both are based off of the Microsoft documentation for the LSN formats. So for the numeric LSN we’re looking for 26 numeric digits, and for the hexadecimal LSN representation we’re after 2 8 character hex numbers and a 4 character one separated by colons (‘:’). If the LSN passed in doesn’t match either of those, we raise a warning as we can’t do anything with it.

For hex to numeric, we split the string on :. Convert each section from hex to int64 (using int64 as LSNs can get really big), then convert to a string and pad with the requisite number of 0s.

For number to hex we work from the back of the LSN as we have to cope with potential leading 0’s.

Hitting brick walls – T-SQL Tuesday #105

T-SQL TuesdayIt’s time for another T-SQL Tuesday. This month Wayne (blog|Twit) asked use to write about when we’ve run up against a brick wall during our SQL Server career

I thought I’d start off with the first Brick Wall SQL Serve place in front of me:

Brickwall 1

My first foray into SQL Server was taking over a failing project to rewrite a business application into Visual Basic 6 and make it work better, and be less fragile, with multiple concurrent users. The brick wall here was the backend was something nasty in MS Access. How could I work around that?

The small amount of the internet around at that time seemd to think that this thing called SQL Server that Microsoft sold that was like Access on steriods. Could this be what I needed? No one else in the department at that time knew anything about it.

At the time we had a site license for Microsoft Office, so Access was free. SQL Server was going to cost some money, which was never a popular proposal. Oh, and we didn’t have anything to host it on. Our server estate consisted of a couple of NT 3.51 DCs, fileshare and print servers. More money needed there.

So my first SQL Server brick wall was learning to write a Business Case. Something I’d never done before, I’d always been a resource others wrote a business case to claim for a couple of months. So, I got hold of those business cases, read them and came up with one myself.

Then I needed to make sure it wasn’t a complete turkey, so I needed to get myself some time with someone who’d being reading it for feedback. This kept going back and forth till it was ready to go to the money controllers.

After one small set of revisions they accepted it, and I had the money. Brilliant news, brickwall smashed through!

Unfortunately just behind it was

Brickwall 2

Had I mentioned that I didn’t know anything about SQL Server at that point? And now I had to install it, configure it, and get an application running and performing well on it.

This was back in the days before YouTube, PluralSight and all the other learning resources we take for granted now. So I loaded the credit card up with the latest books from Wrox and O’Reilly, cranked up the kettle and read late into many nights

Suddenly I realised I had more problems than I thought. SQL Server wan’t just Access on steroids, it was a different way of thinking! I couldn’t just throw the Access stuff I had at it and hope for the best.

So a crash course in database design, learning what a view was a just why I should be using Stored Procedues. Oh, and the joys of left, right, inner, outer and cartesian jobs.

I learnt to build quick demo databases and datasets so I could play with test cases. This skill has been so useful in the 20+ years since then. There’s been no end of other brickwalls that I chipped at brick by brick by running through test case after test case.

I got damn proficient at backups and restores so I could reset back to just before I’d made a major mistake. I learnt how to performance tune my queries so I could work out just what I was doing wrong.

This wasn’t a smash through like Wall 1, this was a constant chipping away for 2 months until eventually I could push the new app out of the door and be confident it wouldn’t come straight back to bite me. The fact it was still in use (with some tweaks) 8 years later backs up the idea that I learnt something

Walls 3….n

And the walls since then? There was learning to cluster with Wolfpack. Surviving my first corrupted disks. Learning PowerShell. Turning Dynamics CRM. Moving database up to the cloud

Just this week the wall I’m chipping away at is trying to get Teams, Poshbot and SQL Server talking nicely so we can get some SQL ChatOps going. This is definitely a chipper rather than a jumping wall.

My feeling is that if you’re not hitting brick walls you’re probably not pushing yourself or SQL Server hard enough! Everytime I’ve hitten a brick wall, I’ve either improved as a DBA or a human. Each wall looks like a problem, but knowing I’ve gone through, over or even around so many in the past just makes them all look a bit smaller. And just like dominoes, once I’d got the first down the rest have just kept going.

Generating a list of full text word breakers for SQL Server

The post “Generating a list of full text word breakers for SQL Server” was originally published for Leaf Node, but as that site is being wound down I’m republishing it here.

Why do you need a list of full text word breakers? SQL Server Full Text works on breaking text down into fragments we’d normally call words, and then working on those fragments. In English we know what most of those are likely to be, ‘ ‘,’.’,’,’ (space, full stop, comma) and some others. But what about other characters? What happens with a ‘? And what about in other collations? This is why a list of full text word breakers is very useful to have

This isn’t going to be a nice handy list of full text word breakers, but it will show how to quickly get such a list.

The basic principle is going to rely on sys.dm_fts_parser. As described on the linked page this takes a string, and then breaks it down into ‘tokens’ based on the defined language, word breaker, thesaurus and stop list. As an example:

select * FROM sys.dm_fts_parser ('"the quick brown fox jumpsover"', 1033, 0, 0)

The typo is deliberate, for once!. And this returns:
Splitting full text strings into tokens in SQL Server

So it’s returned a row for every token it can find. This example was nice and simple as it only contains words and spaces. The type was to show that without a word breaker SQL Server will just return the string without interpreting it. So that was for spaces, now just to confirm for other characters. First of all commas:

select * FROM sys.dm_fts_parser ('"quick,brown"', 1033, 0, 0)

Splitting full text strings with word breakers in SQL Server
So, that returns 2 rows as expected. How about hyphens?:

select * FROM sys.dm_fts_parser ('"quick-brown"', 1033, 0, 0)

 Interesting, we get 3 rows this time. This makes sense when you think that hyphenation isn’t an exact science so some people will use them, and some won’t. So by combining both ‘sides’ of the hyphenated word SQL Server can hopefully match both uses.

So, that’s the basic theory. So this little piece of T-SQL is going to loop through all the 255 ASCII characters. For each one we’re going to use it to join 2 ‘words’, and then run that string through sys.dm_fts_parser. If the function returns more than 1 row we now that it’s found a word breaker, so we then output the character, and the character code as not all the characters are printable. You’ll also notice that code 34 throws an error, that’s because it’s ” which is a reserved character within full Text searches.

declare @i integer
declare @cnt integer
set @i=0
while @i&lt;255
begin
  set @cnt=0
  select @cnt=COUNT(1) FROM sys.dm_fts_parser ('"word1'+CHAR(@i)+'word2"', 1033, 0, 0)
  if @cnt&gt;1
  begin
    print 'this char - '+char(@i)+' - char('+convert(varchar(3),@i)+') is a word breaker'
  end
  set @i=@i+1
end

Which gives a nice long list:

this char -  - char(1) is a word breaker
this char -  - char(2) is a word breaker
this char -  - char(3) is a word breaker
this char -  - char(4) is a word breaker
this char -  - char(5) is a word breaker
this char -  - char(6) is a word breaker
this char -  - char(7) is a word breaker
this char -  - char(8) is a word breaker
this char - 	 - char(9) is a word breaker
this char - 
 - char(10) is a word breaker
this char -  - char(11) is a word breaker
this char -  - char(12) is a word breaker
this char - 
 - char(13) is a word breaker
this char -  - char(14) is a word breaker
this char -  - char(15) is a word breaker
this char -  - char(16) is a word breaker
this char -  - char(17) is a word breaker
this char -  - char(18) is a word breaker
this char -  - char(19) is a word breaker
this char -  - char(20) is a word breaker
this char -  - char(21) is a word breaker
this char -  - char(22) is a word breaker
this char -  - char(23) is a word breaker
this char -  - char(24) is a word breaker
this char -  - char(25) is a word breaker
this char -  - char(26) is a word breaker
this char -  - char(27) is a word breaker
this char -  - char(28) is a word breaker
this char -  - char(29) is a word breaker
this char -  - char(30) is a word breaker
this char -  - char(31) is a word breaker
this char -   - char(32) is a word breaker
this char - ! - char(33) is a word breaker
Msg 7630, Level 15, State 3, Line 7
Syntax error near 'word2' in the full-text search condition '"word1"word2"'.
this char - # - char(35) is a word breaker
this char - $ - char(36) is a word breaker
this char - % - char(37) is a word breaker
this char - &amp; - char(38) is a word breaker
this char - ( - char(40) is a word breaker
this char - ) - char(41) is a word breaker
this char - * - char(42) is a word breaker
this char - + - char(43) is a word breaker
this char - , - char(44) is a word breaker
this char - - - char(45) is a word breaker
this char - . - char(46) is a word breaker
this char - / - char(47) is a word breaker
this char - : - char(58) is a word breaker
this char - ; - char(59) is a word breaker
this char - &lt; - char(60) is a word breaker
this char - = - char(61) is a word breaker
this char - &gt; - char(62) is a word breaker
this char - ? - char(63) is a word breaker
this char - @ - char(64) is a word breaker
this char - [ - char(91) is a word breaker
this char - \ - char(92) is a word breaker
this char - ] - char(93) is a word breaker
this char - ^ - char(94) is a word breaker
this char - { - char(123) is a word breaker
this char - | - char(124) is a word breaker
this char - } - char(125) is a word breaker
this char - ~ - char(126) is a word breaker
this char -  - char(127) is a word breaker
this char -  - char(129) is a word breaker
this char - ‚ - char(130) is a word breaker
this char - „ - char(132) is a word breaker
this char - … - char(133) is a word breaker
this char - † - char(134) is a word breaker
this char - ‡ - char(135) is a word breaker
this char - ‰ - char(137) is a word breaker
this char - ‹ - char(139) is a word breaker
this char -  - char(141) is a word breaker
this char -  - char(143) is a word breaker
this char -  - char(144) is a word breaker
this char - “ - char(147) is a word breaker
this char - ” - char(148) is a word breaker
this char - • - char(149) is a word breaker
this char - – - char(150) is a word breaker
this char - — - char(151) is a word breaker
this char - › - char(155) is a word breaker
this char -  - char(157) is a word breaker
this char -   - char(160) is a word breaker
this char - ¡ - char(161) is a word breaker
this char - ¦ - char(166) is a word breaker
this char - § - char(167) is a word breaker
this char - « - char(171) is a word breaker
this char - ­ - char(173) is a word breaker
this char - ± - char(177) is a word breaker
this char - ¶ - char(182) is a word breaker
this char - · - char(183) is a word breaker
this char - » - char(187) is a word breaker
this char - ¿ - char(191) is a word breaker
this char - × - char(215) is a word breaker
this char - ÷ - char(247) is a word breaker

As you can see there’s a lot of word breakers in there. And you can generate an even larger set if you use unicode, though this will need to loop through a lot more that 255 records, and you’ll need to use nchar() instead of char().

Rolling Back SQL Server agent job owner changes

So in this post I showed you a technique for updating the owner on all your SQL Agent jobs.

Whilst this isn’t a major change, you do need to have checked that the new ownership account has the correct permissions or some thing (eg; EXECUTE AS or CMDEXEC steps). With this in mind someone asked me how I’d back this out. Either for all jobs or that one specific MUST NOT FAIL job that is failing.

As we have 2 ways to change them, we have 2 ways to back out the changes. Note that both of these require you to have done a small amount of work up front, there is no magic rollback option 🙁

PowerShell and dbatools

This one is a little simpler as we can use a lot of PowerShell plumbing to do the lifting for us. To set this up we use:

Get-DbaAgentJob -SqlInstance server1 | Select-Object Name, OwnerLoginName | ConvertTo-Csv | Out-File .\server1-JobOwners.csv

If you’re not a PowerShell user I’ll just break this down into steps:

  1. We user dbatools’ `Get-DbaAgentJob` cmdlet to get all the Agent jobs on Server1
  2. We pipe the ouput to `Select-Object`, we select just the job Name and it’s Owner’s login name
  3. The output is then converted to a CSV object
  4. Finally we write the date to a file in the current folder

So we’ve created a nice csv list of the original owners of our jobs, great. Now how do we use those to roll back?
To roll back every job:

Get-Content .\server1-JobOwners.csv | ConvertFrom-Csv | Foreach-Object {Set-DbaAgentJob -SqlInstance Server1 -Job $_.Name -OwnerLoginName $_.OwnerLoginName}

To rollback just one job it’s very similar, we just insert a Where-Object step into the pipeline:

Get-Content .\server1-JobOwners.csv | ConvertFrom-Csv | Where-Object -Name -eq 'My Job' | Foreach-Object {Set-DbaAgentJob -SqlInstance Server1 -Job $_.Name -OwnerLoginName $_.OwnerLoginName}

In both cases we read in our CSV, convert it from CSV into a PowerShell Object. For a single job we use Where-Object to filter down to that one job, you could also use like if you wanted to pattern match. The remain records then get piped through a ForEach-Object loop, where we use Set-DbaAgentJob to reset the jobs owner.

T-SQL

The pre setup for this one involves creating and populating a table to store the old values:

CREATE TABLE JobOwnerHistory(
  Job_Id char(36),
  JobName nvarchar(128),
  OwnerLoginName nvarchar(50)
)

INSERT INTO JobOwnerHistory
  SELECT 
    sj.job_id, 
    sj.JobName,
    sp.name as 'OwnerLoginName' 
  FROM 
    msdb..sysjobs sj 
    INNER JOIN sys.server_principals sp on sj.owner_sid=sp.sid

So now, resetting a job’s owner is just a modification of our original script:

DECLARE @job_id char(36)
DECLARE @OwnerLoginName varchar(50)
DECLARE JobCursor CURSOR
FOR
SELECT 
  job_id,
  OwnerLoginName
FROM
  JobOwnerHistory
--WHERE JobName LIKE '*Finance*'
OPEN JobCursor 
FETCH NEXT FROM JobCursor INTO @job_id, @OwnerLoginName
WHILE (@@FETCH_STATUS <> -1)
BEGIN
exec msdb..sp_update_job
    @job_name = @job_id,
    @owner_login_name = @OwnerLoginName
FETCH NEXT FROM JobCursor INTO @job_id, @OwnerLoginName
END 
CLOSE JobCursor 
DEALLOCATE JobCursor 

As written that will reset every job in JobOwnerHistory, if you want to filter down to a subset of tables you’d uncomment and modify the WHERE line

Hope those examples are helpful. If you’ve stumbled across this page and it doesn’t quite fix your problem, please drop a comment and I’ll try to lend a hand.

Bulk changing SQL Server Agent job owners with dbatools and T-SQL

Now we all know that having SQL Server Agent jobs owned by ‘Real’ users isn’t a good idea. But I don’t keep that close an eye on some of our test instances, so wasn’t surprised when I spotted this showing up in the monitoring:

The job failed. Unable to determine if the owner (OldDeveloper) of job important_server_job has server access (reason: Could not obtain information about Windows NT group/user 'OldDeveloper', error code 0x534. [SQLSTATE 42000] (Error 15404)).

Wanting to fix this as quickly and simply as possible I just wanted to bulk move them to our job owning account (let’s use the imaginative name of ‘JobOwner’).

I have 2 ways of doing this quickly. First up is an old T-SQL script I wrote years back (yeah, this happens all the time)). In this case we’re looking for all the jobs owner by the user OldDeveloper and moving them to the JobAccount user. It uses a cursor, but they aren’t always the spawn of evil 🙂

DECLARE @job_id char(36)
DECLARE JobCursor CURSOR
FOR
SELECT job_id  FROM msdb..sysjobs sj 
    INNER JOIN sys.server_principals sp on sj.owner_sid=sp.sid
    WHERE sp.name = 'OldDeveloper'
OPEN JobCursor 
FETCH NEXT FROM JobCursor INTO @job_id
WHILE (@@FETCH_STATUS <> -1)
BEGIN
exec msdb..sp_update_job
    @job_name = @job_id,
    @owner_login_name = 'JobAccount'
FETCH NEXT FROM JobCursor INTO @job_id
END 
CLOSE JobCursor 
DEALLOCATE JobCursor 

And then, of course, there’s a simpler dbatools method in PowerShell:

Get-DbaAgentJob -SqlInstance server\instance | Where-Object {$_.OwnerLoginName -eq 'OldDeveloper'} | Set-DbaAgentJob -OwnerLogin JobAccount

Use Get-DbaAgentJOb to get all the SQL Agent jobs from your instance, a quick pipe through Where-Object to filter down to just the ones with a ‘bad’ owner, and finally into Set-DbaAgentJob to set the new owner login.

You can use whichever you prefer as they’ll do the same job. I find the PowerShell/dbatools the easiest one as I can remember the syntax, the T-SQL one I have to remember where I left the original copy 😉

Can’t find a recently deployed SSIS package

Had a developer wander across confused about where the SSIS package he’d just redeployed had gone. No errors when he’d deployed the package to Integration Services, but when he looked at the properties of the package (or at least the one he thought he’d uploaded) it was showing a deployment date of 3 weeks ago.

A handy feature of having SSIS now using SSISDB to store it’s packages is that we can now query it to find out where it had gone with some simple T-SQL. In this case all I needed to do was to ask:

select 
	ip.name, 
	id.name as 'FolderName', 
	ip.deployed_by_name, 
	ip.last_deployed_time 
from 
	internal.projects ip inner join internal.folders id on ip.folder_id=id.folder_id
order by 
	ip.last_deployed_time desc

And I could see that the last time he’d deployed it, it had gone into Folder Y rather than Folder X. So after a quick clean up and redeploy everything was as he expected.

How are SQL Server Log Sequence Numbers (LSNs) generated?

If you’ve ever dug down in the SQL Server transaction logs or had to build up restore chains, then you’ll have come across Log Sequence Numbers (LSNs). Ever wondered why they’re so large, why they all look suspiciously the same, why don’t they start from 0 and just how does SQL Server generate these LSNs? Well, here we’re going to take a look at them

Below we’ll go through examples of how to look inside the current transaction log, and backed up transaction logs. This will involve using some DBCC commands and the undocumented fn_dblog and fn_dump_dblog function. The last 2 are very handy for digging into SQL Server internals, but be wary about running them on a production system without understanding what’s going on. They can leave filehandles and processes behind that can impact on your system.

So with that warning out of the way, let’s push on

Complex SQL Server restore scenarios with the dbatools Restore-DbaDatabase pipeline

dbatools logoNo matter how hard the dbatools team try, there’s always someone who wants to do things we’d never thought. This is one of the great things with getting feedback direct from a great community. Unfortunately a lot of these ideas are either too niche to implement, or would be a lot of complex code for a single use case

As part of the Restore-DbaDatabase stack rewrite, I wanted to do make things easier for users to be able to get their hands dirty within the Restore stack. Not necessarily needing to dive into the core code and the world of Github Pull Requests, but by manipulating the data flowing through the pipeline using standard PowerShell techniques All the while being able to do the heavy listing with out code.

So, below the fold we’ll be looking at some examples of how you can start going to town with your restores

Page 3 of 5

Powered by WordPress & Theme by Anders Norén