Whilst going through my blog roll on Feedly this weekend, I spotted Kevin Marquette’s blog post announcing his new Chronometer module.
At a high level it generates a script profiler timeline as you run through a script. Showing you how long each line takes to run, which lines weren’t run and how many times that line has been called.
I’ve just finished writing a number of functions for the dbatools module to implement some new restore functionality, this seemed like a great way to peer into my functions and see what was going on inside, and spot where I was spending most of the time. This should make it easier to work out where I needed to focus attention to improve performance.
At it’s simplest form it can be called like this:
$start = Get-Date $Chronometer = @{ Path = 'c:\dbatools\module\functions\Restore-DbaDatabase.ps1' Script = {Restore-DbaDatabase -SqlServer localhost\sqlexpress2016 -Path C:\dbatools\backups -WithReplace -OutputScriptOnly} } $results = Get-Chronometer @Chronometer -Verbose (New-TimeSpan -Start $start -End (Get-Date)).TotalSeconds
This took about 10 seconds to run on my test box and dumped out all the coverage for the Restore-DbaDatabase function.
So what do we find under the hood
The line taking the longest to execute was this
[0370ms,001,370ms] 293: $FilteredFiles | Restore-DBFromFilteredArray -SqlServer $SqlServer -DBName $databasename -S qlCredential $SqlCredential -RestoreTime $RestoreTime -DestinationDataDirectory $DestinationDataDirectory -DestinationLogDirectory $Des tinationLogDirectory -NoRecovery:$NoRecovery -Replace:$WithReplace -Scripts:$OutputScript -ScriptOnly:$OutputScriptOnly -FileStructure: $FileMapping -VerifyOnly:$VerifyOnly -UseDestinationDefaultDirectories:$UseDestinationDefaultDirectories -UseSourceDirectories:$UseSour ceDirectories
Which isn’t that unexpected as the Restore-DBFromFilteredArray is the worker function that actually does the restore once we’ve gotten all the required files.
So how could I look at what was happening inside that function as well. To do that I’d need to add that function’s definition to the Path property of the Chronometer class:
$start = Get-Date $Chronometer = @{ Path = ('c:\dbatools\module\functions\Restore-DbaDatabase.ps1', 'c:\dbatools\module\internal\Restore-DbFromFilteredArray.ps1' Script = {Restore-DbaDatabase -SqlServer localhost\sqlexpress2016 -Path C:\dbatools\backups -WithReplace -OutputScriptOnly} } $results = Get-Chronometer @Chronometer -Verbose (New-TimeSpan -Start $start -End (Get-Date)).TotalSeconds
So far nice and simple. But that’s only one of the helper functions that Restore-DbaDatabase relies on, so I’d probably have to add them all for a proper check of the function. Not too much of a problem when it’s code I’ve written as I know all the files I need. But a bit trickier for someone else trying to track down a problem in a couple of months or years time.
First thought was to try adding all the ps1 files from the module to the path:
$path = (Get-ChildItem C:\dbatools\module\internal, c:\dbatools\module\ -Filter *.ps1 -Recurse).FullName $start = Get-Date $Chronometer = @{ Path = $path Script = {Restore-DbaDatabase -SqlServer localhost\sqlexpress2016 -Path C:\dbatools\backups -WithReplace -OutputScriptOnly} } $results = Get-Chronometer @Chronometer -Verbose (New-TimeSpan -Start $start -End (Get-Date)).TotalSeconds
Unfortunately that took 17 minutes, which is a little too long to run repeatedly. It’s not that suprising as the module works by adding a breakpoint to every line of the scripts passed in. For the dbatools module that’s currently 39370 lines, which is a lot of breakpoints!
So then I started thinking of how to only pass in the functions that were actually called in the function I was interested in. And I came up with this:
Function Get-FunctionCalls { <# .SYNOPSIS Get's the functions called in a script and the the file containing those definitions .DESCRIPTION .PARAMETER FunctionName The root function name you're wanting to process .PARAMETER FunctionPaths An object linking a function's name to the the file containing it's definition. Is built by the first call, and then passed to later calls. .PARAMETER ModulePath Path to the root of the powershell module you're analyzing .PARAMETER Depth How deep to recurse into the function calls. Default is 1. This will just load the functions called within the first function. Higher numbers increase the level of recurions, increasing the number of function definitions retrieved, and slowing down the Chronometer run 0 - Will only load the definition of the root function .EXAMPLE $Path = Get-FunctionCalls -FunctionName Restore-DbaDataBase -ModulePath C:\dbatools\module\ -depth 1 $Chronometer = @{ Path = $Path.FileName Script = {Restore-DbaDatabase -SqlServer localhost\sqlexpress2016 -Path C:\dbatools\backups -WithReplace -OutputScriptOnly} } $results = Get-Chronometer @Chronometer Will scan all the ps1 files in c:\dbatools\Module for function definitions, and then will scan the definition of Restore-DbaDatabase for function calls. As depth is set to 1, it will then add the definitions of those called functions from within the module to the Chronometer path. Higher depth values will scan for the definition of functions called in the next tier of functions recursively. .NOTES Original Author: Stuart Moore (@napalmgram), https://stuart-moore.com Source tracked at https://github.com/Stuart-Moore/VariousHelperFunctions/blob/master/Get-FunctionCalls.ps1 Written as a helper function to Kevin Marquette's Chronometer module () This parses the function to be analysed for it's definition and other functions it calls, so that an entire stack can be monitored without needing to manually build up the path or adding every file in the module and killing performance This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see <http://www.gnu.org/licenses/>. #> [CmdletBinding()] param ( [string[]]$FunctionName, [object[]]$FunctionPaths, [String]$ModulePath, [int]$depth=1 ) if ($FunctionPaths -eq $null) { Write-Verbose "$FunctionName - Need to populate the paths to Function calls" $Files = Get-ChildItem $ModulePath -filter *.ps1 -recurse $FunctionPaths = @() $null = $Files | %{Get-Content $_.FullName | Where-Object {$_ -match 'function\s(\w+-\w+)'} | %{$FunctionPaths += [PSCustomObject]@{Function = $Matches[1]; FileName = $_.PsPath}}} } $FunctionFile = $FunctionPaths | Where-Object {$_.Function -eq $FunctionName} $FileContents = Get-Content ($FunctionFile.FileName) $Functions= $FileContents | Where-Object {$_ -match '[^\s|(]\w+-\w+'} | %{$Matches[0]} | Select-Object -Unique $results = @() if ($depth -ge 1){ Foreach ($Function in $Functions){ $results += $FunctionPaths | Where-Object {$_.Function -eq $func} } }else{ $results = $FunctionFile } if ($depth -gt 1) { Foreach ($call in $results) { $results += Get-FunctionCalls -FunctionName $call.Function -FunctionPaths $FunctionPaths -depth ($depth-1) -ModulePath $ModulePath } } $results | Select-Object -Unique -property Function,FileName }
Which is then called like this:
$Path = Get-FunctionCalls -FunctionName Restore-DbaDataBase -ModulePath C:\dbatools\module\ -depth 1 $Chronometer = @{ Path = $Path.FileName Script = {Restore-DbaDatabase -SqlServer localhost\sqlexpress2016 -Path C:\dbatools\backups -WithReplace -OutputScriptOnly} } $results = Get-Chronometer @Chronometer
(Latest version up on github – https://github.com/Stuart-Moore/VariousHelperFunctions/blob/master/Get-FunctionCalls.ps1 )
This returns in about 50 seconds on my dev laptop, which isn’t too shabby and makes it reasonable to run after small code revisions.
Thanks to this, I could now drill down into the function calls to see what was taking the time in Restore-DbFromFilteredArray. And the goodish news is that there is nothing majorly wrong in there, it’s just that it does a lot of looping, so each call is quite short, it just makes the call a lot of times. But that could be an option for offering a -Turbo
switch if the caller is convinced their files are all correct and present.
Kevin’s Chronometer module looks like it’s got some real potentially, so I’ll certainly be keeping an eye on it.
Leave a Reply