SQL Server, getting the size of table partitions with T-SQL

Needed to quickly tell someone how large the partitions on a CRM 2011 auditbase table were (neither of us had access to the nice CRM GUI to answer it (ah they joys of not being trusted!)). So if anyone else needs some thing similar this is what I came up with:

SELECT
part.partition_number,
sum(alloc.total_pages/128) AS TotalTableSizeInMB,
sum(alloc.used_pages/128) AS UsedSizeInMB,
sum(alloc.data_pages/128) AS DataSizeInMB
FROM sys.allocation_units AS alloc
INNER JOIN sys.partitions AS part ON AU.container_id = CASE WHEN alloc.type in(1,3) THEN part.hobt_id ELSE part.partition_id END
LEFT JOIN sys.indexes AS idx ON idx.object_id = part.object_id AND idx.index_id = part.index_id
where part.object_id = object_id('auditbase')
group by partition_number

If you compare the numbers with thos provided by CRM you’ll notice a differece:
crm-part-size-sql

crm-part-size-crm

And that’s just because Dynamics CRM doesn’t include the space the indexes take up.

New-AzureRmAutomationCredential returning “Input string was not in a correct format”

Having banged my head against this misleading error message, I thought I’d post this.

it’s nothing to do with any of the parameters you’re passing in, believe me I checked.

It’s actually complaining that the Credential already exists! I was doing a redeployment and the credential hadn’t been removed.

And in other useful annoyances, Find-AzureRmResource doesn’t work for Automation Credentials. So now my deployment scripts looks like this:

if ((Get-AzureRmAutomationCredential -ResourceGroupName $rgname  -name "meadmin" -AutomationAccountName $AutoName)){
    Remove-AzureRmAutomationCredential -ResourceGroupName $rgname  -name "meadmin" -AutomationAccountName $AutoName 
  }

New-AzureRmAutomationCredential -AutomationAccountName $autoname -ResourceGroupName $rgname -name "meadmin" -Value $acred -Verbose

The term ‘Test-AzureResource’ is not recognized as the name of a cmdlet, function, script file, or operable program

Microsoft have followed through with the deprecated warning on Test-AzureResource and it’s finally been removed. So there’s now a lot of PowerShell scripts out there that no longer work.

A quick alternative that involves nothing more than a search and replace (at least in my scripts so far!) is Find-AzureRmResource, behaviour is the same returning $FALSE if the Azure resource doesn’t exist and $TRUE if it does. So this snippet:

    if (!(Test-AzureRMResource `
        -ResourceName $vmName `
        -ResourceType "Microsoft.Compute/virtualMachines" `
        -ResourceGroupName $rgName)) {

can simply be replaced with:

    if (!(find-AzureRMResource `
        -ResourceName $vmName `
        -ResourceType "Microsoft.Compute/virtualMachines" `
        -ResourceGroupName $rgName)) {

and everything’s good.

Using Powershell to push registry fixes around multiple machines

PowerShellNeeding to push this (Microsoft KB 911272 around a bunch of web servers in a hurry. Luckily, thanks to 2 good design decisions:

  • Enabling PS remoting
  • A sane machine naming convention of the form server-1, server-2

We could push the change around with a small bit of Powershell:

$ccred = Get-Credential

$ScriptBlock={
    New-ItemProperty "HKLM:\Software\Microsoft\ASP.NET" -Name "FCNMode" -Value 1 -PropertyType "DWord" 
    New-ItemProperty "HKLM:\SOFTWARE\Wow6432Node\Microsoft\ASP.NET" -Name "FCNMode" -Value 1 -PropertyType "DWord" 
    iisreset
    }

$i=1
while ($i-lt 9){
    $cname = "app-prod-web-$i"
    Invoke-Command -ComputerName $cname -ScriptBlock $ScriptBlock -Credential $ccred
    $i++
}

We grab some credentials for the remote boxes (remember, seperate accounts for desktop and server admin duties is a GOOD idea).

As we’re passing through multiple commands/lines to Invoke-Command using a scriptblock is the easiest way, so create one of those. Quickly use New-ItemProperty to add the required DWORD values under the right keys, and then an iisreset to make sure they take affect.

A nice quick and dirty while loop allows us to push the scriptblock to the 8 servers in this set, and we’re done. A quick change of the $cname variable and I can push it to the next set as well. And another edit and it’s pushed into the staging environments as well.

Nottingham SQL Server User group begins

After many successful years running in Leicester we’re proud to announce that the PASS East Midlands chapter is moving to Nottingham for 2016.

Gavin had been running the group since it’s birth, but has now handed the reigns over to myself and Tobiasz. As we’re both based in Nottingham we have decided to move the SQL User group to Nottingham. With the response we had for Nottingham’s first SQL Relay event we wanted to build on that, and this also makes it more accessible with Nottingham’s connections to the rest of the East Midlands and Lincolnshire area.

We’re going be based at Capital One’s Trent House on Station Street, easy to get to from Derby, Leicester, Loughborough, Lincoln, Sheffield and all parts of Nottingham

We have 4 SQL Server User group dates lined up for next year:

As with all PASS SQL User groups there is no charge for attending, yep, that’s free training! We’ll even provide some food and drinks while you learn!

Myself and Tobiasz are speaking at the first event, and we’re currently lining up some great home and international speakers for the other sessions.

But we’d love to hear from you if there’s any particular topics you’d like covering, I’m happy to go out and find speakers covering areas of special interest to our members. Also let us know at what level you’d like sessions presented, if you want an introduction to a particular topic that’s great, as is wanting a level 500 brain warper from the likes of Bob Ward!

All details for future meetings will be posted via our Meetup page – PASS East Midlands User Group.

Please register with the meetup group to keep up to date. You’ll also need to register for the event to attend. This is because we need to provide Capital One with a list of names to be signed in at security, so if you haven’t registered then you won’t be allowed in I’m afraid.

Look forward to seeing some of you there. Please post any comments here or at PASS East Midlands User Group

SQL Relay bringing free SQL Server Training to Nottingham

sqlrelaylogo70Having spoken at the last couple of SQL Relays, I’m very happy to announce that in 2015 we’re visiting Nottingham for the first time.

On the 7th October SQL Relay will be at the Nottingham Albert Hall conference center.  Speaker submissions are now coming in (and if you fancy speaking, please submit a session), and we hope to release the full lineup early in August. Based on previous years there will a wide range of high quality presenters.

At Nottingham we will be having 4 tracks on the day:

  • SQL Server
  • Business Intelligence
  • Business Analytics
  • Workshops provided by our sponsors

Across these we’ll be covering topics such as SQL Server performance, management and development, Azure/Cloud technology, R, Big Data, Cubes, Reporting , Dashboarding and much more.

If you’re a SQL Server DBA, then this could be a great opportunity to pick up some Business Intelligence crossover training. Or for a Business Intelligence developer, find our how to use Big Data solutions (on Azure, without the hardware costs) to analyse your data even further.

All of this great SQL Server content is available for the princely sum of Nothing. That’s right, its free, and we’ll even provide you with lunch.

We’re open for registrations, so sign up today and look forward to some quality free SQL training in Nottingham

If you’ve any questions, then please ask them.

Finding the bottom folders of a file tree using PowerShell

There’s a simple way of getting Powershell to tell you the last folder in a known path:

$path = \\server\share\folder1\folder2\folder3
$folder = Split-Path $path -Leaf

and that’s all you need.

Now, how do you go about doing that if you don’t know the path ahead of time? That is, you have a large collection of directories, and all you want are those folders than make up the bottom levels of the branches. In this graphic it would be all the folders underlined.

Example of bottom folders

As an example, current $EMPLOYER holds backups from all SQL databases on a share, with each database’s backups being in a folder. Due to various DBAs over the years the depth of the tree to the actual backups isn’t fixed. So I need to find the “bottom” folder so I can process the backup files in there.

Well, that’s a little trickier. So lets have a look at a couple of ways of doing that:
Continue reading

Bulk adding Active Directory service accounts with PowerShell

I’ve recently had to create a lot of new service accounts to support some new infrastructure projects. As these are supporting a lot of web services, there’s been a large number of service accounts that needed creating to own all the app pools.

To make life simpler when this moves from implementation to support, all of these accounts want to be setup the same way. And we want secure passwords as well.

And then those accounts also need to be setup for each of our prod, stage, dev, test environments.

So, a dull, repetitive task where making sure details are replicated exactly, this sounds like the perfect job to automate with some PowerShell.

[Reflection.Assembly]::LoadFile('C:\Windows\Microsoft.NET\Framework\v4.0.30319\system.web.dll') | out-null

function New-Password ($length = 12)
{
        $p = [System.Web.Security.Membership]::GeneratePassword($length,2)
        $password = $p | ConvertTo-SecureString -AsPlainText -Force
        $retpass = @{"string"=$p; "secstring"=$password}
        return $retpass
}

$environments = @()

$environments += @{"env"="prod";
                   "prefix"="pwa-";
                   "adpath"= "ou=Prod Accounts, ou=Service Accounts, ou=Domain Controllers, dc=ad, dc=contoso,dc=com,";
                   "identgroup"="name1";
                   "ro-group"="group1"}

$environments += @{"env"="test";
                    "prefix"="twa-";
                    "adpath"= "ou=Test Accounts, ou=Service Accounts, ou=Domain Controllers, dc=ad, dc=contoso,dc=com,";
                    "identgroup"="name2";
                    "ro-group"="group2"}

$environments += @{"env"="dev";
                    "prefix"="dwa-";
                    "adpath"= "ou=Dev Accounts, ou=Service Accounts, ou=Domain Controllers, dc=ad, dc=contoso,dc=com,";
                    "identgroup"="name3";
                    "ro-group"="group3"}


$Users = import-csv -path "c:\temp\users.csv"

$output = @()

foreach ($environment in $environments){
    $out += "`n "+$environment.env + "`n"
    foreach ($user in $Users){
        $password = New-Password( 15)
        $username = $environment.prefix+$user.user.trim()
        Aew-ADuser $username -Path $environment.adpath -AccountPassword $password.secstring -DisplayName $username -PasswordNeverExpires $true -ChangePasswordAtLogon $false -CannotChangePassword $true -Enabled $true
        Add-ADGroupMember $identgroup -Members $username
        Add-ADGroupMember $rogroup -Members $username
        $output += @{"username"=$username; "password"=$password.string;"Environment"=$environment.env
    } 
}

First off, we’ll want to be creating secure complex passwords. Rather than rewrite everything from scratch, we’ll use the built in function from the .Net System.Web class, which we load in the first line.

Then we’ll wrap this call in a PowerShell function to make calling things easier:

function New-Password ($length = 12)
{
        $p = [System.Web.Security.Membership]::GeneratePassword($length,2)
        $password = $p | ConvertTo-SecureString -AsPlainText -Force
        $retpass = @{"string"=$p; "secstring"=$password}
        return $retpass
}

We pass in a length parameter for later use, but set a default to guarantee at least 12 characters. Then we create a new password of required length, the 2nd paramater to GeneratePassword specifies the minimum number of special characters (@!# etc) that the password has to contain. GeneratePassword returns the password as a string, but we need it to be a SecureString to pass to an Active Directory cmdlet later on, so we convert it. Then return the password both as a string and a secure string via a Hash Table.

Now we setup our environments we’ll be creating our accounts for. I like to do this using arrarys and hashtables. We have an array/object $Environments which holds the details of all of the environments we want to create. For my needs, each environment is defined as follows:

$environments += @{"env"="dev";
                    "prefix"="dwa-";
                    "adpath"= "ou=Dev Accounts, ou=Service Accounts, ou=Domain Controllers, dc=ad, dc=contoso,dc=com,";
                    "identgroup"="name3";
                    "ro-group"="group3"}

The values in the hash table are:

  • The environment Name
  • The prefix for the usernames for this environment
  • The path in Active Directory for the OU the user account should be created in
  • An group for all the accounts in this environment to belong to
  • Another group for all the accounts in this environment to belong to

This is based on the AD setup I’m currently working with, your needs may be different, if you’re not the Active Directory admin then please go talk to them about how they’d like things configured

Now that we’ve got everything setup, let’s create some accounts!

I load the usernames from a csv file that looks like this:

user
Account1
web-user
web-user2
app-user
app-user2

The first line stays at user, so that we can use it to get the values later on.

Create an array $output to hold our output.

Now we loop through each of our environments, and then for each environment we loop through out list of users.

For each user:

  • we create a new secure password using our previously defined function.
  • Build a full username using the supplied username and the appropriate prefix for the environment
  • Create the user in Active Directory based on our environment
  • Add them to any groups we want them in
  • Add the newly created users details to our output object

And we’re done. In the great tradition of PowerShell I’ve kept the user details in an object $output so they can easily be passed along to another script, command or whatever you want.

If you want to just dump them to screen, then you can use:

$output | Format-Table 

Or redirect the object to you secure password storage utility of choice.

And another dull repetitive task is handed off to a simple script.

Upcoming Presentations

Well, it’s been a while since I last did some presenting, mainly due to having purchased a new home and having gotten stuck in with renovating it.

After a successful session up at SQL Server User Group North East last week talking about using PowerShell to migrate SQL Server instances, I’ve another 3 sessions lined up around the country. 2 as part of the SQLRelay set of events, and 1 one at the Leicester SQL User Group

All 3 sessions will be my presentation “Transaction Log Basics”, this abstract should give you a good idea of what to expect:

Every SQL DBA has at least 5 transaction logs that they need to take care of. But what exactly is going on inside that file? Why can it be a performance issue? Is it really that important? And why does it keep filling my disks when I’m not looking?

This session will answer all of those questions. We’ll be looking at how SQL Server relies on it’s transaction, best practices that will keep it performing properly and explaining why it’s so vital to your databases and your job to keep is safe!

It’s a perfect introduction to SQL Server transaction logs and also a good way to make sure you won’t fall into problems later on.

The dates are:

SQL Relay Southhampton – 14th October
Leicester SQL Server User Group – 15th October
SQL Relay Birmingham – 29th October (unfortunately this event is fully subscribed, but there is a waiting list you can sign up for)

Looking forward to seeing plenty of people at these events. If you’re reading this and planning on attending, then let me know, or come and say hello on the day.

Deleting Large amounts of data in SQL Server – Part 1

There comes a time when you’ll be asked to remove a large amount of data from one of your SQL Server databases. Perhaps an archive of order information is needs pruning, or removing those session records that aren’t needed anymore.

Yu’ve been off to have a look at, and it all looks simple enough. Looks like there’s a couple of million rows to delete, but you can identify them all easily. How easy is this going to be you think:

DELETE * FROM Table1 where DateAdded<getdate()-90;

And then it just sits there, and sits there, and sits there. Or you get a nice error message about breaking a key constraint. What to do about it?

Continue reading