Ever wonder how some people always seem to know the answer in SQL Server? Or seem to come up with it with a minimum of digging around?
Well, they’re not necessarily cleverer than you, it could just be that they’re better at playing with SQL Server than you are. Knowing how to quickly and simply build up a test scenario means whenever you want to try something out, or prove a hunch then you can be most of the way to proving something while others are still Googling and Binging away. Or if you’ve found something on the Internet, you can check it does actually do what it claims to. Because as Abraham Lincoln once said:
“Don’t believe everything you read on the Internet”
So first you need a playpit. This is somewhere you can do what you want and really don’t care about it. I really mean do what you want with, finding out what happens if you use dd to write bad blocks directly into your master mdf should be fine here. So not your usual Dev or test instances then, because despite what people say they do expect those to be actually usable most of the time. So this is a perfect case for using SQL Server evaluation editions. If you’ve not trashed it enough for a reinstall withing 90 days you’re not playing rough enough with it anyway. And you get to play with setting up SQL Server from the command line as well (it’s a one liner and you can reinstall in less time than it takes to find out what you did wrong).
So you’ve got a SQL Server install you don’t care about, so how do you build yourself a test system. Well, most things in SQL Server can be tested with something as simple as
create database PlayTime go use PlayTime go create table Tab1 ( c1 integer, c2 varchar(20) ); go
And there we go, a database and a table ready to play with. This is about as much as I’ve needed for most of my playing with things like PowerShell backups, Extended Events or anything else I’ve wanted to look in to. Just think, if you want to see raw data structures, pause between the 2 create statements, stop SQL Sever, copy the mdf for your new database (hey, is that an oppurtunity to find out where SQL Sever throws database files if you didn’t specify where?), start SQL Server up, create your new table, stop SQL Server again and use a tool like VBinDiff to see the differences between the 2 files.
Now the next thing people want when playing is data. Depending on how picky you want to be, getting some enormous amounts of data is pretty straightforward. Want a million row table to play with partitioning? Nice and easy:
insert into Tab1 (c1, c2) select top 1000000 row_number() over (order by (Select 1)) , 'text' from master.sys.all_columns a cross join master.sys.all_columns b; go
And you thought you’d never find a use for CROSS JOIN
? We’re also making use of the master.sys.all_columns
table as a basis, which on a plain SQL Server install will give you about 6000 rows, so if we didn’t limit the cross join we’d get about 36 millions, which should enough. But, if you want to get into the billions range then just cross join it again
Ah, but you want that as dates do you. Well here we go:
create table Dates ( c1 datetime, c2 varchar(20) ); go insert into Dates select top 1000000 DATEADD(mi,row_number() over (order by (Select 1)),getdate()), 'text' from master.sys.all_columns a cross join master.sys.all_columns b; go
Which should give you just under 2 years worth of minutes, plenty to check a partitioning function with.
Or perhaps you just want to bulk up a table or 3 for raw database size playng:
create BigTable as ( c1 integer identity(1,1) c2 varchar(max) ) go insert into BigTable (c2) values (replicate('junk',2000)); go 100000
Here we’re making use of REPLICATE to repeat junk 2000 times, giving use the nice 8000 characters that’ll fill a VARCHAR(MAX) to capacity. Also using the ability to pass an integer value to GO, which will cause osql, sqlcmd or SQL Server Management Studio to execute the previous batch the specified number of times, in this case 100000 times should give us plenty of data.
You might have noticed that all the examples are using T-SQL. Dropping down and typing the commands speeds thing up, and makes it much easier to redo it later. And practicing your T-SQL, and knowing just how much you can get away with to build up a quick test scenario will make you quicker. Want to see the difference adding extra columns to an index makes?
create table IndexTest ( c1 integer not null, c2 char(9) not null default 'test text', c3 char(9) not null default 'more text', c4 char(9) not null default 'some more', ) go insert into IndexTest (c1) select top 1000000 row_number() over (order by (Select 1)) from master.sys.all_columns a cross join master.sys.all_columns b; go create index IX_IndexTest on IndexTest (c1) go select index_type_desc, page_count from sys.dm_db_index_physical_stats(db_id(),object_id('IndexTest'),NULL,NULL,'sampled'); exec sp_spaceused 'Indextest'; drop index IX_IndexTest on Indextest; create index IX_IndexTest on IndexTest (c1) include (c2); go select index_type_desc, page_count from sys.dm_db_index_physical_stats(db_id(),object_id('IndexTest'),NULL,NULL,'sampled'); exec sp_spaceused 'Indextest'; drop index IX_IndexTest on Indextest; create index IX_IndexTest on IndexTest (c1) include (c2,c3); go select index_type_desc, page_count from sys.dm_db_index_physical_stats(db_id(),object_id('IndexTest'),NULL,NULL,'sampled'); exec sp_spaceused 'Indextest'; drop index IX_IndexTest on Indextest; create index IX_IndexTest on IndexTest (c1) include (c2,c3,c4); go select index_type_desc, page_count from sys.dm_db_index_physical_stats(db_id(),object_id('IndexTest'),NULL,NULL,'sampled'); exec sp_spaceused 'Indextest'; drop index IX_IndexTest on Indextest;
Which is perfect to leave running on your test system while you go and get a coffee. Once you start getting used to running tests like this, then running complex scenarios or trying lots of combinations becomes so much easier than constantly wrestling with the SSMS GUI!
So go on, get out there and start playing. You’ve only your SQL Sever skills and knowlege to improve!
Leave a Reply