Don’t Miss Your Vote!

PASS Summit 2014 Banner_200x200On July 2, 1776 in Philadelphia Pennsylvania, the Continental Congress cast a historic unanimous vote to adopt a resolution declaring our nation’s independence from British rule.  On July 4th, 1776 the final wording of the Declaration of Independence was approved and sent to the printer.

Keeping this historic event in mind,  if you are attending the 2014 SQL PASS Summit later this year in Seattle, Washington you could be missing out on performing one of your civic duties; one that was made possible with the actions of the Continental Congress in 1776.  The right to vote.

All across the United States a general election is scheduled to take place on Tuesday, November 4th, 2014.  Given that the PASS Summit 2014 starts on that day, and if you are like myself and do not live in Seattle, you could potentially miss out on getting your voice heard in this years election.  Even if you do live in Seattle, the ability to vote and part take in the all of the Summit’s glory might prove to be difficult.

You do have an option though, and that’s through the use of an absentee ballot.  You should be able to contact the Secretary of State department for the state that you reside to find out what the process is.

vote-counts-button

There are usually deadlines for requesting & submitting an absentee ballot so please make sure that you confirm the dates and mark them on your calendar.  I usually set two reminders in my to-do list, 1) when I need to obtain the ballot by and 2) when I need to submit it by.

While I am a huge proponent of the PASS Summit and the SQL Community in general, please do not miss the opportunity to have your voice heard because of it.  Using an absentee ballot will give you piece of mind that while attending the first day of the 2014 Summit, your voice will be heard.

Vote.  Make a difference.

Posted in Community, PASS, Philosophical, Summit | Tagged , , , | Leave a comment

PASS Summit 2014 – All Day Pre-Con

sqlsat271_web It all started at SQL Saturday #271 in beautiful Albuquerque, New Mexico.  My good friend Chris Shaw and I were sitting outside the event catching up on things as we usually do. Our topic of discussion turned to one of presenting and comparing notes of our most recent speaking engagements.  After a few minutes of chatting, we both realized that relating the topic of choice to real world examples was turning out to be a great way to further ingrain the technology to the attendees.  What better way to learn about things than to speak about real world scenarios?

As our ramblings continued, we stumbled on the notion that having two data professionals, both with a wide range of skills and backgrounds, present in a “rumble” type of fashion could potentially be a fun and entertaining session.

PASS Summit 2014 Banner_200x200

As simple as that, a pre-con abstract is born!

So, come, join us for our full day pre-con at the PASS Summit 2014 in beautiful Seattle and see who is the victor!  There might even be a trophy! I can assure you that it’s going to be a day filled with technology and real world solutions brought to you by two guys who have a lot of passion for Microsoft SQL Server and the PASS Community!

In case you wanted to see the official Abstract we have included it here. We hope to see you there!! To register check out this link.

Real World End-to-End Performance Solutions

Do you ever wonder how other people approach real world business objectives from start to finish? The success of a project can be hampered from decisions that are made before a single row resides in a database, and yet often there is no one single best solution, because different people will approach a problem from different angles and there are countless variables along the path that are critical to success. If you make a mistake at the beginning, costly redesigns may haunt you for years to come.

In this powerful yet entertaining full-day pre-conference session, we will start with two different database professionals with two different skill sets. The speakers both have very different ways of approaching a project (and are both set in their ways!), but both have the same measurable goal. Whose process runs faster? Join this session and see the results.

In other words . . . “Let’s get ready to Rumble!”

What may appear as the end of the session is really only the beginning as variables and limitations are added and various questions are answered: What happens to our project when we are forced to use GUIDs (Globally Unique Identifiers), or when we push our solution to the cloud? Does the introduction of new hardware improve the overall outcome of our test? Do we see faster performance with SQL Server 2014 over SQL Server 2012? Join us and find out!

The Combantants

chrisshawChris Shaw:  In the red corner representing Colorado Springs, standing tall at 5 foot 6 nothing but attitude. Armed with a staggering 17 years of experience for a right hand and striking fast wit for the knock out on the left. Your 5 year SQL Server MVP, Pass Regional Mentor, and Colorado Springs User Group leader… Chris Shaw.

You can find Chris on Twitter at @SQLShaw

jmorehouse_preconJohn Morehouse: In the blue corner, backed by corn huskers and fire fighters, a true hero dedicated to destroying poor database performance and the sharing of database knowledge. Lighting fast reflexes and in depth knowledge is unleashed to deliver lethal blows. The towering, the intimidating and the inspirational to so many… John Morehouse.

You can find John on Twitter at @SqlRUs

So go, register, sign up and come see Chris and I battle it out to the bitter end.  Fun & learning will be had by all!

Posted in PASS, Presenting | Tagged , , , | Leave a comment

SQL Saturday #307 – Iowa City, IA

sqlsat307_webAlthough I will admit that I’m late on getting this blog post out, it’s that time again.

I am honored to have been selected to speak at SQL Saturday #307 in Iowa City, Iowa this year.  So, later today I’ll be making the 4 hour drive over to see the awesome folks in Iowa.  It looks like to be a great day for a scenic drive and I’m looking forward to some quiet time on the road.

So if you have nothing better to do on Saturday, June 7th, 2014, come on out and get your learn on.  The schedule for this event is quite impressive and there will be a TON of things to learn.

I triple dog dare you to come out (name that movie), introduce yourself to me, and get your learn on!

10 Things Every DBA Should Know!

This session is one that I’ve given before and very much enjoy giving it.  There are a ton of things that DBA’s need to know, but in this session I go over some of the things that I’ve seen in the trenches.  We’ll talk about things such as SELECT *, data types, Auto Shrink and a couple of other things.  I don’t want to ruin the whole thing for you.  So, if you are new the DBA realm or just want to come watch me present, please stop by.  Bring your energy and your questions and we’ll get it done!

I’ll be in the Room 2520B –  1:00PM.  It’s a session right after lunch so I reserve the right to throw things at you if you fall asleep from lunch! I’m a former lineman so I can’t throw worth a darn so you might be safe.  Your neighbor sitting next to you on the other hand might be in trouble.

Free Training

Remember that this event is a FREE day of awesome SQL Server training and professional development.  Check out the schedule and register.  In today’s economy, how can you afford to pass up FREE training?!?!

If you make it out my session, make sure to come up and introduce yourself as I’d love to meet you! I know I can look like a mean DBA but I promise that I’m not.  =)

See you in Iowa City!

Posted in Presenting, SQL Saturday | Tagged , , | Leave a comment

Query Optimizer Wins Again

In a previous post, I mentioned that I had to use dynamic SQL to handle the switching of the context of the query in order to accomplish what I wanted.  I had also mentioned that I would be putting out another blog post regarding that dynamic SQL and here we are!

As a recap, in that post I was attempting to dynamically calculate the server memory requirements based on the physical memory installed on the machine.  I’ll also mention again that Microsoft changed the structure of the DMV that was used between SQL Server 2012 and SQL Server 2008 R2 and lower.  In order to run the query on all servers, I had to determine a way to dynamically handle the appropriate structure for the various versions of SQL Server we have.

This is how I ended up doing it:

-- Setup the dynamic SQL
-- We need the physical memory values in GB since that's the scale we are working with.
IF @serverVersion > 10
	SET @sql = '(SELECT @memOut = (physical_memory_KB/1024/1024) FROM sys.dm_os_sys_info)'
ELSE
	SET @sql = '(select @memOut = (physical_memory_in_bytes/1024/1024/1024) FROM sys.dm_os_sys_info)'

-- Get the amount of physical memory on the box
EXEC sp_executesql @sql, @paramdefs, @memOut = @mem OUTPUT

This is the way that I actually tried several times to handle it:

-- We need the physical memory values in GB since that's the scale we are working with.
IF @serverVersion > 10
	SELECT @memOut = (physical_memory_KB/1024/1024) FROM sys.dm_os_sys_info
ELSE
	select @memOut = (physical_memory_in_bytes/1024/1024/1024) FROM sys.dm_os_sys_info

Of course, I know now that this won’t work at all.  I’ll be honest and say that it took me more than a few minutes to figure out why my logic wasn’t right and why SQL Server kept yelling at me.

The Truth Shall Set You Free

query_optimizer

Query Optimizer

Doh!  Of course, it was the query optimizer! The first thing that the optimizer does is to validate (or parse) the syntax of the code.

Knowing that this is the first thing the optimizer will do, looking at one of the branches of the ‘IF’ statement in the wrong (bottom) example, the columns were valid.  For the other branch,  it wasn’t because that column didn’t exist for that particular table depending on what version of SQL Server the code was examining.

When it couldn’t find the column, the optimizer had to bow out simply because it couldn’t verify that the column existed.  This forced me to put the syntax of the query into a dynamic statement simply because the dynamic sql isn’t parsed until the query itself is called through the sp_executesql command.

When using dynamic sql, the query optimizer just treats the building of the syntax just as regular text and happily continues along it’s merry way.

Once the dynamic sql syntax was in place, the query optimizer was then able to validate the query and things all fell into place.

Some References

SQL Guru  has a great post on how the optimizer works.  I highly recommend reading it if you’re not familiar with it.  Heck, even if you are familiar with it, read it anyway as a refresher.

Another SQL Server Guru, Paul White, also has a deep dive into the query optimizer and you can find the first post in his series here.

Another Lesson Learned

This little experience, while probably trivial,  was also a great reminder that we don’t always know or remember everything about SQL Server.  SQL Server is such a massive platform that you won’t know or remember everything about it.  You will at times forget even some of the simplest of things and that’s ok when that happens.  We are human after all.

Summary

The next time that you are trying to get some code to compile and it won’t, take a moment see if the query optimizer can do it’s job.  If the query optimizer can’t parse and bind your code, then it won’t even get off the ground.

Enjoy!

Posted in Internals, T-SQL | Tagged , , | Leave a comment

SQL Saturday #297 – Colorado Springs, CO

sqlsat297_webYes, folks it’s that time of year again.  I’ll be heading to Colorado Springs, CO this upcoming weekend to join the ranks of speakers as one of the chosen ones.   As usual, I’m honored to be speaking at their event.  They always have an outstanding list of speakers and to be among them is truly an honor. 

Colorado_LandscapeI very much enjoy traveling to Colorado.  The scenery is breath taking and the people there are incredible.  If you have never had the opportunity to visit, I would highly recommend it.  

This year the Colorado Springs group is doing something a little different.  They are hosting a food drive in order to help the local community and I for one will be stopping at a local grocery store to load up on canned food before the event.  I think that this is a fantastic way to give back to the community. 

Microsoft MVP Chris Shaw did a blog post about the food drive here.    

SQL Server Databaseology 201 – The Study of Database Internals

This is probably one of my favorite presentations as I get to talk about the internal structures of a Microsoft SQL Server database.  I’m not talking about tables, views, or procedures but rather things such as how each row is structured, how does SQL Server keep track of your data and various ways to go diving into the depths of the internals.  This presentation is packed with demo’s and fantastic #Sharknado references!  We’ll also discuss how your data types matter and that despite what you’ve been told, disk isn’t always cheap.

If you haven’t checked out the schedule yet, you should.  There are some fantastic speakers going to this event.  Speakers like Mike Fal (B|T), Chris Shaw (B|T),  Jason Kassay (T), Jason “Cannonball” Horner (B|T), Steve Jones (B|T), Meagan Longoria (B|T), Thomas LaRock (B|T), Keith Tate (B|T), Glenn Berry (B|T), Jason Strate (B|T) and many others.  I assure you that this is a schedule you don’t want to miss!

Free Training

SQL Saturday’s is a great venue for FREE training and I highly encourage you to attend if you can.  It’s one Saturday out of the year for this particular event and I promise that it’ll be worth your time.  If you haven’t yet experienced the awesomeness that is SQL Saturday, you are missing out and now’s the time to change it!

If you are in my session or even just wandering about, please feel free to come up and introduce yourself!  I enjoy meeting new people and I promise I won’t bite!

Still here?  What are you waiting for?  Go.  Now.  Register.   Take the leap and move forward in your learning & career! DO IT!!

Posted in Community, Presenting, SQL Saturday | Tagged , , | 3 Comments

SQL Saturday #287 – Madison, WI

sqlsat287_webThis upcoming weekend will mark a first for me.  I get to speak at SQL Saturday 287 in Madison, WI.  This will be my first time speaking at this particular event and I’m very much looking forward to it.

I’m also bringing my family along for the ride as my wife has relatives in the area so we are making a vacation out of the trip.  Nothing like a mini-vacation with some #sqlawesomesauce mixed in!  Whoo Hoo!!

So if you have nothing better to do on Saturday, March 28, 2014, come on out and get your learn on.  The schedule for this event is quite impressive and there will be a TON of things to learn.  Come, join us and learn something.  As my wise father always told me, you should never stop learning.

Oh yeah, I’m speaking!

Social Networking Is Alive!

SocialMediaIsAlive!Social media is every where these days and if you’re not on the train, you could be missing out.  This session is about how you can use social media to improve not only your professional life, but your personal one as well.  This session isn’t geared to specifically geeks (totally non-technical), but to anybody who wishes to improve their online presence through the use of social networking.  It’s a fun session and I hope that the attendee’s enjoy it!!

HINT: I’ll show you how to get free SQL Server help from one of the popular social networks.  How would you like to tell your management that you solved a real world enterprise level issue with the help of social media?  For free?

Better yet, if someone in the audience would have a SQL question that they want answered, we’ll see if we can get the #sqlfamily to help out.  I’m guessing that we will!

You might even see reference to a popular YouTube video.

Free Training

Can you really pass up on free training?  Ok, even with lunch it’s only $10!  So if you’re still on the fence on attending, get off the fence, register, come on out and learn something.  Meet some new people.  Grab some swag and a fantastic lunch.  I promise you that the speakers won’t bite.  Register now!!

Get your learn on!  I triple dog dare you!

Why are you still here?  Go.  Register.  Do it now! See you in Madison!

Posted in Community, Presenting, SQL Saturday | Tagged , , | 4 Comments

Configuration Validation – Max Memory

We are in the midst of rolling out SQL Server 2012 to our environment and, like most shops, we customize our SQL Server instances based on our environment.  Since we don’t just let things go straight out of the box, we have to verify things are done in the manner which matches our deployment process.  We don’t deploy new servers often so the need to fully automate this process isn’t here yet.

What we  want to do, is to have a way to perform an easy validation of an install to ensure that any settings weren’t missed.  So I started to put together a script that I could easily run on a server to validate certain settings were in place.  This script checks for things like certain trace flags, what is the max memory set to, how many data files does TempDB have, etc etc. For this post, we’re going to focus on the max memory setting.

Max Memory

Setting the max memory (and even the minimum memory) for a SQL Server instance is a very important thing to address when installing a SQL Server.  It’s a hog and it will gladly take over all of the memory if you let it, thus starving the operating system or any other applications that you might have installed.

One of the tools that we use when configuring, and very well I might add, is information from a blog post by Jonathan Kehayias (B|T), “How Much Memory Does My SQL Server Actually Need“. This is a great post as it fits our environment pretty well.  We have dedicated SQL Servers and are they aren’t virtualized so it’s a great place to start.

Essentially, Jonathan recommends this as a starting point:

  • Reserve 1GB of RAM for the Operating System
  • Reserve 1GB of RAM for each 4GB of RAM installed from 4-16GB
  • Reserve 1GB for every 8GB of RAM installed above 16GB of RAM

For eample, for a server with 192GB of total physical memory installed:

  • 1GB = Initial OS Reserve
  • 4GB = If you have more than 16GB of physical memory, otherwise (Total / 4)
  • 22GB = Total amount of memory minus 16GB divided by 8 for anything greater than 16GB of physical memory.  If you don’t have 16+ GB of memory, this is Zero.
  • 27GB = 22GB + 4GB + 1GB (total amount to reserve for the OS).
  • 165GB  = 192GB – 27GB.  This the total amount to give to SQL Server
    • NOTE: To set this value, you would take 165*1024 = 168,960 (in MB)
configvalidation_maxmemory_1

Setting Max Memory

Armed with this information a simple spreadsheet was born to handle the math.  This worked great! Of course, until we wanted to script things out. Once that milestone was reached, it was obvious that the real need was to dynamically handle this in a script.  So, I started looking into how to accomplish this.

The Script

To start with I had to find out where SQL Server was keeping the total amount of physical memory that the server had.  I’ll be honest and say that I had to look for it.  I thought that I might have to read the registry, but thankfully I found a DMV, sys.dm_os_sys_info, that gives you that information.  You can read more about this particular DMV here:

If you look at the information from TechNet, you’ll see that the ‘physcial_memory_KB‘ is the column that we’re interested in as it represents the amount of physical memory that is available.  However, one thing of note is that the column name was changed from SQL 2008 R2 to SQL 2012.  In the older versions (2005 – 2008 R2) the column name is ‘physical_memory_in_bytes‘.

Along with that, the number of bytes that the number represents is also different.

In 2012 it’s ‘physcial_memory_KB’.  

In 2008 R2 (and lower) it’s ‘physical_memory_in_bytes’

I had to account for not only the name difference but also the different context (KB versus Bytes) in the script as we have a mixture of instances running and I thought that it would handy to be to run this across the Enterprise.

Once I knew where the information was kept, it was relatively easy to do the math that I needed and spit back the results.  Since we have a mixture of SQL Server versions, I used ‘SERVERPROPERTY‘ to determine what version of SQL Server the script was running against, thus allowing me to dynamically adjust for the differences in the DMV mentioned above.

You’ll also probably notice that I am using the sp_executesql command to dynamically handle the DMV differences between SQL Server versions.  This will be another blog post.

I tested the script on SQL 2005, 2008 and 2012.  Anything older than that, you’re on your own since 2000 didn’t have the DMV’s at our disposal.

DECLARE @mem NVARCHAR(10)
DECLARE @memOut NVARCHAR(10)
DECLARE @totalOSReserve INT
DECLARE @4_16 INT
DECLARE @8Above16 INT
DECLARE @serverVersion INT
DECLARE @sql NVARCHAR(MAX)
DECLARE @paramdefs NVARCHAR(500)
-- Put this into a parameter so it's configurable
DECLARE @osMem INT

-- Done this way to handle older versions of SQL Server
SET @osMem = 1 -- leave 1GB for the OS
SET @8Above16 = 0 -- leave it 0 by default if you don't have more than 8GB of memory
SET @serverVersion = LEFT(CAST(SERVERPROPERTY('productversion') AS VARCHAR(100)),CHARINDEX('.',CAST(SERVERPROPERTY('productversion') AS VARCHAR(100)),1)-1)
SET @paramdefs = N'@memOut INT OUTPUT'

-- Setup the dynamic SQL
-- We need the physical memory values in GB since that's the scale we are working with.
IF @serverVersion > 10
	SET @sql = '(SELECT @memOut = (physical_memory_KB/1024/1024) FROM sys.dm_os_sys_info)'
ELSE
	SET @sql = '(select @memOut = (physical_memory_in_bytes/1024/1024/1024) FROM sys.dm_os_sys_info)'

-- Get the amount of physical memory on the box
EXEC sp_executesql @sql, @paramdefs, @memOut = @mem OUTPUT

-- Start the Math
IF @mem > 16
	BEGIN
		SET @4_16 = 4
		SET @8Above16 = (@mem-16)/8
	END
 ELSE
	BEGIN
		SET @4_16 = @mem/4
	END

-- Total amount of memory reserved for the OS
SET @totalOSReserve = @osMem + @4_16 + @8Above16
SET @mem = (@mem-@totalOSReserve)*1024

-- Use sys.configurations to find the current value
SELECT (@mem/1024)+@totalOSReserve AS 'Total Physical Memory'
	, @totalOSReserve AS 'Total OS Reserve'
	, @mem AS 'Expected SQL Server Memory'
	, value_in_use AS 'Current Configured Value'
FROM sys.configurations
WHERE name = 'max server memory (MB)'
GO

Running this query against our Central Management Server was a great way to validate that all of our installs (both new & old) were set correctly.

Any server that isn’t set correctly can be dynamically adjusted to the appropriate value.  Keep in mind if you are increasing the amount of memory for SQL Server to utilize, you won’t need to reboot.  If you are lowering it and it already has more than the limit in use, you’ll have to work in a reboot to release it back to the OS.  Once you reboot the server then it’ll be capped at the new upper limit.

Of course, it you need to do this in ProductionI highly recommend that you do all of the appropriate testing required by your business and/or policies.

Resources & Permissions

Click on the individual links to see the Microsoft reference material.  I’ve also listed the permissions that you’ll need to have in order to run this:

Summary

There are a lot of things that we check for in our validation script but this portion was an interesting piece to build and I think will turn out to be pretty handy!

Enjoy!

Posted in Administration, T-SQL | Tagged , , | 5 Comments

Indexes & Snapshots

Several weeks ago while at SQL Saturday #271 in Albuqurque, NM, I had someone ask me if you took a snapshot of a database, would any index fragmentation go right along with it. I was fairly confident that the answer was “Yes, it will since a snapshot is at the page level” but I wasn’t quite sure and to be honest, I had never looked into it for myself.

Thus a blog post is born! Basically what we’ll do is:

    1. Create a dummy database
    2. Insert a bunch of data
    3. Delete 1/2 of the data
    4. Take a snapshot
    5. Rebuild one of the indexes
    6. Restore from the snapshot

And if all goes well, we should see that the index fragmentation from the data being deleted is restored when the database is restored from using the snapshot.

Here we go!

First let’s create our dummy database and table.

USE tempdb
GO
IF DB_ID('SnapshotIndexDemo') IS NOT NULL
	BEGIN 
		DROP DATABASE SnapshotIndexDemo
	END	
GO
CREATE DATABASE SnapshotIndexDemo
GO
USE SnapshotIndexDemo
GO		
CREATE TABLE [dbo].[Sharknado](
	[CustomerID] [INT] IDENTITY(1,1) PRIMARY KEY,
	[CustomerName] [VARCHAR](8000) NULL,
) ON [PRIMARY]
GO
-- Create us a NCL just because
CREATE NONCLUSTERED INDEX IX_Sharknado on Sharknado (CustomerID) include (CustomerName);
GO

Now I’m going to load some data.  We need to pump in some data so that we can easily see things are fragmented.  This should explode my database to 400+ MB in size. Give or take.

DECLARE @x INT = 1
WHILE @x < 50000
BEGIN
	INSERT Sharknado (CustomerName)
	SELECT REPLICATE('c', 5000)

	SET @x = @x + 1
END
GO

Just to check, you can run this code to ensure that the database has grown. The size column found in sys.master_files represents the number of 8KB pages the files are, thus we multiple that sum by 8192 (the true size of each page) and then divide by 1,024,000 for the number of bytes in a megabyte.

SELECT Database_ID,
	db_name(database_id) 
	, (SUM(size)*8192)/1024000 as 'Size in MB'
FROM sys.master_files
WHERE database_id = db_id('SnapshotIndexDemo')
GROUP BY Database_ID

You can see here that it pushed the database to a little over 400MB. Perfect!

snapshotindex_1

Now that we have some data to play with, let’s see if we can get things fragmented so that we can take a snapshot. We can see from this that there isn’t any fragmentation at this point.  This code will provide the fragmentation levels for all indexes, which in our case is just two.

SELECT index_type_Desc,fragment_count, avg_fragmentation_in_percent
FROM sys.dm_db_index_physical_stats(db_id('SnapshotIndexDemo')
	, object_id('Sharknado'), NULL, NULL, 'LIMITED')
GO

snapshotindex_3

Now that we’ve shown that things are nice and tidy, let’s create some chaos!!

-- This will delete every even row by using the modulo ("Mod") operator
-- Modulo - http://technet.microsoft.com/en-us/library/ms190279.aspx
Use SnapshotIndexDemo
GO
DELETE dbo.Sharknado
WHERE CustomerID % 2 = 0
GO

And then let’s check to make sure that we have fragmentation.

snapshotindex_4

Now let’s take a snapshot!! This code will create the snapshot and should work for you, assuming that you have a “C:\Temp” directory.  If you don’t, just adjust the script to a folder location.  We’ll clean stuff up at the end of this.

USE tempdb
GO
CREATE DATABASE SnapShotIndexDemo_ss
ON
(
NAME = SnapShotIndexDemo,
FILENAME = 'c:\Temp\SnapShotIndexDemo.ss'
)
AS SNAPSHOT OF SnapShotIndexDemo
GO

Now that the snapshot is created, let’s rebuild the non-clustered index and check the fragmentation.  Once the rebuild is complete, we’ll restore the database using the snapshot that we just took.  If our theory is true, we would expect to see the fragmentation reappear.

USE [SnapshotIndexDemo]
GO
ALTER INDEX [IX_Sharknado] ON [dbo].[Sharknado] REBUILD
GO

Let’s check the fragmentation.

snapshotindex_5

Since we rebuilt the non-clustered index, we would expect for the fragmentation to be 0.  Finally, let’s restore the database using the snapshot that we took earlier.

USE tempdb
GO
RESTORE DATABASE SnapShotIndexDemo
FROM DATABASE_SNAPSHOT ='SnapShotIndexDemo_ss'
GO

And check the fragmentation again.

snapshotindex_6

Voila!  The fragmentation is once back again thus our theory is correct.

So, in conclusion, if you have fragmentation in your database and you take a snapshot, you’ll be taking the fragmentation right along with it.

Of course, we should leave the environment as we found it, so here is some code to clean things up if you want.

USE [tempdb]
GO
-- Clean up the snapshot
IF  EXISTS (SELECT name FROM sys.databases WHERE name = N'SnapShotIndexDemo_ss')
DROP DATABASE [SnapShotIndexDemo_ss]
GO
-- Clean up the database
IF  EXISTS (SELECT name FROM sys.databases WHERE name = N'SnapShotIndexDemo')
DROP DATABASE [SnapShotIndexDemo]
GO
Posted in Indexes, Snapshots | Tagged , , | 7 Comments

Mistakes

Last week, SQL Server MVP Chris Shaw wrote a blog post about learning from mistakes and he tagged myself as well as Paul Randal (b|t) and Andy Leonard (b|t) to do a similar post.

You can find their responses here:

Paul Randal - http://www.sqlskills.com/blogs/paul/learning-listen/

Andy Leonard - http://sqlblog.com/blogs/andy_leonard/archive/2014/01/04/a-mistake-responding-well-and-a-public-call-for-a-public-apology.aspx

Tunnel Vision

A couple of weeks ago I happened to be on-call and of course the pager goes off at 1:50AM one morning. Yay!  I drag myself out of the rack and head to the kitchen where I can work on things without disturbing my beautiful bride.  I log into the VPN and start to dive into where the issue might be.  In my travels I determine that the job that paged me was missing a source file that is normally deposited onto the file system by another non-sql server process, thus it was basically outside my control.  So I had to call someone else to wake them up to see if they could check on things.  At 3 o’clock in the morning none the less.

After talking to them for a bit and digging further into it, turns that out that I had tunnel vision.  If I had take a few minutes to step back and look at the whole picture, I would have realized that I could have figured out the real issue if I had just done a little more work.  At the time I was absolutely sure that it was something that I couldn’t resolve without outside help, but I was wrong.

Tunnel vision can and will get you eventually.  It can be difficult to detect but if you realize that your in the tunnel, just stop.  Stop everything, even for a few seconds  and take a step away from the issue.  That one look from outside the tunnel can mean the difference between waking someone else up or letting them sleep.  I’m pretty sure that my co-worker would have preferred to remain sleeping.

Admitting Your Mistakes

I can remember a time at a client site earlier into my SQL Server days as a consultant when I wanted to run profiler to try to capture something.    So I did.  What I didn’t realize is that I had selected an event for profiler that would bring the server to it’s knees.  Did I mention that it was a production server?  Yeah, it was.

Once I realized what had happened, I immediately contacted my manager (the client manager that I was reporting to), informed her of my mistake as well as what I was doing to resolve the issue.  She was very understanding, no yelling or screaming and she was appreciative of the fact that I contacted her right away as well that I had a game plan on how to resolve the issue, which I had already enacted.

It’s OK to mess up.  Be honest about it.  Admit your mistakes, realize that you are human and move on.  Dwelling on something that you can’t change in the past isn’t going to solve anything.  Yes, you might feel bad, but it happens.  However, if you have a manager that’s worth their weight, they will be more appreciative of you learning from your mistakes and moving on than dwelling on what happened.  They will also be thankful that you came to them with a solution.  Always try to approach management with a solution in hand, even if it’s one that they don’t go for.

Always a Rookie

Very early into my volunteer fire fighter/EMT career, I happened to have a conversation with one of the then assistant fire chiefs (who later became the chief and still is) about my worry of making a mistake in the midst of a call.  He told me “you’re always a rookie”, meaning that no call is exactly the same.  The people are different, the situation is different, the outcome can be different.  This advice really struck home for me not only in my fire fighting career but also my career as an IT professional.  No issue is the same and if you treat it that way, it will help you to remember to stay calm in just about any situation, let your training kick in and hopefully keep you out of the tunnel.

Knowledge is Power

I don’t remember where I saw or read this, but the quote goes something like “Once you have the knowledge, it’s your’s to do with as you please”.  In other words, knowledge is something that NO ONE can take a away from you.  Somewhere along the way in my DBA career, I got the bug to learn.  I read blogs, I go to SQL Saturday’s to speak, but also to learn new things.  I take training whenever I can.  I test & play around with things.  I try my best to continue to learn and expand my mind.

The moral  is to not make the mistake of letting your knowledge & skills get stale.  The technology of this world is expanding at an incredible speed and you should take advantage of as many learning opportunities as you can. Go read a blog.  There are tons out there.

In the mighty words of GI Joe, “.. Knowing is half the battle!”   Keep learning.

Summary

Hopefully you can learn from a couple of my mistakes and try not to repeat them.  If you do, however, just remember, try to get out of the tunnel, you’re still human, and time will go on.

Posted in Community, Personal | Tagged , , , | 7 Comments

Update #4 – Prayers for Alyssa

This will most likely be my final post for Alyssa.  There has been a Facebook page setup to help convey information and that’s a much better avenue for updates.

You can find the page here: http://www.facebook.com/pages/Alyssas-Army/481215798666644

Let me say that the outpouring response for thoughts, prayers and comments has been incredible.  I was able to share a United States map with all of the various locations of where the comments came from with Alyssa’s family. I did have some comments from other parts of the world and when I have time, I plan on getting a world view of the comments.

Alyssa

With sadness, my last update…

alyssaIt is with a heavy heart that I am posting today that our Alyssa has passed. We would like to thank everyone for the outpouring of support you have given. Alyssa truly was an amazing, strong, sweet, and loving young woman!

Please continue to pray for her soul and also for peace and comfort for her family.

RIP Alyssa! We love you so much and you will be forever in our hearts.

Alyssa passed away on Sunday January 27th, 2014 in the loving arms of her mother in Houston, TX.   From what I understand, she went peacefully and I know she went to be with the Good Lord in the heavens.

Her obituary:  http://fremonttribune.com/cass-news/obituaries/alyssa-johnson/article_f0705b16-fc4d-5b17-93ad-9f77b9e8b354.html

This post will go live at 10:30AM on Monday February 3rd, 2014.  This will be during Alyssa’s funeral.   Please take a moment and offer up a silent prayer and well wishes towards Alyssa’s family as they go through this dark time.

Gifts

If you want, you can send monetary gifts to Alyssa & her family by sending them to alyssajohnsonbenefit@gmail.com.   I don’t know how much this ordeal is going to cost, but I can’t imagine that it’ll be cheap.  So if you have a spare dollar or two, please consider sending Alyssa’s family a gift.

Please check the Facebook page for further fundraising efforts.

Thank you again for following this story and offering up your thoughts & prayers!!

Posted in Community, General Conversations, Personal | Tagged , , , | 11 Comments