Query Optimizer Wins Again

In a previous post, I mentioned that I had to use dynamic SQL to handle the switching of the context of the query in order to accomplish what I wanted.  I had also mentioned that I would be putting out another blog post regarding that dynamic SQL and here we are!

As a recap, in that post I was attempting to dynamically calculate the server memory requirements based on the physical memory installed on the machine.  I’ll also mention again that Microsoft changed the structure of the DMV that was used between SQL Server 2012 and SQL Server 2008 R2 and lower.  In order to run the query on all servers, I had to determine a way to dynamically handle the appropriate structure for the various versions of SQL Server we have.

This is how I ended up doing it:

-- Setup the dynamic SQL
-- We need the physical memory values in GB since that's the scale we are working with.
IF @serverVersion > 10
	SET @sql = '(SELECT @memOut = (physical_memory_KB/1024/1024) FROM sys.dm_os_sys_info)'
ELSE
	SET @sql = '(select @memOut = (physical_memory_in_bytes/1024/1024/1024) FROM sys.dm_os_sys_info)'

-- Get the amount of physical memory on the box
EXEC sp_executesql @sql, @paramdefs, @memOut = @mem OUTPUT

This is the way that I actually tried several times to handle it:

-- We need the physical memory values in GB since that's the scale we are working with.
IF @serverVersion > 10
	SELECT @memOut = (physical_memory_KB/1024/1024) FROM sys.dm_os_sys_info
ELSE
	select @memOut = (physical_memory_in_bytes/1024/1024/1024) FROM sys.dm_os_sys_info

Of course, I know now that this won’t work at all.  I’ll be honest and say that it took me more than a few minutes to figure out why my logic wasn’t right and why SQL Server kept yelling at me.

The Truth Shall Set You Free

query_optimizer

Query Optimizer

Doh!  Of course, it was the query optimizer! The first thing that the optimizer does is to validate (or parse) the syntax of the code.

Knowing that this is the first thing the optimizer will do, looking at one of the branches of the ‘IF’ statement in the wrong (bottom) example, the columns were valid.  For the other branch,  it wasn’t because that column didn’t exist for that particular table depending on what version of SQL Server the code was examining.

When it couldn’t find the column, the optimizer had to bow out simply because it couldn’t verify that the column existed.  This forced me to put the syntax of the query into a dynamic statement simply because the dynamic sql isn’t parsed until the query itself is called through the sp_executesql command.

When using dynamic sql, the query optimizer just treats the building of the syntax just as regular text and happily continues along it’s merry way.

Once the dynamic sql syntax was in place, the query optimizer was then able to validate the query and things all fell into place.

Some References

SQL Guru  has a great post on how the optimizer works.  I highly recommend reading it if you’re not familiar with it.  Heck, even if you are familiar with it, read it anyway as a refresher.

Another SQL Server Guru, Paul White, also has a deep dive into the query optimizer and you can find the first post in his series here.

Another Lesson Learned

This little experience, while probably trivial,  was also a great reminder that we don’t always know or remember everything about SQL Server.  SQL Server is such a massive platform that you won’t know or remember everything about it.  You will at times forget even some of the simplest of things and that’s ok when that happens.  We are human after all.

Summary

The next time that you are trying to get some code to compile and it won’t, take a moment see if the query optimizer can do it’s job.  If the query optimizer can’t parse and bind your code, then it won’t even get off the ground.

Enjoy!

Posted in Internals, T-SQL | Tagged , , | Leave a comment

SQL Saturday #297 – Colorado Springs, CO

sqlsat297_webYes, folks it’s that time of year again.  I’ll be heading to Colorado Springs, CO this upcoming weekend to join the ranks of speakers as one of the chosen ones.   As usual, I’m honored to be speaking at their event.  They always have an outstanding list of speakers and to be among them is truly an honor. 

Colorado_LandscapeI very much enjoy traveling to Colorado.  The scenery is breath taking and the people there are incredible.  If you have never had the opportunity to visit, I would highly recommend it.  

This year the Colorado Springs group is doing something a little different.  They are hosting a food drive in order to help the local community and I for one will be stopping at a local grocery store to load up on canned food before the event.  I think that this is a fantastic way to give back to the community. 

Microsoft MVP Chris Shaw did a blog post about the food drive here.    

SQL Server Databaseology 201 – The Study of Database Internals

This is probably one of my favorite presentations as I get to talk about the internal structures of a Microsoft SQL Server database.  I’m not talking about tables, views, or procedures but rather things such as how each row is structured, how does SQL Server keep track of your data and various ways to go diving into the depths of the internals.  This presentation is packed with demo’s and fantastic #Sharknado references!  We’ll also discuss how your data types matter and that despite what you’ve been told, disk isn’t always cheap.

If you haven’t checked out the schedule yet, you should.  There are some fantastic speakers going to this event.  Speakers like Mike Fal (B|T), Chris Shaw (B|T),  Jason Kassay (T), Jason “Cannonball” Horner (B|T), Steve Jones (B|T), Meagan Longoria (B|T), Thomas LaRock (B|T), Keith Tate (B|T), Glenn Berry (B|T), Jason Strate (B|T) and many others.  I assure you that this is a schedule you don’t want to miss!

Free Training

SQL Saturday’s is a great venue for FREE training and I highly encourage you to attend if you can.  It’s one Saturday out of the year for this particular event and I promise that it’ll be worth your time.  If you haven’t yet experienced the awesomeness that is SQL Saturday, you are missing out and now’s the time to change it!

If you are in my session or even just wandering about, please feel free to come up and introduce yourself!  I enjoy meeting new people and I promise I won’t bite!

Still here?  What are you waiting for?  Go.  Now.  Register.   Take the leap and move forward in your learning & career! DO IT!!

Posted in Community, Presenting, SQL Saturday | Tagged , , | 3 Comments

SQL Saturday #287 – Madison, WI

sqlsat287_webThis upcoming weekend will mark a first for me.  I get to speak at SQL Saturday 287 in Madison, WI.  This will be my first time speaking at this particular event and I’m very much looking forward to it.

I’m also bringing my family along for the ride as my wife has relatives in the area so we are making a vacation out of the trip.  Nothing like a mini-vacation with some #sqlawesomesauce mixed in!  Whoo Hoo!!

So if you have nothing better to do on Saturday, March 28, 2014, come on out and get your learn on.  The schedule for this event is quite impressive and there will be a TON of things to learn.  Come, join us and learn something.  As my wise father always told me, you should never stop learning.

Oh yeah, I’m speaking!

Social Networking Is Alive!

SocialMediaIsAlive!Social media is every where these days and if you’re not on the train, you could be missing out.  This session is about how you can use social media to improve not only your professional life, but your personal one as well.  This session isn’t geared to specifically geeks (totally non-technical), but to anybody who wishes to improve their online presence through the use of social networking.  It’s a fun session and I hope that the attendee’s enjoy it!!

HINT: I’ll show you how to get free SQL Server help from one of the popular social networks.  How would you like to tell your management that you solved a real world enterprise level issue with the help of social media?  For free?

Better yet, if someone in the audience would have a SQL question that they want answered, we’ll see if we can get the #sqlfamily to help out.  I’m guessing that we will!

You might even see reference to a popular YouTube video.

Free Training

Can you really pass up on free training?  Ok, even with lunch it’s only $10!  So if you’re still on the fence on attending, get off the fence, register, come on out and learn something.  Meet some new people.  Grab some swag and a fantastic lunch.  I promise you that the speakers won’t bite.  Register now!!

Get your learn on!  I triple dog dare you!

Why are you still here?  Go.  Register.  Do it now! See you in Madison!

Posted in Community, Presenting, SQL Saturday | Tagged , , | 4 Comments

Configuration Validation – Max Memory

We are in the midst of rolling out SQL Server 2012 to our environment and, like most shops, we customize our SQL Server instances based on our environment.  Since we don’t just let things go straight out of the box, we have to verify things are done in the manner which matches our deployment process.  We don’t deploy new servers often so the need to fully automate this process isn’t here yet.

What we  want to do, is to have a way to perform an easy validation of an install to ensure that any settings weren’t missed.  So I started to put together a script that I could easily run on a server to validate certain settings were in place.  This script checks for things like certain trace flags, what is the max memory set to, how many data files does TempDB have, etc etc. For this post, we’re going to focus on the max memory setting.

Max Memory

Setting the max memory (and even the minimum memory) for a SQL Server instance is a very important thing to address when installing a SQL Server.  It’s a hog and it will gladly take over all of the memory if you let it, thus starving the operating system or any other applications that you might have installed.

One of the tools that we use when configuring, and very well I might add, is information from a blog post by Jonathan Kehayias (B|T), “How Much Memory Does My SQL Server Actually Need“. This is a great post as it fits our environment pretty well.  We have dedicated SQL Servers and are they aren’t virtualized so it’s a great place to start.

Essentially, Jonathan recommends this as a starting point:

  • Reserve 1GB of RAM for the Operating System
  • Reserve 1GB of RAM for each 4GB of RAM installed from 4-16GB
  • Reserve 1GB for every 8GB of RAM installed above 16GB of RAM

For eample, for a server with 192GB of total physical memory installed:

  • 1GB = Initial OS Reserve
  • 4GB = If you have more than 16GB of physical memory, otherwise (Total / 4)
  • 22GB = Total amount of memory minus 16GB divided by 8 for anything greater than 16GB of physical memory.  If you don’t have 16+ GB of memory, this is Zero.
  • 27GB = 22GB + 4GB + 1GB (total amount to reserve for the OS).
  • 165GB  = 192GB – 27GB.  This the total amount to give to SQL Server
    • NOTE: To set this value, you would take 165*1024 = 168,960 (in MB)
configvalidation_maxmemory_1

Setting Max Memory

Armed with this information a simple spreadsheet was born to handle the math.  This worked great! Of course, until we wanted to script things out. Once that milestone was reached, it was obvious that the real need was to dynamically handle this in a script.  So, I started looking into how to accomplish this.

The Script

To start with I had to find out where SQL Server was keeping the total amount of physical memory that the server had.  I’ll be honest and say that I had to look for it.  I thought that I might have to read the registry, but thankfully I found a DMV, sys.dm_os_sys_info, that gives you that information.  You can read more about this particular DMV here:

If you look at the information from TechNet, you’ll see that the ‘physcial_memory_KB‘ is the column that we’re interested in as it represents the amount of physical memory that is available.  However, one thing of note is that the column name was changed from SQL 2008 R2 to SQL 2012.  In the older versions (2005 – 2008 R2) the column name is ‘physical_memory_in_bytes‘.

Along with that, the number of bytes that the number represents is also different.

In 2012 it’s ‘physcial_memory_KB’.  

In 2008 R2 (and lower) it’s ‘physical_memory_in_bytes’

I had to account for not only the name difference but also the different context (KB versus Bytes) in the script as we have a mixture of instances running and I thought that it would handy to be to run this across the Enterprise.

Once I knew where the information was kept, it was relatively easy to do the math that I needed and spit back the results.  Since we have a mixture of SQL Server versions, I used ‘SERVERPROPERTY‘ to determine what version of SQL Server the script was running against, thus allowing me to dynamically adjust for the differences in the DMV mentioned above.

You’ll also probably notice that I am using the sp_executesql command to dynamically handle the DMV differences between SQL Server versions.  This will be another blog post.

I tested the script on SQL 2005, 2008 and 2012.  Anything older than that, you’re on your own since 2000 didn’t have the DMV’s at our disposal.

DECLARE @mem NVARCHAR(10)
DECLARE @memOut NVARCHAR(10)
DECLARE @totalOSReserve INT
DECLARE @4_16 INT
DECLARE @8Above16 INT
DECLARE @serverVersion INT
DECLARE @sql NVARCHAR(MAX)
DECLARE @paramdefs NVARCHAR(500)
-- Put this into a parameter so it's configurable
DECLARE @osMem INT

-- Done this way to handle older versions of SQL Server
SET @osMem = 1 -- leave 1GB for the OS
SET @8Above16 = 0 -- leave it 0 by default if you don't have more than 8GB of memory
SET @serverVersion = LEFT(CAST(SERVERPROPERTY('productversion') AS VARCHAR(100)),CHARINDEX('.',CAST(SERVERPROPERTY('productversion') AS VARCHAR(100)),1)-1)
SET @paramdefs = N'@memOut INT OUTPUT'

-- Setup the dynamic SQL
-- We need the physical memory values in GB since that's the scale we are working with.
IF @serverVersion > 10
	SET @sql = '(SELECT @memOut = (physical_memory_KB/1024/1024) FROM sys.dm_os_sys_info)'
ELSE
	SET @sql = '(select @memOut = (physical_memory_in_bytes/1024/1024/1024) FROM sys.dm_os_sys_info)'

-- Get the amount of physical memory on the box
EXEC sp_executesql @sql, @paramdefs, @memOut = @mem OUTPUT

-- Start the Math
IF @mem > 16
	BEGIN
		SET @4_16 = 4
		SET @8Above16 = (@mem-16)/8
	END
 ELSE
	BEGIN
		SET @4_16 = @mem/4
	END

-- Total amount of memory reserved for the OS
SET @totalOSReserve = @osMem + @4_16 + @8Above16
SET @mem = (@mem-@totalOSReserve)*1024

-- Use sys.configurations to find the current value
SELECT (@mem/1024)+@totalOSReserve AS 'Total Physical Memory'
	, @totalOSReserve AS 'Total OS Reserve'
	, @mem AS 'Expected SQL Server Memory'
	, value_in_use AS 'Current Configured Value'
FROM sys.configurations
WHERE name = 'max server memory (MB)'
GO

Running this query against our Central Management Server was a great way to validate that all of our installs (both new & old) were set correctly.

Any server that isn’t set correctly can be dynamically adjusted to the appropriate value.  Keep in mind if you are increasing the amount of memory for SQL Server to utilize, you won’t need to reboot.  If you are lowering it and it already has more than the limit in use, you’ll have to work in a reboot to release it back to the OS.  Once you reboot the server then it’ll be capped at the new upper limit.

Of course, it you need to do this in ProductionI highly recommend that you do all of the appropriate testing required by your business and/or policies.

Resources & Permissions

Click on the individual links to see the Microsoft reference material.  I’ve also listed the permissions that you’ll need to have in order to run this:

Summary

There are a lot of things that we check for in our validation script but this portion was an interesting piece to build and I think will turn out to be pretty handy!

Enjoy!

Posted in Administration, T-SQL | Tagged , , | 5 Comments

Indexes & Snapshots

Several weeks ago while at SQL Saturday #271 in Albuqurque, NM, I had someone ask me if you took a snapshot of a database, would any index fragmentation go right along with it. I was fairly confident that the answer was “Yes, it will since a snapshot is at the page level” but I wasn’t quite sure and to be honest, I had never looked into it for myself.

Thus a blog post is born! Basically what we’ll do is:

    1. Create a dummy database
    2. Insert a bunch of data
    3. Delete 1/2 of the data
    4. Take a snapshot
    5. Rebuild one of the indexes
    6. Restore from the snapshot

And if all goes well, we should see that the index fragmentation from the data being deleted is restored when the database is restored from using the snapshot.

Here we go!

First let’s create our dummy database and table.

USE tempdb
GO
IF DB_ID('SnapshotIndexDemo') IS NOT NULL
	BEGIN 
		DROP DATABASE SnapshotIndexDemo
	END	
GO
CREATE DATABASE SnapshotIndexDemo
GO
USE SnapshotIndexDemo
GO		
CREATE TABLE [dbo].[Sharknado](
	[CustomerID] [INT] IDENTITY(1,1) PRIMARY KEY,
	[CustomerName] [VARCHAR](8000) NULL,
) ON [PRIMARY]
GO
-- Create us a NCL just because
CREATE NONCLUSTERED INDEX IX_Sharknado on Sharknado (CustomerID) include (CustomerName);
GO

Now I’m going to load some data.  We need to pump in some data so that we can easily see things are fragmented.  This should explode my database to 400+ MB in size. Give or take.

DECLARE @x INT = 1
WHILE @x < 50000
BEGIN
	INSERT Sharknado (CustomerName)
	SELECT REPLICATE('c', 5000)

	SET @x = @x + 1
END
GO

Just to check, you can run this code to ensure that the database has grown. The size column found in sys.master_files represents the number of 8KB pages the files are, thus we multiple that sum by 8192 (the true size of each page) and then divide by 1,024,000 for the number of bytes in a megabyte.

SELECT Database_ID,
	db_name(database_id) 
	, (SUM(size)*8192)/1024000 as 'Size in MB'
FROM sys.master_files
WHERE database_id = db_id('SnapshotIndexDemo')
GROUP BY Database_ID

You can see here that it pushed the database to a little over 400MB. Perfect!

snapshotindex_1

Now that we have some data to play with, let’s see if we can get things fragmented so that we can take a snapshot. We can see from this that there isn’t any fragmentation at this point.  This code will provide the fragmentation levels for all indexes, which in our case is just two.

SELECT index_type_Desc,fragment_count, avg_fragmentation_in_percent
FROM sys.dm_db_index_physical_stats(db_id('SnapshotIndexDemo')
	, object_id('Sharknado'), NULL, NULL, 'LIMITED')
GO

snapshotindex_3

Now that we’ve shown that things are nice and tidy, let’s create some chaos!!

-- This will delete every even row by using the modulo ("Mod") operator
-- Modulo - http://technet.microsoft.com/en-us/library/ms190279.aspx
Use SnapshotIndexDemo
GO
DELETE dbo.Sharknado
WHERE CustomerID % 2 = 0
GO

And then let’s check to make sure that we have fragmentation.

snapshotindex_4

Now let’s take a snapshot!! This code will create the snapshot and should work for you, assuming that you have a “C:\Temp” directory.  If you don’t, just adjust the script to a folder location.  We’ll clean stuff up at the end of this.

USE tempdb
GO
CREATE DATABASE SnapShotIndexDemo_ss
ON
(
NAME = SnapShotIndexDemo,
FILENAME = 'c:\Temp\SnapShotIndexDemo.ss'
)
AS SNAPSHOT OF SnapShotIndexDemo
GO

Now that the snapshot is created, let’s rebuild the non-clustered index and check the fragmentation.  Once the rebuild is complete, we’ll restore the database using the snapshot that we just took.  If our theory is true, we would expect to see the fragmentation reappear.

USE [SnapshotIndexDemo]
GO
ALTER INDEX [IX_Sharknado] ON [dbo].[Sharknado] REBUILD
GO

Let’s check the fragmentation.

snapshotindex_5

Since we rebuilt the non-clustered index, we would expect for the fragmentation to be 0.  Finally, let’s restore the database using the snapshot that we took earlier.

USE tempdb
GO
RESTORE DATABASE SnapShotIndexDemo
FROM DATABASE_SNAPSHOT ='SnapShotIndexDemo_ss'
GO

And check the fragmentation again.

snapshotindex_6

Voila!  The fragmentation is once back again thus our theory is correct.

So, in conclusion, if you have fragmentation in your database and you take a snapshot, you’ll be taking the fragmentation right along with it.

Of course, we should leave the environment as we found it, so here is some code to clean things up if you want.

USE [tempdb]
GO
-- Clean up the snapshot
IF  EXISTS (SELECT name FROM sys.databases WHERE name = N'SnapShotIndexDemo_ss')
DROP DATABASE [SnapShotIndexDemo_ss]
GO
-- Clean up the database
IF  EXISTS (SELECT name FROM sys.databases WHERE name = N'SnapShotIndexDemo')
DROP DATABASE [SnapShotIndexDemo]
GO
Posted in Indexes, Snapshots | Tagged , , | 7 Comments

Mistakes

Last week, SQL Server MVP Chris Shaw wrote a blog post about learning from mistakes and he tagged myself as well as Paul Randal (b|t) and Andy Leonard (b|t) to do a similar post.

You can find their responses here:

Paul Randal - http://www.sqlskills.com/blogs/paul/learning-listen/

Andy Leonard - http://sqlblog.com/blogs/andy_leonard/archive/2014/01/04/a-mistake-responding-well-and-a-public-call-for-a-public-apology.aspx

Tunnel Vision

A couple of weeks ago I happened to be on-call and of course the pager goes off at 1:50AM one morning. Yay!  I drag myself out of the rack and head to the kitchen where I can work on things without disturbing my beautiful bride.  I log into the VPN and start to dive into where the issue might be.  In my travels I determine that the job that paged me was missing a source file that is normally deposited onto the file system by another non-sql server process, thus it was basically outside my control.  So I had to call someone else to wake them up to see if they could check on things.  At 3 o’clock in the morning none the less.

After talking to them for a bit and digging further into it, turns that out that I had tunnel vision.  If I had take a few minutes to step back and look at the whole picture, I would have realized that I could have figured out the real issue if I had just done a little more work.  At the time I was absolutely sure that it was something that I couldn’t resolve without outside help, but I was wrong.

Tunnel vision can and will get you eventually.  It can be difficult to detect but if you realize that your in the tunnel, just stop.  Stop everything, even for a few seconds  and take a step away from the issue.  That one look from outside the tunnel can mean the difference between waking someone else up or letting them sleep.  I’m pretty sure that my co-worker would have preferred to remain sleeping.

Admitting Your Mistakes

I can remember a time at a client site earlier into my SQL Server days as a consultant when I wanted to run profiler to try to capture something.    So I did.  What I didn’t realize is that I had selected an event for profiler that would bring the server to it’s knees.  Did I mention that it was a production server?  Yeah, it was.

Once I realized what had happened, I immediately contacted my manager (the client manager that I was reporting to), informed her of my mistake as well as what I was doing to resolve the issue.  She was very understanding, no yelling or screaming and she was appreciative of the fact that I contacted her right away as well that I had a game plan on how to resolve the issue, which I had already enacted.

It’s OK to mess up.  Be honest about it.  Admit your mistakes, realize that you are human and move on.  Dwelling on something that you can’t change in the past isn’t going to solve anything.  Yes, you might feel bad, but it happens.  However, if you have a manager that’s worth their weight, they will be more appreciative of you learning from your mistakes and moving on than dwelling on what happened.  They will also be thankful that you came to them with a solution.  Always try to approach management with a solution in hand, even if it’s one that they don’t go for.

Always a Rookie

Very early into my volunteer fire fighter/EMT career, I happened to have a conversation with one of the then assistant fire chiefs (who later became the chief and still is) about my worry of making a mistake in the midst of a call.  He told me “you’re always a rookie”, meaning that no call is exactly the same.  The people are different, the situation is different, the outcome can be different.  This advice really struck home for me not only in my fire fighting career but also my career as an IT professional.  No issue is the same and if you treat it that way, it will help you to remember to stay calm in just about any situation, let your training kick in and hopefully keep you out of the tunnel.

Knowledge is Power

I don’t remember where I saw or read this, but the quote goes something like “Once you have the knowledge, it’s your’s to do with as you please”.  In other words, knowledge is something that NO ONE can take a away from you.  Somewhere along the way in my DBA career, I got the bug to learn.  I read blogs, I go to SQL Saturday’s to speak, but also to learn new things.  I take training whenever I can.  I test & play around with things.  I try my best to continue to learn and expand my mind.

The moral  is to not make the mistake of letting your knowledge & skills get stale.  The technology of this world is expanding at an incredible speed and you should take advantage of as many learning opportunities as you can. Go read a blog.  There are tons out there.

In the mighty words of GI Joe, “.. Knowing is half the battle!”   Keep learning.

Summary

Hopefully you can learn from a couple of my mistakes and try not to repeat them.  If you do, however, just remember, try to get out of the tunnel, you’re still human, and time will go on.

Posted in Community, Personal | Tagged , , , | 7 Comments

Update #4 – Prayers for Alyssa

This will most likely be my final post for Alyssa.  There has been a Facebook page setup to help convey information and that’s a much better avenue for updates.

You can find the page here: http://www.facebook.com/pages/Alyssas-Army/481215798666644

Let me say that the outpouring response for thoughts, prayers and comments has been incredible.  I was able to share a United States map with all of the various locations of where the comments came from with Alyssa’s family. I did have some comments from other parts of the world and when I have time, I plan on getting a world view of the comments.

Alyssa

With sadness, my last update…

alyssaIt is with a heavy heart that I am posting today that our Alyssa has passed. We would like to thank everyone for the outpouring of support you have given. Alyssa truly was an amazing, strong, sweet, and loving young woman!

Please continue to pray for her soul and also for peace and comfort for her family.

RIP Alyssa! We love you so much and you will be forever in our hearts.

Alyssa passed away on Sunday January 27th, 2014 in the loving arms of her mother in Houston, TX.   From what I understand, she went peacefully and I know she went to be with the Good Lord in the heavens.

Her obituary:  http://fremonttribune.com/cass-news/obituaries/alyssa-johnson/article_f0705b16-fc4d-5b17-93ad-9f77b9e8b354.html

This post will go live at 10:30AM on Monday February 3rd, 2014.  This will be during Alyssa’s funeral.   Please take a moment and offer up a silent prayer and well wishes towards Alyssa’s family as they go through this dark time.

Gifts

If you want, you can send monetary gifts to Alyssa & her family by sending them to alyssajohnsonbenefit@gmail.com.   I don’t know how much this ordeal is going to cost, but I can’t imagine that it’ll be cheap.  So if you have a spare dollar or two, please consider sending Alyssa’s family a gift.

Please check the Facebook page for further fundraising efforts.

Thank you again for following this story and offering up your thoughts & prayers!!

Posted in Community, General Conversations, Personal | Tagged , , , | 11 Comments

SQL Saturday #271 – Albuquerque, New Mexico

sqlsat271_webAnother fine SQL Saturday is headed towards Albuquerque, New Mexico on Saturday January 25th, 2014 and I am honored to be have been selected to speak!

This one holds a special place in my heart as I was supposed to speak at the Albuquerque SQL Saturday in February 2013.  However, if you’ve read my “2013 In the Rearview Mirror” post you would have discovered that I was at the hospital with my second son, Simon.  While I did miss the event, I wouldn’t trade it for the world for being there for my wife & son.  The organizers of the event were fantastic to work with in find me a replacement and I’ll always thank them for that.

So, now a year later, I get to finally get to make the trip down to sunny New Mexico and do something that I absolutely love to do, which is talk about SQL Server.

SQL Server Databaseology 201 – The Study of Database Internals

This is probably one of my favorite presentations as I get to talk about the internal structures of a Microsoft SQL Server database.  I’m not talking about tables, views, or procedures but rather things such as how each row is structured, how does SQL Server keep track of your data and various ways to go diving into the depths of the internals.  This presentation is packed with demo’s and fantastic #Sharknado references!

If you haven’t checked out the schedule yet, you should.  There are some fantastic speakers going to this event.  Speakers like Mike Fal (B|T), Chris Shaw (B|T),  Jason Kassay (T), Jason “Cannonball” Horner (B|T), Denny Cherry (B|T), Steve Jones (B|T), Meagan Longoria (B|T) and many others.

SQL Saturday’s is a great venue for FREE training and I highly encourage you to attend if you can.  It’s one Saturday out of the year and I promise that it’ll be worth your time.

Feel free to come up and introduce yourself!  I enjoy meeting new people and I promise I won’t bite!

What are you waiting for?  Go.  Now.  Register.   Take the leap and move forward in your learning & career!

See you there!

Posted in General Conversations, Presenting, SQL Saturday | Tagged , , | 6 Comments

2013 In the Rearview Mirror

So I’ve been seeing a lot of posts about 2013 in “Review”.  So I thought that I should do one as well.  Might as well join the crowd right? Here we go!

Family

Simon

Mighty Simon!

For those that might remember, just under a year ago our little family got to ride on a little roller-coaster.  My beautiful bride, Cari was pregnant with our second child, a little boy.   In late January, Cari’s water broke early at 29 weeks and we rushed to the hospital.  The doctor’s told us that approximately 95% (don’t quote me on that number, but I think it’s close) of the women who’s water breaks early will deliver within 48 hours.  Fast forward 3 weeks.  Yup, we were in the minority (Thank the Lord!)  in this regard and Cari had to stay in the hospital for 3 additional weeks begin pregnant. Every minute in the hospital meant another minute for Simon to stay in the womb, growing and getting stronger.

Simon’s (as he was later named) heart rate dropped one mid-Sunday morning, February 10th actually,  and the doctor’s decided that it was time for him to make his appearance.  So emergency c-section was on the menu and Simon made his debut.   The next 5 weeks were spent in the NICU at a fantastic hospital (Nebraska Medical Center) and he is currently a happy, healthy baby boy.

During this rough time I had a lot of people thinking & praying for my family and I.  I cannot tell you how much it means to me to have people I haven’t even met praying for the well being of my little boy.  Simply incredible and I thank each and every one of you for that.

Professional

In April 2013 I took on a new adventure within my employer, Farm Credit Services of America.  I had been working on a development team as a Database Developer when an opportunity to move teams to the production support team opened up.  So after some thought, I applied, interviewed and was offered the position.  So with that, I’ve gone back to the dark side of being on-call and providing production support for a fantastic company.

#SQLFamily

A couple of things happened within the SQL Family for me in 2013.  I was able to speak at several SQL Saturday’s including some new great locations such as New York, New York and Orlando Florida.  I was also able to visit some of my favorite spots such as Denver, Colorado and Sacremento, California.  It was a great year for speaking and I look forward to continuing that in 2014.

PASS_Outstanding_Volunteer_Seal1.  In August, I was honored to be named as a PASS Outstanding Volunteer.  A colleage of mine, David Klee was kind enough to nominate me for the aware and I thank him for that.  I love the SQL Server community and I’m completely honored to join the ranks of these outstanding individuals.

2.  In December, our regional mentor, Jason Strate (B | T) decided to step down from being a regional mentor for the North Central Region of PASS.  In a previous conversation months prior, I had mentioned that I’d be interested in being a regional mentor and/or helping out with whatever I could.  Even with that conversation, I was still surprised when I got a phone call from the PASS organization asking if I wanted to replace Jason.  Well, no one can really replace Jason, but they want me to give it a whirl.  So I said, yes and became a Regional Mentor.  I get to join Ted Krueger (B | T) in helping local chapters with whatever issues they might have.  Ted and I get along great and I look forward to working with him as a Regional Mentor in 2014.

sqlsat197_web

3.  I was also able to help organize and deliver another successful SQL Saturday in April for the Omaha area.  This was our second event for us and I think that we knocked it out of the ball park!  We had about 165 (give or take) people through the door and some absolutely fantastic speakers.  Watch for future notifications for another SQL Saturday in the Omaha area probably in 2015!

Anyway, I think all in all, 2013 turned out to be a fantastic year for me, both personally as well as professionally.

I look forward to what 2014 may have in store for me, both in my professional life as well as in my personal one.

 

Posted in General Conversations, Personal | Tagged , , | 15 Comments

Update #3: Prayers for Alyssa

Previous blog posts:
Prayers for Alyssa
Update #1: Prayers for Alyssa
Update #2: Prayers for Alyssa

1/14/14 From Tami:

alyssaAlyssa is in Houston now and has undergone tests all day. Her heart is worse than the Dr’s originally thought. Not sure if it was missed or has just gotten worse. They are increasing the first medicine to 40ml. She is currently at 18ml. They will increase 2ml every 8 hours. Alyssa doesn’t tolerate this med very well. It is the med that causes her to vomit. She is currently vomiting frequently and it is not expected to change anytime soon. She is currently fluctuating between vomiting and sleeping.

The next 3 days will consist of extensive testing to see if she even qualifies for transplantation. There is 1 other pediatric patient currently waiting for lungs. Depending on how the next 3 days go she may not be strong enough for a transplant.

She is running a fever of 101°. They think there may be an infection in her central line. Her heart has now also started to have an irregular rhythm.

They are working with Alyssa to try to get her strong enough to have surgery. They are doing physical therapy on her now.

Please, I cannot ask for enough prayers so that we can get this sweet girl healthy again! Please also give Tina and Jeff the strength she needs to help her daughter get through this!

Posted in General Conversations | 9 Comments