Configuration Validation – Max Memory

We are in the midst of rolling out SQL Server 2012 to our environment and, like most shops, we customize our SQL Server instances based on our environment.  Since we don’t just let things go straight out of the box, we have to verify things are done in the manner which matches our deployment process.  We don’t deploy new servers often so the need to fully automate this process isn’t here yet.

What we  want to do, is to have a way to perform an easy validation of an install to ensure that any settings weren’t missed.  So I started to put together a script that I could easily run on a server to validate certain settings were in place.  This script checks for things like certain trace flags, what is the max memory set to, how many data files does TempDB have, etc etc. For this post, we’re going to focus on the max memory setting.

Max Memory

Setting the max memory (and even the minimum memory) for a SQL Server instance is a very important thing to address when installing a SQL Server.  It’s a hog and it will gladly take over all of the memory if you let it, thus starving the operating system or any other applications that you might have installed.

One of the tools that we use when configuring, and very well I might add, is information from a blog post by Jonathan Kehayias (B|T), “How Much Memory Does My SQL Server Actually Need“. This is a great post as it fits our environment pretty well.  We have dedicated SQL Servers and are they aren’t virtualized so it’s a great place to start.

Essentially, Jonathan recommends this as a starting point:

  • Reserve 1GB of RAM for the Operating System
  • Reserve 1GB of RAM for each 4GB of RAM installed from 4-16GB
  • Reserve 1GB for every 8GB of RAM installed above 16GB of RAM

For eample, for a server with 192GB of total physical memory installed:

  • 1GB = Initial OS Reserve
  • 4GB = If you have more than 16GB of physical memory, otherwise (Total / 4)
  • 22GB = Total amount of memory minus 16GB divided by 8 for anything greater than 16GB of physical memory.  If you don’t have 16+ GB of memory, this is Zero.
  • 27GB = 22GB + 4GB + 1GB (total amount to reserve for the OS).
  • 165GB  = 192GB – 27GB.  This the total amount to give to SQL Server
    • NOTE: To set this value, you would take 165*1024 = 168,960 (in MB)
Setting Max Memory

Armed with this information a simple spreadsheet was born to handle the math.  This worked great! Of course, until we wanted to script things out. Once that milestone was reached, it was obvious that the real need was to dynamically handle this in a script.  So, I started looking into how to accomplish this.

The Script

To start with I had to find out where SQL Server was keeping the total amount of physical memory that the server had.  I’ll be honest and say that I had to look for it.  I thought that I might have to read the registry, but thankfully I found a DMV, sys.dm_os_sys_info, that gives you that information.  You can read more about this particular DMV here:

If you look at the information from TechNet, you’ll see that the ‘physcial_memory_KB‘ is the column that we’re interested in as it represents the amount of physical memory that is available.  However, one thing of note is that the column name was changed from SQL 2008 R2 to SQL 2012.  In the older versions (2005 – 2008 R2) the column name is ‘physical_memory_in_bytes‘.

Along with that, the number of bytes that the number represents is also different.

In 2012 it’s ‘physcial_memory_KB’.  

In 2008 R2 (and lower) it’s ‘physical_memory_in_bytes’

I had to account for not only the name difference but also the different context (KB versus Bytes) in the script as we have a mixture of instances running and I thought that it would handy to be to run this across the Enterprise.

Once I knew where the information was kept, it was relatively easy to do the math that I needed and spit back the results.  Since we have a mixture of SQL Server versions, I used ‘SERVERPROPERTY‘ to determine what version of SQL Server the script was running against, thus allowing me to dynamically adjust for the differences in the DMV mentioned above.

You’ll also probably notice that I am using the sp_executesql command to dynamically handle the DMV differences between SQL Server versions.  This will be another blog post.

I tested the script on SQL 2005, 2008 and 2012.  Anything older than that, you’re on your own since 2000 didn’t have the DMV’s at our disposal.

Running this query against our Central Management Server was a great way to validate that all of our installs (both new & old) were set correctly.

Any server that isn’t set correctly can be dynamically adjusted to the appropriate value.  Keep in mind if you are increasing the amount of memory for SQL Server to utilize, you won’t need to reboot.  If you are lowering it and it already has more than the limit in use, you’ll have to work in a reboot to release it back to the OS.  Once you reboot the server then it’ll be capped at the new upper limit.

Of course, it you need to do this in ProductionI highly recommend that you do all of the appropriate testing required by your business and/or policies.

Resources & Permissions

Click on the individual links to see the Microsoft reference material.  I’ve also listed the permissions that you’ll need to have in order to run this:


There are a lot of things that we check for in our validation script but this portion was an interesting piece to build and I think will turn out to be pretty handy!


© 2014, John Morehouse. All rights reserved.

6 Replies to “Configuration Validation – Max Memory”

  1. I have found that I can lower the max memory setting and SQL Server will give that memory back to the OS. It can take a few minutes depending on how busy your box is. I have not had to reboot. I could not find a definite answer in technet.

    I did find this – It has a statement that you can make changes to min and max memory without restarting the instance.

    Thanks for the nice article.


Leave a Reply