RAM Matters: How Much Do You Need for Gaming? 4GB, 8GB, 16GB or 32GB

Techspot's next challenge. When will you folks do testing of how much memory is useful for real world non-gaming scenarios, including heavy hitters like Adobe CS, large Access or SQL data bases, and browsers with 40 or 50 open windows? My rule of thumb, gaming or otherwise is to mess around with 4GB or 6GB systems almost never, so 8GB is my minimum. And I really like the responsiveness of my elderly Dell Precision T3500 with hex-core Xeon and 24GB of memory.
 
12GB is not a configuration you should use in any dual-channel system. So that is why you don't see that configuration as we don't encourage it for the modern dual and quad platforms.
Please explain why 4 modules should not be used in a 2x4 and 2x2 configuration. As long as the both dual channels are paired, I see no reason why the second dual channel should have the same size modules as the first dual channel. It is the frequency that needs to be matched across all modules not the size. The size only needs to match the module being paired to.
 
Not sure why you insist on considering only 4, 8, 16, 32. Some people will have spare memory slots that can be used to upgrade rather than throwing away memory modules, thus 12 and 24 are also options (so long as you obey the channel rules). So the conclusion should really be that 12GB is the minimum.

How do you get 12gb with dual channel tho? An 8gb kit and a 4gb kit? I dont think u can get 2gb sticks.

Going from 8GB to 16GB is probably my last upgrade on my 2600k based rig; held up surprisingly well all things considered.

Also make sure its 2133mhz, makes a huge difference in CPU bound games like GTA V. I found that out first hand with my 2600k and 980ti combo
 
I have an Ivy Bridge system that runs just fine in my system playing games with 8x4 @12GB. More ram is always better that less ram and dual channel is way, way overrated. Do some research...
 
Going from 8GB to 16GB is probably my last upgrade on my 2600k based rig; held up surprisingly well all things considered.
Indeed. I still have a 2500K overclocked to 4.8Ghz since day one and have the money to upgrade but I dont think it´s really worth it. I have since upgraded the rest: 8Gb of RAM, a 256Gb 960 Evo SSD and a Nvidia 1070. But the CPU is still holding on strong for everything, to be honest, and some articles and videos out there compare it to current CPUs and the difference is not that great, especially when OCed.
 
So you either spend on a Galax EXOC GTX 1080 8GB @ $699 with Crucial 8G Single 2400 DDR4 Ram @ $125 or a Galax GTX1060 OC 3GB @ $285 plus Corsair Vengeance Black (8GBx2) 3000 16GB Kit DDR4 Ram @ $276 for similar gameplay results? Prices from msy at time of post.
 
I prefer to have as much as possible. Basically as much as my motherboard will support. For my last build I had 32GB of DDR3.
This time, 64GB of DDR4 which I may increase to 128GB.
.
You lost gaming performance then, countless articles showed that in games, from time to time, having way to much RAM actually performed worse then machines with 8GB-16GB. It wasn't much of a loss, 3-10 FPS, give or take.
 
Based on how I use Chrome, 16GB is going to have to be the minimum for my next build. I'm an absolute tab monster.
 
I find this a useless article unfortunately. There are plenty of people still out here with the previous gen video cards from Nvidia and AMD yet they aren't represented. My son God love him for playing games with the PC elite snobs he does has been convinced that his system with his OC'ed 4690k, 480GB HyperX SSD, Liquid cooling kit, Nvidia 960, and 8GB of 2400 memory is too slow now requiring at least 16GB of RAM and a new CPU to play his modern games. Yes, he earned his machine at the time, no I will not cater to his whims from asshats that apparently don't live in the real world with bills to pay etc outweighing the "need" for a 1080 ti, 32GB of memory, and the latest Intel or AMD CPU. I was hoping to see an article that I could present to him to help prove that 1) If a current game requires at least 16GB of RAM then the devs suck at their job and 2) He isn't going to notice that much of a difference in actual performance just be cause the FPS counter may change by 10-20FPS when he is already showing well over 60FPS.


I also agree with many other comments on their points. The article is trying to tackle an "issue" that is very tough to quantify and not get trolled.
 
Interested to know what app was used to display ram usages and fps in game. Oh and was that Steve playing with those sticks, seriously just hand me over a working 16GB and I couldn't care less what happens to the rest, lol.
 
RAM for games sure 8 GB can do the trick. 16 GB, 24 GB and 32 GB. Most games can use all the RAM but VPU/GPU to share enough RAM to that to get Max Video Shades, 3D effects etc in detail is the key. One of my sticks is bad so down to 24 GB from 32 GB. I get lifetime protection just have to send it the defective stick. In the meantime 24 GB seems to work well but the system is only using part of that. 1920 x 1080 always no issues even over WiFi connection as well.
 
I find this a useless article unfortunately. There are plenty of people still out here with the previous gen video cards from Nvidia and AMD yet they aren't represented. My son God love him for playing games with the PC elite snobs he does has been convinced that his system with his OC'ed 4690k, 480GB HyperX SSD, Liquid cooling kit, Nvidia 960, and 8GB of 2400 memory is too slow now requiring at least 16GB of RAM and a new CPU to play his modern games. Yes, he earned his machine at the time, no I will not cater to his whims from asshats that apparently don't live in the real world with bills to pay etc outweighing the "need" for a 1080 ti, 32GB of memory, and the latest Intel or AMD CPU. I was hoping to see an article that I could present to him to help prove that 1) If a current game requires at least 16GB of RAM then the devs suck at their job and 2) He isn't going to notice that much of a difference in actual performance just be cause the FPS counter may change by 10-20FPS when he is already showing well over 60FPS.

I also agree with many other comments on their points. The article is trying to tackle an "issue" that is very tough to quantify and not get trolled.

If you find this article useless, especially based on the information provided in your own comment, then it’s your inability to comprehend the information in front of you that has been the point of fail.

Your son is 100% correct. Most modern multiplayer games such as Battlefield 1 stutter quite a lot with 8GB’s of memory and this is amplified on older systems such as his Haswell quad-core due to the lower memory speed. If he were to update to 16GB’s the experience would be much better.

You were hoping to find an article that would support your argument, unfortunately your wrong so you didn’t find what you were after.

Interested to know what app was used to display ram usages and fps in game. Oh and was that Steve playing with those sticks, seriously just hand me over a working 16GB and I couldn't care less what happens to the rest, lol.

MSI Afterburner with Riva Tuner and yes that was me playing with those memory sticks ;)

Please explain why 4 modules should not be used in a 2x4 and 2x2 configuration. As long as the both dual channels are paired, I see no reason why the second dual channel should have the same size modules as the first dual channel. It is the frequency that needs to be matched across all modules not the size. The size only needs to match the module being paired to.

What ZackL04 said.
 
There are still good DDR3 systems that do support 2GB modules. I do understand where you are coming from now, but that is only true if you disregard DDR3.

I'm disregarding DDR3 ;) I mean method 1 showed allocation and method 2 showed the performance impact. It's pretty clear for the latest and greatest titles you'll be right on the edge with 12GB's, certainly better than 8 but for some headroom you'll want 16GB's.

If you have an older system and you're playing older games then it's really a non-issue.
 
I'm disregarding DDR3 ;) I mean method 1 showed allocation and method 2 showed the performance impact.
Ohh I wasn't questioning the article, it was the comment that had me confused. You cleared that up when you said DDR3 was off the table. I probably should have realized it was in reference to the article's DDR4 only but I didn't. Actually I didn't know DDR4 didn't come with 2GB modules either, so that was part of the confusion.
 
I thought it odd you used the 1060 in lieu of the 1070 which to me would be the one most would buy. My system still goes ok after 12 months - skylake i7-6700K, Asus ROG Maxiumus VIII Hero, 16GB G.Skill Ripjaws V DDR4 PC3400, 2 x Gigabyte G1 Gaming GTX1070 in SLI, Antec Edge 80+ Gold 750W PSU, Samsung 960 EVO M.2 512GB NVME, Thermaltake Water 3.0 Ring RGB all wrapped up in a Coolermaster Mastercase Pro5 Case and connected to a Razer Ourobouros, Corsair K95 Platinum RGB and a Samsung 49" 4K TV with a Samsung 27" on the side.
The biggest issue for me so far is game support. For instance Tom Clancy's Ghost Recon Wildlands does not support SLI so I can only play in 1080p with only 8Gb of VRAM. GTAV recognises SLI and can be played in 4K.
 
I find this a useless article unfortunately. There are plenty of people still out here with the previous gen video cards from Nvidia and AMD yet they aren't represented. My son God love him for playing games with the PC elite snobs he does has been convinced that his system with his OC'ed 4690k, 480GB HyperX SSD, Liquid cooling kit, Nvidia 960, and 8GB of 2400 memory is too slow now requiring at least 16GB of RAM and a new CPU to play his modern games. Yes, he earned his machine at the time, no I will not cater to his whims from asshats that apparently don't live in the real world with bills to pay etc outweighing the "need" for a 1080 ti, 32GB of memory, and the latest Intel or AMD CPU. I was hoping to see an article that I could present to him to help prove that 1) If a current game requires at least 16GB of RAM then the devs suck at their job and 2) He isn't going to notice that much of a difference in actual performance just be cause the FPS counter may change by 10-20FPS when he is already showing well over 60FPS.

I also agree with many other comments on their points. The article is trying to tackle an "issue" that is very tough to quantify and not get trolled.

If you find this article useless, especially based on the information provided in your own comment, then it’s your inability to comprehend the information in front of you that has been the point of fail.

Your son is 100% correct. Most modern multiplayer games such as Battlefield 1 stutter quite a lot with 8GB’s of memory and this is amplified on older systems such as his Haswell quad-core due to the lower memory speed. If he were to update to 16GB’s the experience would be much better.

You were hoping to find an article that would support your argument, unfortunately your wrong so you didn’t find what you were after.

Interested to know what app was used to display ram usages and fps in game. Oh and was that Steve playing with those sticks, seriously just hand me over a working 16GB and I couldn't care less what happens to the rest, lol.

MSI Afterburner with Riva Tuner and yes that was me playing with those memory sticks ;)

Please explain why 4 modules should not be used in a 2x4 and 2x2 configuration. As long as the both dual channels are paired, I see no reason why the second dual channel should have the same size modules as the first dual channel. It is the frequency that needs to be matched across all modules not the size. The size only needs to match the module being paired to.

What ZackL04 said.

There is multiple articles already stating MHZ of memory fail to yield anything more then a 1-5% increase in frame rate aside from ryzen due to the newer architecture.

If your going to rip on someone for speaking their mind on a comment then you should at least have the capacity to have a proper article with relevant information and not this nonsense posted. The article benchmarks show VRAM comparisons, not SYSTEM ram comparisons using dynamic VRAM testing samples instead of a static card and altering system memory limits which would correlate with the title. The author, I assume you also fails to have the understanding of how games allocate resources, the test themselves show useless results as the different VRAM buffers as shown in Call of Duty WWII show the 1080ti and 1060 have nearly identical results, well ****, where did the other 5GB of information go? you know all those textures that the TI had cached in VRAM. But you state "simular usage"

Hint, whenever your trying to quantify some "theory" you need analytical data that's not dynamic, that means not playing another map and being like whatever its close enough, I saw the numbers be similar. Yet what you posted is nothing close being a 5GB difference.

So lets break this down for you and hopefully you take the time to re-run the testing methodology if you care at all and would like to provide relevant information for your viewers.

1. Get a static setup of either VRAM or SYSTEM ram, a static benchmark then run the tests and invert the VRAM and SYSTEM ram configuration making the other the static module.

2. Realize that unless Shaders are forcefully stored in memory then your test is skewed

3. The only accurate piece of information is that when VRAM allocation is exhausted then system RAM and page file/hdd is utilized in that order for resources. I'm not going to bother speculating on specifics, given the system is running 32GB the testing methodology also needs to show specific resource utilization for the game being tested for page file as I highly doubt the game is using that over system ram. I'm not even sure why the that value is included unless the system is running 16gb or less on a card with 4gb of vram or less.

The TL:DR is don't tell someone they cant comprehend your article when its utterly awful and provides next to nothing with statistics and is poorly articulated.
 
There is multiple articles already stating MHZ of memory fail to yield anything more then a 1-5% increase in frame rate aside from ryzen due to the newer architecture.

If your going to rip on someone for speaking their mind on a comment then you should at least have the capacity to have a proper article with relevant information and not this nonsense posted. The article benchmarks show VRAM comparisons, not SYSTEM ram comparisons using dynamic VRAM testing samples instead of a static card and altering system memory limits which would correlate with the title. The author, I assume you also fails to have the understanding of how games allocate resources, the test themselves show useless results as the different VRAM buffers as shown in Call of Duty WWII show the 1080ti and 1060 have nearly identical results, well ****, where did the other 5GB of information go? you know all those textures that the TI had cached in VRAM. But you state "simular usage"

Hint, whenever your trying to quantify some "theory" you need analytical data that's not dynamic, that means not playing another map and being like whatever its close enough, I saw the numbers be similar. Yet what you posted is nothing close being a 5GB difference.

So lets break this down for you and hopefully you take the time to re-run the testing methodology if you care at all and would like to provide relevant information for your viewers.

1. Get a static setup of either VRAM or SYSTEM ram, a static benchmark then run the tests and invert the VRAM and SYSTEM ram configuration making the other the static module.

2. Realize that unless Shaders are forcefully stored in memory then your test is skewed

3. The only accurate piece of information is that when VRAM allocation is exhausted then system RAM and page file/hdd is utilized in that order for resources. I'm not going to bother speculating on specifics, given the system is running 32GB the testing methodology also needs to show specific resource utilization for the game being tested for page file as I highly doubt the game is using that over system ram. I'm not even sure why the that value is included unless the system is running 16gb or less on a card with 4gb of vram or less.

The TL:DR is don't tell someone they cant comprehend your article when its utterly awful and provides next to nothing with statistics and is poorly articulated.

They speak their mind and I speak mine, why can't it go both ways? Using Call of Duty WWII to show memory allocation was actually a mistake because the game allocates all available memory. That said the performance results are accurate and speak for themselves.

Also you've completely misunderstood the memory frequency comment.

1. Get a static setup of either VRAM or SYSTEM ram, a static benchmark then run the tests and invert the VRAM and SYSTEM ram configuration making the other the static module.

Did you miss the second page of the article? I have to assume you did based on this comment. I don't need to address anything beyond this, it makes no sense...
 
Last edited:
I thought it odd you used the 1060 in lieu of the 1070 which to me would be the one most would buy. My system still goes ok after 12 months - skylake i7-6700K, Asus ROG Maxiumus VIII Hero, 16GB G.Skill Ripjaws V DDR4 PC3400, 2 x Gigabyte G1 Gaming GTX1070 in SLI, Antec Edge 80+ Gold 750W PSU, Samsung 960 EVO M.2 512GB NVME, Thermaltake Water 3.0 Ring RGB all wrapped up in a Coolermaster Mastercase Pro5 Case and connected to a Razer Ourobouros, Corsair K95 Platinum RGB and a Samsung 49" 4K TV with a Samsung 27" on the side.
The biggest issue for me so far is game support. For instance Tom Clancy's Ghost Recon Wildlands does not support SLI so I can only play in 1080p with only 8Gb of VRAM. GTAV recognises SLI and can be played in 4K.
First off, you forgot to mention the brand name/model of your computer chair (hint hint). Secondly, haven't you heard that SLI is on its way out? A lot of new games don't and won't support it going forward.

The sad part is, you could have gotten a 1080 Ti for the same cost as two 1070s, similar performance (on games with nearly 100% scaling), and never had to worry about SLI support again.
 
What about civ 6 , total war warhammer 2, witcher 3 ? ... these must need a lot of ram. The problem is that you probably need a campaign game in the late stages with thousand of units etc.
 
I prefer to have as much as possible. Basically as much as my motherboard will support. For my last build I had 32GB of DDR3.

This time, 64GB of DDR4 which I may increase to 128GB.

More memory means that if the system's usage requirements ever peak, you won't even notice.

As the article states, some data gets offloaded to the SSD. I'm fairly certain most gamers use SSD rather than slower HDD by now.
Honestly I have 64GB of ram in my workstation having a 40 million poly model in 3ds max and zbrush 64 and photoshop open and premiere and I haven't run out of memory once ... 128gb could be good only if you are in scientific research or AI and you use huge datasets like calculating fluids , or the weather ...
 
So what should he be testing, then? World of Warcraft? Dark Age of Camelot? Diablo II? StarCraft Brood War? Age of Empires I? Combat Flight Simulator 2? Plants vs. Zombies?

With the exception of games like Crysis, by & large there's little point in testing older games because we already know that a) they're going to perform "better" on new hardware because it already performed well on the old hardware, b) we're not going to see any improvement in performance on new hardware because the game didn't come close to maxing out the resources available on the old hardware, or c) the new hardware is so overpowered you have to use extra software/applications on new hardware to slow it down enough to match the old hardware's capabilities.
 
So what should he be testing, then? World of Warcraft? Dark Age of Camelot? Diablo II? StarCraft Brood War? Age of Empires I? Combat Flight Simulator 2? Plants vs. Zombies?

With the exception of games like Crysis, by & large there's little point in testing older games because we already know that a) they're going to perform "better" on new hardware because it already performed well on the old hardware, b) we're not going to see any improvement in performance on new hardware because the game didn't come close to maxing out the resources available on the old hardware, or c) the new hardware is so overpowered you have to use extra software/applications on new hardware to slow it down enough to match the old hardware's capabilities.

I deleted Slappy McPhee's post, that was probably the most unintelligent thing he could come back with. Has to be trolling at this point so lets not waste anymore time.
 
Back