Modder doubles memory on GeForce RTX 3070 to 16GB

Shawn Knight

Posts: 15,294   +192
Staff member
Impressive: One of the quickest and easiest ways to boost your computer’s overall performance is to upgrade the memory subsystem. In many instances, it can take fewer than five minutes to remove your old memory and swap in a new kit that is faster and / or offers more capacity. It’s no wonder, then, that this is often among the first upgrades to consider when sprucing up an older system. What Russian modder VIK-on has accomplished in his latest video series, however, is many levels beyond a basic memory upgrade.

The hardware tweaker took a Palit-branded GeForce RTX 3070, removed the 1GB GDDR6 memory modules (K4Z803256C-HC14) and replaced them with 2GB GDDR6 models (K4ZAF3258M-HC14), effectively doubling the card's memory to 16GB.

If that weren’t challenging enough, VIK-on also had to change some resistor settings in order to trick the card into recognizing that it now has twice as much memory on tap.

Unfortunately, issues still persisted. Similar to what happened when the mod was performed on a GeForce RTX 2070, the card would crash after running an intense workload. This is where the YouTuber’s subscribers chimed in with a potential solution.

As it turns out, using the EVGA Precision X1 tuning software to lock the card’s frequency rectified the issue, allowing the modder to complete benchmarking runs in 3DMark Time Spy and Unigine Superposition – 8K Optimized. The results from those tests are in line with what you’d expect, proving that the upgrade was indeed a success.

Permalink to story.

 
Ton8 in bottom gear
I put eigh gigs in a 1050Ti
Sure you did.

Also "eigh gigs" on a 1050Ti is completely useless. 16GB on a 3070 are not, even if not all of them are used yet, you can easily use more than 10GB in 4k in some games now.
 
Sure you did.

Also "eigh gigs" on a 1050Ti is completely useless. 16GB on a 3070 are not, even if not all of them are used yet, you can easily use more than 10GB in 4k in some games now.

Well yes and no. You're right, some 4k games already can push 10gb on a 3070 but the issue is that the 3070 is not really pushing very good performance at 4k games pushing this much vram. I am not sure many gamers want 4k games that have gorgeous visual fidelity but run at 30FPS average. There's a limited number of games in which a 3070 is both strong enough to handle 4k higher details AND go over the 8gb

The issue is future games, as in thinking 2 or 3 years in advance when even 1440p could start pushing the 3070 beyond the VRAM limits but not beyond the usable performance limit. But again, given the way things develop for the most part, it would be unlikely that most games start pushing a ton of vram without also pushing the 3070 beyond is usefulness as a 4k or 1440p high gaming card.

If given the choice (That isn't something insane like a 30 or 40% price premium of course) sure go for more VRAM than less but for the most part, day to day most users will either not encounter or would be able to very slightly downgrade some of the in game settings to stay within the VRAM constrains while also within acceptable performance constrains. It's an issue, but imo not really a deal breaker.
 
Well yes and no. You're right, some 4k games already can push 10gb on a 3070 but the issue is that the 3070 is not really pushing very good performance at 4k games pushing this much vram. I am not sure many gamers want 4k games that have gorgeous visual fidelity but run at 30FPS average. There's a limited number of games in which a 3070 is both strong enough to handle 4k higher details AND go over the 8gb

The issue is future games, as in thinking 2 or 3 years in advance when even 1440p could start pushing the 3070 beyond the VRAM limits but not beyond the usable performance limit. But again, given the way things develop for the most part, it would be unlikely that most games start pushing a ton of vram without also pushing the 3070 beyond is usefulness as a 4k or 1440p high gaming card.

If given the choice (That isn't something insane like a 30 or 40% price premium of course) sure go for more VRAM than less but for the most part, day to day most users will either not encounter or would be able to very slightly downgrade some of the in game settings to stay within the VRAM constrains while also within acceptable performance constrains. It's an issue, but imo not really a deal breaker.
The 4k on 3070 is indeed stretching it, but even at 1440p with RT/texture packs/mods it can push over 8GB Vram so any number above that (10GB, 12GB, etc) is to be desired. Does it need 16GB? NO, but since this was the example of 16GB that's what I'm using for explaining.

Now if nvidia releases the 3070 Ti with 10GB as a cut down 3080, that would make more sense and would be enough for a decent number of years up to 1440p.
 
1987??? Nice time machine you have there.

I had a VGA Wonder in 88 in one of my work machines, don't remember if it was the first run of those. So it wasn't that much of a time loop. heh.

Might have been an EGA Wonder? Pretty sure those were out in 87
 
I'm reading this on my 4K monitor, using GTX780 (with 3GB of RAM), and wondering what's the fuss all about :)

The upgrade from 1MB to 2MB VRAM was so huge on my first PC in the 90s :)
And then came Matrox, with 4MB, and blew them all away. I watched it while in dorm. Good years.
 
I'm reading this on my 4K monitor, using GTX780 (with 3GB of RAM), and wondering what's the fuss all about :)


And then came Matrox, with 4MB, and blew them all away. I watched it while in dorm. Good years.

It's about gaming and being able to do things quicker than your opponent and see things they can't, a 780 with 3gb of ram will work fine for forums, facebook etc but once you get into the higher gaming levels where people compete against each other faster and more IS better
 
I totally forgot that upgrading VRAM was a thing back in the day. I remember my dad upgrading our old trident video card's memory from 1 to 2MB. It did nothing performance wise, but it was exciting nonetheless.
 
"Looks at impossible to get RTX 3080 sitting inside PC, thinks about smoke coming from it, decides not to try"
 
IMHO, 16GB is what the 3070 should have had at launch (or at least 10). The 3070's processing ability is 4K capable depending on your needs, but that 8GB is going to make it age prematurely. Stop and think about it - in 2016, you could buy a 1060 6GB for $250 or a 1070 8GB for $400, and almost 5 years later you're paying $500 MSRP (yeah right) for a card with... wait for it... 8GB.
 
Modded from cheapo GPU to legit GPU. It is ridiculous how Nvidia can offer so low amount of VRAM on such an expensive and powerful card. When the first real nextgen games come to PC, 8 Gb cards are already limited. On max settings Resident Evil 8 is one game that most likely demands more than 8 Gb. Fortunately RTX 3070 Ti will be the proper version of this card. Nvidia is so lame sometimes.
 
We have reached a time where people have to fix the GPUs themselves. The current cards are made just for the miners and no one cared for the gamers.
 
RTX 3070 is mostly a 1440p card, so it needs at least 10GB. 4k is possible for some games, but most of them are not... In 2 years time it will definitely be an only 1440p card and in 4 years an only 1080p card (even with DLSS) - maybe sooner. Unless you like to play on Low settings + high rez, which is really silly IMO.

Higher res textures, denser and wider worlds in games, RTX and other stuff will make the 8GB Vram look like a very bad choice, even at 1440p.

Does it need 16Gb? No, it's overkill. It's a mid field power card so it can't push the FPS enough at 4k to fill those 16GB. The same as 12GB for 3060 is ovekill, but between a Vram starved card and one with "too much" Vram, I'd take the latter every time.

There is a reason AMD went with 16GB Vram for RX 6800+ and with 12GB for 6700XT, that's the perfect amount of Vram for the power of the cards, in their respective tier - for this generation. Now only if they would hurry up with that Fidelity FX SR, that would be great.
 
I had a VGA Wonder in 88 in one of my work machines, don't remember if it was the first run of those. So it wasn't that much of a time loop. heh.

Might have been an EGA Wonder? Pretty sure those were out in 87


Ok you are correct. I was still in the C64/C128 scene at that time.
 
Well if they put loads of vram they won't sell quadros to the pros. It's all artificial ... game devs will utilise every little bit of ram from 16gb in a week. Better textures more high res textures, no loading at all ...less optimisation etc but at the current pace it won't happen for years.
 
Back