RTX 3070 mod includes physical switch for VRAM capacity

nanoguy

Posts: 1,355   +27
Staff member
Why it matters: An RTX 3070 with 16 gigabytes of VRAM will likely only exist in the hands of a few hardware enthusiasts, and that includes crazy mods like an RTX 3070 with a physical switch for memory capacity. There is a Taiwanese company that has created a specialized 16GB RTX 3070 model for digital signage companies, but it's not something gamers will be able to buy from a retail store.

VRAM capacity on modern video cards is one of the hottest topics of discussion among PC enthusiasts, and for good reason – Nvidia has made spacious frame buffers a luxury with the RTX 40 series. For instance, an RTX 4070 with 12 gigabytes of GDDR6X memory is priced at around $600, and Team Green only recently bowed to public pressure with the announcement of an RTX 4060 Ti with 16 gigabytes of VRAM for a suggested price of $499.

The modding community has recently come up with interesting projects that show Nvidia could have easily equipped some RTX 30 series cards with more memory. In the case of the RTX 3070, which normally comes with 8 gigabytes of VRAM, playing newer AAA titles with Ultra settings and high-resolution textures is an exercise in frustration. That said, an experiment around doubling the VRAM capacity showed the card is certainly capable of respectable frame rates even as it inches towards its third anniversary.

A more recent project reveals a similar alteration for the RTX 3070. YouTuber Paulo Gomes teamed up with Casual Gamers to create what is likely the only Nvidia card with a physical switch for memory capacity. The switch makes no sense for gamers who simply want the higher VRAM capacity, but the mod itself is an idea that could prove useful to reviewers looking to limit-test a graphics card. It would certainly be less expensive than, say, testing the 16GB RTX A4000 against the 8GB RTX 3070 like our own Steven Walton did earlier this year.

Otherwise, the new mod once again confirms that an RTX 3070 with 16 gigabytes of GDDR6X is perfectly capable of running resource-intensive games like The Last of Us Part I at 1440p using the Ultra quality preset. As you can see from the graph above, the card can achieve a slightly higher average frame rate as well as significantly better one percent and 0.1 percent lows.

Nvidia might never officially release a 16-gigabyte RTX 3070, but that won't stop some custom board vendors from trying. An interesting tidbit from Computex this year is that a small Taiwanese company called Gxore seems to have done just that.

That said, Gxore isn't listed as an official Nvidia partner, so this is likely a low-volume product targeting Chinese digital signage providers. A dead giveaway is that this unusual RTX 3070 model has no fewer than eight mini-DisplayPort outputs, which should come in handy for driving video walls.

H/T: Paulo Gomes | Casual Gamers

Permalink to story.

 
You know, it has occurred to me that it might be cheaper to send cards off to a modder to add VRAM to it than to upgrade to a faster card, especially if nVidia is going to pull this "midrange cards are always going to have 8gigs of vram" now. You could get better 4k performance out of an older card with a larger memory bus than things like the 4060ti. I don't think the 4060ti 16gig will offer any better performance in games other than in games where VRAM is already a limiting factor. Why pay $500 for a 4060ti when you could pay someone $100-150 to put 16gigs on your 3070?
 
But, but, but, with this, Nvidia can no longer milk their sheep. ;)

This reminds me of back in the old days, there was a similar hardware mod for, I think, GeForce 2 cards. Move two 0-ohm resistors on the card to a different location, and Voila - Instant Quadro. I did it, and it worked for a while until my card died about a year or more later. That was a pretty simple mod, though. I cannot imagine easily doing this mod since it sounds like you would have to physically remove the VRAM and replace it. Surface mount electronics is great - if you have the capability to do it, but with SM ICs of that size, it can get pretty tricky. Someone patient enough might be able to do it with some solder-wick - but all I can say is potential modders beware.
 
But, but, but, with this, Nvidia can no longer milk their sheep. ;)

This reminds me of back in the old days, there was a similar hardware mod for, I think, GeForce 2 cards. Move two 0-ohm resistors on the card to a different location, and Voila - Instant Quadro. I did it, and it worked for a while until my card died about a year or more later. That was a pretty simple mod, though. I cannot imagine easily doing this mod since it sounds like you would have to physically remove the VRAM and replace it. Surface mount electronics is great - if you have the capability to do it, but with SM ICs of that size, it can get pretty tricky. Someone patient enough might be able to do it with some solder-wick - but all I can say is potential modders beware.
For the cost, it'd probably be cheaper and safer to send it off to an electronics repair shop rather than buy a heat gun to remove surface mount components. Someone should email louis rossman about this
 
For the cost, it'd probably be cheaper and safer to send it off to an electronics repair shop rather than buy a heat gun to remove surface mount components. Someone should email louis rossman about this
Safer, no doubt. For those of us who have HASL stations, maybe not cheaper. Honestly, though I am arguably very close to being equipped, I just want a card that works and have more important things to do, so I'm not inclined to give it a try.
 
This reminds me of back in the old days, there was a similar hardware mod for, I think, GeForce 2 cards. Move two 0-ohm resistors on the card to a different location, and Voila - Instant Quadro. I did it, and it worked for a while until my card died about a year or more later.

And that mod didn't had anything to do with your card dying, right?
 
This mod makes no sense, other than an interesting science project. All you need to do for "reviewers" is have a driver that allows you to cap VRam use. Then you can play around with any and all values while testing.
 
This mod makes no sense, other than an interesting science project. All you need to do for "reviewers" is have a driver that allows you to cap VRam use. Then you can play around with any and all values while testing.
You write a lot of video drivers in your spare time?

Because we have different definitions of easy.
 
Few years ago I've seen few places that offers "storage upgrade" for iphones, you know when 16GB iphones were a thing. I mean theoritically if those people could do it, they could also mod these graphic cards. problem is now where to source the vram chips for the modding.
 
Few years ago I've seen few places that offers "storage upgrade" for iphones, you know when 16GB iphones were a thing. I mean theoritically if those people could do it, they could also mod these graphic cards. problem is now where to source the vram chips for the modding.
If there is a market for it someone will setup a reselling company to provide that service. You can buy 4GB GDDR5 chips go for around $8 each when you can find them online so $32 for 16 gigs and maybe $100 for an electronics shop to do the mod. I'd say $150 for changing a card from 8gigs to 16 gigs is reasonable.
 
The simple Fact that Nvidia is not pushing 12-24GB cards is stupid, the future games will all require it, for them its about money, not the customer, look at AMD, for them its about being able to play games in the future, why Nvidia has so much is stupid, Intel and AMD need to become more competitive so that the customer becomes the No1 priority.
 
You write a lot of video drivers in your spare time?

Because we have different definitions of easy.
I don't code any more. But there are plenty of people who do. I'm suggesting that the manufacturer could write such a driver if they wanted to. In other words, writing a driver is far easier than modding hardware. How good are you with a soldering iron?

And it seems someone has already thought about this.
https://www.phoronix.com/news/Mesa-20.3-override_vram_size
 
And that mod didn't had anything to do with your card dying, right?
If it did, I would have thought that it would have died very soon after I did the mod. However, it was at least a year before the card died, so I highly doubt it did.
 
I don't code any more. But there are plenty of people who do. I'm suggesting that the manufacturer could write such a driver if they wanted to. In other words, writing a driver is far easier than modding hardware. How good are you with a soldering iron?
As I mentioned, this is surface mount stuff. It's not the old days where using a soldering iron would have been much easier. Given its surface mount, desoldering is nowhere near as easy. Even though I have a Hot Air Solder station, I would not attempt it because the desoldering equipment is much more specialized. With Hot Air, you would have to heat all the pads, pins, at the same time, and it would be difficult to do with a large chip. Hot air soldering is very easy even with large chips, but generally, a specialized desoldering tool is needed to desolder large chips. Such tools have heated bars that heat all the leads on all sides of the chip at the same time and grasp the chip to pull it free - and I have no idea what you would use for BGAs.
And it seems someone has already thought about this.
https://www.phoronix.com/news/Mesa-20.3-override_vram_size
Your point about drivers is very valid, IMO. There's no way that Nvidia or any chip manufacturer would produce different GPU chips that correspond to different memory sizes - it is simply too expensive. It has to be built into the chip and controlled by the instruction set of the chip. However, the larger memory has to be available on the board - obviously.
The simple Fact that Nvidia is not pushing 12-24GB cards is stupid, the future games will all require it, for them its about money, not the customer, look at AMD, for them its about being able to play games in the future, why Nvidia has so much is stupid, Intel and AMD need to become more competitive so that the customer becomes the No1 priority.
Nvidia has created a brand/performance mystique and they know that there are those out there that cannot resist having their latest hardware. Though I won't mention any names as he's not been around in quite a while, we used to have a TS participant that was, IMO, that type. People like that drive the market. If he were around, IMO, he would have a 4090, or perhaps two or three, and would be bragging about it though its been universally panned as a card that is not worth the money. IMO - this all started back in the Titan days.

Finally, neither AMD nor any other GPU manufacturer has been giving Nvidia any real competition for a long time. And if they were, you can bet their prices would be just as astronomical as Nvidia's are now, and I also speculate that they would adopt business practices similar to Nvidia's because Nvidia has said "This is the Way" and has literally paved the way for such business practices.

IMO, there needs to be real competition in the market place unless consumers keep the current marketplace pressure up and refuse to buy the outrageously expensive crap the manufacturers are producing - just like what is going on now in both the CPU and GPU markets.
 
Back