Nvidia might have killed off the RTX 3080 12GB

midian182

Posts: 9,763   +121
Staff member
Rumor mill: After only releasing it in January this year, Nvidia is rumored to have ceased production of the RTX 3080 12GB GPU. The move comes as the card's price falls to that of the 10GB variant while the more powerful RTX 3080 Ti is reduced by $200.

Twitter user @Zed_Wang (via PC Gamer) tweeted that Nvidia has decided to stop manufacturing the RTX 3080 12GB now that graphics card prices are falling and it needs to clear stock. A quick look on Newegg shows several of these 12GB variants priced the same as or, in some cases, less than the 10GB model.

It seems Nvidia is getting rid of the card with the extra VRAM and wider bus to shift the remaining excess of RTX 3080 10GB cards before its RTX 4000 (Ada Lovelace) series lands later this year.

Another factor in Nvidia's decision is the RTX 3080 Ti seeing its MSRP drop by $200—the cheapest on Newegg is $1,089—which leaves less room, price-wise, for the RTX 3080 12GB to sit between the Ti and 10 GB models. From Nvidia's point of view, it also means consumers might be tempted to pay extra and buy an RTX 3080 Ti instead of an RTX 3080 10GB when there's no 12GB variant.

We weren't overly impressed with the RTX 3080 12GB. It was given a score of 75 in our review, where we noted that it allowed Nvidia to significantly boost profit margins on silicon that would have otherwise been sold as the original 10GB model. It also allowed board partners and/or distributors to cash in on the incredible demand the market was experiencing at the time. But finding one at the same price as the RTX 3080 10GB makes the newer card a much more appealing prospect, so you might want to grab one while you can.

Permalink to story.

 
I assume the price drop on the 3080Ti is for AIB cards only, because I'm not seeing any price drop on the FE models which are in stock, and have been for weeks now on the Nvidia store.
 
Fairly wide implications here: I know this might seem counter intuitive due to the LHR on those cards but we know that bypassing it got out not too long ago. This means that this is something Nvidia knows how to bypass so while the cards they intended for consumers (Even if they're runs of the exact same model) they wanted to remain constrained with the LHR stuff, we could now more accurately guess they know AIB partners were selling direct to larger miner operations.

So while they probably didn't intended average Joe buying 1 to 5 separate 3080 12gb and hence the hash limiter, they probably absolutely *DID* intended for AIB parterns to sell 200 card pallets of the exact same product to a larger mining operation while the AIB partner goes "Don't worry, we can unlock the limiter for you but you gotta sing this NDA and nobody can know Nvidia told us how to bypass it ok?"

I know, seems fairly unlikely but let's now get back to the news item: They decided gamers don't need 12gb model 10 is enough. Now it is true but it was so since launch and was probably never an extremely pressing need to release the card with 12gb that is not a demand that came from gamers, it came from miners.

This means that now that Eth dances around the 1000 mark for a while vs the close to 5000 just 7 months ago or so Nvidia and their AIB partners have talked to enough miners to know it's not going to recover fast enough to justify making the 12gb version that we know can be easily bypassed so it is safe to halt production and just ramp it back up later months if Eth starts to creep up in profitability again and miner demand for 12gb justifies putting it back on stock.

So even if you think I assume too much, that last paragraph is the most plausible explanation as to why they're quietly doing this but the implication is that Nvidia knows LHR was easily defeated to the point that they designed a product around it potentially being disable as needed but enable and sold to regular gamers if needed.

It confirms that they're extremely dishonest and were catering directly to miners. Nobody would mind if they were honest about it but it's the lying that gets people angry with them and for good reason.
 
The 12gb card was silly. Frankly, I'm far more interested in a new display standard that supports 8k/120 than I am any current graphics card.

I game on a 65" 4kTV and it only does 60hz, which is fine but I don't want "just fine." However, I don't want to drop ~$10,000 on a new TV if it isn't future proof. As soon as a display standard is released that supports 8k120 I will have a "money is no object" build. Until then, I don't really see a point to upgrading my system or display as all the games I play run 4k60 just fine.

Some of my parts are getting dated and I'd like to do a minor upgrade but the current graphics card market is silly even though we are getting close to MSRP. MSRP for the performance we're getting is stupid. I mostly play EvE and ESO with frequently replaces of fallout 3, vegas and Oblivon (Skyrim sucks, fight me)

Last new game I played was cyberpunk and, yes, my system dogged on that but I think that was more my CPU than GPU because dropping the resolution below 1080p didn't get me any notable FPS improvement. I'm excited for Starfield and all the Bethesda bugs that are likely to come with it.

I just don't feel that even at MSRP the value isn't there. And it's not that I'm unwilling to spend money, I'm unwilling to spend the money when specific features just don't exist yet. If we are going to be making these super power graphics cards why don't we have standards that can support the full raster performance. I know raytracing is pretty, but I've always felt it was more fluff than anything. Just something to eat up performance to sell more graphics cards.

Give me a Card and TV that can do 8k120 and I'll trade my wife's car in for it.
 
The 12gb card was silly. Frankly, I'm far more interested in a new display standard that supports 8k/120 than I am any current graphics card.

I game on a 65" 4kTV and it only does 60hz, which is fine but I don't want "just fine." However, I don't want to drop ~$10,000 on a new TV if it isn't future proof. As soon as a display standard is released that supports 8k120 I will have a "money is no object" build. Until then, I don't really see a point to upgrading my system or display as all the games I play run 4k60 just fine.

Some of my parts are getting dated and I'd like to do a minor upgrade but the current graphics card market is silly even though we are getting close to MSRP. MSRP for the performance we're getting is stupid. I mostly play EvE and ESO with frequently replaces of fallout 3, vegas and Oblivon (Skyrim sucks, fight me)

Last new game I played was cyberpunk and, yes, my system dogged on that but I think that was more my CPU than GPU because dropping the resolution below 1080p didn't get me any notable FPS improvement. I'm excited for Starfield and all the Bethesda bugs that are likely to come with it.

I just don't feel that even at MSRP the value isn't there. And it's not that I'm unwilling to spend money, I'm unwilling to spend the money when specific features just don't exist yet. If we are going to be making these super power graphics cards why don't we have standards that can support the full raster performance. I know raytracing is pretty, but I've always felt it was more fluff than anything. Just something to eat up performance to sell more graphics cards.

Give me a Card and TV that can do 8k120 and I'll trade my wife's car in for it.


just make sure she approves of that car sale lol
 
just make sure she approves of that car sale lol
It's my name on the title and I paid for it, it is technically my car. My weekend canyon carver turned into her daily driver >.>
I have a Honda CRV I haul all my tools around and beat on for work. My Acura TLX Type-S was my reward to myself. She didn't even drive before I got that car, she literally got her license 3 months after I got it. She took her drivers test in that car
 
The 12gb card was silly. Frankly, I'm far more interested in a new display standard that supports 8k/120 than I am any current graphics card.

I game on a 65" 4kTV and it only does 60hz, which is fine but I don't want "just fine." However, I don't want to drop ~$10,000 on a new TV if it isn't future proof. As soon as a display standard is released that supports 8k120 I will have a "money is no object" build. Until then, I don't really see a point to upgrading my system or display as all the games I play run 4k60 just fine.

Some of my parts are getting dated and I'd like to do a minor upgrade but the current graphics card market is silly even though we are getting close to MSRP. MSRP for the performance we're getting is stupid. I mostly play EvE and ESO with frequently replaces of fallout 3, vegas and Oblivon (Skyrim sucks, fight me)

Last new game I played was cyberpunk and, yes, my system dogged on that but I think that was more my CPU than GPU because dropping the resolution below 1080p didn't get me any notable FPS improvement. I'm excited for Starfield and all the Bethesda bugs that are likely to come with it.

I just don't feel that even at MSRP the value isn't there. And it's not that I'm unwilling to spend money, I'm unwilling to spend the money when specific features just don't exist yet. If we are going to be making these super power graphics cards why don't we have standards that can support the full raster performance. I know raytracing is pretty, but I've always felt it was more fluff than anything. Just something to eat up performance to sell more graphics cards.

Give me a Card and TV that can do 8k120 and I'll trade my wife's car in for it.
Barely right now is when we are seeing 4K@60fps capable cards at kinda affordable prices, doubt we'll see 8K@120fps anytime soon, three or more graphic card gens may be. Meanwhile you can just do, upgrading to a OLED TV with 120hz and VRR capabilities.
 
10/12GB - honestly you're not really going to see a difference in any gaming just because of slightly more VRAM. The two card models, the 12GB was a whole 5% faster, but at the time it came out cost upwards of $400 over the 10GB version.

I've noticed at my local Micro Center the 10GB and 12GB models are all still priced at $1k+.

As much as I hated the 12GB model, it was an ingenious move by Nvidia. They listen to all the cry babies crying about how 10GB isn't enough on the 3080 and 12GB should have been the minimum....so they pump out the 12GB that's slightly faster over the 10GB and price the crap out of it.

Then you have stupid people gobble them up to satisfy their idea that the 12GB is going to open up a whole new world and make things magically better.
 
Why don't just cut the prices on the 10gb RTX3080 to the original 699$ MSRP? They probably will not loss money there.
 
Lol They can sell whatever cards they want as long as they have cards in the usual price brackets for a fair market price. Higher profits is how you get better future products. It's how you grow a business.

AMD will do it too any chance they get. Any company would and does. Money is king. Not your emotions. Shareholders have more influence than consumers. This is not news. Get over it.
 
Barely right now is when we are seeing 4K@60fps capable cards at kinda affordable prices, doubt we'll see 8K@120fps anytime soon, three or more graphic card gens may be. Meanwhile you can just do, upgrading to a OLED TV with 120hz and VRR capabilities.
You can get an 8k TV with a 120hz panel but there isn't a display connector that transfers an 8k120 signal, which is the issue. I REALLY want to upgrade my TV, but until HDMI has an 8k120 standard. I'm currently looking at the 75" Samsung QN900B and the only reason I haven't pulled the trigger on it is that I want to have it for close to a decade. And we are SOOOO CLOSE to an 8k120 standard that it actually makes me mad. Keep in mind, I use these TVs as computer monitors. My home office is a hybrid home theater. Now, we're talking about a decent bit of money here. I can justify that cost because I can spread it over 10 years. I make big purchases here and there, but when I do that I expect that big purchase to pay for itself over several years.

If the QN900B accepted an 8k120hz input it would be on my wall right now
 
This company all the time thinking more about their profits than about happiness of gamers. Killjoy!

lol you joking?

These are publicly traded companies.

Shareholders and Executives bonuses come before consumers.

They never cared about happiness of gamers they just want your money.
 
Probably the biggest rip-off Nividia has done. At best worth $100 premium over the 10GB model and they wanted $500. Arrogance at its finest. Should have just made a 20Gb version of the regular 3080 with GDDR6X for $1099.
 
The fact is that nobody really asked for a RTX 3080 12GB and RTX 3080 Ti because we can clearly tell that these “in between” cards basically don’t make a huge difference in performance when compared to the RTX 3080 10GB. But I suspect Nvidia did not want to officially increase MSRP, so they happily released these 2 models that are marginal step up from the RTX 3080 10GB. What some people were anticipating was a 20GB model which Nvidia chose not to deliver. The narrative is that GDDR6X was in short supply, but that was only at the start. When both the RTX 3080TI and 3080 12GB was released, GDDR6X is also being used on the 3070 Ti. If it was in short supply, why add another product that uses it? So I believe these cards were meant for the desperate miners that will soak up any cards that run fast/ faster with the extra memory bandwidth, particularly the 3080 12GB. So it also made sense why Nvidia may kill the 12GB version now. In fact with expectation of next gen cards in the later part of this year, I think it also makes sense that Nvidia should be reducing or stopping high end Ampere card production since they are not cheap to produce such complex chips. Same goes for AMD. The mid and lower range of cards may still exists for awhile, until it gets the next gen treatment likely early next year.
 
After everything that has happened, I don't see this going anywhere. This is especially true because of the prices of used GPUs that have fallen like a tractor-trailer going over a cliff. :laughing:
 
Back