Nvidia doesn't need us anymore: How is the GeForce RTX 3080 12GB launching to zero reviews

You should expand upon your "And then you realize the 6900xt os 2 generations behind" behind comment.

1) Do you mean in terms of relative performance in just CP2077 or overall?
2) Do you mean in terms of VRAM?
3) Do you mean because of proprietary software such as DLSS?
4) Do you mean because of RTRT capabilities?

Feel free to expand from there and list other thoughts about why the 6900XT is 2 generations behind. I'm curious as to why you think that and I'd be curious as to where you'd get your supporting info on.
It's simple. The new Ultra settings is RT + DLSS Q. Don't have to expand on the RT part, but regarding DLSS, it improves image quality when games use TAA, especially poor implementations of it.

So the 6900xt is a high end card that....can't play at high end settings.

And don't get me started on FSR. God almighty have you tried that thing? The amount of sharpness makes me wanna vomit after 15 minutes
 
After all this abuse, the sad part is gamers are still willing to buy whatever they can get their hands on even with ridiculous prices and are still having bitter debates on which company is "better"...
 
It pains me when I read people who are like "Nvidia is doing a great job of maximizing profits, that's great news, good for them, they're so smart!". Is that some sort of a fetish? Do they pay girls online to make fun of them as well?
 
The only thing Nvidia has been good for since 2017 is mining, the costs of everything else is too much.. buy a card, play games when you want, mine when your not and never pay again for an upgrade, that is all they are good for...too full of shirt
Remember when AMD released the Vega series during the previous mining boom while promising gamers they were trying to get these new cards into the hands of gamers while the very first driver update was solely about improving mining? Don't think for a moment that NVIDIA is alone in trying to maximize profits.
 
The TUF 3080 12 GB will be available on Newegg Shuffle today for $1499!!! Because Nvidia did not set an MSRP or bother to release a FE of the card, there is not even a lower launch price for these cards. The TUF 3080 Ti is $1999.
 
It's funny you know. I had been saying that the RTX 3080 will be a short-lived card because Far Cry 6 requires a minimum 11GB of VRAM to use the HD textures. I can't use them because the RX 5700 XT only has 8GB but I spent less than $500 (before tax) on it.

I have to wonder how all of the people who spent >$1,000 on the 10GB model feel right now. Probably like all the people who spent a crap-tonne of money on the RTX 2080 Ti felt after Ampere launched.

I swear, nVidia keeps screwing consumers and they keep coming back like battered spouses. What nVidia is showing us is the low opinion that THEY have of their own customers. When nVidia does something like this, they're basically saying "Our customers are dumb enough to take whatever abuse we throw at them and they keep coming back so let's soak them for every cent that they have!" which they've proven time and again doesn't hurt them one bit.

It's obvious that nVidia believes that their customers are robots who will pay ANY amount to get a video card in a green box. History has shown, very clearly, that they've never been wrong about this. Even now, with the offerings from ATi being essentially equal, people are willing to spend more for an RTX 3080 than an RX 6900 XT, despite the 6GB VRAM deficiency. That's just insane.
I have owned a 3080 from around the launch window, it is a great GPU. From Tom's Hardware regarding Far Cry 6 "Far Cry 6 says you need at least 11GB VRAM for 4K ultra with DXR, and it's mostly correct, though 10GB apparently will suffice. "

With 4K Ultra textures and DXR, the RTX 3080 scales about the same as it does across the board in this game. Better than the RX 6800 XT, but slightly behind the RX 6900 XT, even though it has a better minimum frame rate than both. Especially the 6800 XT which falls in the low 40s while the 3080 maintains a minimum in the 50s. Yes, it's likely that some upcoming games will need more than 10GB, but for right now the 10GB is just fine. I play at 1440p so I'm good for a while.
 
I have owned a 3080 from around the launch window, it is a great GPU. From Tom's Hardware regarding Far Cry 6 "Far Cry 6 says you need at least 11GB VRAM for 4K ultra with DXR, and it's mostly correct, though 10GB apparently will suffice. "

With 4K Ultra textures and DXR, the RTX 3080 scales about the same as it does across the board in this game. Better than the RX 6800 XT, but slightly behind the RX 6900 XT, even though it has a better minimum frame rate than both. Especially the 6800 XT which falls in the low 40s while the 3080 maintains a minimum in the 50s. Yes, it's likely that some upcoming games will need more than 10GB, but for right now the 10GB is just fine. I play at 1440p so I'm good for a while.
Actually fc6 dynamically adjusts textures according to your vram. So you may think you are playing with ultra textures but you are not. The problem even occurs on a 3080ti and its even worse on a 3080.
 
I love how so many people are tied up with how FC6 plays when you want to complete max it out on 4k with the high def texture pack and you don't have a GPU with 12GB or more.

Wow. It's impressive how folks will just cling to one little thing and make that their driving point as to why any other GPU is just a gawd awful piece of crap.

The fault may not lie with the GPUs that have less than 12GB, but perhaps the fault lies with Ubisoft and how poorly they've created the game and how it functions with the high res textures?
 
Thank you for this article and brining light to the situation. Most gamers just gave up on the 3xxx series. The reality is, until ethereum and crypto mining stops being profitable from GPU mining, we might never see a return to MSRP GPU pricing OR even the ability to get a midrange GPU.
 
Nvidia and AMD obviously don't need retail consumers but if or when cryptocurrency crashes again and millions of their used graphics cards sell for 80% off on Amazon and eBay, I'm sure they will be extremely ticked off that they lost their grip on gamers and others who won't buy a new GPU for years when they have used RTX 3090 and RX 6900 XT graphics cards for less than $300.
 
Nvidia and AMD obviously don't need retail consumers but if or when cryptocurrency crashes again and millions of their used graphics cards sell for 80% off on Amazon and eBay, I'm sure they will be extremely ticked off that they lost their grip on gamers and others who won't buy a new GPU for years when they have used RTX 3090 and RX 6900 XT graphics cards for less than $300.
Glad you mentioned that.

This is the real reason why neither Nvidia or AMD will increase production to satisfy the worlds demands.

They know that the bubble can burst at any moment and they cant simply cancel future orders on a whim, yet the used GPU market can flood the market in the blink of an eye.

As much as I hate corporations, I do understand why they are doing this.
 
Last edited by a moderator:
A duopoly is not a free market. Let's discuss free-markets when the US govt. stops banning competing products.
The problem with capitalism is that it's a competition and all competitions result in winners and losers. A perfect analogy of the capitalistic market was the Highlander franchise. Two immortals would meet, be compelled to swordfight until one lost their head. The winner would absorb the loser and move on. That's exactly what corporations do in the "free market" of capitalism. The saying was "There can be only one!" and eventually, that's what there was.

In the "free market" of capitalism, eventually you have one, two or three massive corporations left in an industry and instead of bending to the will of the market, they're powerful enough to now bend the market to their will. The computer industry is a perfect case in point:

GPU/Video Card Manufacturers in the 90s:
3dfx, Array Technologies inc. (ATi), CirrusLogic, Diamond, Intel, Matrox, nVidia, Oak Technologies, Orchid, S3, Silicon Integrated Systems (SiS)

GPU/Video Card Manufacturers now:
ATi Technologies, Intel, nVidia

Hard Drive Manufacturers in the 90s:
Fujitsu, IBM, Kalok, Maxtor, MicropΩlis, Seagate, Western Digital

Hard Drive Manufacturers now:
Hitachi, Seagate, Western Digital, Samsung

'Nuff Sed.
 
GPU/Video Card Manufacturers in the 90s:
3dfx, Array Technologies inc. (ATi), CirrusLogic, Diamond, Intel, Matrox, nVidia, Oak Technologies, Orchid, S3, Silicon Integrated Systems (SiS)

Imagination (PowerVR) should be there and also, I think that some of those companies were OEM's and sold GPUS based on someone else chips.

GPU/Video Card Manufacturers now:
ATi Technologies, Intel, nVidia

If only Imagination decided to rejoin the desktop market.

Hard Drive Manufacturers now:
Hitachi, Seagate, Western Digital, Samsung

Isn't Hitachi part of WD?
 
I agree. It hasn't been a "free" market in a long time. It's a market dominated by overly-huge multinational corporations. Venture capitalism is impossible at this point. Beware those who defend Neoliberalism because Neoliberalism is only "good" for those who are already wealthy. I write "good" in quotes because more often than not, their profit increase doesn't really benefit them one bit because there's no way that your lifestyle really changes once you have $20,000,000 in the bank simply by accumulating more.
A reply ripe with wisdom. (y) (Y)
Keynesian Economics is what we used to have. It's a far more effective way to implement capitalism than Neoliberalism ever was. It's what the Nordic countries still use and look at how great they are. The only metric in which they don't score highly is corruption. We could learn so much from them.
Sounds like a great metric not to score high in, at least in my little mind. ;)
I agree with you but it makes me wonder why nVidia scrapped Quadro.

Yeah, I was completely floored when they said that it was better to buy the RTX 3080 (starting at $1,630) with 10GB than the RX 5800 XT (starting at $1,500) with 16GB. I just look at those and I could easily tell that the life of the RTX 3080 was severely limited by its 10GB framebuffer.
Buying what you think is best for you rather than buying what you think will make you look better than everyone else is a great tactic. At least, you end up satisfied with what you have.
Narcissists always think that just because they like something that it's "obviously" the best, eh? Narcissists also try to impress people by telling them that they have two RTX 3090s. :laughing:
In some respects, I feel sorry for them because when the next latest, greatest model comes along, there's something compelling them to get that, otherwise, they won't be satisfied. And that, not ever being satisfied, must be difficult.
I can't agree. The difference between nVidia and Compaq is that Compaq was surrounded by competitors like the aforementioned Hewlett-Packard, Acer, Toshiba, Lenovo, Packard-Bell, Toshiba, Dell, Everex and Gateway. The only competitor to nVidia is ATi and (to some marginal degree, Intel) and when there are only two, no matter how bad one behaves, there are still fools who will only buy their products. Look at how successful Apple is no matter how many d!ck moves they pull.
The situation is different, I agree, however, my bet is that consumers get fed up with the market situation and find alternative means - such as buying from the used market, or, more simply as you did buying what is best for them and that may be from the used market, a competing brand (even if it is not the "best" in the eyes of reviewers. In either respect, it takes business away from what manufacturers are shoving down the throat of the marketplace, and ultimately, it takes power away from the manufacturers.

People got fed up with paying "Compaq" prices and found alternatives - where ever they could, and thus, "Compaq" is now a ghost of Christmas Past. ;)
Sounds a lot like General Motors. :laughing:
Good guess, but it is not. I prefer not to say who it is because I would prefer to remain as anonymous as possible. :cold_sweat:
Yup, they've become so powerful that instead of being forced to bend to the will of the market, they're able to bend the market to their will. That completely defeats the purpose of free enterprise to begin with. They also snuff out any attempts by upstarts so there's no more Venture Capitalism either.
Yes, and yet somehow, we have those that see this situation as the "Free Market". :rolleyes:
I know how it operates and I know that it's completely different from how it's supposed to operate.
Or at least how we would like the market to operate.
Cut and paste: The lazy man's way to "forum". :laughing:

You're probably giving too much credit there. You used the word "think". :laughing:
No comment. My post would probably get pulled - again. ;)
What would we ever do without you Cap? 🤣
TS would be completely different without him! We are endlessly enriched by his insight. :D
 
After all this abuse, the sad part is gamers are still willing to buy whatever they can get their hands on even with ridiculous prices and are still having bitter debates on which company is "better"...
Well, at least some gamers anyway. While I do not consider myself a hard-core gamer, I am holding off two builds until prices return to some semblance of reasonability.
 
There's something of an ocean of a comment section here, but I'll drop my 2-cent needle into the haystack anyway: would it be feasable to slowly upload a review test-by-test as it's made instead of waiting a few days for the full thing?
 
First I'm hearing of this. And I'm suspicious, because that pack is available for the consoles, neither of which have (practically usable) 11GB VRAM.
Suspicious or not, it's nevertheless true. When grabbing evidence, I came across something confusing though. I was sure that the game said 11GB was the minimum when I was downloading it but Ubisoft themselves have stated that a 12GB frame buffer is the minimum:
"Please note that the minimum requirement for running the HD Texture Pack for Far Cry 6 is 12GB of VRAM. For 4K configurations, you will need 16GB of VRAM. If you download and run the HD Texture Pack with lower VRAM capabilities, you will encounter performance issues while playing."
Far Cry 6 HD Texture Pack Info
And even just looking at PC, why would a developer choose a texture budget that is just over available VRAM on a large chunk of the (still overall tiny) installed GPU base that could use more demanding textures? Could it be the developer has a deal with AMD?
I seriously doubt it because games like this take way more more than 1¼ years to make. When this decision was made by Ubisoft, it's possible that AMD was aware of it while nVidia wasn't so AMD made sure that the RX 6800 and up would be able to use them. There's no way that anyone could have foreseen that nVidia would gimp their card with only 10GB of VRAM. If AMD knew that, I'm sure that they wouldn't have put 16GB in the RX 6800 XT when 12GB would have been enough to use the HD textures, have more than the RTX 3080 and save money.
I've got a 3080 and I'm not going to sweat this one. I'm pretty sure I'll be able to dial in a good looking, high performing version to my tastes anyway if I ever decide to play this game.
Oh there's no doubt. I'm just saying that, to me, if I'm spending 4 figures on a goddamn video card that it better be able to max-out every game on the market at the time and for the foreseeable future, especially when newegg has it priced higher than a card that outperforms it (RX 6900 XT) like newegg does. I have no doubt that you're happy with the card, why wouldn't you be? The question is, how long will you be happy with that card considering what you paid and would you have been happier for longer if your card had an extra 6GB of VRAM on it?

I am of course assuming that you aren't one of the lucky ones who managed to get one for the real MSRP at the beginning of Q4 of 2020.
 
There's something of an ocean of a comment section here, but I'll drop my 2-cent needle into the haystack anyway: would it be feasable to slowly upload a review test-by-test as it's made instead of waiting a few days for the full thing?
That's a good point. Another question could be "Why don't the drivers for the existing RTX 3080 10GB or RTX 3090 work with the RTX 3080 12GB?"

That question is the one that I've been trying to answer because if they're both RTX 3000 cards, the same drivers should work. Mind you, since I don't use nVidia, I don't know how GeForce drivers work. Crimson packages from Radeon cover several cards, in fact, even thought they're theoretically not for the R9 Fury (it got moved to legacy last June), I ran one of my R9 Furies for a couple of weeks with RX 5700 XT that weren't supposed to be for it and there was no problem. I only found out when I went to tweak something in the Crimson settings and I saw the message that "this driver is not compatible with this card" and I'm thinking to myself "Well Far Cry 6 doesn't seem to agree." :laughing:
 
I have owned a 3080 from around the launch window, it is a great GPU. From Tom's Hardware regarding Far Cry 6 "Far Cry 6 says you need at least 11GB VRAM for 4K ultra with DXR, and it's mostly correct, though 10GB apparently will suffice. "
I'll stop you right there because my post was about the people who are paying exorbitant amounts of money right now. I can't criticise the price you paid if you paid the real MSRP and I never said that it's not a great card because it certainly is.

My whole point was that people seem to be willing to pay more for an RTX 3080 with 10GB of VRAM vs the RX 6800 XT (and even RX 6900 XT) with comparable gaming performance and a mammoth 60% increase in VRAM. If I were forced to pay what people are now paying, I'd be getting the card that I thought would last the longest and that's the card with 16GB. Since you only paid (you lucky devil) the real MSRP for your RTX 3080, you're not who I'm talking about.

Paying what you did is both reasonable and sane. My post was about the fact that newegg is selling the RX 6900 XT 16GB for $1,610USD while the RTX 3080 10GB starts at $1,630. When you bought yours, the RTX 3080 was a good buy, no question. Looking at those prices now though, do you really think that it still is? I sure don't.
 
Imagination (PowerVR) should be there and also, I think that some of those companies were OEM's and sold GPUS based on someone else chips.
I didn't even know that they existed back then but sure, they could be added.
Isn't Hitachi part of WD?
I don't think so but I could be wrong. I do know that Hitachi bought IBM's hard drive division (DeskStar) and sells them under the HGST brand now.
 
Got a Sapphire 6800 (non XT) for 719.99+tax at launch , sold for $1200 3 months ago and picked up red devil 6900XT Ultimate for $1599. Sometimes the market work out in your favor
 
Last edited:
Back