Nvidia GeForce RTX 3090 Review: One Massive Graphics Card

I was lucky to get my 3080, but it was an EVGA XC3.

I placed my order for a 3090 FE, and now I have to wait for it.

 
How the hell is this having a score of 80%?

- Worst power draw ever
- WORST VALUE of this generation
- Heavy and badly designed, could damage the motherboard
- Premium Power supply required
- Don't fit in a lot of cases
- 12 pins connector gimmick

FERMI 2.0... Steve, you are way too generous. Someone needs to wake up and call Nvidia BS!

Nvidia is the first to release cards this generation so that means reviews are comparing them to the 20xx series and not whatever AMD may come out with. There's a reason Nvidia rushed these cards to market, comparing Ampere cards to the far overpriced and disappointing Turing essentially guarantees good review scores. This card is the most disappointing of the bunch but even the 3080 at $700 with only an average performance uplift of 30% should not have gotten a 90, especially considering the power draw. An 80 or 85 at best. The 3090 should be a 65.

I don't think I can stress how much of a manipulative move Nvidia launching first is. In addition to the previously mentioned benefits, it also means that AMD cards will be compared to the new Ampere cards, even if at that point Nvidia has no stock.

There's no surprises here right? Minor performance uplift for significantly higher price. Exactly as the Titan series always has been (and other than a name change this is a Titan).

You are missing a critical detail: Minor performance lift over the xx80 Ti model. In this case we are comparing it to the non ti xx80 model.

ex: https://hothardware.com/reviews/nvidia-titan-rtx-review-benchmarks-and-overclocking?page=6

The RTX Titan had a large lead over the 2080 as did every prior titan of the past from the 10xx and 900 series.

It looks like Nvidia pretty much gave everything it's got and put it into the 3080, at least in regards to gaming. I don't really see the point of a 3080 Ti this time, there simply isn't enough of a performance gap between the 3080 and Titan cards this time.

Wow, this is a horrible result for Nvidia. This card is more than twice the price of the 3080 and it only offers about 10% more performance? It would be interesting to see if any of these results were due to bottlenecks. This card should be at least 20-30% more powerful on paper. Anyway, with these being 320 watt cards, I am waiting it out (like I have choice haha) to see what AMD has to offer against he 3070. I don't want to have to shell out another $125 for a decent 750 watt power supply. I suspect that AMD will not beat the 3080, but will have a card that is highly competitive with the 3070, maybe even one that beats it a little at the same price point.

The bottleneck for these new Nvidia cards may very well be that Nvidia "doubled" the number of CUDA cores by enabling the int processor in it's CUDA core to also process FP32. The problem with that is it can only processed Int or FP32, not both at the same time. Nvidia didn't double the resources available to it either so it's not a true doubling. This is why you see such inconsistent performance on Ampere and no where near the indicated performance level that many true CUDA cores would bring.
 
Last edited:
How the hell is this having a score of 80%?

- Worst power draw ever
- WORST VALUE of this generation
- Heavy and badly designed, could damage the motherboard
- Premium Power supply required
- Don't fit in a lot of cases
- 12 pins connector gimmick

FERMI 2.0... Steve, you are way too generous. Someone needs to wake up and call Nvidia BS!

Calling it Fermi 2.0 is too generous as well. Ampere so far is more like NV30/FX 2.0.
 
Now another conclusion that can be drawn looking at these figures in this review is that there is no place for the TI variant of the 3080. Probably a model with 20GB VRAM.
 
Here in Gougeland (New Zealand) I can only see people buying this if they have specific work cases for this GPU because at $3500nzd I can't see many gamers buying it and the 3080 is only half that price but still expensive at around $1795nzd I'm picking most here will go with 3070 if not for a cheaper big navi GPU when they're out
 
Amd boys still shocked their favourite performance/watt king is 3 gens behind now



If you can't afford something due to being poor, then just don't buy it and don't insult the company which actually innovates instead of gluing dies together.

Also, stop making benches with 3950x, that cpu is pathetic, slower than 3800x in gaming, often just on par with 3600x due to the glued design and latency penalties and IT IS A MASSIVE BOTTLENECK @1080p and 1440p and depending on a game engine, even in 4K.

You only make those cards look bad.

lol glued design.

You're post sounds like its straight from 2007. Then to insult people saying they are poor. Just maybe they have more important things to do with $1500. You know like spend it on their family, or paying bills. We are still in a virus pandemic with people losing jobs.

I wouldn't be surprised if the people you are calling poor actually make more money than you. My money says you still live at home with your parents.

By this time next month there will be plenty of options to choose from that will hit different price points.
 
This card is such a rip-off that I can't understand its reason for existence. I do have a rather cynical theory (cynical based on nVidia's past actions) though, I think that it's a marketing research social experiment. Just how many people are willing to spend another $700USD just to be able to proudly say "I GOT DA BESTEST!" even if it is ludicrously more expensive for a trivial boost in performance? It's kind of like a wealth survey and an IQ test of the consumer base at the same time.

If people actually buy this card for gaming, you can expect another nVidia profit centre card like this one in every generation that leaves us scratching our heads.

The RTX 3090, a card that makes me remember the nineties:
"Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should!"
- Jeff Goldblum, Jurassic Park (the good one), 1993
 
After three more days you`ll come up with i9 results and draw the same conclusion "In hindsight, maybe that's something we should have done first..."
 
This card is such a rip-off that I can't understand its reason for existence...[it] makes me remember the nineties: "Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should!"
- Jeff Goldblum, Jurassic Park
Ah, you believe the 3090 is a technological advance dangerous enough to destroy civilization as we know it? Your hypothesis ignores two points. First, these cards have uses besides gaming (gasp!). The big boost in FP32 performance and massive 24GB memory is meant to position the card for content creation, machine learning, etc. In this environment, it's going to provide far more than an "incremental" advantage over the 3080.

Secondly, even among gamers, there are plenty of people who are more than willing to pay double the price for an extra 10% performance. I'm not one of them, but I don't envy and despise them like you seem to. Why not allow people to spend their money how they wish?
 
Would this card perform better when faster CPUs come out like the upcoming AMD Ryzen Zen3 4000 coming Oct? I am thinking yes, and since this card runs quiet and cool then you don't need to be thinking of a fancy water cooler (that's a big savings there). I like the idea of binned chips that are more stable then what I am hearing users complain about on their non FE type RTX 3080 cards. The 24GB VRAM sounds like overkill until you hear every board partner announcing 20GB VRAM cards (probably because AMD's Big Navi is supposed to have 20GB VRAM). I don't need 8k gaming but what about two or three 2560 x 1440 144/165hz panels. I can see that working well. MS flight simulator anyone? I hate paying the high price but I think it is going to be worth it and this could be a card we can use a number of years.
 
From the reviews I've seen, they got similar margins with Intel CPUs. There is no CPU bottleneck, especially at 4K.

there is, 10900 gives 10% more (roughly) all around. I don't know which test you looked at. What I looked at It here: ~160 fps, 1440p on intel ~190 fps. That's not "similar". 4K here ~100fps, on intel ~120 fps.
Guru3D, open it up and compare directly games tested here and there (you'll find most of them). Don't just blindly trust me, or anyone, always check.
 
Ah, you believe the 3090 is a technological advance dangerous enough to destroy civilization as we know it? Your hypothesis ignores two points. First, these cards have uses besides gaming (gasp!). The big boost in FP32 performance and massive 24GB memory is meant to position the card for content creation, machine learning, etc. In this environment, it's going to provide far more than an "incremental" advantage over the 3080.

Secondly, even among gamers, there are plenty of people who are more than willing to pay double the price for an extra 10% performance. I'm not one of them, but I don't envy and despise them like you seem to. Why not allow people to spend their money how they wish?
Well, there's two things:
  1. I was referring to people who would buy it for gaming and I'm sure you know the type.
  2. I was deliberately going a bit overboard for comedic effect.

Of course Titan prosumer customers would buy it for other things. I'm sorry if my sense of humour didn't come through, I admit that it is a bit twisted. Of course it's not the end of the world, I'm actually loving every second of it because it's like watching an old silent movie of the Keystone Cops. It's a comedy of errors.
 
People saying is a rip off miss 3 fundamental issues.

-The card isn't even really marketed to you anyway. nVidia never claim this as the next gaming king. It is a halo product that double as prosumer card.
-If these are sold out on release day, then clearly there are people who don't think it is a rip off and have more cash than you.The only reason if you are a gamer to get this card is because you are insecure and need best of the best to make you feel good. Get over yourself.
-Lastly. It is supposed to be a rip off, Sherlock.
 
People only upset when it's amd with crazy power draws. Remember when people actually used cost of electricity as a knock against AMD gpus?

 
To think; we used to get a Dual-GPU card for $999(referring to the GTX 690 and other Dual GPU before it's time). Sure this thing has a whopping 24GB of memory, but for the love of god why are all the reviewers around the web bowing to(add an L after B and dismiss "to") Nvidia? Seriously AMD really needs a winner to put Nvidia back into the realm of reality. Just can't wait for the 3080 Ti, 3090 TI? Oh maybe another Titan? Hec let's throw in a 3070 Ti as-well. My criticism aside, I do almost love what the 3080 offers, but still far more expensive then it should be; but then again same can be said for the Auto industry and what the cost of a new vehicle is.
 
Do people still think that 10GB is not enough ??

Benchmarks proves that you don't need more than that
The goalposts have shifted, not the argument is "well it wont be enough for future titles and is therefore terrible".

Which is also pants on head ret@rd3d, because by the time those games come out the 3080 will be beaten out by the 4070 and 5060.

But that wont stop the brigade demanding to be allowed to buy twice the VRAM for another $300.
 
It seems 3080 and 3090 stocks will be limited for long time.
Producing 628mm2 28 billions transistors using 8nm non euv process is definitely hard reach.
I expect 3070 super will use more-damaged ga102.
 
The goalposts have shifted, not the argument is "well it wont be enough for future titles and is therefore terrible".

Which is also pants on head ret@rd3d, because by the time those games come out the 3080 will be beaten out by the 4070 and 5060.

But that wont stop the brigade demanding to be allowed to buy twice the VRAM for another $300.

Whether or not games need the additional VRAM, Nvidia is releasing higher memory cards regardless. Clearly there is demand for them and games tend to use what they have available. We have been stagnant in regards to memory capacity for too long IMO. My 1080 Ti has more VRAM than Nvidia's flagship 3080 as they described it.
 
there is, 10900 gives 10% more (roughly) all around. I don't know which test you looked at. What I looked at It here: ~160 fps, 1440p on intel ~190 fps. That's not "similar". 4K here ~100fps, on intel ~120 fps.
Guru3D, open it up and compare directly games tested here and there (you'll find most of them). Don't just blindly trust me, or anyone, always check.
Stop looking at absolute values, those don't matter for the comparison we are doing, just the percentages. Absolute values are always all over the place depending on how games are tested.

Even the best and most positive reviews give the 3090 just 15% more FPS vs the 3080 at 4K with an 10900k and that's only because they tested just a few games. If they tested more games they would certainly get closer to the 10% found by Steve.

TL;DR it's not worth buying it unless you do rendering of complex 3D scenes that eat up a ton of VRAM*. it's a fake Titan GPU with all of the restrictions of a gaming card in the drivers. a vanity card.

*and even here we don't know if Nvidia will release the rumoured 20GB 3080.
 
People saying is a rip off miss 3 fundamental issues.

-The card isn't even really marketed to you anyway. nVidia never claim this as the next gaming king. It is a halo product that double as prosumer card.
-If these are sold out on release day, then clearly there are people who don't think it is a rip off and have more cash than you.The only reason if you are a gamer to get this card is because you are insecure and need best of the best to make you feel good. Get over yourself.
-Lastly. It is supposed to be a rip off, Sherlock.
When you market the card as the BFGPU and say it does 8K gaming then yes, it is targeted at gamers :)
 
Given the specs, the size, the price and the power consumption I really expected more from this thing...
 
Back