Nvidia GeForce RTX 3080 Review: Ampere Arrives!

A half truth is still a half lie, but...

While I'd could argue that nvidia didn't deliver on its promises of 1.9x energy efficiency, 80%+ performance increase and silent cooling. The damn thing is fast.

I'd go as far as saying that is the current best bang for the buck. Probably short lived until the RTX3070 or RX 6000 cards are out, but still, no small feat...

Also I think the 8nm samsung process does not do well with big chips (AFAIK it wasn't really designed for that) seeing that the RTX3070 consumes much less power and still performs on par with a RTX2080Ti, but we'll cross that bridge when we come to it.

I'm waiting for the RTX3070 or RX6000, but right now the 3080 is on the top of my llist.





 
1- Average peformance gain is 68% in 4K. 1440p is clearly CPU limited. Also power consumption test was done at 4K, so 1440p is irrelevant

2- RTX 3080 consume 27.8% higher than RTX 2080
https://static.techspot.com/articles-info/2099/bench/Power_PCAT.png

While performing 68% faster. In other words, RTX 3080 is much more efficient than RTX 2080

However, the gap is much lower if you compare it to RTX 2080 Ti but RTX 3080 replaces RTX 2080 and not RTX 2080 Ti.

It's a small bump in power efficiency: https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/35.html



I think it's beyond comparison between an AMD architecture on TSMC 7nm and an Nvidia one on a different process. We do have analysis that Ampere is power efficient, even though it has a high TDP.

AMD's architectures have not been as power efficient as Nvidia's since forever. Performance per watt has been all Nvidia for five or six generations at this stage, and by a significant margin the past four. To the point a 12nm Nvidia design can use roughly the same power as a 7nm AMD one with close gaming performance. E.g 2070S v 5700XT or 2060S v 5700. Gaming is emphasised because obviously GPU designs these days can vary considerably in their proposed usage scenarios.

What we can ascertain about this Samsung 'LPU' node they call 8nm is based off their 10nm node. Nvidia could easily have production of some consumer Ampere parts on TSMC 7nm, which would give us an interesting point of comparison. They confirmed they are still partners.

What we do know for sure is Samsung bid real low for this contract. They had plenty of capacity, motivation to take market off TSMC and Nvidia have an eye for stronger margins. It's a marriage of convenience.

The bump in efficiency is anywhere from 8 to 15%: https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/35.html

AMD has claimed a 50% improvement in performance per watt so don't be surprised if AMD takes the efficiency crown this time around. Power consumption clearly wasn't the focus for Nvidia and RDNA1 was already close to begin with.


Overall a good launch for Nvidia but now it feels completely deflating after they hyped up much larger performance gains than we are actually seeing. Can't say I'd recommend people who have a 2080 Ti to upgrade.
 
Last edited:
I'm kinda confused and this is why:
"A big 70% performance jump over the RTX 2080 at 4K is impressive, and a huge improvement in cost per frame, so that's a job well done by Nvidia."
Steve, I respect you and Tim to death and watch Hardware Unboxed religiously but this statement is unintentionally misleading. The reason that I say this is because I seriously doubt that anyone can picture the performance of the 2080 the way we can with the 2080 Ti. There are two cards here that people are looking at to see what the generational performance difference is and the 2080 isn't one of them.

What I see here is an extremely modest increase of 23% at 1440p...
Proper context, right after that sentence you quote, we say this:

Now, that big jump won't be experienced across the board, or more specifically, we saw less consistent improvements at 1440p where the RTX 3080 could beat the 2080 by as much as 70%, but also by as little as 25% (for a ~50% performance gain on average). That 50% boost is still great to see even though in terms of raw performance at the $700 price point we can look two years back to the RTX 2080 release or 3.5 years since the GTX 1080 Ti was released.

So we don't appear in disagreement. We never liked the 20 series pricing (you can look back to those reviews), but then again with no pressure from AMD at the high-end, Nvidia charged what they charged. They have corrected course in this generation, hopefully because of incoming competition (which remains to be seen). At present time however, the 3080 is good value compared to what you could buy a month ago, or 2 years ago.
 
So this is pretty clearly faster than my 1080 (usual rig) and 780 (backup.) I don't think I'm going to need a lot more detail on that point.

What I could use some help understanding in follow up articles is how adequate or not the memory capacity is. Is it reasonable to think that the 10 GB will be enough for all games released in say at least the next 4 years, or is it possible the VRAM will become a limiting factor before the rest of the card's processing power will?
 
So this is pretty clearly faster than my 1080 (usual rig) and 780 (backup.) I don't think I'm going to need a lot more detail on that point.

What I could use some help understanding in follow up articles is how adequate or not the memory capacity is. Is it reasonable to think that the 10 GB will be enough for all games released in say at least the next 4 years, or is it possible the VRAM will become a limiting factor before the rest of the card's processing power will?
For 4K 10GB should be ok for at least another 2 years and you can OC the memory to get a few more FPS in memory intensive games. It's the 8GB of the 3070, with the lower bandwidth too, that people are concerned for a bit.
 
Last edited:
I was excited about this, but that TDP is a bit much for me to deal with in the summer in a house with no AC. Probably would be great in the winter, but... I guess I'll just have to consider the 3060 or maybe 3070.
 
I got your point but the fact is that the 3080 is not a Ti version and it costs the same as the 2080 when it launched, the comparison is fair, even if you compare it against the 2080Ti the 3080 costs hundreds less and offer 30% more performance. In fact I don't think there's going to be a 3080Ti in the future and the 3090 is all Nvidia will offer at the highest end for almost the same price as the 2080Ti.
I'm not sure that you do get my point because you didn't address the clear pricing progression. I find it interesting how you seem to hold the mighty "Ti" designation as gospel but don't mind the fact that the cost of the top-tier Ti model has gone from $699 (GTX 1080 Ti) to $1199 (RTX 2080 Ti) in TWO GENERATIONS. You only care if it says Ti when nVidia says to. Here we have nVidia playing a tune and you're dancing to it, which is exactly what the tune is designed to do and is exactly what they want.

The fact that you say "but, but, but, it's not a Ti!" means that you're looking at the branding and ignoring the performance and pricing metrics. This is a bad idea because performance metrics don't lie, branding does. Ask yourself what the performance difference is between the RTX 2080 and the RTX 2080 Ti. Then ask yourself what the performance difference is between the RTX 3080 and the RTX 3090... You'll see that the performance delta between the 2080 and 2080 Ti is about the same as the projected difference between the 3080 and 3090 which means that there isn't really enough room between them to fit another card without cannibalising sales from the two other cards.

This is why nVidia is referring to the RTX 3090 as a "gaming card", because it is. It's not a Titan because it uses too much power to be one. Titans are supposed to use around 250W which is important to the pro-sumers that use them. There is no Titan this time around, so they called what would have been the Ti version of the 3080, the 3090. Don't let them make you into another sheep. Stop being lazy and do the numbers, you'll be better off for it.
 
ILet's look at the pricing progression, shall we?

MSRP of GTX 280 $649
MSRP of GTX 480 $499
MSRP of GTX 580 $499
MSRP of GTX 680 $499
MSRP of GTX 780 $649
MSRP of GTX 780 Ti $699 <- Top-tier Ti line at $699
MSRP of GTX 980 Ti $699 <- Top-tier Ti line at $699
MSRP of GTX 1080 Ti $699 <- Top-tier Ti line at $699
MSRP of RTX 2080 $699 <- Top-tier NON-Ti line at $699
MSRP of RTX 2080 Ti $1199 <- Top-tier Ti line at $1200?!?!?!
MSRP of RTX 3080 $699 <- Top-tier NON-Ti line at $699 (just like RTX 20)
MSRP of RTX 3080 Ti $1199 <- Top-tier NON-Ti line at $699 (just like RTX 20)
MSRP of RTX 3090 $1499 <- Still way more expensive than the Ti

So, nVidia has conditioned the sheep to suddenly accept a $500 jump in price for the top-tier Ti line from $699 to $1199 and the non-Ti from $499 to $649. Don't tell me it's inflation because the cost of the top-tier card was between $500 and $649 for FIVE GENERATIONS (at LEAST ten years). Don't tell me that it's inflation because the IBM PC model 5150 was $2000 in 1984 and we're not paying $12000 for entire PCs that are not even in the same performance universe as that old IBM. Tech is supposed to get cheaper over time with new tech being the same price as the old was (or LESS not more). Here we have nVidia finding a way around that to charge people even more and more money when they shouldn't be (they don't have to after all) and people are CELEBRATING THIS? Seriously?

I think the celebration is about the CURRENT price/perfomrance ratio, and the monstrous performance in general. Prices are not following nice mathematical patterns. Tech IS getting cheaper, maybe not at the rate to your liking, but as you pointed out yourself, you don't pay multipple months of salary for a capable PC today (like you did in 1984, or I did 1992)

Prices are creeping up, accross the board. Just look at the consoles PS5 vs PS2 launch price, or the same for XBOX (and I still do believe that the new Xbox specs are awesome, and I couldn't build anything near that performance for the price they are going to charge).

Or look at CPUs: I have bought my 2500K back in 2011 for Ł150, and the top of the line, most expensive, "beast of CPU" 2600K was Ł249. Today top of the line CPUs cost multiple times that, from both manufacturers (and I still feel OK with that, as I think their performance grew even faster than their price, so weirdly enough, I feel I get actually more bang for my buck today, than 10 years ago).

You can't just take a linea, draw a (projection) line based on past data, and expect prices to adhere to that. At least in my experience, markets simply do not work that way.
 
The tests actually show you the difference between last gen gpus and a 3080 and they are using the same CPU in all the tests. And why would anybody use 1080p in 2020-2021 ?? helloo.....the damn phone has a higher resolution than that. 1080p is for budget builds or ppl that play cs-go or other non taxing games that are smoother @ high refresh rates. Why pick a CPU bound resolution to test a GPU is beyond me, and I wouldn't even dream of playing Cyberpunk 2077 or other new games at anything lower than 1440p
 
These are some nice gains compared to the last generation. I curious how AMD is going to respond. Hopefully they at least match the 3080 performance wise.
 
The tests actually show you the difference between last gen gpus and a 3080 and they are using the same CPU in all the tests. And why would anybody use 1080p in 2020-2021 ?? helloo.....the damn phone has a higher resolution than that. 1080p is for budget builds or ppl that play cs-go or other non taxing games that are smoother @ high refresh rates. Why pick a CPU bound resolution to test a GPU is beyond me, and I wouldn't even dream of playing Cyberpunk 2077 or other new games at anything lower than 1440p

It really depends on the game if it's CPU or GPU bound at 1080p

Plenty of people still game at that resolution and monitor manufacturers are still pushing cutting edge tech there as well. 1080p resolution is the only res you can get 360 Hz on.
 
Few things:
1. This is worth upgrading my 1080Ti for, though I am not sure if I will. I think I might be able to wait for the 4x series. I think Cyberpunk will determine that for me on whether I make the jump now or in 12-18 months when 4x series cards come out. My PS5 will probably tide me over well until then.
2. I was thinking we would start to see 4k GPU bound limitations begin to lift for CPU benchmarks, but not yet.
4x series won't be out for 24 months. They refresh roughly every two years, but not sooner.

I have a 1080Ti as well, and guarantee you that unless you're gaming at 1080p, this card will be somewhat challenged by CP2077. I'm at 3440x1440p and Metro Exodus was dipping into the 50's at times. Cyberpunk will be much more demanding.
 
I'm not sure that you do get my point because you didn't address the clear pricing progression. I find it interesting how you seem to hold the mighty "Ti" designation as gospel but don't mind the fact that the cost of the top-tier Ti model has gone from $699 (GTX 1080 Ti) to $1199 (RTX 2080 Ti) in TWO GENERATIONS. You only care if it says Ti when nVidia says to. Here we have nVidia playing a tune and you're dancing to it, which is exactly what the tune is designed to do and is exactly what they want.

The fact that you say "but, but, but, it's not a Ti!" means that you're looking at the branding and ignoring the performance and pricing metrics. This is a bad idea because performance metrics don't lie, branding does. Ask yourself what the performance difference is between the RTX 2080 and the RTX 2080 Ti. Then ask yourself what the performance difference is between the RTX 3080 and the RTX 3090... You'll see that the performance delta between the 2080 and 2080 Ti is about the same as the projected difference between the 3080 and 3090 which means that there isn't really enough room between them to fit another card without cannibalising sales from the two other cards.

This is why nVidia is referring to the RTX 3090 as a "gaming card", because it is. It's not a Titan because it uses too much power to be one. Titans are supposed to use around 250W which is important to the pro-sumers that use them. There is no Titan this time around, so they called what would have been the Ti version of the 3080, the 3090. Don't let them make you into another sheep. Stop being lazy and do the numbers, you'll be better off for it.
Well, NOW you can get much better than 2080Ti performance for $699. What's the problem?

"Titans are supposed to use around 250W which is important to the pro-sumers that use them. There is no Titan this time around, so they called what would have been the Ti version of the 3080, the 3090."

Sources, please?

"Here we have nVidia playing a tune and you're dancing to it"

Ahem... what "tune", exactly? Buy it- or don't.

"You'll see that the performance delta between the 2080 and 2080 Ti is about the same as the projected difference between the 3080 and 3090 which means that there isn't really enough room between them to fit another card without cannibalising sales from the two other cards."

You're missing the fact that there is an $800 difference in price ($700 vs $1500) between the 3080 and 3090. They could easily release a 3080 Super/Ti with 16/20GB VRAM for $1000-1100. Plenty of space for an in-between card here.
 
Last edited:
320W TDP? Thank you, but no, thank you.

Unless your case is a box of tempered glass, it’s a non issue. My factory OCed 2080ti draws 320w at 110% power limit. Even at an ambient temperature of 32c during the summer, the triple fan cooling solution is able to keep it below 79c without any obnoxious fan whine. I just hear a woosh. When the ambient temperature is more moderate and it’s a game that doesn’t peg the GPU at 95%+, it’s virtually silent (case is under my desk).

The only thing that sucks is the hot air exhaust when you have no AC and it’s like 30c outside and the sun is shining straight through your windows. When that happens, I cut the power limit slider and cap frame rates or play something like Rimworld, Factorio or Oxygen not included. When it’s that hot, I won’t be in the mood for anything other than chill games anyway.
 
Huge disappointment, especially with that TDP and pricing. Based on previous news, specs and hype, the 3080 performance is where I was expecting the 3070 would be. Maybe we should wait more reviews of other models, but looks like RTX 3xxx was overhyped.

If these gains keep up in the rest of the lineup, I see little reason for people with a RTX 2xxx who game at 1080p to upgrade. I wonder if the pessimists who were expecting Geforce FX 2.0 will end up vindicated. It will be a great opportunity for AMD and Intel.
 
Huge disappointment, especially with that TDP and pricing. Based on previous news, specs and hype, the 3080 performance is where I was expecting the 3070 would be. Maybe we should wait more reviews of other models, but looks like RTX 3xxx was overhyped.

If these gains keep up in the rest of the lineup, I see little reason for people with a RTX 2xxx who game at 1080p to upgrade. I wonder if the pessimists who were expecting Geforce FX 2.0 will end up vindicated. It will be a great opportunity for AMD and Intel.
The ($499) 3070 is going to be slightly better than last gen's $1200 flagship, for god's sake. I don't get your point... 2018 is gone.

And at 1080p, your CPU matters more anyway.
 
Well, NOW you can get much better than 2080Ti performance for $699. What's the problem?

"Titans are supposed to use around 250W which is important to the pro-sumers that use them. There is no Titan this time around, so they called what would have been the Ti version of the 3080, the 3090."

Sources, please?

"Here we have nVidia playing a tune and you're dancing to it"

Ahem... what "tune", exactly? Buy it- or don't.

"You'll see that the performance delta between the 2080 and 2080 Ti is about the same as the projected difference between the 3080 and 3090 which means that there isn't really enough room between them to fit another card without cannibalising sales from the two other cards."

You're missing the fact that there is an $800 difference in price ($700 vs $1500) between the 3080 and 3090. They could easily release a 3080 Super/Ti with 16/20GB VRAM for $1000-1100. Plenty of space for an in-between card here.

Proper context, right after that sentence you quote, we say this:

Now, that big jump won't be experienced across the board, or more specifically, we saw less consistent improvements at 1440p where the RTX 3080 could beat the 2080 by as much as 70%, but also by as little as 25% (for a ~50% performance gain on average). That 50% boost is still great to see even though in terms of raw performance at the $700 price point we can look two years back to the RTX 2080 release or 3.5 years since the GTX 1080 Ti was released.

So we don't appear in disagreement. We never liked the 20 series pricing (you can look back to those reviews), but then again with no pressure from AMD at the high-end, Nvidia charged what they charged. They have corrected course in this generation, hopefully because of incoming competition (which remains to be seen). At present time however, the 3080 is good value compared to what you could buy a month ago, or 2 years ago.
Oh I didn't say that we diisagree, I took that sentence because it was put alone in BIG letters. Someone skimming through the article might miss that. You're right of course and I don't disagree with you. It was just that it was hghlighted without the context attached. I'm a big fan of "Harbour On Box" and I would never think that Steve would be intentionally misleading. That's why I said "unintentionally". Steve's one of the most honest and most knowledgeable people on the internet and I know that's not saying much but I'd never accuse him of anything untoward. I wanted to make that clear.
 
You clearly can't read, there is a 0% difference between the 3950x and the 10900K at 4k. Look at the benchmarks above! Trolls.

TechSpot, one of the most popular games in the world is COD Warzone and you guys don't include benchmarks. It's literally the only game a lot of us play and you ignore it completely. I also don't understand the lack of 1080P testing. You reviewed a 360hz 1080P monitor yesterday. I have a 280hz 1080P monitor and desperately want to see benches! I get some of your editors don't care about high refresh gaming but a LOT of us do!
The 3080 is not meant for 1080p. At that resolution, the CPU is doing most of the work.

That is why there are no tests at that resolution.
 
Last edited:
Well, NOW you can get much better than 2080Ti performance for $699. What's the problem?

"Titans are supposed to use around 250W which is important to the pro-sumers that use them. There is no Titan this time around, so they called what would have been the Ti version of the 3080, the 3090."

Sources, please?

"Here we have nVidia playing a tune and you're dancing to it"

Ahem... what "tune", exactly? Buy it- or don't.

"You'll see that the performance delta between the 2080 and 2080 Ti is about the same as the projected difference between the 3080 and 3090 which means that there isn't really enough room between them to fit another card without cannibalising sales from the two other cards."

You're missing the fact that there is an $800 difference in price ($700 vs $1500) between the 3080 and 3090. They could easily release a 3080 Super/Ti with 16/20GB VRAM for $1000-1100. Plenty of space for an in-between card here.
Um, it seems I triggered you and that wasn't my intent so I apologise for that. My intent was to show you that it's not quite as good as it seems. I honestly don't know how they'll wedge a "Ti" version into this mix but looking at the metrics of price and performance, they already have, the 3090 (which is pie-in the-sky expensive just like the 2080 Ti). Is this better than before? Sure it is, but that's not exactly setting the bar high, eh?

I wanted to point out that this isn't as good as nVidia makes it out to be because a lot of people have short memories. Is this good? Yes and no because good is relative. Pascal was a much better launch for its time and it's not even close. That GTX 1080 Ti was perhaps the best high-end video card value that I've ever seen. That thing is still a monster today and Pascal should be the benchmark against which all nVidia launches are compared, not Turing.

I just want people to temper their emotional response of euphoria (and believe me, I get it) and remember that this isn't really all that great. Sure. it's just great compared to how horrible Turing was but it's still pretty bad compared to how great Pascal was. Pascal was a GIGANTIC LEAP from Kepler and the prices remained THE SAME, as they should have.

How long did we suffer with Intel sandbagging because AMD's FX wasn't up to scratch but were too busy being happy about the little incremental improvements that Intel made? We were so preoccupied with how much better Devil's Canyon was compared to Ivy Bridge that we didn't realise what Intel was doing to us until after Ryzen launched. We needed AMD to wake us up and it's because we couldn't see the forest for the trees. We need to be a little bit more cynical and a little less naive, even though that's not always fun (actually it never really is). Yes, Ampere is better than Turing and nVidia wants us to focus on that because it's to their advantage if we do so. However, if we forget how good that it can be or fail to notice the bigger picture and the path that we seem to be on, do you know what that makes us?

Boiled frogs my friend. boiled frogs. I don't want that for any of us, even if I have to be a bit of a killjoy to point it out.

As for my source about the prosumers and the 250W Titan, my source is AdoredTV. Love him or hate him (I personally think Jim's awesome), nobody can ever say that he's wrong because he backs up everything he says. I checked and he was 100% right about the 250W Titans. I didn't even know that they were 250W cards before Jim pointed it out but it makes sense.
 
The tests actually show you the difference between last gen gpus and a 3080 and they are using the same CPU in all the tests. And why would anybody use 1080p in 2020-2021 ?? helloo.....the damn phone has a higher resolution than that. 1080p is for budget builds or ppl that play cs-go or other non taxing games that are smoother @ high refresh rates. Why pick a CPU bound resolution to test a GPU is beyond me, and I wouldn't even dream of playing Cyberpunk 2077 or other new games at anything lower than 1440p
I agree with you but since the vast majority of people still game at 1080p (I don't but I know that I'm not the whole world), Steve would have been remiss to ignore it. The latest Steam survey shows that over 65% of Steam users game at 1080p. Even if it's not relevant to us, it's still the most relevant resolution for gaming in the world today. People want to know how it performs at the resolution that they use and rightfully so.
 
Some people seem to be worried about the amount of VRAM (10GB). I am not sure I follow the concern here.

The overwhelming majority of cards on the market now have between 4 - 8 GB of VRAM and are doing quite well with most games. Is there a study or a serious prediction that there will be games in the very near future where 10 GB are not enough??

Just because 1 2- games MIGHT be that VRAM hungry in > 4 years doesn't warrant the panic over "only" 10 GB of VRAM!
 
The ($499) 3070 is going to be slightly better than last gen's $1200 flagship, for god's sake. I don't get your point... 2018 is gone.

And at 1080p, your CPU matters more anyway.

With this glass half full mentality it might not sound terrible, but pragmatically speaking it's still pretty bad, considering the hype it generated. Especially with that TDP. And like many, I'm frustrated with Nvidia getting away with imposing a new baseline in prices (though it's not solely their fault - AMD is also complacent), so I still feel the bang for buck isn't good enough.

Think about it. Quite a few people who wish to upgrade from a 2080 to a 3080 (and maybe, a 2070 to 3070, or 2060 to 3060, in case they present similar increases in power draw) will have to shop not only for a new card, but also a new PSU. Not a great deal with these gains. Maybe even a new CPU case for some, unless Nvidia or third-parties pull some miracle with the cooling.

The fellow before me couldn't have said it better: This is sooo AMD.

I have to disagree on 1080p, especially for people who play poorly optimized indie titles made in Unity and UE4. But let's forget about 1080p. Based on raw specs and hype, I was expecting the 3080 to be at very least 40% ahead of 2080Ti in 1440p and 4k - not an absurd expectation with that core count + transistor count + TDP. And 40% would still be sort of meh, 50%+ would be good.
 
Back