Nvidia GeForce RTX 3080 Review: Ampere Arrives!

I'm kinda confused and this is why:
"A big 70% performance jump over the RTX 2080 at 4K is impressive, and a huge improvement in cost per frame, so that's a job well done by Nvidia."
Steve, I respect you and Tim to death and watch Hardware Unboxed religiously but this statement is unintentionally misleading. The reason that I say this is because I seriously doubt that anyone can picture the performance of the 2080 the way we can with the 2080 Ti. There are two cards here that people are looking at to see what the generational performance difference is and the 2080 isn't one of them.

What I see here is an extremely modest increase of 23% at 1440p:
CPU_1440p.png

and a less modest increase of 31% at 4K:
CPU_4K.png

Now, some could argue that the RTX 2080 Ti costs $1200USD while the RTX 3080 costs only $700 and that's true but just because the RTX 2080 Ti cost that much doesn't mean that it should have because the top-tier Ti card was the $700 card as recently as the GTX 1080 Ti. It was a terrible price to pay and I'm amazed at just how many people were stupid enough to pay $1200 for that card.now nVidia is using their past bad behaviour to make a normal price look amazing.

It's amazes me just how successfully that nVidia has manipulated their customer base. Their first step was to release the 2080 at $700 which was barely a performance jump over the GTX 1080 Ti (which was also $700). Step two was to release the 2080 Ti which was a made their customers so used to getting screwed over by pricing the 2080 Ti at $1200, that pricing the 3080 at $700 (where the 2080 Ti should have been in the first place and where the 3080 Ti should be priced instead of the 3080), nVidia has made its customer base extremely happy to get screwed over.

Let's look at the pricing progression, shall we?

MSRP of GTX 280 $649
MSRP of GTX 480 $499
MSRP of GTX 580 $499
MSRP of GTX 680 $499
MSRP of GTX 780 $649
MSRP of GTX 780 Ti $699 <- Top-tier Ti line at $699
MSRP of GTX 980 Ti $699 <- Top-tier Ti line at $699
MSRP of GTX 1080 Ti $699 <- Top-tier Ti line at $699
MSRP of RTX 2080 $699 <- Top-tier NON-Ti line at $699
MSRP of RTX 2080 Ti $1199 <- Top-tier Ti line at $1200?!?!?!
MSRP of RTX 3080 $699 <- Top-tier NON-Ti line at $699 (just like RTX 20)
MSRP of RTX 3080 Ti $1199 <- Top-tier NON-Ti line at $699 (just like RTX 20)
MSRP of RTX 3090 $1499 <- Still way more expensive than the Ti

So, nVidia has conditioned the sheep to suddenly accept a $500 jump in price for the top-tier Ti line from $699 to $1199 and the non-Ti from $499 to $649. Don't tell me it's inflation because the cost of the top-tier card was between $500 and $649 for FIVE GENERATIONS (at LEAST ten years). Don't tell me that it's inflation because the IBM PC model 5150 was $2000 in 1984 and we're not paying $12000 for entire PCs that are not even in the same performance universe as that old IBM. Tech is supposed to get cheaper over time with new tech being the same price as the old was (or LESS not more). Here we have nVidia finding a way around that to charge people even more and more money when they shouldn't be (they don't have to after all) and people are CELEBRATING THIS? Seriously?

Nvidia is offering a new level of performance where it has no competition so far. At the end of the day, the price is related to supply and demand. There is a 1000 US$+ tier now and consumers are willing to pay.
 
With this glass half full mentality it might not sound terrible, but pragmatically speaking it's still pretty bad, considering the hype it generated. Especially with that TDP. And like many, I'm frustrated with Nvidia getting away with imposing a new baseline in prices (though it's not solely their fault - AMD is also complacent), so I still feel the bang for buck isn't good enough.

Think about it. Quite a few people who wish to upgrade from a 2080 to a 3080 (and maybe, a 2070 to 3070, or 2060 to 3060, in case they present similar increases in power draw) will have to shop not only for a new card, but also a new PSU. Not a great deal with these gains. Maybe even a new CPU case for some, unless Nvidia or third-parties pull some miracle with the cooling.

The fellow before me couldn't have said it better: This is sooo AMD.

I have to disagree on 1080p, especially for people who play poorly optimized indie titles made in Unity and UE4. But let's forget about 1080p. Based on raw specs and hype, I was expecting the 3080 to be at very least 40% ahead of 2080Ti in 1440p and 4k - not an absurd expectation with that core count + transistor count + TDP. And 40% would still be sort of meh, 50%+ would be good.
Read.

I still think that the performance/price ratio is impressive. We're talking about a large gain in performance at a 60% lower price than the 2080Ti. What's not to celebrate?

Upgrading PSU's is inevitable. It's progress, and newer tech need not apologize, imo.
 
Read.

I still think that the performance/price ratio is impressive. We're talking about a large gain in performance at a 60% lower price than the 2080Ti. What's not to celebrate?

Upgrading PSU's is inevitable. It's progress, and newer tech need not apologize, imo.

You certainly seemed to care on August 15th 2017 and on February 7th, 2019 when Vega 56 and Vega VII released respectively with comments like:

"Steve, would you review the glass stand that AMD provided? I'm just curious if it too runs hotter, louder, and is more power hungry than the competition..."

Double standards aside, power consumption does matter. Upgrading the power supply is not progress and it's not sustainable to continue to increase power requirements every generation.

Nvidia have done a good job keeping temps in check. That said performance per watt should improve more than this per generation.
 
Some people seem to be worried about the amount of VRAM (10GB). I am not sure I follow the concern here.

The overwhelming majority of cards on the market now have between 4 - 8 GB of VRAM and are doing quite well with most games. Is there a study or a serious prediction that there will be games in the very near future where 10 GB are not enough??
I'm one who asked, so I'll answer. I asked because my understanding of what drives VRAM usage is hazy. With no actual knowledge, my intuition is that if this card is making the leap from 1440p to 4K (so ~2.25x pixels processed), might the VRAM usage not also require the same 2.25x adjustment? And yes, I understand the current (=soon to be last) generation games ran fine on 4-8 GB. But the next generation is upon us and I have no information about what its games will require. I see the consoles have 16 GB RAM total, some of which has to be for the OS and the app, but I believe in theory could allocate more than 10 GB to VRAM purposes. So I'm asking.

As to a study or serious prediction, that's the whole point. If there's one out there that says Yeah or Nay either way, I haven't seen it, but I'd like to before making my purchase. I think the people who would know for sure are probably the people with design oversight of games targeted to launch in the next 2-4 years. I'd love to know what their design assumptions are for available VRAM.
 
if this card is making the leap from 1440p to 4K (so ~2.25x pixels processed), might the VRAM usage not also require the same 2.25x adjustment?

I don't know for sure either. I've read a couple of other sites saying the increased speed from storage to the card, to output more than makes up the difference. Steve, the benchmarker did say one game claimed 9gb so he made an adjustment in the games settings, then committed vram dropped. A web browser is the only consumer software I've used that hogged 9gb of ram. That's insane, I don't think I would cater to a game that needs to load more into memory than a $700 video card has.
 
Although this is still a very impressive video card for the price, the fact that it's only a little more powerful in benchmarks than the (twice as expensive) 2080 Ti means that the impression I had from Jenson Huang's video, that these new cards were so surprisingly powerful that there was a risk of AMD being left far behind (although as I then noted, there was reason to hope that it wouldn't be too far behind)... is completely incorrect. This is not quite a routine release, but the level of the performance uplift is not sufficient to lead to any doubts about AMD's new video cards being comparable.
I wonder if, in the time between the video and these benchmarks, AMD drew up plans which they could still implement for some last-minute performance boosts that will even embarass Nvidia. (But apparently if so, Nvidia can fight back by adding more memory to these cards... yay for competition.)

By the way, though, I know what I would really like AMD (or, even better, Nvidia, since they have the better software stack) to do: make a mediocre budget video card on an obsolete process node like 14nm or even 22nm...with one special feature.
Totally unlocked FP64 performance.
So it's made on an obsolete process node so that there's enough capacity to satiate the demand from the cryptominers, so that those of us who would like to do serious number-crunching at home could get our hands on, say, 1 to 4 teraflops at FP64 at an affordable price.
 
Although this is still a very impressive video card for the price, the fact that it's only a little more powerful in benchmarks than the (twice as expensive) 2080 Ti means that the impression I had from Jenson Huang's video, that these new cards were so surprisingly powerful that there was a risk of AMD being left far behind (although as I then noted, there was reason to hope that it wouldn't be too far behind)... is completely incorrect. This is not quite a routine release, but the level of the performance uplift is not sufficient to lead to any doubts about AMD's new video cards being comparable.
I wonder if, in the time between the video and these benchmarks, AMD drew up plans which they could still implement for some last-minute performance boosts that will even embarass Nvidia. (But apparently if so, Nvidia can fight back by adding more memory to these cards... yay for competition.)

By the way, though, I know what I would really like AMD (or, even better, Nvidia, since they have the better software stack) to do: make a mediocre budget video card on an obsolete process node like 14nm or even 22nm...with one special feature.
Totally unlocked FP64 performance.
So it's made on an obsolete process node so that there's enough capacity to satiate the demand from the cryptominers, so that those of us who would like to do serious number-crunching at home could get our hands on, say, 1 to 4 teraflops at FP64 at an affordable price.

Entirely possible we could see some really nice competition this gen which should benefit the consumer.

I'll be waiting for 3070 benchmarks and whatever AMD has up it's sleeve. Looking for a card up to around 280w. A 3070 Ti would likely be right in that range.
 
The fellow before me couldn't have said it better: This is sooo AMD.

I have to disagree on 1080p, especially for people who play poorly optimized indie titles made in Unity and UE4. But let's forget about 1080p. Based on raw specs and hype, I was expecting the 3080 to be at very least 40% ahead of 2080Ti in 1440p and 4k - not an absurd expectation with that core count + transistor count + TDP. And 40% would still be sort of meh, 50%+ would be good.

This does remind a lot of past AMD cards. On paper specs of the 3080 vs 2080 Ti have almost doubled in many categories (comparing the two because they are both using the 102 die) but actual gaming performance hasn‘t increased anywhere near that, but compute performance has increased greatly in some apps.

Yes, performance / $ has increased very much but it only looks that good due to the 2xxx series highly increased pricing over its predecessor.

Tbh, while it‘s a good performance, it is not anywhere near what it was hyped to be (including nVidia). So maybe selling your 2080Ti if you are playing at 1440p is not really necessary.

Here‘s a good quote from Yuri Bubliy on Twitter:

RTX 2080TI vs RTX 3080. Minute of math.

12 vs 8 nm
14.2 vs 29.7 TFLOPS
4352 vs 8704 Cores
1545 vs 1710 MHz Boost Clock

Amazing specs

BUT

TDP RTX 2080TI OC ~= TDP RTX 3080
And performance gain only 15%...


 
Seriously, it's behind all the hype. It's what we've been seeing generation after generation. 20-30% faster than the previous gen.

And Microsoft FS 2020 punishes it to the ground. Can't even keep up 60fps minimum at 1440p, let alone 4K.
If the upcoming 3090 can't keep up 60fps minimum at 4K in MFS2020 for it's absurd asking price, I don't know what to say about the impatient money wasters.

The best thing though is that the price tag of the 3080 puts 2080Ti to shame. Good thing I never got the 2080Ti.
 
Last edited:
Do you guys have anything against nvidia ???

1- You should have used best CPU for gaming. i9 10900K because we do see a lot of CPU bottleneck specially at 1440p
Also, both intel and AMD will have better CPU with high IPC in next few months (rocket lake and Zen3). 3950X results is irrelevant for the foreseeable future

2- Average 4K comparison graph with RTX 2080 should have used Doom Eternal at Ultra nightmare textures, not high texture. RTX 3080 showed 115% boost over RTX 2080 at 4K on Ultra nightmare.
2GB extra memory that RTX 3080 have over RTX 2080 is one of the advantage that RTX 3080 have. So reducing setting to make RTX 3080 only 70% faster is stupid and does not represent what consumers will get in real world because people will play it on max setting on these cards. We are not comparing 8GB RTX 3080 to 8GB RTX 2080. We are comparing 10GB RTX 3080 to 8GB RTX 2080 and extra video memory is part of the advantage that RTX 3080 have over RTX 2080

Just imagine if you are comparing GTX 1060 3GB to RX480 8GB and then you lower texures and setting to reduce VRAM usage to 3Gb and then compare both cards. Thats so stupid, and I doubt that you will do that.

Your website shows that you have something against nvidia. Don't deny it.
 
Quote from article
"...........as Nvidia paired it with 10GB of VRAM which might prove insufficient in a year or two, though PCIe 4.0 will be a lot more useful in mitigating those performance loses in modern platforms."

Even 8GB seems to be sufficient in all current games at 4K as long as your GPU is fast enough. It makes no sense to say than 10GB won't be sufficient any time soon

Doom Eternal can uses over 8GB BUT 8GB high-end card RTX 2080 still get over 60fps all the time. So while having over 8GB have benefit and can make your fps better, you can't say that 8GB is not sufficient

Also, if you notice RTX 2060 6GB performed better (both avg and min fps) than GTX 1070 8GB in same setting that can use 9GB. So, while extra video memory is helpful, I say the final results (frame rate) is what matters and RTX 2060 6GB is still better than 1070 8GB anyway.
 
What are actual downsides to the high power usage other than the obvious it uses too much power?
 
As a 1080 Ti owner playing at 1440p, this article just proves to me that my card will do just fine for now and I can wait for the next series of cards for even more improvements. Another card that I am extremely impressed with is the AMD RX 5700 XT, which is a $400 card that is right there with these other cards. I am playing Assassin's Creed Odyssey and its amazing how GPUs just don't run that game well.
 
Last edited:
Nvidia is offering a new level of performance where it has no competition so far. At the end of the day, the price is related to supply and demand. There is a 1000 US$+ tier now and consumers are willing to pay.
I agree with you 100%, there is a $1000USD tier that some people are willing to pay. However, I won't be saying good things about it and I'll be damned if I ever celebrate this horrid fact. I point things like this out to people because I can see that nVidia's marketing is specifically designed to distract people from what is actually happening by dangling (admittedly) very pretty things in front of their noses. I suppose that I just care enough to give them the slap in the face that brings them back down to Earth. LOL
 
I agree with you but since the vast majority of people still game at 1080p (I don't but I know that I'm not the whole world), Steve would have been remiss to ignore it. The latest Steam survey shows that over 65% of Steam users game at 1080p. Even if it's not relevant to us, it's still the most relevant resolution for gaming in the world today. People want to know how it performs at the resolution that they use and rightfully so.

Those people should not care about 180p metrics on a $700 card but maybe a $500 card and definitely a RX 3060 card?

I am guessing the eSports crowd is still very niche to want this card and run it in 1080p, but I could be wrong.
 
Last edited:
This does remind a lot of past AMD cards. On paper specs of the 3080 vs 2080 Ti have almost doubled in many categories (comparing the two because they are both using the 102 die) but actual gaming performance hasn‘t increased anywhere near that, but compute performance has increased greatly in some apps.

Yes, performance / $ has increased very much but it only looks that good due to the 2xxx series highly increased pricing over its predecessor.

Tbh, while it‘s a good performance, it is not anywhere near what it was hyped to be (including nVidia). So maybe selling your 2080Ti if you are playing at 1440p is not really necessary.

Here‘s a good quote from Yuri Bubliy on Twitter:




LOL that post from Yuri is so cherrypicked. First he ignores the huge ROP count difference between the 2080Ti and the 3080 (88 vs 64) which explains a lot of the performance difference. And then he goes full fanboy by comparing the OC 2080Ti to the non-OC 3080 in his 15% performance gain numbers. I get that he's a big AMD supporter but for someone who knows his numbers so damn well, he should be embarrassed by his choice of what to omit and compare.
 
What are actual downsides to the high power usage other than the obvious it uses too much power?

IMO just that you need to use a bit more a/c to cool your house, and thus also costs a bit more in electrical costs each month to run and cool. For most gamers it really doesn't matter as long as your power supply is up to the task.
 
I asked because my understanding of what drives VRAM usage is hazy. With no actual knowledge, my intuition is that if this card is making the leap from 1440p to 4K (so ~2.25x pixels processed), might the VRAM usage not also require the same 2.25x adjustment?
When a graphics card is doing all of the work to render a frame, then VRAM will be filled with various blocks of data - some of it is the raw building blocks of the 3D scene itself (I.e. information about shapes, textures to cover with them, the list of instructions of what to do) and some of it is 'working data.'

In the case of the former, this doesn't change with resolution: it's fixed in size. However, working data does vary and by quite a lot. For example, a modern game like Doom Eternal or Shadow of the Tomb Raider will create several versions of the frame, in different formats, to use in the creation of the visual effects seen on the monitor. These might be the same size as the final, completed frame, but they will always scale with the resolution.

So going from 1440p to 4K doesn't involve a full 2.25 times more memory, as a good chunk of it will be the assets for the frame. Only temporary stuff will get scaled, some by 2.25 times, others will be a set fraction of this.
 
Although this is still a very impressive video card for the price, the fact that it's only a little more powerful in benchmarks than the (twice as expensive) 2080 Ti means that the impression I had from Jenson Huang's video, that these new cards were so surprisingly powerful that there was a risk of AMD being left far behind (although as I then noted, there was reason to hope that it wouldn't be too far behind)... is completely incorrect. This is not quite a routine release, but the level of the performance uplift is not sufficient to lead to any doubts about AMD's new video cards being comparable.
I'll be honest with you, I have no doubt that ATi cooked up something comparable for AMD to offer. It was never accurate to say that Radeon cards weren't comparable to GeForce cards. It's just that nVidia has a tendency to make some pie-in-the-sky halo products to improve their image while AMD has a tendency to only make cards at price and performance levels that will maximise sales.

Now, of course, having the halo product is extremely valuable because it means that it's the card reviewers are going to use when testing CPUs to eliminate any possiblity of a GPU bottleneck. People see that and think "Steve and Tim use nVidia cards. Since they could use anything they wanted, GeForce cards must be better than Radeon cards." which is an extremely powerful form of exposure.

However, that doesn't mean that Radeon cards aren't comparable because the RX 5xxx series has been amazingly competitive. While the RX 5500 XT isn't the least bit impressive to me, the others have been:

The GTX 1660 effectively made the RX 5500 XT DOA
The RX 5600 XT effectively deleted the GTX 1660 Ti
The RX 5700 effectively deleted the RTX 2060
The RX 5700 XT deleted the RTX 2060 Super, the RTX 2070 and nips at the heels of the much more expensive RTX 2070 Super.

Better late than never I guess.
I wonder if, in the time between the video and these benchmarks, AMD drew up plans which they could still implement for some last-minute performance boosts that will even embarass Nvidia. (But apparently if so, Nvidia can fight back by adding more memory to these cards... yay for competition.)
This is why I want nVidia to stumble a bit, just enough for ATi to catch up. If ATi catches up, it gets even better still. I have always wanted them to be on even terms so that no shenanigans can occur. That's what's best for us consumers.
By the way, though, I know what I would really like AMD (or, even better, Nvidia, since they have the better software stack) to do: make a mediocre budget video card on an obsolete process node like 14nm or even 22nm...with one special feature.
Totally unlocked FP64 performance.
So it's made on an obsolete process node so that there's enough capacity to satiate the demand from the cryptominers, so that those of us who would like to do serious number-crunching at home could get our hands on, say, 1 to 4 teraflops at FP64 at an affordable price.
I absolutely LOVE this idea! :heart_eyes: My friend, this is PURE GENIUS! BRAVO! (y) (Y)
I'm sure that Samsung and Global Foundries would have the available capacity.
 
Last edited by a moderator:
LOL that post from Yuri is so cherrypicked. First he ignores the huge ROP count difference between the 2080Ti and the 3080 (88 vs 64) which explains a lot of the performance difference. And then he goes full fanboy by comparing the OC 2080Ti to the non-OC 3080 in his 15% performance gain numbers. I get that he's a big AMD supporter but for someone who knows his numbers so damn well, he should be embarrassed by his choice of what to omit and compare.
In order to see what the generation over generation technical advance is, his approach is imho correct.
- You need to compare the 2080ti to the 3080 as both use the same die class (102).
- The 2080 OC has the same power consumption as the stock 3080 FE. This allows you to see the performance delta at the same power consumption

Take the reviews and imagine just two non-technical things to be different:
- The 2080Ti was sold at the previous Ti price point of $699
- The 3080 was instead called 3080Ti

What do you think the reviews' conclusion would have been in that case?

Yes, the bang for buck has improved noticeably going from Turing to Ampere, but let's not forget that pricing made it very poor going from the 1080Ti to the 2080Ti. And nVidia calling the 3080 that is naming - that does not change that the 3080 and 3090 are using the same die class as the 2080Ti and Titan. 2080 and Super were on the 104 die that the 3070 is using now.
 
Those people not care about 180p metrics on a $700 card but maybe a $500 card and definitely a RX 3060 card?
You'd think so, but people are curious about things, even when they're irrelevant.
I am guessing the eSports crowd is still very niche to want this card and run it in 1080p, but I could be wrong.
You know, you make a great point about the eSports crowd. I hadn't thought of that (I rarely, if ever, think of eSports) but they'd probably only care in games like Rocket League and (as Steve Walton calls it) Counter Strike: Potato Offensive. :laughing:
 
Back