Nvidia GeForce RTX 4080 Review: Fast, Expensive & 4K Capable All the Way

Thx for the review, it confirm that for us , simple mortals ( or intelligent ppl ) the 6900xt / 6950xt are the Best GPUs out there
That's actually not true. The best out there right now is the RX 6800 XT. The RX 6900 XT is 27% more expensive but only 9% faster. The RX 6950 XT is even worse because it's 61% more expensive but only 15% faster. To choose either the 6900 XT or 6950 XT is about as reasonable as choosing an RTX 4080 over the RX 7900 XTX.

Remember that Steve referred to both the RX 6900 XT and RTX 3090 as "stupid" because their value prepositions were both terrible. Sure, the 3090 was worse than the 6900 XT but that never made the 6900 XT (or 6950 XT for that matter) anywhere near a good deal.
 
After the 4090 and 4080 were announced and then the 7900XT and XTX, I decided that nothing would approach the 6800XT's value at current pricing for quite a while, especially for 1440p HRR gaming. So I bought one for $550.

Looking at these value graphs, all I gotta say is:

EFF YEAH!
You're 100% correct about that (I myself also have a 6800 XT).

You know, I've seen people talking about what a great deal the 6900 or 6950 XT are, completely ignoring the card that has been the best deal for its entire generation (6800 XT) and it leaves me stunned. With the 6900 XT you pay 27% more for 9% more performance and with the 6950 XT you pay 61% more for 15% more performance compared to the 6800 XT.

The same people who rightly recognise that the GeForce cards are a rip-off can't also seem to tell that the RX 6900 XT and RX 6950 XT cards are also rip-offs. It's like there's some kind of cognitive dissonance there. Just because a card says "Radeon" on it doesn't automatically make it a great deal, it just automatically makes it a better deal than cards that say "GeForce" on them. :laughing:
 
My local Micro Center still has inventory on hand for the 4080. They show 55 cards in stock as of this morning on their website.

Hopefully this is a showing that people aren't really looking for these cards. I think the downside to this, though, is that not everyone has a local brick and mortar store they can visit to pick one of these cards up so they have to rely on the online retailers.

None of the online retailers I looked at have any of the 4080s for sale at MSRP pricing. You can certainly find 3rd parties (on newegg) that are listing cards going for upwards of $200+ over MSRP....so those people are stuck waiting and waiting to see if they can find one online at MSRP or they take the dive and pay a scalper for one.
Here's hoping those scalpers can't move them either, and end up having to take a loss.
 
yeah it'll get a lower score.

it could beat the 4080 in raster but if it loses to raytracing and AI/driver features then it makes sense.
I don't think that you understand something. EVERYTHING is raster, including ray-tracing. The effects of the ray-tracing calculations are still rasterised on to the screen. Raster performance is GPU performance, period. Ray-Tracing is just a calculation that is done by the card to know where to rasterise shadows and reflections. They are not two different ways of putting something on the screen, putting anything on the screen is rasterisation. RT is just a modification of the instructions that the rasteriser follows. It is not, in itself, an indication of video card performance any more than having a great sound system is an indication of a car's performance. It is a frill that the device could run perfectly fine without.

The performance of a card is how many frames it can rasterise per second. RT has absolutely nothing to do with the actual image drawing. If one card rasterises faster than the other, it is a faster card, end of story. Sure, some noobs care about RT (that doesn't bother me much) and that's fine. You'll notice that most people who have been gaming for at least 10 years find RT completely underwhelming because we've seen REAL game-changers like hardware tessellation.

What blows my mind though is when I read people talking about halo-level cards and they bring up things like FSR and/or DLSS. Upscaling technologies like FSR and DLSS are completely irrelevant to these cards because they don't yet need a performance uplift of any kind. By the time they do (I expect at least four years down the road), there's no way to know what the landscape will look like. Maybe by then FSR will be better than DLSS, maybe Intel's XeSS will be better than both or maybe things will be exactly as they are right now. The point is that with absolutely no way of knowing, it's beyond stupid to use them as criteria for which video card is worth paying for and which isn't.
 
I don't think that you understand something. EVERYTHING is raster, including ray-tracing. The effects of the ray-tracing calculations are still rasterised on to the screen. Raster performance is GPU performance, period. Ray-Tracing is just a calculation that is done by the card to know where to rasterise shadows and reflections. They are not two different ways of putting something on the screen, putting anything on the screen is rasterisation. RT is just a modification of the instructions that the rasteriser follows. It is not, in itself, an indication of video card performance any more than having a great sound system is an indication of a car's performance. It is a frill that the device could run perfectly fine without.

The performance of a card is how many frames it can rasterise per second. RT has absolutely nothing to do with the actual image drawing. If one card rasterises faster than the other, it is a faster card, end of story. Sure, some noobs care about RT (that doesn't bother me much) and that's fine. You'll notice that most people who have been gaming for at least 10 years find RT completely underwhelming because we've seen REAL game-changers like hardware tessellation.

What blows my mind though is when I read people talking about halo-level cards and they bring up things like FSR and/or DLSS. Upscaling technologies like FSR and DLSS are completely irrelevant to these cards because they don't yet need a performance uplift of any kind. By the time they do (I expect at least four years down the road), there's no way to know what the landscape will look like. Maybe by then FSR will be better than DLSS, maybe Intel's XeSS will be better than both or maybe things will be exactly as they are right now. The point is that with absolutely no way of knowing, it's beyond stupid to use them as criteria for which video card is worth paying for and which isn't.
You're right. I have been playing for the last 28 years and right now I can't understand why RT is so important. I saw it in action and it didn't blow my mind. Just a gimmick.
 
Let's say the RTX 4080 comes out to be exactly 20% faster than the RX 7900 XTX in standard 4K testing
And why make that assumption?
Because again, Nvidia is simply that all mighty?
Might as well say that a 1650 beats the 7900XTX at 8K, because at this point, statements like that wouldnt surprise me anymore.

Without reviews, we cant do much, but its sad to already err on favor of nvidia when preliminary numbers are pointing for the 7900XTX to actually be faster, but no, AMD always has to get the short end of the stick...
 
The effects of the ray-tracing calculations are still rasterised on to the screen. Raster performance is GPU performance, period. Ray-Tracing is just a calculation that is done by the card to know where to rasterise shadows and reflections.
That's not quite how rendering works. Rasterization is the process of converting the 3D volumetric world of vertices into a 2D array of pixels. That's it, there's nothing else involved. No part of rasterization involves where pixels go or even what color they have. That's all done after the rasterization has taken place.

That process is done per primitive in the world view and each pixel is colored on the basis of results from pixel shaders, compute shaders, and ray shaders; sometimes just one of them, sometimes all three. Unfortunately, the term rasterization has become synonymous with just pixel+compute shading, as a means to somehow separate it from ray tracing.
 
You're right. I have been playing for the last 28 years and right now I can't understand why RT is so important. I saw it in action and it didn't blow my mind. Just a gimmick.
Same here, first game I played was Pong!

Have experienced pretty much everything since and here I am, still trying to figure out why on Earth everyone is going bananas after RT.

I simply dont see it and just justify the ridiculous performance hit that imposes on these GPU's.

Only sample that I can say "impressed" me was the Quake demo, but again, doesnt justify the hardware requirement and performance hit.
 
Sadly, it looks like we lost Steven in the same way that we lost Tim and Digital Foundry.

Pour one for our fallen homies.

That said, very interesting how the 6900 and the 6950 are always either close or faster than the 3090 and the 3090Ti, yet you never hear them mentioned or used in any of the videos and reviews.
Except of course when the gimmick of RT is mentioned, then AMD exist and its just to push the nvidia agenda.
You know what else is funny about that? Those two cards are the WORST value Radeon cards out there. Remember that the 6900 XT is only 9% faster than the 6800 XT and the 6950 XT is only 15% faster than the 6800 XT. Meanwhile the 6900 XT is 27% more expensive than the 6800 XT and the 6950 XT is even worse as it's 61% more expensive! I think that sites ignore the 6800 XT because nVidia tells them to. They don't want people to be reminded that there's a card out there that rivals the RTX 3080 for only $515USD! They don't want people to see just how well the 6800 XT performs at its price because they know very well that it's the best-value video card of its generation! If people see that the 6800 XT is perfectly fine for their needs and then see the price, nVidia's sales numbers would drop like a semi going over a cliff. Just check this out (credit to Guru3D for having the most comprehensive video card comparison lists):
index.php

index.php

RTX 4090 = $1600 (if you're lucky)
RTX 4080 = $1200 (if you're DAMN lucky)
RX 6950 XT = $890
RX 6900 XT = $655
RX 6800 XT = $515

Do you see what I mean? The 6900 XT and 6950 XT are NOT good deals, they just LOOK good compared to nVidia. Compared to the 6800 XT, they're just plain bad (and nVidia is just plain terrible).
And yes, 90 out of 100 for this overpriced fire risk heater.......

fc9676181918f1cb612feafb74e62f7098004350c70096173c39bd43e0aca4ea.gif
Yeah giving this card a 90% is an absolute joke. My level of respect for this site's "impartiality" has plummetted. I always saw Steve Walton as a properly impartial reviewer but after his omission that nVidia was involved in the creation of the Spider-Man game and now this...

There is no denying the reality of the situation. It's very clear that nVidia put the fear of God into Steve Walton. It brings me no joy to say that. ☹️
 
That's not quite how rendering works. Rasterization is the process of converting the 3D volumetric world of vertices into a 2D array of pixels. That's it, there's nothing else involved. No part of rasterization involves where pixels go or even what color they have. That's all done after the rasterization has taken place.

That process is done per primitive in the world view and each pixel is colored on the basis of results from pixel shaders, compute shaders, and ray shaders; sometimes just one of them, sometimes all three. Unfortunately, the term rasterization has become synonymous with just pixel+compute shading, as a means to somehow separate it from ray tracing.
Nevertheless, raster performance is video card performance. RT performance is not.
 
Take pricing out of the picture for your scoring of the card and the card does deliver solid performance.
And just HOW does one take pricing out of the picture? The very suggestion of it is a suggestion to completely ignore one of the most relevant parts of reality. I'm sorry but once you said that, the rest of the post just fell apart.
 
Nevertheless, raster performance is video card performance. RT performance is not.
Apart from the fact that the ray shaders themselves are processed by the same units that process pixel and compute shaders. The dedicated RT units in AMD, Intel, and Nvidia graphics cards do two primary things - handle ray-triangle intersection calculations and the traversal algorithms for hunting through the bounding volume hierarchies to figure out which primitive the ray is intersecting with. All the rest is done by the 'normal' parts of the GPU.
 
I always saw Steve Walton as a properly impartial reviewer but after his omission that nVidia was involved in the creation of the Spider-Man game and now this...

There is no denying the reality of the situation. It's very clear that nVidia put the fear of God into Steve Walton. It brings me no joy to say that. ☹️
From latest reviews I felt that not only HU has lost it, but also LTT, JTC and GN a while back.
Looks like all youtubers are on Nvidia payroll since a while. There a few left that remain balanced but I can count them using one hand.
 
That's no excuse. New generations always brought new features no or little added cost. SLI, 3D Vision, Surround, PhysX, G-Sync, Reflex, DLSS, ShadowPlay, ray tracing, among others, were all features Nvidia brought in with new architectures without charging ridiculous amounts of money for it. Every generation also made higher resolutions more viable and and improved efficiency/thermals too. New features and better efficiency isn't something new that the 4000 series invented.



No, it wasn't. There was nothing incredible about the price of the 3080. It was merely what was expected. In fact, it's already an increase over the ~$600 or so that 80-tier cards would launch at previously.

The 3080 wasn't "incredible", it was merely a return to the norm. Which highlights how atrocious the 4080 is now.
I've only tried to explain my reasoning for purchasing the card, not whether NVIDIA's business and pricing model is fair to everyone. And you may not remember very well, but when the 3080 pricing was announced we FPS hungry geeks rejoiced. We expected pricing much higher than we got. It didn't last long, of course, but it was incredible at the moment.
 
I don't think that you understand something. EVERYTHING is raster, including ray-tracing. The effects of the ray-tracing calculations are still rasterised on to the screen. Raster performance is GPU performance, period. Ray-Tracing is just a calculation that is done by the card to know where to rasterise shadows and reflections. They are not two different ways of putting something on the screen, putting anything on the screen is rasterisation. RT is just a modification of the instructions that the rasteriser follows. It is not, in itself, an indication of video card performance any more than having a great sound system is an indication of a car's performance. It is a frill that the device could run perfectly fine without.

The performance of a card is how many frames it can rasterise per second. RT has absolutely nothing to do with the actual image drawing. If one card rasterises faster than the other, it is a faster card, end of story. Sure, some noobs care about RT (that doesn't bother me much) and that's fine. You'll notice that most people who have been gaming for at least 10 years find RT completely underwhelming because we've seen REAL game-changers like hardware tessellation.

What blows my mind though is when I read people talking about halo-level cards and they bring up things like FSR and/or DLSS. Upscaling technologies like FSR and DLSS are completely irrelevant to these cards because they don't yet need a performance uplift of any kind. By the time they do (I expect at least four years down the road), there's no way to know what the landscape will look like. Maybe by then FSR will be better than DLSS, maybe Intel's XeSS will be better than both or maybe things will be exactly as they are right now. The point is that with absolutely no way of knowing, it's beyond stupid to use them as criteria for which video card is worth paying for and which isn't.
I disagree with you on DLSS and high end cards. I have never had a halo level card, but I've had a 3080 for a while. DLSS has been critical in many games to reach the 165 fps at 1440p. There is no way around that fact.
 
And you may not remember very well, but when the 3080 pricing was announced we FPS hungry geeks rejoiced.

They rejoiced because it was following the RTX 2000 series, which was notoriously disappointing. Not because it was some "incredible, never-before-seen" value. It was simply a return to the value that was expected from the Pascal, Maxwell, Kepler days.

We expected pricing much higher than we got.

This is straight up wrong. The pricing ($500 70-tier card, $700 80-tier card) was pretty much in line with what was expected. Again, by that point in time the $800 RTX 2080 and $530 RTX 2070 were the highest priced cards we had ever gotten at launch for those respective tiers, and Nvidia caught a lot of flak for it. Many reviews straight up told readers to not bother with the RTX 2080 and just get a GTX 1080 Ti instead. Nobody had reason to believe Nvidia would push prices further up at that time, and they predictably didn't.

What nobody expected was the performance jump we got. That's what people were being cynical about, not the price. People expected a disappointing increase in performance at the same price points, as had been the case with the 2000 series. So everyone was surprised by the $500 3070 matching the $1200 2080 Ti. That's the part that was rejoiced, not the fact the price was what it was.

It seems you're the one who doesn't remember it very well.
 
I don't think that you understand something. EVERYTHING is raster, including ray-tracing. The effects of the ray-tracing calculations are still rasterised on to the screen. Raster performance is GPU performance, period. Ray-Tracing is just a calculation that is done by the card to know where to rasterise shadows and reflections. They are not two different ways of putting something on the screen, putting anything on the screen is rasterisation. RT is just a modification of the instructions that the rasteriser follows. It is not, in itself, an indication of video card performance any more than having a great sound system is an indication of a car's performance. It is a frill that the device could run perfectly fine without.

The performance of a card is how many frames it can rasterise per second. RT has absolutely nothing to do with the actual image drawing. If one card rasterises faster than the other, it is a faster card, end of story. Sure, some noobs care about RT (that doesn't bother me much) and that's fine. You'll notice that most people who have been gaming for at least 10 years find RT completely underwhelming because we've seen REAL game-changers like hardware tessellation.

What blows my mind though is when I read people talking about halo-level cards and they bring up things like FSR and/or DLSS. Upscaling technologies like FSR and DLSS are completely irrelevant to these cards because they don't yet need a performance uplift of any kind. By the time they do (I expect at least four years down the road), there's no way to know what the landscape will look like. Maybe by then FSR will be better than DLSS, maybe Intel's XeSS will be better than both or maybe things will be exactly as they are right now. The point is that with absolutely no way of knowing, it's beyond stupid to use them as criteria for which video card is worth paying for and which isn't.
if the feature is used in games and makes it look better then its not a frill or pointless feature.

youre missing my point, if im gonna drop a grand or more on a graphics card, the top tier options, then I want the best one possible, the one that will handle whatever a game can throw at it to the best of its abilities. A person could offer me a truck with 1000hp or sportscar with less power but I'd pick the car because I wanna be able to do more than a straight line.

if my goal is to run a game at ultra which it is I dont care that one card/brand does great at one thing then falters at all the others, if its competitor has the ability to run all those frills instead then in my eyes its the winner, sure to some those features are pointless but I'd rather be able to utilize them than talk myself out of regretting a purchase later.

now if were talking about lower budget cards then I can agree with you, because those buyers probably dont care about all the bells and whistles, its why we have options, but at the top of hill space is limited so I want the best.
 
To be clear, Steve does not score products (he doesn't on HUB) but historically we've always scored products in TechSpot reviews (Steve's and others, for years), so that's an (always subjective) layer of editing after Steve wraps up his testing and before we publish the review.

With that said, I will add this...

* We usually score a product based on features, performance, value, competitors, innovation, etc.

* A 90/100 score doesn't mean everyone should buy it, but rather where we believe the product slots among its direct competitors.
I agree that under normal circumstances this logic is flawless. However in this case, it doesn't yet have any direct competitors to which it can be properly compared. I realise that this is a serious handicap when trying to score an item in an incomplete marketplace and I don't envy the task of scoring an item with that much guesswork involved.

Now, if the score on the article changes when it actually does have direct competitors, that would mitigate the problem. I must admit that I don't know if you do that already and if you do, I apologise for my ignorance of it.

The reason this is important is that in the future, someone could read this review and see that 90% (let's be honest, many people just read headlines or conclusions) which would mislead them into thinking that purchasing it would be a good decision. That would be bad for the reader and if they discover that it was a bad decision, they might not trust a techspot review going forward. I like techspot and I don't want that to happen.
* In the case of the RTX 4080, it's a very fast GPU, it's just too expensive for most. In terms of value it's not horrible, but it's not great either. As of writing, there is nothing else that delivers that level of performance (from the competition).
Sure, but being too expensive for most is a pretty big drawback, isn't it? To be fair, I can understand ignoring the price of a halo product (because those are supposed to be pie-in-the-sky) but this is not a halo product. The RTX 4090 is the halo product and this is nowhere near that level of performance. Thus, the value of the item should be addressed in how it's scored. That's just my opinion but I don't think that my logic is flawed.
* It's up to the consumer to decide if they want it/can pay for it. If not, there are alternatives. You will see us scoring other products (GPUs or otherwise) lower if we think they don't perform where they should within its segment/intended market, if they are not well built, are buggy, etc.
I agree that those are all important criteria but it's just as important to assess whether or not they perform where they should at their price point vis a vis other products of the same type in the marketplace. If a consumer sees a 90% score, they're more than likely going to believe that the asking price is justified. Then if they read the article and see that it's not, they're going to be confused. It's just a fact that people are still far more ignorant about computers than they are knowledgeable, on average.
* Needless to be said, we try to write fair reviews and don't play favorites with any company. If you don't like company A or company B, that's fine but we won't judge a product based on that kind of sentiment.
I completely agree with this. This is exactly how it should be done. I may have a hate-on for Intel and nVidia but if I were a reviewer, I would judge products completely on their objective merits and nothing else. This is something that I've always respected about techspot. This is something that I believe everyone here has always respected about techspot. The backlash here I think is because the optics of giving the RTX 4080 a 90% score doesn't exactly give the appearance of impartiality when the RTX 4080 has been almost universally panned by other reviewers.
One last comment not related to the review but GPUs in general (current and next generation)... GPU makers got spoiled by mining and scalpers pricing.
Yes they did, but not equally. One is definitely more spoiled than the other.
My hope is that kind of distortion won't return for the foreseeable future and if that happens the pricing and lifecycle of these products will have to change and possibly go back to where it was 3+ years ago. In other words, we'd nothing but love if these 4080/4090s and $1000+ GPUs become a thing of the past and we go back to the days where a mainstream GPU cost $200-250 and a high-end one would set you back no more than $500-600 (and less than that months after launch). But that's not true today.
Well said! We don't know if that's still not true today because if AMD continues the trend of cards being $100 less than their last-gen counterparts (7900 XTX = 6950 XT, 7900 XT = 6900 XT), then perhaps we'll see the RX 7800 XT at $550 which would pretty much guarantee that AMD wins this generation and prices would be almost back to normal (The Radeon HD 7870 was $412 so not too far off).

However, if they **** around and shoot themselves in the foot, that would suck for everyone. The RX 6000-series was their best chance but their pricing only served to snatch defeat from the jaws of victory. This time however, nVidia has really left themselves vulnerable and open to an AMD counterattack with this pricing structure. AMD hasn't had an opportunity like this since Fermi was delayed. Let's hope that they make the most of it, for all our sakes.
 
DLSS has been critical in many games to reach the 165 fps at 1440p
Do you know how that and FSR works?
They are "cheating" by taking a lower resolution image, upscaling then displaying it at what you think is native resolution.
Why? simple, look at your own answer, to increase FPS and then use that metric (as RT is being abused now) to boast some weird superiority beyond what perhaps the owner can do with it (FPS displayed by the monitor).
 
I disagree with you on DLSS and high end cards. I have never had a halo level card, but I've had a 3080 for a while. DLSS has been critical in many games to reach the 165 fps at 1440p. There is no way around that fact.
And how many games do you need to have 165fps in? Not everyone plays CS:GO or games like it. Most AAA titles are like Assassin's Creed, Tomb Raider, Uncharted or Far Cry. They're not PvP games and 165fps offers no real advantage. In your (very uncommon) position, sure, it makes sense for you. However, I am curious as to what game you're playing because at 1440p, Ampere was at a disadvantage when compared to RDNA2.
 
Last edited:
if the feature is used in games and makes it look better then its not a frill or pointless feature.
But the problem with RT (in my opinion) the required hardware doesnt justify the results.

but at the top of hill space is limited so I want the best.
If you can afford it, go wild.

But not everything that its ridiculously expensive is really worth it. Example, under normal circumstances, buying a GPU for 2K would never really provide a sensible investment (unless you make some money with it and even there it has limits).
If you do have FU money and dont value it, then again, go wild.
 
I've only tried to explain my reasoning for purchasing the card, not whether NVIDIA's business and pricing model is fair to everyone. And you may not remember very well, but when the 3080 pricing was announced we FPS hungry geeks rejoiced. We expected pricing much higher than we got. It didn't last long, of course, but it was incredible at the moment.
While I was happy with the price of the 3080 (the short... very short period it was at MSRP), we weren't really "rejoicing" because of that. Everybody was just happy that they could finally skip the 2000 series.
 
Back