Nvidia GeForce RTX 4080 Review: Fast, Expensive & 4K Capable All the Way

tl;dr:
1. the chip isn't 2x more expensive to make
That first estimate was a minimum. AMD's graph is a little too indistinct to draw any really accurate figures from it but there's a margin of error to estimate the cost scaling to be as high as 2.5 times. If that was the case, then AD103 chips would be 74% more expensive to fabricate than TU104s. The biggest unknown factor is just how much TSMC is charging for N4 production, compared to N5 and N6.
2. the RAM isn't 2x more expensive (let's not forget the price the 3090 with 24GB GDDR6X is currently selling at now)
No, it's not twice the cost, obviously. The 4080 and 3090 Ti do use different density GDDR6X modules than the 3090 (16Gb vs 8Gb) though, and the former uses faster modules (24Gbps vs 21 Gbps), so they will be more expensive. How much more so is another unknown.
That leaves the cooler, PCB and other components price which I also doubt that they're 2x more expensive (especially with the fairly decent power draw the card has).
I'm assuming that you're going with the twice as expensive comment because the MSRP of the 4080 is 72% more than that of the 3080 ($1199 vs $699) and estimated the production/purchasing costs can't account for that increase. But R&D, staffing, building, marketing, advertising, software development, packaging, storage, distribution, import fees, licensing fees, etc are all part of that price tag too. For all we know, Nvidia could well be experiencing much higher costs in all of those areas, but equally, they may not.

No matter how outraged one is about the price of the new GeForce cards, there's no real need to get worked up about it or expect some kind of protest by reviewers and the like. There's a far simpler solution: just don't buy it. And if enough people feel the same way about the price tags, then Nvidia will get the message in the clearest possible way -- lower revenues and smaller margins, regardless of how much of a market leader they are or how good their products are at what they do.
 
* We usually score a product based on features, performance, value, competitors, innovation, etc.

* A 90/100 score doesn't mean everyone should buy it, but rather where we believe the product slots among its direct competitors.


* It's up to the consumer to decide if they want it/can pay for it. If not, there are alternatives. You will see us scoring other products (GPUs or otherwise) lower if we think they don't perform where they should within its segment/intended market, if they are not well built, are buggy, etc.

*
Looking forward to the score you give to the $1,000 RX 7900XTX that according to AMD is a 4080 competitor.

Now we have no idea how it will perform but the review should show that.
 
Relax dude. Lamborghinis aren't necessary, or a good value, either. But that doesn't mean they "shouldn't exist".

Stop projecting your frustrations with personal income on the rest of the world. If you can't afford it, sorry. But it doesn't mean a product is some sort of an abomination that should be banned. It just means it's not in your future for the moment. DEAL

LMAO

Lamborghini? What are you smoking? This is s graphics card, not a luxury product. Nobody will admire you or give you "social clout" or whatever for having a RTX 4000 card.

And yes, it literally shouldn't exist. When put into the context where every GPU generation before this one both increased the maximum amount of performance available AND increased performance/$ in every segment at the same time (even the RTX 2000 series did it to some degree), the 4000 series, so far, is a huge failure. New GPUs shouldn't just infinitely increase in price every generation, they should replace the older GPUs in the same-ish price brackets, which is how the GPU market has operated in the past two decades.

Also, about the pathetic quip about personal income: The 4080 itself is irrelevant. $1000+ GPUs are a minuscule, insignificant portion of the GPU market. This product is inconsequential to the vast, overwhelming majority of the market. The issue highlighted in my comment is not about this product in particular, it's that the whole reason Nvidia is launching this particular product at a price that everyone, including themselves, knows is ridiculous is because this is their attempt to keep the RTX 3000 series at artificially inflated prices. That affects the market segments that are not irrelevant, like the RTX 3060 Nvidia is asking almost $400 for, or the RTX 3050 that Nvidia is asking a hilarious $300 for. Nvidia is launching 4000 series cards at ridiculous prices in an attempt to make leftover 3000 stock appear to be a better deal than it is. So even though the 4080 is an irrelevant product as far as actual market presence goes, it not receiving a proper $700/$800 or so price means it won't push the 3080 down, which won't push the 3070 down, which won't push the 3060 down, and so on. And THAT affects the products people actually buy.
 
No matter how outraged one is about the price of the new GeForce cards, there's no real need to get worked up about it or expect some kind of protest by reviewers and the like. There's a far simpler solution: just don't buy it. And if enough people feel the same way about the price tags, then Nvidia will get the message in the clearest possible way -- lower revenues and smaller margins, regardless of how much of a market leader they are or how good their products are at what they do.

Your "just don't buy it" idea has a problem. It assumes that both 1) consumers are always well informed, and 2) consumers always act rationally and on their best interest. But that's not true, most consumers are casual shoppers that aren't (nor should have the obligation to be) expertly informed about the market conditions they're buying in. That's why market regulations and regulatory agencies, consumer protection laws, and even product reviewers came to exist. Consumers need protections, as unregulated/unprotected market always leads to abusive, anti-consumer behavior by corporations. This idea that "the free market self regulates" is not compatible with reality, it's the delusion of libertarian teeganers.

And yes, as this isn't a tightly regulated market, we need reviewers to talk about this. Nvidia is using absurd launches like this coupled with their equally ridiculous "moore's law is dead, cheap GPUs are a thing of the past" in a clear attempt to use their dominant position to fix prices and hold them to artificially inflated levels. Tech news outlets whined for years about Intel's stagnation, a result of them using their dominant position to sabotage the competition and then inflate the prices of mediocre products ("would anyone like another $350 4-core flagship CPU this year, anyone?"). But now that Nvidia started going down the same path, it's all fine and dandy? The market will "self-regulate and stop the abusive practices", like it did with Intel?
 
That first estimate was a minimum. AMD's graph is a little too indistinct to draw any really accurate figures from it but there's a margin of error to estimate the cost scaling to be as high as 2.5 times. If that was the case, then AD103 chips would be 74% more expensive to fabricate than TU104s. The biggest unknown factor is just how much TSMC is charging for N4 production, compared to N5 and N6.

No, it's not twice the cost, obviously. The 4080 and 3090 Ti do use different density GDDR6X modules than the 3090 (16Gb vs 8Gb) though, and the former uses faster modules (24Gbps vs 21 Gbps), so they will be more expensive. How much more so is another unknown.

I'm assuming that you're going with the twice as expensive comment because the MSRP of the 4080 is 72% more than that of the 3080 ($1199 vs $699) and estimated the production/purchasing costs can't account for that increase. But R&D, staffing, building, marketing, advertising, software development, packaging, storage, distribution, import fees, licensing fees, etc are all part of that price tag too. For all we know, Nvidia could well be experiencing much higher costs in all of those areas, but equally, they may not.

No matter how outraged one is about the price of the new GeForce cards, there's no real need to get worked up about it or expect some kind of protest by reviewers and the like. There's a far simpler solution: just don't buy it. And if enough people feel the same way about the price tags, then Nvidia will get the message in the clearest possible way -- lower revenues and smaller margins, regardless of how much of a market leader they are or how good their products are at what they do.
All of that was already included in the price of the previous gen. Why would things like r&d suddenly make a huge difference in pricing this gen? (and the server side is usually the one with the highest profit margins and where most of the R&D goes)

And it doesn't matter that the ram chips are more expensive, it still has 8GB less GDDR6X RAM. Nvidia is most likely actually saving a few bucks here, especially since they are using a smaller bus size. Is 16GB @ 22.4Gbps on a 256bit bus really more expensive than 24GB @ 19.5Gbps on a 384bit bus?

As for the BOM, it's generally just the larger cooler that is more expensive. Power delivery (which is usually a good chunk of the BOM) can't be that much more expensive since it's not a monster like the 4090.

In fact, everything here is based on the process node being much more expensive. According to TSMC the N5 process node has fewer defects than what N7 had at the same point in time of its life. So yields should in theory not be bad.

With 4080's die size being almost half the die size of the 3080/3090, the excuse of double the cost of a wafer also falls apart.


We are, by the numbers, looking at a huge increase in price that isn't justified by the BOM and other production costs. Nvidia knows it can sell all GPUs at these high prices and they also doesn't want to cut the price of the 3000 series any more.
 
Last edited:
* In the case of the RTX 4080, it's a very fast GPU, it's just too expensive for most. In terms of value it's not horrible, but it's not great either. As of writing, there is nothing else that delivers that level of performance (from the competition).
Thank you for at least coming right out and saying that the first cards of each generation will enjoy a permanent buff to their review score, because you won't come back and re-evaluate. First to market wins forever. The honesty is refreshing.
 
All of that was already included in the price of the previous gen. Why would things like r&d suddenly make a huge difference in pricing this gen?
The 'why' is anyone's guess, but Nvidia's financial statements show R&D costs in 2022 being double that of 2021. Their long-term debts are significantly higher too.
And it doesn't matter that the ram chips are more expensive, it's still 8GB less GDDR6X RAM. Nvidia is most likely actually saving a few bucks here, especially since they are using a smaller bus size. Is 16GB @ 22.4Gbps on a 256bit bus really more expensive than 24GB @ 19.5Gbps on a 384bit bus?
At face value, one wouldn't expect 24 chips of slower GDDR6X to be cheaper than 8 chips of faster GDDR6X, but the chips aren't the same - the latter is double the density of the former. It could be harder/more expensive for Micron to fabricate them, but it might not be. The point is we simply don't know.
In fact, everything here is based on the process node being much more expensive. According to TSMC the N5 process node has fewer defects than what N7 had at the same point in time of its life. So yields should in theory not be bad.
Yes, in theory, and TSMC themselves have said that the N4 process has the same yields as N5. However, that's a same-die comparison. We don't know anything about what differences Nvidia has requested for the fabrication (older architectures, such as Pascal, used an NV-specific node) nor do we know how sensitive the die design is to defects. All we do know is that a full AD103 chip has 84 SMs, whereas those used in the 4080 have 76; it also has the full amount of L2 cache (64MB), so we're looking at dies that are coming out at the top end of the binning process (unlike the first 3080 which used fairly middling GA102 dies).
With 4080's die size being almost half the dies size of the 3080/3090, the excuse of double the cost of a wafer also falls apart.
I never said that the AD103 was double the cost of the GA102. Read again: I said that it was 40% to 70% more than the cost of the TU104. The GA102 was made on Samsung's old 8nm node, for which we have no cost comparisons.
We are, by the numbers, looking at a huge increase in price that isn't justified by the BOM and other production costs.
What numbers? We don't actually have any actual figures for any of these things - just guesses, estimates, and assumptions. But no matter: you and many other people believe that the price of the 4080/4090 isn't justifiable, and that's absolutely fine. I have no problem with such views, and if millions of people around the world don't buy any RTX 40 series card because of the price, then good on them.

I'm simply looking at this from a professional perspective. There's enough evidence around to strongly suggest that the design and manufacturing of graphics cards are notably more expensive than it was, say, four years ago. AMD made a point of highlighting this in the RX 7000 launch, and many slides in their presentations were dedicated to the point of designing everything to keep costs down. Nvidia has taken a different approach and has decided to pass those on to the buyer.

Time, and financial reports, will tell us which approach is better for the companies and consumers.
 
The 'why' is anyone's guess, but Nvidia's financial statements show R&D costs in 2022 being double that of 2021. Their long-term debts are significantly higher too.

At face value, one wouldn't expect 24 chips of slower GDDR6X to be cheaper than 8 chips of faster GDDR6X, but the chips aren't the same - the latter is double the density of the former. It could be harder/more expensive for Micron to fabricate them, but it might not be. The point is we simply don't know.

Yes, in theory, and TSMC themselves have said that the N4 process has the same yields as N5. However, that's a same-die comparison. We don't know anything about what differences Nvidia has requested for the fabrication (older architectures, such as Pascal, used an NV-specific node) nor do we know how sensitive the die design is to defects. All we do know is that a full AD103 chip has 84 SMs, whereas those used in the 4080 have 76; it also has the full amount of L2 cache (64MB), so we're looking at dies that are coming out at the top end of the binning process (unlike the first 3080 which used fairly middling GA102 dies).

I never said that the AD103 was double the cost of the GA102. Read again: I said that it was 40% to 70% more than the cost of the TU104. The GA102 was made on Samsung's old 8nm node, for which we have no cost comparisons.

What numbers? We don't actually have any actual figures for any of these things - just guesses, estimates, and assumptions. But no matter: you and many other people believe that the price of the 4080/4090 isn't justifiable, and that's absolutely fine. I have no problem with such views, and if millions of people around the world don't buy any RTX 40 series card because of the price, then good on them.

I'm simply looking at this from a professional perspective. There's enough evidence around to strongly suggest that the design and manufacturing of graphics cards are notably more expensive than it was, say, four years ago. AMD made a point of highlighting this in the RX 7000 launch, and many slides in their presentations were dedicated to the point of designing everything to keep costs down. Nvidia has taken a different approach and has decided to pass those on to the buyer.

Time, and financial reports, will tell us which approach is better for the companies and consumers.
"I'm simply looking at this from a professional perspective."

And I gave exactly such a perspective looking at the numbers we have available. Saying that Nvidia increased by 40 to 70% the cost of a GPU die that is half the size of the older one is just not realistic.

We also know that the full AD103 die has at least 10752 shader units while the 4080 has 9728. So it's not the full die. Nvidia has plenty of room for a future 4080ti (a more than 10% increase in cores).

As for the RAM. It doesn't matter if we don't know the exact figures. We can easily infer that the cost per GB didn't double (I doubt that it even grew even close to 50%). It's the same technology, I've never seen higher density memory chips having exorbitant price premiums over the others (in terms of $/GB). We haven't yet switched to GDDR7 and the 4080 still has a lot less VRAM than the 3090.

Quick math: even if we assume a huge 50% price increase per GB, that would just put the 4080 at parity in terms of cost with the 3090. (not taking into account the simplification of the PCB as they are using fewer chips and a smaller bus)

It's much more realistic to consider the price to be high so that it allows them to sell the older generation at a high price too.

According to this Nvidia didn't double their R&D:

In fact, the financial statement you gave, which was released in february, says that Nvidia had record revenue and insane profitability. The R&D increase is a natural result of their increased revenue and it also seems that the R&D increase (percentage wise) is nowhere near the revenue increase (YoY).

I'm seriously having a hard time understand your take on the price of the 4080. It's not a halo product, it's not a full die, it's not a big die, it doesn't have a lot of VRAM, it has a 256bit bus, it doesn't need a big and expensive power delivery system on the PCB. It has nothing... We are looking at a zero increase in perf/$. We don't need to speculate, it's clearly priced to make the older gen still appealing to consumers.
 
Last edited:
My local Micro Center still has inventory on hand for the 4080. They show 55 cards in stock as of this morning on their website.

Hopefully this is a showing that people aren't really looking for these cards. I think the downside to this, though, is that not everyone has a local brick and mortar store they can visit to pick one of these cards up so they have to rely on the online retailers.

None of the online retailers I looked at have any of the 4080s for sale at MSRP pricing. You can certainly find 3rd parties (on newegg) that are listing cards going for upwards of $200+ over MSRP....so those people are stuck waiting and waiting to see if they can find one online at MSRP or they take the dive and pay a scalper for one.
 
I think many people would say that they're features they don't personally care about enough to want to pay extra for. I currently have no intention of buying any 4xxx cards myself. But it seems unreasonable to cast the tech itself as irrelevant.

Upscaling is one of the best technologies/techniques to come out in a long time to help drive higher frame rates without having to sacrifice too much in the way of perceived image quality. If you're running on a low res monitor, sure it won't matter to you. For anyone running a high-refresh 4k monitor, it's fantastic and absolutely could be a deciding factor in which card you buy.

Ray-tracing *was* "fluffy junk" when the performance hits weren't worth the visual trade off. When (some) cards are now able to combine technologies to allow ray tracing and still produce great frame rates, why wouldn't you turn it on if you could? (assuming you place enough value on how pretty things look).

So yeh. I wouldn't write-off technologies that *both* companies are actively pursuing and improving upon. Just don't spend outside of your own needs/means/wants when weighed against what's available.
I know I'm in the extreme minority. I'll be in the the sub-1% group of gamers who purchase a 4090 or 4080. And that's something I'm perfectly cool with. But your point is absolutely the reason why. I am willing to spend that on a GPU because I want every single feature turned on. The performance of the 4080 is finally making RT at "Ultra" graphics settings acceptable on most games for my needs and wants. The 3080 did a decent job, but I still played with RT off in nearly every scenario. I absolutely hate the price of the 4080, but the massive performance uplift was enough for me to spend the dough.
 
Can't believe the score this GPU has gotten, so, TechSpot is actually recommending its readers that paying 70% over the same kind of product (RTX 3080) is the way to go. As of today, PC Gaming is a joke and PC oriented web sites should be trying to do something about it.
I mean, the thing is they are not the same product. The 3080 (which I currently have) is barely comparable at all to the 4080. They are the same type of product, but not even close when it comes to performance.
 
In my country the RTX4080 is almost 1.9-2k Euro VAT included.
But miners started to dump big on second hand market. A RTX3080 is 500-550 now.
 
Isn‘t that the case with almost every new GPU gen though ? The 3080 was much faster than the 2080 it replaced but at the same msrp.
You're not wrong. But the power consumption, thermals and lack of true effectiveness at running RT on any resolution not to mention bleeding edge resolutions put the 3080 in a sub-par category. This is just my opinion. The MSRP of the original 3080 was incredible, and I only hoped that would continue. Alas, that does not seem like it will ever be the case moving forward. Honestly, though, Nvidia has simply replace the product slot of what use to be the xx80 series with the xx70 series. I don't think NVIDIA really intends to compare xx80 series with previous xx80 series anymore. The comparisons with the 4080 now are over the 3090 and 3090ti. The 4070 model pricing will reflect that again, as it did with the 3080.
 
LMAO

Lamborghini? What are you smoking? This is s graphics card, not a luxury product. Nobody will admire you or give you "social clout" or whatever for having a RTX 4000 card.

And yes, it literally shouldn't exist. When put into the context where every GPU generation before this one both increased the maximum amount of performance available AND increased performance/$ in every segment at the same time (even the RTX 2000 series did it to some degree), the 4000 series, so far, is a huge failure. New GPUs shouldn't just infinitely increase in price every generation, they should replace the older GPUs in the same-ish price brackets, which is how the GPU market has operated in the past two decades.

Also, about the pathetic quip about personal income: The 4080 itself is irrelevant. $1000+ GPUs are a minuscule, insignificant portion of the GPU market. This product is inconsequential to the vast, overwhelming majority of the market. The issue highlighted in my comment is not about this product in particular, it's that the whole reason Nvidia is launching this particular product at a price that everyone, including themselves, knows is ridiculous is because this is their attempt to keep the RTX 3000 series at artificially inflated prices. That affects the market segments that are not irrelevant, like the RTX 3060 Nvidia is asking almost $400 for, or the RTX 3050 that Nvidia is asking a hilarious $300 for. Nvidia is launching 4000 series cards at ridiculous prices in an attempt to make leftover 3000 stock appear to be a better deal than it is. So even though the 4080 is an irrelevant product as far as actual market presence goes, it not receiving a proper $700/$800 or so price means it won't push the 3080 down, which won't push the 3070 down, which won't push the 3060 down, and so on. And THAT affects the products people actually buy.
For me, this is absolutely a luxury product. It's my hobby, and my only hobby. There is no logical reason for me to own a 4080 no matter how much I game. But I can afford it, and I want the best that I am willing to pay for. I'll admit, this pricing is more than I wanted to pay, and I am definitely at the ceiling for a "luxury" graphics card.
 
I mean, the thing is they are not the same product. The 3080 (which I currently have) is barely comparable at all to the 4080. They are the same type of product, but not even close when it comes to performance.

The RTX 3080 was 66% faster than the RTX 2080 for 12.5% LESS money ($700 vs $800 MSRP at launch respectively).

The RTX 4080 is 49% faster than the RTX 3080 for 71% more money ($700 vs $1200).

Do you see where the problem is?

It goes even further. The 2080 was 37% faster than the 1080 for 33% more money ($800 vs $600), and that was regarded as a widely disappointing launch. The 1080, in turn, was 66% faster than the 980 for 10% more money ($600 vs $550). The 980 was 30% faster than the GTX 780 for 15% less money ($550 vs $650).

The 4000 series is the first time ever a new generation of GPUs has regressed in performance/$. Not even the 2000 series was this bad.
 
I know I'm in the extreme minority. I'll be in the the sub-1% group of gamers who purchase a 4090 or 4080. And that's something I'm perfectly cool with. But your point is absolutely the reason why. I am willing to spend that on a GPU because I want every single feature turned on. The performance of the 4080 is finally making RT at "Ultra" graphics settings acceptable on most games for my needs and wants. The 3080 did a decent job, but I still played with RT off in nearly every scenario. I absolutely hate the price of the 4080, but the massive performance uplift was enough for me to spend the dough.

Does it bother you that Nvidia hasn't actually improved RT performance over Ampere?

When you put RT on for Ampere, you get around a 40-45% hit to performance.
When you pur RT on for Ada, you get around a 40-45% hit to performance.

Nvidia made no strides in performance improvement with RT over Ampere. The only reason it appears to be better is simply because of the improved rasterization performance.

I'm just curious to see what you think about it.
 
Does it bother you that Nvidia hasn't actually improved RT performance over Ampere?

When you put RT on for Ampere, you get around a 40-45% hit to performance.
When you pur RT on for Ada, you get around a 40-45% hit to performance.

Nvidia made no strides in performance improvement with RT over Ampere. The only reason it appears to be better is simply because of the improved rasterization performance.

I'm just curious to see what you think about it.
Well, my initial thought would be that even if the hit is 40-45%, the overall performance now allows for those RT'd games to be played at a resolution and framerate I would want in order to turn those features on. I'd say that my interest in these products really stops at the performance I can get with the games I play at the settings I want to play them. Sure, it would be phenomenal if the RT hit in this new gen was, say, 25%. I'd find that super interesting and impressive. But the price would be the same if not more. If my interest in these products centered more on technological advancement in the industry and less on actual gaming, I'd have a bigger issue with the 40-45% RT hit not improving.
 
Well, my initial thought would be that even if the hit is 40-45%, the overall performance now allows for those RT'd games to be played at a resolution and framerate I would want in order to turn those features on. I'd say that my interest in these products really stops at the performance I can get with the games I play at the settings I want to play them. Sure, it would be phenomenal if the RT hit in this new gen was, say, 25%. I'd find that super interesting and impressive. But the price would be the same if not more. If my interest in these products centered more on technological advancement in the industry and less on actual gaming, I'd have a bigger issue with the 40-45% RT hit not improving.
One more bit about why I was willing to pay for a 4080 at this admittedly high price: When I built my first gaming PC about eight years ago, I purchased a used 1050ti 2 GB. I traded that in for a 1060 3 gb. I traded that in for a 2070ti (which a discount from a friend). Then I sold that for a 2080ti (with a discount from a friend). Then I sold that for the incredibly priced 3080 and made a profit. I will sell my 3080 to a friend for a reasonable price, most likely $500. So in my mind, I'm paying a more reasonable price to upgrade to the 4080. If I did not have an investment with the 3080, I would absolutely not buy the 4080. No way.
 
I mean, the thing is they are not the same product. The 3080 (which I currently have) is barely comparable at all to the 4080. They are the same type of product, but not even close when it comes to performance.

It was Nvidia's mistake not to call the GPUs differently...

They could have gone with the $1600 RTX 4000 Titan
$1200 RTX 4090
$900 RTX 4080 ... just to satisfy the AMD fanboys.
I'm being sarcastic, of course.

Coming off my previous comment, we can vote with our wallets.
I hope next-gen Radeon do extremely well, so they become the better buys for less (and eventually Nvidia will be forced to reduce pricing, just as it used be for a decade before mining was ever part of the equation).
 
The RTX 3080 was 66% faster than the RTX 2080 for 12.5% LESS money ($700 vs $800 MSRP at launch respectively).

The RTX 4080 is 49% faster than the RTX 3080 for 71% more money ($700 vs $1200).

Do you see where the problem is?

It goes even further. The 2080 was 37% faster than the 1080 for 33% more money ($800 vs $600), and that was regarded as a widely disappointing launch. The 1080, in turn, was 66% faster than the 980 for 10% more money ($600 vs $550). The 980 was 30% faster than the GTX 780 for 15% less money ($550 vs $650).

The 4000 series is the first time ever a new generation of GPUs has regressed in performance/$. Not even the 2000 series was this bad.
I understand what you're saying. But what I'm pointing to in not pure speed but rather all of the other features possible with the 4080 over the 3080 vs the 3080 over the 2080. While the "pure" speed is somewhere around 50% faster, in many cases more, the ability to handle RT at 1440P and 4K along with improved DLSS performance, power efficiency and essentially the same or better thermal management are definitely factors in the purchase decision for me.

I will never tell you the 4080 price is a better value than the 3080. The original 3080 MSRP was incredible at the time.
 
It was Nvidia's mistake not to call the GPUs differently...

They could have gone with the $1600 RTX 4000 Titan
$1200 RTX 4090
$900 RTX 4080 ... just to satisfy the AMD fanboys.
I'm being sarcastic, of course.

Coming off my previous comment, we can vote with our wallets.
I hope next-gen Radeon do extremely well, so they become the better buys for less (and eventually Nvidia will be forced to reduce pricing, just as it used be for a decade before mining was ever part of the equation).
I completely agree. This 4080 is not intended to be compared with the 3080. That's just a fact. This is the new xx90 series. I also agree with voting with your wallet. And it will happen. The 4080 is not going to sell well. Period.
 
I understand what you're saying. But what I'm pointing to in not pure speed but rather all of the other features possible with the 4080 over the 3080 vs the 3080 over the 2080. While the "pure" speed is somewhere around 50% faster, in many cases more, the ability to handle RT at 1440P and 4K along with improved DLSS performance, power efficiency and essentially the same or better thermal management are definitely factors in the purchase decision for me.

That's no excuse. New generations always brought new features no or little added cost. SLI, 3D Vision, Surround, PhysX, G-Sync, Reflex, DLSS, ShadowPlay, ray tracing, among others, were all features Nvidia brought in with new architectures without charging ridiculous amounts of money for it. Every generation also made higher resolutions more viable and and improved efficiency/thermals too. New features and better efficiency isn't something new that the 4000 series invented.

I will never tell you the 4080 price is a better value than the 3080. The original 3080 MSRP was incredible at the time.

No, it wasn't. There was nothing incredible about the price of the 3080. It was merely what was expected. In fact, it's already an increase over the ~$600 or so that 80-tier cards would launch at previously.

The 3080 wasn't "incredible", it was merely a return to the norm. Which highlights how atrocious the 4080 is now.
 
Back