AMD Radeon RX 7900 XT vs. Radeon RX 6800 XT

At $800 I still feel the 7900xt has some room to come down in price, at $750 we'd have a pretty solid winner. The 6800XT is still a great card so I would have a hard time recommending upgrading from it in general. I have a 6700XT and I would very much like to get a 7900XT in the $700-750 price range. I have a native 4k display so the 6700xt is workable with some settings tweaks but it's far from an ideal experience.
 
Last edited:
$750, really? More like $600. Seriously. I wish there was a cost break down per card of what it costs AMD to produce one. You would be utterly shocked.

I know there is a lot of overhead involved with that, staff, R&D, everything. But the more cards they sell the less it costs per card.
 
$750, really? More like $600. Seriously. I wish there was a cost break down per card of what it costs AMD to produce one. You would be utterly shocked.

I know there is a lot of overhead involved with that, staff, R&D, everything. But the more cards they sell the less it costs per card.
I agree, I think if the market was the one we had pre-pandemic then it would be $500 or less but this is the market we're in and these companies are not our friends. Their sole purpose is to make money and you have the scalpers to thank for showing them people were willing to pay such high prices.
 
At $800 I still feel the 7900xt has some room to come down in price, at $750 we'd have a pretty solid winner.
AMD will be reluctant to do this -- the operating margin in its Gaming sector isn't great (16% in Q4 2022, 14% for the whole year). The bulk of that sector's revenue comes from PS5, Steam Deck, and Xbox Series X/S SoC sales, but given that these are already low margin products, significantly dropping the prices on dGPUs will only hurt that sector even more.

That said, at least it actually made some operating income in Q4 -- the Client sector (I.e. Ryzen) actually made a loss.

The polls make for interesting reading and I feel that a reasonable portion of people have unrealistic expectations concerning performance improvements with new GPUs. An increase of 50 to 60% would require substantial changes in architecture design and chip size.
 
AMD will be reluctant to do this -- the operating margin in its Gaming sector isn't great (16% in Q4 2022, 14% for the whole year). The bulk of that sector's revenue comes from PS5, Steam Deck, and Xbox Series X/S SoC sales, but given that these are already low margin products, significantly dropping the prices on dGPUs will only hurt that sector even more.

That said, at least it actually made some operating income in Q4 -- the Client sector (I.e. Ryzen) actually made a loss.

The polls make for interesting reading and I feel that a reasonable portion of people have unrealistic expectations concerning performance improvements with new GPUs. An increase of 50 to 60% would require substantial changes in architecture design and chip size.
There is a lot of really cool tech in these cards so I'm not surprised that their margins are low. Hopefully since they already invested in the R&D in their chiplet GPU designs we'll see higher margins and lower prices in later generations
 
AMD will be reluctant to do this -- the operating margin in its Gaming sector isn't great (16% in Q4 2022, 14% for the whole year). The bulk of that sector's revenue comes from PS5, Steam Deck, and Xbox Series X/S SoC sales, but given that these are already low margin products, significantly dropping the prices on dGPUs will only hurt that sector even more.

That said, at least it actually made some operating income in Q4 -- the Client sector (I.e. Ryzen) actually made a loss.

The polls make for interesting reading and I feel that a reasonable portion of people have unrealistic expectations concerning performance improvements with new GPUs. An increase of 50 to 60% would require substantial changes in architecture design and chip size.
I would think that AMD's bread and butter would come from the HPC market. At one point, the speed of compute operations on AMD cards was far faster than anything nVidia had on the market.
 
Not sure why you guys keep testing $800+ cards at medium settings in CSGO. 7900XT/3080Ti can get OVER 200 FPS avg @4k max settings. Medium settings should target far less expensive and/or powerful cards.
 
The 6800XT, 7900XT, and 7900XTX are the only cards worth buying right now at the high end, unless you really want RT. <$600, $800, and $1000. Thanks for the comparison TS.
There's nothing at all wrong with the 4070Ti, especially if you are going to be gaming mostly in 1440p high fps. It has a very similar level of perfomance to the 7900XT, has far better RT perfomance, has DLSS 3 and can often be found for around $50 less. The only downsides to it, is the 12GB of VRAM versus the 16GB on the 7900XT and it's lower memory bandwidth (which very rarely comes into play, anyway).
Hell, even Hardware Unboxed recently recommended the 4070Ti over the 7900XT in a recent showdown, because it offers slightly better bang for you buck.
I guess you could also say the 4090, is if also worth while buying if money isn't an issue and you simply want the best GPU that there is, for gaming.
I'm also guessing the 6800XT, won't look that great a deal either, when the 4070 comes along with a very similar price tag, but with slightly better all-round perfomance. That's unless the 6800XT drops down in price.
 
There's nothing at all wrong with the 4070Ti, especially if you are going to be gaming mostly in 1440p high fps. It has a very similar level of perfomance to the 7900XT, has far better RT perfomance, has DLSS 3 and can often be found for around $50 less. The only downsides to it, is the 12GB of VRAM versus the 16GB on the 7900XT and it's lower memory bandwidth (which very rarely comes into play, anyway).
Hell, even Hardware Unboxed recently recommended the 4070Ti over the 7900XT in a recent showdown, because it offers slightly better bang for you buck.
I guess you could also say the 4090, is if also worth while buying if money isn't an issue and you simply want the best GPU that there is, for gaming.
I'm also guessing the 6800XT, won't look that great a deal either, when the 4070 comes along with a very similar price tag, but with slightly better all-round perfomance. That's unless the 6800XT drops down in price.

The 7900XT has 20GB of vram not 16.
 
There's nothing at all wrong with the 4070Ti, especially if you are going to be gaming mostly in 1440p high fps. It has a very similar level of perfomance to the 7900XT, has far better RT perfomance, has DLSS 3 and can often be found for around $50 less. The only downsides to it, is the 12GB of VRAM versus the 16GB on the 7900XT and it's lower memory bandwidth (which very rarely comes into play, anyway).
Hell, even Hardware Unboxed recently recommended the 4070Ti over the 7900XT in a recent showdown, because it offers slightly better bang for you buck.
I guess you could also say the 4090, is if also worth while buying if money isn't an issue and you simply want the best GPU that there is, for gaming.
I'm also guessing the 6800XT, won't look that great a deal either, when the 4070 comes along with a very similar price tag, but with slightly better all-round perfomance. That's unless the 6800XT drops down in price.
The 4070ti already gets bottlednecked in newer games due to it's 12GB of vram. It is entirely unacceptable for a card that costs $800. Also, barely anyone uses raytracing, it's nothing really more than a talking point. Now we're on to path tracing which the 4090 can barely do so the card is already obsolete. The 7900xt is going to age much better than the 4070ti simply due to it's 20GB of vram.
 
There's nothing at all wrong with the 4070Ti, especially if you are going to be gaming mostly in 1440p high fps. It has a very similar level of perfomance to the 7900XT, has far better RT perfomance, has DLSS 3 and can often be found for around $50 less. The only downsides to it, is the 12GB of VRAM versus the 16GB on the 7900XT and it's lower memory bandwidth (which very rarely comes into play, anyway).
Hell, even Hardware Unboxed recently recommended the 4070Ti over the 7900XT in a recent showdown, because it offers slightly better bang for you buck.
I guess you could also say the 4090, is if also worth while buying if money isn't an issue and you simply want the best GPU that there is, for gaming.
I'm also guessing the 6800XT, won't look that great a deal either, when the 4070 comes along with a very similar price tag, but with slightly better all-round perfomance. That's unless the 6800XT drops down in price.

1. DLSS 3 is nothing but a gimmick right now and so will be FSR 3 for at least an year. SAM and RSR are tech that people should be really looking forward to, as well as upscaling tech UE 5 will use if done well.
2. Maybe in US, but in most of the world 7900 XT is a little cheaper or the same price, depends on what model you find in stock at a given time and if there is any sale running.
3. 4080 is the card that has very similar performance level to the 7900 XT at 1440p and even 4K in some cases. As another comment mentioned, it is 20GB, so much better future proofed. In the end it all comes down to the games you play the most after all, but there is no denial that 20GB VRAM will help with whatever comes next.
4. Yeah, RT performance is better, again, not in all cases and varies by how much, but looking at the most played games on Steam for example I rarely see any game that has a good implementation of RT and because in the case of CoD for example, I doubt anyone keeps RT on. With top sellers it is the same story. A quick swipe right at the top sellers and top played right now you find Resident Evil 4 as the top game with RT and it performs better on AMD than on Nvidia for whatever reason.
5. I don't see any competition for AMD right now, both in 3000 and 4000 series vs 6000 and 7000 other than for the rabid fans of benchmarks and RT or those who really need the power of 4090 for work.
 
The problem for AMD is the fact they called it a 7900XT, when it's clearly the 7800XT, but then they couldn't gouge on the price, which has worked a charm for them.

The sad thing they'll be repeating this when the 7700XT erm I mean 7800XT is released at $699 and so on down the food chain.

I would buy the 7900XT at $699 tops, but frankly RDNA3 is underwhelming and they blew the chance to decimate Nvidia this time round with sub=par improvements and pricing. RDNA4 needs to do a lot better, Blackwell will be no joke and is going to be a bigger upgrade than Lovelace being a an all new architecture.
 
The problem for AMD is the fact they called it a 7900XT, when it's clearly the 7800XT, but then they couldn't gouge on the price, which has worked a charm for them.

The sad thing they'll be repeating this when the 7700XT erm I mean 7800XT is released at $699 and so on down the food chain.

I would buy the 7900XT at $699 tops, but frankly RDNA3 is underwhelming and they blew the chance to decimate Nvidia this time round with sub=par improvements and pricing. RDNA4 needs to do a lot better, Blackwell will be no joke and is going to be a bigger upgrade than Lovelace being a an all new architecture.
the 4090 was a really a 4080, but then the couldn't gouge on price. I mean that literally. Going on Die size and chip naming, the 4090 has more in common with an 80 series card than a 90 series card. The name scheme of the die itself suggest it was intended as an 80 series card. The X102 dies have always been 80 series cards and the X103 dies have been 70 series card. The "4080 12gb" should have been a 4070 with the 4080 16gb being the TI.

They're fantastic cards but lets call a spade a spade here. Yeah, the 7900's should have been 7800's but nVidia did the exact same thing. The 7900xt should have been a 7800xt and the 7900xtx should have been the 7900xt. They're both guilty here. The thing is, no one would have paid the prices they're asking if they named them other way. Atleast the backlash from the 4080 12gb twisted nVidia's arm into "unlaunching" it.

But something I'd like you to keep in mind is that RDNA uses a chiplet design in a graphics card, it's something that has NEVER been done before. I'm certain we'll see significant performance increases in later generations as they tweak the tech and make improvements. Hopefully their prices will come down as they improve manufacturing techniques.
 
the 4090 was a really a 4080, but then the couldn't gouge on price. I mean that literally. Going on Die size and chip naming, the 4090 has more in common with an 80 series card than a 90 series card. The X102 dies have always been 80 series cards and the X103 dies have been 70 series card. The "4080 12gb" should have been a 4070 with the 4080 16gb being the TI.
In terms of die size and chip naming, the AD102 is virtually the same as the GA102 -- the only difference being a 3% smaller die size in the newer model. Sure, Nvidia used the GA102 in both a 90 and 80 series card, but if one goes back to the TU102, which was only ever used in the 2080 Ti and the Titan RTX, and the GP102 was only used in the 1080 Ti and Titan X/Xp cards -- in other words, the X102 chip hasn't always been an 80 series card at all.

The use of X103 only started with Ampere -- prior to that, Nvidia used X104 for the next tier:

TU104 -- 2080 Super, 2080, 2070 Super
GP104 -- 1080, 1070 Ti, 1070, 1060

If one compares the GA102 used in the 3080 [10GB] to the full die, it's effectively an 80% full chip (68 vs 84 SMs, 5 vs 6 MB L2 cache). An 80% version of the AD102 would be around 114 or so SMs -- given that the 4090 is 128, that would be too close in terms of performance to really separate the 4090 from the 4080.

But something I'd like you to keep in mind is that RDNA uses a chiplet design in a graphics card, it's something that has NEVER been done before. I'm certain we'll see significant performance increases in later generations as they tweak the tech and make improvements. Hopefully their prices will come down as they improve manufacturing techniques.
The use of chiplets isn't strictly about performance, it's more about improving cost-effectiveness for AMD, but by leaving the cache and memory controllers on a larger, cheaper node (SRAM and IO circuitry isn't going to scale down much more now), it leaves scope for AMD to go with increasingly larger GCDs.

The one used in Navi 31 is quite small at just 326 mm2, so there's definitely room to have something larger. However, TSMC's N5 is already expensive and the next nodes are going to be even more so.

Which raises a problem: how does one improve performance and lower prices? Sticking with N5 for RDNA 4 would be cheaper than using N3, but the die sizes would be notably larger, thus reducing yields and, in turn, forcing the prices to remain high. If one goes for the newer nodes, to improve the GCD density, performance, and yields, then the manufacturing costs are going to remain high.

The way to reduce the latter is by pre-ordering a shed load of wafers from TSMC, but to do this, one needs to have a market that's going to buy all those chips -- something that the likes of Apple and Nvidia have easily, as does AMD does for its CPUs, but just not for its GPUs.
 
The problem for AMD is the fact they called it a 7900XT, when it's clearly the 7800XT, but then they couldn't gouge on the price, which has worked a charm for them.

The sad thing they'll be repeating this when the 7700XT erm I mean 7800XT is released at $699 and so on down the food chain.

I would buy the 7900XT at $699 tops, but frankly RDNA3 is underwhelming and they blew the chance to decimate Nvidia this time round with sub=par improvements and pricing. RDNA4 needs to do a lot better, Blackwell will be no joke and is going to be a bigger upgrade than Lovelace being a an all new architecture.
This. The reality is the x800 level really does cost $800 now for a new card in this market, as people are paying it. NV still way overpriced charging $1200 for an 800-level card. Don't even get me started on houses.
 
$750, really? More like $600. Seriously. I wish there was a cost break down per card of what it costs AMD to produce one. You would be utterly shocked.

I know there is a lot of overhead involved with that, staff, R&D, everything. But the more cards they sell the less it costs per card.
Cost break down for what? Dies costs cents to make. Millions are produced and whatever is good in the batch is used. However more dies are bad than good so you wanna break down the cost of a card it's probably around 25-50 to produce and ship for a single card but in the making what you don't see is all the dies that were bad, the employees on the line testing that need paid, or even the the engineers that devople the cooling or look of the card. So yes cards prices need to be increased to make up for the losses and still profit. You can't look at just the cost of the card but what actually goes into making the card!
 
At $800 I still feel the 7900xt has some room to come down in price, at $750 we'd have a pretty solid winner. The 6800XT is still a great card so I would have a hard time recommending upgrading from it in general. I have a 6700XT and I would very much like to get a 7900XT in the $700-750 price range. I have a native 4k display so the 6700xt is workable with some settings tweaks but it's far from an ideal experience.

Yeah, but those aren't the same dollars (or euros) as before. What we have now is Covid-lockdown-caused mass printed and digitally produced money, not backed by gold or anything real. So, those $800 are actually $500 before the money-printing competition.
 
Yeah, but those aren't the same dollars (or euros) as before. What we have now is Covid-lockdown-caused mass printed and digitally produced money, not backed by gold or anything real. So, those $800 are actually $500 before the money-printing competition.
Until my paycheck gets adjusted for inflation those are still $800 real dollars.
 
Until my paycheck gets adjusted for inflation those are still $800 real dollars.

You're right. From your viewpoint it's the same $800, and from their it isn't.
Special Relativity in action.
 
And the 6800XT isn't much faster than the 6800 either and it's under $400 on the used market.

Overall a great article, I like these types of comparisons for gamers, thanks!
 
the 4090 was a really a 4080, but then the couldn't gouge on price. I mean that literally. Going on Die size and chip naming, the 4090 has more in common with an 80 series card than a 90 series card. The name scheme of the die itself suggest it was intended as an 80 series card. The X102 dies have always been 80 series cards and the X103 dies have been 70 series card. The "4080 12gb" should have been a 4070 with the 4080 16gb being the TI.

They're fantastic cards but lets call a spade a spade here. Yeah, the 7900's should have been 7800's but nVidia did the exact same thing. The 7900xt should have been a 7800xt and the 7900xtx should have been the 7900xt. They're both guilty here. The thing is, no one would have paid the prices they're asking if they named them other way. Atleast the backlash from the 4080 12gb twisted nVidia's arm into "unlaunching" it.

But something I'd like you to keep in mind is that RDNA uses a chiplet design in a graphics card, it's something that has NEVER been done before. I'm certain we'll see significant performance increases in later generations as they tweak the tech and make improvements. Hopefully their prices will come down as they improve manufacturing techniques.

You are mixing up the 4090 laptop and 4090 desktop. 4090 laptop is indeed a lowered powered 4080 desktop. How is 4090 desktop more like a 4080? That makes no sense, it's on AD102 vs AD103.

Since this article was about the 7900XT I didn't want to talk about Nvidia's price gouging, currently I wouldn't touch their cards, but that horse has been beaten to death, less so with AMD pricing and labelling
 
Coming from a (watercooled in a full custom loop) Vega 64 and having a 1440p display, the 7900XT is the perfect GPU for me. Fast, lots of VRAM which will be very useful in the future (I change GPUs every 5-6 years), good availability, and there are waterblocks for it!
Tried RT on CP2077 and Hogwarts Legacy in a friend’s rig, not worth the performance impact IMHO, so I really don’t care that much about RT performance.
 
Back