AMD Radeon RX 7900 XTX and 7900 XT launched at $999 and $899

Status
Not open for further replies.
So doing the math, the AMD card is quite a bit faster than the 4090 since we can see the benchmark in overwatch 2 for both. It's also much cheaper, probably does worse in Ray-T but who cares?

4090 = 500 max fps -
AMD = 600 max fps. - in this post.

So we're looking at a 100+fps difference in one game?
 
Last edited:
Yes because we should always trust AMD and Nvidia's slides to show 100% accuracy and transparency. Certainly these companies don't cherry pick things, nor have they ever. It's like taking the word of a politician. That being said, your "math" is still only speculation given the nature of initial testing. We can only hope driver enhancements, and game patches from now until official release will hold true to these values. Notice the words "Up to"? We're both in the same corner man, regardless of exact performance numbers, it will most definitely offer better bang for the buck than Nvidia's offerings.


Bcz AMD didn't lie with their RDNA2 slides... they actually understated their performance. Which again, will be far greater once AIB RDNA3 cards are released... and running higher clocks, etc..

Announced today were just stock cards.... wait until you see something like the liquid cooled Red Devil, etc..
 
Where do I start?

Tim or whoever writes for him is a master at using words and sentences to nudge people on their desired paths, which of course is, to push consumers towards nvidia.

Every possible plus from AMD is dismissed, especially if it paints nvidia on a bad light.

Every claim made by AMD were doubted, even though AMD has proven over and over that they dont lie or over promise on their presentations, unlike intel and nvidia.

Only referred the AMD gpus by model, yet used every possible adjective (when convenient of course) to mention nvidia (gpu chip name, gpu codename, gpu brand and model, etc).

He threw a couple of bones here and there, but in the end, nvidia is king and its perfect and this announcement is another failure on amd part.

Anyways, my take.

RT is simply stupid, period.
Requires too much resources for a damned shadow or reflection in a puddle.

I personally believe that we are at least 2 more gens away from decent hardware and same for games, which are currently less than 50 in existance.

Talking about games, why so much effort is placed in hyping the very few existent RT games when we have thousands of other pc games that many of us havent played yet?

I know, it paints nvidia in a superior spot and as another failure for AMD.

The estimated power consumption is great, compared to the performance increase and the less heat that will produce.

Display port 2.1 support is good to have (but of course dismissed), that usb-c port is intriguing.

AV1 support, plus other little things are great.

And of course, the prices and names.

The naming is stupid, these are simply 7800 and 7900 gpus, no need for the moronic use of XT and XTX.

Price would’ve been great if they were 150 bucks cheaper but since media, writers and Tubers are ok with 1700 and 2000 bucks gpus, then we shouldn’t complain, I guess.

Well, hopefully Steves review will be fair and unbiased as always and we can then see how they really perform.

*Edited a bit.
 
Last edited by a moderator:
For me the 8k Samsung Neo G9 ultrawide with dp 2.1 was the most impressive thing from the whole show.

https://www.techpowerup.com/img/PXyv3fznZUC75Ucj.jpg
and the list of Vendors coming with DP 2.1 monitors in 2023
https://www.techpowerup.com/img/bKcGqqrE17E9vQNt.jpg
I guess time will tell if gamers want crazy rasterization with decent RT performance ( 7000 series) or Good rt performance with less Rasterization performance (4080 16 gigs) or both with best rt performance(4090). Unfortunately with the performance crown I don't see the 4090 moving down in price, but the 4080 at $1200 might drop when Ampere supplies dry up.
I was lucky enough to sell my 3090 at $800 last week before it fell even more in value.
Still a shame that 4000 series lacks dp 2.1 although hdmi 2.1 is still plenty of good on my 48 inch cx oled for now.
 
Expect corpodrones praising 1000$ card in 3,2,1...

Better than nothing anyway, not insane 1600 and 1200 for x80 card, especially if these two beat 4080 in raster perf, or even with ampere in RT. I'll be glad to see ngreedia's mug fall into dirt and 2020 cards significantly drop in price.
I feel you, but pipe dream.
 
Navi-31 (4k Gaming):
7900 XTX = $999
7900 XT = $899

Navi-32 (1440p Gaming):
7800 XTX = $799 ?
7800 XT = $699 ?

Navi-33 (1080p Gaming):
7700 XTX = $599 ?
7700 XT = $499 ?


You will not see 7800 card until CES and when most of the 6900, 6800, 6700 cards are sold off on a firesale.
 
Those interested in AMD's GPU chiplet strategy should check out AdoredTV on Youtube. Jim's latest take there is (as always) detailed and insightful.

He's the one who AFAIK first leaked and explained AMD's chiplet strategy for CPUs.
 
Last edited:
Those interested in AMD's GPU chiplet strategy should check out AdoredTV on Youtube. Jim's latest take there is (as always) detailed and insightful.

He's the one who AFAIK first leaked AND explained AMD's chiplet strategy for CPUs.
I suspect chiplets are going to be a design trend with GPUs going forward, with the direction already heading towards multiple functions on what used to be a fairly specialised peripheral. Rasterisation, compute, tensor/matrix, encode/decode, and so on - a lot of scaling to be done at lithography & fab level if taking the SoC approach. Chiplets theoretically offer the opportunity for more flexible ‘mix-and-match’ with economies of scale for a broader product spread.

Essentially Apple only has 3 primary M1 SoCs that are artificially tiered according to various states of broken-ness and cache; same with Nvidia’s GA102/104/106.
 
To me, the most exciting thing about these new AMD graphics cards is that they support DisplayPort 2.1. If one is going to get a monster graphics card, one ought to be able to actually enjoy the high frame rates it makes possible!
Also: while conventional shader rendering is easy to split, as it was even split between multiple video cards, ray tracing involves the whole scene at once. So AMD, with their "unified" design, could only split the cache off to chiplets. A design like Nvidia's could have had the ray tracing on one chiplet, and the shaders split up into multiple chiplets, benefiting much more. So the big feature, chiplets, is a bit of a disappointment to me.
Conventional rendering easy to split? Could you tell even one "split rendering solution" that had no big problems? SLI, SLI and Crossfire all had own problems.

Splitting Ray tracing into separate chiplet could cause huge problems with latencies. Splitting anything else than cache would have been very risky move. Remember that skipping RDNA3 is not an option so it Must work. Therefore selecting easy path now makes sense.
 
While many are comparing 4090 with this launch but most of us don't really care. Just because we can afford something doesn't mean we will get it as VFM is a major factor for most of us.
Most of us buy xx80 or xx70 cards. Therein, the Nvidia seems to have been totally checkmated.

Seriously, the 4080 price seems like a horrible value proposition right now. Either the 4080 will be unlaunched or a substantial price cut will be made before launch.
 
Conventional rendering easy to split? Could you tell even one "split rendering solution" that had no big problems? SLI, SLI and Crossfire all had own problems.

Splitting Ray tracing into separate chiplet could cause huge problems with latencies. Splitting anything else than cache would have been very risky move. Remember that skipping RDNA3 is not an option so it Must work. Therefore selecting easy path now makes sense.
The biggest problem of those solutions was that the latency made it incredibly hard to sync the two GPU dies. With infinity fabric and the large shared cache it could work much better than expected. But unless they make the GPU dies much smaller it won't make sense.
 
+ $500 cards is out of my league, waiting for the rest of SKU's. I would give this MCM design a few months after launch to see the issues before jumping. I'm not a Beta tester.

 
These prices & specs seem broadly inline with AMD has done previously in RDNA 2. I was a little surprised by the prices - which I expected to be a bit higher. However, I also expected RT to be much better this generation.

The "8K" marketing was questionable, as GamersNexus has already covered. I think they pushed the whole DP 2.1 too hard, for what appears to be a niche feature. (few people need over 4K120hz or 240Hz at 1440p) However, DP 2.1 might be a killer feature for enthusiasts which want super high frame rates.

I think this launch hasn't changed much. AMD still behind NV in a couple of key features, but are cost efficient for raster.

Edit: As always, take these marketing presentations with a grain of salt. We need to wait for reviews to get a clear understanding of the value these cards provide.
 
Last edited:
The biggest problem of those solutions was that the latency made it incredibly hard to sync the two GPU dies. With infinity fabric and the large shared cache it could work much better than expected. But unless they make the GPU dies much smaller it won't make sense.
It should work better yes but should work and work are two different things.
AMD just cannot risk any major problems with this one, it must work.

I think AMD will later develop MCM further later but since release cycle must be continous, developments also must be small enough.

+ $500 cards is out of my league, waiting for the rest of SKU's. I would give this MCM design a few months after launch to see the issues before jumping. I'm not a Beta tester.
Um. Instead putting cache on GPU, AMD puts it on separate chip. With your logic, you will never buy MCM Ryzen where that another die is much more complex ;)
 
Um. Instead putting cache on GPU, AMD puts it on separate chip. With your logic, you will never buy MCM Ryzen where that another die is much more complex ;)
I'm running a Zen3 5600x right now, but got it 5 months ago. I don't buy new stuff in first 6-12 months after release, I let others do the testing work.
 
The RTX 4090 @ 4K Native Ultra RT in Metro Exodus Enhanced Edition can get over 100 fps average. The claim for the 7900 XTX is that it is 1.5X the 6950XT, that means at best it will get an average of 50fps with the same settings. How relevant is that for a GPU that cost $600 more plus requires you to upgrade if you don't have the proper case and PSU? I don't know, but it is pretty clear that AMD is not going to be competitive with the high end Nvidia cards for RT. To put this in perspective according to overclock3d the 3090 Ti can achieve an average of over 60 fps and the humble 3080 FE can achieve an average of 45 fps in native 4k max settings in MEE. If this truly is the case, the 7900 XTX RT capabilities are about that of a 3080 FE. The RTX 4080 will handily defeat the 7900 XTX in RT tasks. My hope was that the AMD flagship could at least compete in RT with Nvidia's 4080, I think it needed at least 2X the RT performance for that to happen probably a little more actually. I did not realize that the CU count only increased by 16 from the previous gen. It seems that the shader count doubling is actually similar to what Nvidia did from RTX 20 - RTX 30. With 1 RT core per CU, the 7900 XTX RT uplift comes primarily from the enhanced cores and maybe a little more with the extra 16 cores.

Just giving this based on charts, not giving an opinion, you guys can decide for yourselves if $200 more for a 4080 is worth it considering it will have at least +50% faster RT performance than the 7900 XTX, given the 4090 has 100% better RT performance. The 7900 XTX is a raster monster and I imagine a lot of people will opt for it even knowing that there will be little point in turning on RT in most titles. I was thinking about upgrading from my 3080 if the price was right to the 7900 XTX, but I don't think I will unless the final benches with RT surprise me. If I was upgrading from something like a 2080 or older, I would definitely still give the 7900 XTX a look.
 
Last edited:
So, 999$, that's 1500 Euros in ... Europe. Though sell. Ofc, peasants like me are still waiting for middle tier. With the current economics and AMD, Nvidia sitting on a massive pile of old cards, I got a feeling affordable new gen stuff won't come up for a year or so.
 
The problem for AMD is that in a hypothetical situation where the 7900 xtx is the same price as the 4090 rtx, then 99% of the world will choose to buy Nvidia.

This means Nvidia can completely drive them out of the market just by lowering the price of their card, they have no defense. And there is plenty of room for price reduction. That’s why AMD cards mast have at least 48gb vram
 
The problem for AMD is that in a hypothetical situation where the 7900 xtx is the same price as the 4090 rtx, then 99% of the world will choose to buy Nvidia.

This means Nvidia can completely drive them out of the market just by lowering the price of their card, they have no defense. And there is plenty of room for price reduction. That’s why AMD cards mast have at least 48gb vram
Same applied to Intel, they can do the same cause, as NVIDIA, they are giants in the industry. So when a Goliath tries to fight 2 giants, they have to take all that risks.
 
The problem for AMD is that in a hypothetical situation where the 7900 xtx is the same price as the 4090 rtx, then 99% of the world will choose to buy Nvidia.

This means Nvidia can completely drive them out of the market just by lowering the price of their card, they have no defense. And there is plenty of room for price reduction. That’s why AMD cards mast have at least 48gb vram
I doubt Nvidia has any decent leeway to lower prices by enough to affect AMD because they have a more expensive process node, GDDR6X, bigger and more complex coolers and lower yields.

Let's not forget that AMD can lower prices too :)
 
Expect corpodrones praising 1000$ card in 3,2,1...

Better than nothing anyway, not insane 1600 and 1200 for x80 card, especially if these two beat 4080 in raster perf, or even with ampere in RT. I'll be glad to see ngreedia's mug fall into dirt and 2020 cards significantly drop in price.

You sound like you are complaining about a 1,000 dollar graphics card in this statement. In your recent statement you back tracked and made it sound like you are okay with a 1,000 dollar halo. Yeah sounds like a lawyer to me lol.
 
I'm running a Zen3 5600x right now, but got it 5 months ago. I don't buy new stuff in first 6-12 months after release, I let others do the testing work.
I understand but in case you do that, MCM is about last thing you have to worry about. Drivers, cooling, power connectors etc are much likely to have issues.
 
Do you know how inflation works? Genuine question.

Why, yes actually...

Inflation is the dilution of the spending power of your money.
This dilution happens because of people taking out loans or governments simply printing money or devaluing their currency.
Inflation is a direct result of using fiat currencies.

(Fun fact when you save money, pay off a loan or credit card you also destroy money created by inflation.)

Inflation is not rising prices, rising prices is the symptom of inflation.
The current dramatic rise in inflation is because government gave everyone free money and people spent it.
 
Status
Not open for further replies.
Back