AMD Radeon RX 7900 XTX and 7900 XT launched at $999 and $899

Status
Not open for further replies.
While that's certainly true, one also has to account for the increased cost of the packaging of the chiplets. AMD is using their Elevated Fanout Bridge system, which isn't anywhere near as straightforward as previous Navi packages -- it's necessary, of course, and does bring additional advantages, but cost reduction isn't one of them.

There's also the fact that manufacturing on N5/N6 isn't as cheap as it is for N7, something that AMD has pointed out themselves:

AtNozaoGiNMBtrnDWTADSQ.png


Given that the Navi 21 is 520 mm2 and the Navi 31 GCD is 308 mm2 (ignoring the six 38 mm2 MCDs), a 40% reduction in die area for about a 20% increase in associated yield cost (estimated from the chart above) might seem to be an absolute win. Well, in real terms, it is but how much of that cost reduction is then taken up by the MCD fabrication and final packaging is anyone's guess.

Obviously, AMD wouldn't be doing any of this and setting the prices at the level they have, if there wasn't a significant enough margin to make it all worthwhile.
If you visit that web page ( https://web.archive.org/web/20220327163600/https://caly-technologies.com/die-yield-calculator/ ) which has a die per wafer calculator and put for lets say the Navi 31 with 308mm2 , for the dimensions 18mm x 18mm , wafer diameter 300 mm and defect density 0.07 (that the value for the tsmc 5nm node) it will show you that every wafer produce 131 good Navi31 dies. With a wafer cost about 15.000$ (I think with the discounts the tmsc gives to the big clients it’s around 10k but lets say 15) it means every Navi31 die has 15.000$/131 good dies per wafer= 115$ manufacturing cost.

Is 115 $ manufacturing cost per Navi31 die really a “huge” and difficult to control amount of cost?
 
Is 115 $ manufacturing cost per Navi31 die really a “huge” and difficult to control amount of cost?
If, and it’s a big if, all those figures used for determining the per die cost are accurate, then $115 is obviously not very much at all. I suspect the wafer costs are more than that, though, given that N5 is in huge demand and TSMC has recently increased prices. I also suspect that the defect density is higher than 0.07 but that’s more of a suspicion rather than anything based on figures I’ve seen.

The MCDs are fortunately very small which should help combat the issue with the fact that any defects will reduce the usable amount of cache (and it has to be 16MB, nothing less can be used for the 7900XTX/XT models), though I do wonder if AMD designed any redundancy in or are just happy to stick with traditional binning.

The hardest cost to quantity is the use of the EFB packaging. It’s only be used on the Instinct cards so far and given how highly priced they are, it’s too hard to judge just how it comes to normal GPU packaging methods.

But as I said, AMD will have worked this all out and almost certainly won’t be taking a significantly smaller margin than they do with the 6900s.
 
So doing the math, the AMD card is quite a bit faster than the 4090 since we can see the benchmark in overwatch 2 for both. It's also much cheaper, probably does worse in Ray-T but who cares?

4090 = 500 max fps -
AMD = 600 max fps. - in this post.

So we're looking at a 100+fps difference in one game?
It would be nice but I seriously doubt that. If it were faster than the RTX 4090, I'm sure that the people on stage wouldn't have been able to shut up about it. Instead, they were extremely coy and only showed graphs with FSR enabled. Quite frankly, I thought that this was the absolute worst AMD product reveal that I've ever seen. It reminded me of just how slimy Intel product releases are. It wasn't nearly as bad, but it was bad nonetheless.
Where do I start?

Tim or whoever writes for him is a master at using words and sentences to nudge people on their desired paths, which of course is, to push consumers towards nvidia.
Yep, I definitely picked up on his serious lack of enthusiasm.
Every possible plus from AMD is dismissed, especially if it paints nvidia on a bad light.

Every claim made by AMD were doubted, even though AMD has proven over and over that they dont lie or over promise on their presentations, unlike intel and nvidia.
You're not wrong.
Only referred the AMD gpus by model, yet used every possible adjective (when convenient of course) to mention nvidia (gpu chip name, gpu codename, gpu brand and model, etc).

He threw a couple of bones here and there, but in the end, nvidia is king and its perfect and this announcement is another failure on amd part.
Yeah, I was rather astonished by Tim's whole attitude. It was like he didn't even want to be doing this. On one hand, I can't blame him for being tired considering the time of night it was in Australia but on the other hand, I've seen many reactions to reveals by him that occurred at exactly the same time of night but he was actually enthused about it. It was definitely a bad look on him.
Anyways, my take.

RT is simply stupid, period.
Requires too much resources for a damned shadow or reflection in a puddle.
Agreed. I couldn't tell you what the shadows are like in ANY game that I've ever played because I don't look at them. There's always something way more important on screen to keep my focus.
I personally believe that we are at least 2 more gens away from decent hardware and same for games, which are currently less than 50 in existance.
I'm honestly not sure if it will ever make that much difference to the enjoyment of games. It's basically a case of "Let's improve everything that gamers don't look at or care about!"
Talking about games, why so much effort is placed in hyping the very few existent RT games when we have thousands of other pc games that many of us havent played yet?
Because nVidia has convinced the noobs that this is what they want and once convinced, no amount of intelligent discourse can change their feeble minds. Have you ever tried to use logic to show a religious person just how insane their beliefs are? It's the same thing.
I know, it paints nvidia in a superior spot and as another failure for AMD.
Only to clueless people and people pretending to have a clue. Those who have a clue know better.
The estimated power consumption is great, compared to the performance increase and the less heat that will produce.
They only cared about heat and power consumption when AMD products were less efficient.
Display port 2.1 support is good to have (but of course dismissed), that usb-c port is intriguing.
Well, I'm kind of dismissive of that DisplayPort as well because if these cards are unable to out-perform the RTX 4090, then the DisplayPort version is completely irrelevant. This would mean that the RTX 4090 can out-perform the RX 7900 XTX despite having the inferior DisplayPort.
The naming is stupid, these are simply 7800 and 7900 gpus, no need for the moronic use of XT and XTX.
Nope. The RX 7800 XT is a different card entirely and will be A LOT less expensive. Trust me, I know ATi nomenclature.
Price would’ve been great if they were 150 bucks cheaper but since media, writers and Tubers are ok with 1700 and 2000 bucks gpus, then we shouldn’t complain, I guess.
I think that you should read my first post in this thread. Those are halo products and both are priced $100 less than the last generation. If this is their method, the RX 7800 XT will only cost $550.
Well, hopefully Steves review will be fair and unbiased as always and we can then see how they really perform.
Steve's usually good that way but after Tim's reaction, I'm starting to wonder what's going on.
Those interested in AMD's GPU chiplet strategy should check out AdoredTV on Youtube. Jim's latest take there is (as always) detailed and insightful.

He's the one who AFAIK first leaked and explained AMD's chiplet strategy for CPUs.
Jim's the greatest investigative tech journalist since Charlie Demerjian and is hands-down the greatest investigative TechTuber that I've ever seen. I actually posted his video in a few places already but here's the vid again:
The "I'm a Mac" channel is actually pretty good too.
But, but, but... Ray Tracing!

Another detailed and informative article, but you don't need to be so neutral. Nvidia must drop prices - that's the big takeaway from this.
If you saw his video, he was far from neutral. He was definitely annoyed and had a strong anti-AMD attitude the whole time. I'd never seen Tim be negative like this before and it made me wonder what else is going on. Funnily enough, Hardware Canucks (who usually have a strong Intel and nVidia bias) may have been AMD's biggest cheerleader this time around:
+ $500 cards is out of my league, waiting for the rest of SKU's. I would give this MCM design a few months after launch to see the issues before jumping. I'm not a Beta tester.
You'll be looking at an RX 7700 XT or RX 7800 if I've figured out AMD's pricing structure correctly. It appears they're trying to return prices to Earth.
 
The RTX 4090 @ 4K Native Ultra RT in Metro Exodus Enhanced Edition can get over 100 fps average.
Ok, and?
The claim for the 7900 XTX is that it is 1.5X the 6950XT, that means at best it will get an average of 50fps with the same settings. How relevant is that for a GPU that cost $600 more plus requires you to upgrade if you don't have the proper case and PSU?
If that's not satisfactory for you, perhaps you made a real bone-headed error paying all that money for your RTX 3080 because according to overclock3d.net, the RX 6950 XT gets only 5.3 fewer fps than the RTX 3080 (33.7 vs 39) and a 50% increase would make it handily faster than your RTX 3080. If that's not satisfactory, then you made a serious blunder getting your RTX 3080 if all you care about is ray-tracing because now you're deriding the RX 7900 XTX for doing BETTER than the RTX 3080 that YOU BOUGHT! Here's the chart for everyone to see:
08121728841l.jpg

I don't know, but it is pretty clear that AMD is not going to be competitive with the high end Nvidia cards for RT. To put this in perspective according to overclock3d the 3090 Ti can achieve an average of over 60 fps and the humble 3080 FE can achieve an average of 45 fps in native 4k max settings in MEE.
Again, if you're complaining about the RT performance of the RX 7900 XTX when it's better than your "humble" RTX 3080 (humble? are you kidding me?) then why aren't you whining about how BAD your RTX 3080 is? Shouldn't you be pissed off that nVidia charged what they did for the card that gave you such "substandard" RT performance?
If this truly is the case, the 7900 XTX RT capabilities are about that of a 3080 FE.
You need to learn math because 33.7x1.5 = 50.55 which is 11% faster than the RTX 3080 FE. Again, if you're not satisfied with the RT performance of the RTX 3080, then you're the one who was foolish enough to buy that overpriced thing to begin with. That's not AMD's fault, it's yours!
The RTX 4080 will handily defeat the 7900 XTX in RT tasks.
Probably, but since 99% of games don't employ RT, this obsession of yours over it seems pretty absurd. What if the 7900 XTX handily defeats the 4080 in all other tasks? What then?
My hope was that the AMD flagship could at least compete in RT with Nvidia's 4080, I think it needed at least 2X the RT performance for that to happen probably a little more actually. I did not realize that the CU count only increased by 16 from the previous gen. It seems that the shader count doubling is actually similar to what Nvidia did from RTX 20 - RTX 30. With 1 RT core per CU, the 7900 XTX RT uplift comes primarily from the enhanced cores and maybe a little more with the extra 16 cores.

Just giving this based on charts, not giving an opinion, you guys can decide for yourselves if $200 more for a 4080 is worth it considering it will have at least +50% faster RT
Yeah, it's really not. Even if I did care about RT, playing at 1440p to get great frame rates is not that much of a sacrifice to most people, especially since most people still game at 1080p. I personally game on a 55" 4K display but I really struggle to tell the difference between 1440p and 2160p. They both look amazing and I'm happy to game at either resolution.
performance than the 7900 XTX, given the 4090 has 100% better RT performance. The 7900 XTX is a raster monster and I imagine a lot of people will opt for it even knowing that there will be little point in turning on RT in most titles.
It will be fine if you play at 1440p because even if we just do the 4K uplift ratio (which is probably lower than the 1440p uplift ratio) the RT numbers jump to 117fps according to the chart above. (78.1x1.5=117.15)

You really just seem to be looking for reasons to complain at this point.
I was thinking about upgrading from my 3080 if the price was right to the 7900 XTX, but I don't think I will unless the final benches with RT surprise me. If I was upgrading from something like a 2080 or older, I would definitely still give the 7900 XTX a look.
Why would you upgrade from your RTX 3080? You must have spent a lot on it and it's a beast of a card. Maybe you should have just bought an RX 6800 XT like I did if you couldn't get the RT performance that you seem to crave so badly because it would have saved you a few hundred dollars that you could put towards the 4080 or 4090.

I honestly don't understand what you're going on and on about. You're complaining about a card that would still be faster in RT than the one you bought (I still don't get that), you've made objectively wrong claims (according to simple arithmetic) and for some reason expect that ATi is going to fall all over themselves over something that very few people care about instead of producing a card that everyone can afford and live with.

I've tried to make sense of your post but I just can't. You come across as an entitled child who is mad that the grape bubble gum they were given was Bubble Yum instead of Bubblicious. If you ever do actually get a Radeon card (I'm assuming that you've never had one by the way you talk), you're going to feel awfully stupid when you realise that there's no real difference between Radeons and GeForces when actually gaming. They both do the exact same thing and your obsession with this frill called RT doesn't paint you in the best of lights when it comes to your level of actual expertise.
 
Inflation is not rising prices, rising prices is the symptom of inflation.
The current dramatic rise in inflation is because government gave everyone free money and people spent it.
Why no actually. What we're referring to as "inflation" isn't inflation at all. Everything has become more expensive because OIL has become more expensive as a result of Russia invading Ukraine. The cost of oil affects all products because the vehicles that ship those products around the world depend on oil for fuel.

Your money isn't less valuable, it's just that the cost of logistics has increased dramatically on everything. There's also the fact that the private sector is overstating the problem because their profit margins are percentages, so the more they have to pay, the more profit they make from you as a result. It's not a dollar-for-dollar increase, it's a dollar-for-(dollar x profit) increase.

That is NOT inflation no matter what the media says. That is extortion.
 
Why no actually. What we're referring to as "inflation" isn't inflation at all. Everything has become more expensive because OIL has become more expensive as a result of Russia invading Ukraine. The cost of oil affects all products because the vehicles that ship those products around the world depend on oil for fuel.

Your money isn't less valuable, it's just that the cost of logistics has increased dramatically on everything. There's also the fact that the private sector is overstating the problem because their profit margins are percentages, so the more they have to pay, the more profit they make from you as a result. It's not a dollar-for-dollar increase, it's a dollar-for-(dollar x profit) increase.

That is NOT inflation no matter what the media says. That is extortion.

Stop talking nonsense, look at US inflation in 2021 dude. Where was Putin then? 😂 No Putin in sight, but Fed with "unlimited supply" and demented Brandon, oh yes.
 
Status
Not open for further replies.
Back