AMD Radeon RX 7900 XTX and 7900 XT launched at $999 and $899

Status
Not open for further replies.
I doubt Nvidia has any decent leeway to lower prices by enough to affect AMD because they have a more expensive process node, GDDR6X, bigger and more complex coolers and lower yields.

Let's not forget that AMD can lower prices too :)
AMD is definitely positioning itself as the value card here. But, honestly, things are not much different than they were last gen performance wise. The 6900 XT could compete with the 3090 in raster, but RT it lagged behind. The biggest notable differences here are that the price gap is a little wider $600 vs $500. The RT gap appears to also be a little wider. AMD has FSR 2.0 with the promise of FSR 3.0 coming. And we're not in the middle of a GPU shortage (that's probably the biggest difference). Oh, and the 4080 is not 1/2 the price of the 4090 and only 15% slower this time around (also a big difference). That leaves the 4080 as the question mark in my mind. Yes, it will still outperform the 7900 XTX in RT, but it will be significantly behind it in raster for $200 more. I feel like Nvidia needs to drop the 4080 to $900 for it to be competitive, and even then the 7900 XT will be a better value for raster performance. We'll have to wait and see how it all pans out, but my guess is that the 4080 Ti will replace the 4080 at $1200 in the future and that will ultimately be the card to go for. This is my opinion of course, I'm agnostic when it comes to AMD and Nvidia, but I really think that AMD needed at least 2X RT over RDNA2, 1.6X sounds good, but not when you consider that they were already so far behind, they did not close the gap at all, I think it actually got a little wider.
 
Last edited:
I understand but in case you do that, MCM is about last thing you have to worry about. Drivers, cooling, power connectors etc are much likely to have issues.
I was thinking the same thing about driver stability for mcm vs monolithic hopefully the execute stability.
 
I still remember the latency issues for Zen1, + and 2. This is why I got a Zen3 after many months after initial release. I was in "pain" with the i7 4790 but not desperate to be a beta tester for AMD chiplet design.
Also after a few months you get better prices for all.
 
I still remember the latency issues for Zen1, + and 2. This is why I got a Zen3 after many months after initial release.
Latency is less of an issue for GPUs than it is for CPUs, because accessing GDDR has latencies in the order of 100s of nanoseconds. This is why game and GPU driver developers try to optimize the shader core occupancy to be as high as possible -- keeping the cores loaded up helps to hide the DRAM latency.

Cache helps too, of course, which is why big GPUs have so much of it. Intel and Nvidia use a two tier system to AMD's four, but the likes of the AD102's L1 and L2 bandwidth and latency are extremely good (Intel's is...umm...less so). Despite its obvious complexity, AMD's system in RDNA 2 was really good: very low latency, throughout all of the levels.

Shifting the L3 into separate dies does increase the latency for the final tier, but AMD could be offsetting that by having larger caches, with better latencies, for the lower tiers.
 
Shifting the L3 into separate dies does increase the latency for the final tier, but AMD could be offsetting that by having larger caches, with better latencies, for the lower tiers.
I still remember also the external L3 from Athlon slot days and what a issue was to sync it with CPU when overclocking. Remember this?
Screenshot_2022-03-27_at_18.08.49.png


After all external Cache is external...and adding the GDDR link here also induces a +latency. I'm waiting for reviews from both testers and users.
 
The RTX 4090 @ 4K Native Ultra RT in Metro Exodus Enhanced Edition can get over 100 fps average. The claim for the 7900 XTX is that it is 1.5X the 6950XT, that means at best it will get an average of 50fps with the same settings. How relevant is that for a GPU that cost $600 more plus requires you to upgrade if you don't have the proper case and PSU? I don't know, but it is pretty clear that AMD is not going to be competitive with the high end Nvidia cards for RT. To put this in perspective according to overclock3d the 3090 Ti can achieve an average of over 60 fps and the humble 3080 FE can achieve an average of 45 fps in native 4k max settings in MEE. If this truly is the case, the 7900 XTX RT capabilities are about that of a 3080 FE. The RTX 4080 will handily defeat the 7900 XTX in RT tasks. My hope was that the AMD flagship could at least compete in RT with Nvidia's 4080, I think it needed at least 2X the RT performance for that to happen probably a little more actually. I did not realize that the CU count only increased by 16 from the previous gen. It seems that the shader count doubling is actually similar to what Nvidia did from RTX 20 - RTX 30. With 1 RT core per CU, the 7900 XTX RT uplift comes primarily from the enhanced cores and maybe a little more with the extra 16 cores.

Just giving this based on charts, not giving an opinion, you guys can decide for yourselves if $200 more for a 4080 is worth it considering it will have at least +50% faster RT performance than the 7900 XTX, given the 4090 has 100% better RT performance. The 7900 XTX is a raster monster and I imagine a lot of people will opt for it even knowing that there will be little point in turning on RT in most titles. I was thinking about upgrading from my 3080 if the price was right to the 7900 XTX, but I don't think I will unless the final benches with RT surprise me. If I was upgrading from something like a 2080 or older, I would definitely still give the 7900 XTX a look.

I have a 6800xt and turn on RT in bottom most mode and it looks just fine in all my games that support RT at 3440x1440p at 100hz+.
You're making a mountain out of a mole hill.

This new generation does 50% more rays per CU and there are double the number of CU's.

While it won't match a 4090 I bet it will definitely match a 3090 in RT.
 
Very humble pricing for a flagship card. I'm both skeptical on why they keep the price of the 7900 XTX lower than the 4090,

It is better to be the king of the $1000 than the garbage of the $1600.

AMD knows that if they ask for $1500 they won't stand a chance against a 4090 because Nvidia name and RT performance wins easily; so due to savings on the hardware and so on they can have big earnings even selling cheaper. It's also VERY important to attract a future PS6 and xbox contact.
 
I have a 6800xt and turn on RT in bottom most mode and it looks just fine in all my games that support RT at 3440x1440p at 100hz+.
You're making a mountain out of a mole hill.

This new generation does 50% more rays per CU and there are double the number of CU's.

While it won't match a 4090 I bet it will definitely match a 3090 in RT.
No, I'm stating exactly what seems to be the case given AMD's own numbers here. If RT is not that important to you, that's okay. I'm just saying for me that was one of the things I was looking for to upgrade from a 3080. I wanted to see RT performance at least competitive with the 4080. Given the 1.5X in Metro Exodus, I think you can expect it to be between the 3080 and 3090 in terms of RT performance.

No there are not double the CUs. The 6950 XT had 80 CUs, the 7900 XTX has 96 CUs, only 16 more, no where close to double. That means only 96 RT cores. RT is at most 1.6X that of the 6950 XT. The shader count is doubled because the cores can now switch between FP32 and INT just like Nvidia did when it moved from RTX 20 to 30.
 
I still remember also the external L3 from Athlon slot days and what a issue was to sync it with CPU when overclocking. Remember this?


After all external Cache is external...and adding the GDDR link here also induces a +latency. I'm waiting for reviews from both testers and users.
The old K7's L2 cache actually had pretty good latency, given that it was SRAM and not DRAM. The biggest problem was the combination of clock speed and bus width, and then the physical distance from the on-die cache controller.

Putting the MCs on separate dies isn't an issue in itself, given the inherent latencies with GDDR. Nvidia's chips and AMD's for RNDA 2 and older all used a big crossbar between the L2 cache partitions (one per MC) and the rest of the die. For RDNA 3, that crossbar is now an external Infinity Fabric-developed system and seeing as AMD have now had several years of experience in working with it, the use of it in the 7000 series GPUs shouldn't immediately cause concern.

Time (and benchmarks) will, of course, tell.
 
This is interesting:

Over all not bad!
This has to be wrong, the 4090 Metro Exodus 4K Native RT High can get 117 fps average. The chart here says 60. No doubt that 7900 XTX will approach 4090 on raster, but that is completely wrong for ME and even in the enhanced version of ME, 4090 can get over 100 fps average. The 7900 XTX at 1.6X RT vs 6950 XT will almost definitely be under 60 fps as MEE.

 
Gotta admit, I'm pretty disappointed Nvidia doesn't have DisplayPort 2.1 on their new cards for the following reasons:

1. Displayport 2 to HDMI 2 adapters won't introduce latency and will give people an option to have multiple HDMI 2 devices.

2. 360hz 1440p monitors will have Displayport 2.1 (required for 1440p 360hz) and/or HDMI 2.1. Displayport 1.4 is too slow for these monitors. I very likely will buy one. I have the Alienware QD-OLED monitor now and it's incredible but I miss high refresh rates.
 
The value proposition is enticing but we all need some independent benchmarking against nVidia. After that we'll have a full picture.
 
I'm taking a wait and see attitude. The XTX is still a bit steeply priced for me; however, I might be able to get the WAF in line with buying one - for just one of the builds I intend to do in the next year or so.

I'll also be looking forward to a low or mid-range card in this line for my HTPC rebuild.
 
The changes mentioned are neat. Is it me, or does everyone feel like this is a stop gap?

There's a sense of something much more, brewing in the AMD labs, they are successfully tweaking and adding features they didn't have before. It's busy over there... Great work if you ask me.
 
If this doesn't get Nvidia to change their price strategy, nothing will, and they'll lose decent chunks of market share.

I'm one of the lucky few who could afford a 4090 but I ain't paying that ridiculous price especially when it's missing things like DisplayPort 2.1.

Honestly very tempted by the 7900XTX, as usual I'll wait for reviews but it probably performs better than anything else, other than for 4090 which it's probably nipping at the heels of anyway. For near enough half the price (in the UK you can't really get hold of a 4090 for less that £2k) I might actually pick one up for my new build.
 
So, that presentation was terrible. The only things that we've learnt from it are what the cards look like, how much they cost and that they're faster than the previous generation.

It looks like Jim at AdoredTV was right though because maufacturing with chiplets is far more economical and so the Halo Radeons are priced WAY below the Halo GeForces. While I wouldn't be the least bit interested in a card with an MSRP of US$999 or US$899, that's not the point because these are just halo products.

ATi nomenclature has always been pretty straightforward in how it has been applied. The letter prefix identifies the family (X, HD, R9, RX) and the first numerical digit identifies the generation (but doesn't always start with 1).

The second digit is where things get more interesting because it identifies which tier of that generation a specific card represents and here they are:
9 = Halo product* (X1900 XTX, HD 5970, R9 390X, RX 7900 XT/XTX)
8 = Enthusiast* (X1800 XT, HD 4870, RX 580, RX 6800 XT)
7 = High-End/Specialty* (HD 4770, RX 470, RX 5700 XT)
6 = Mainstream* (HD 6670, R9 260X, RX 5600 XT, RX 6600 XT)
5 = Entry-Level* (X1550, HD 7570, RX 550, RX 5500 XT)
4 = HTPC (HD 2400 XT, HD 3450, HD 5450, HD 6450, RX 6400)
3 = Office Discrete/High-End IGP (X1300, HD 2350, HD 3300, HD 4350)
2 = Mid-Range IGP (Xpress 1200, HD 3200, HD 4200)
1 = Basic IGP (Xpress 2100, HD 3100)
* = Intended for 3D Gaming

Most people buy Radeon cards with 6, 7 or 8 as the second digit and those cards cost considerably less than the halo cards with a 9. AMD's new pricing is exactly US$100 lower than the last generation. I'll use the RX 6950 XT as the comparison card for the RX 7900 XTX:

RX 6950 XT MSRP: US$1099
RX 7900 XTX MSRP: US999 <- US$100 lower
RX 6900 XT MSRP: US$999
RX 7900 XT MSRP: US$899 <- US$100 lower

If AMD continues this trend of $100 lower than last gen, we can expect to see this:

RX 7800 XT = US$549
RX 7800 = US$479
RX 7700 XT = US$379
RX 7600 XT = US$279
RX 7600 = US$230
RX 7500 (XT?) = US$99

Now, I don't know for sure if AMD will actually be doing this but remember that these prices were more or less the norm for over a decade and AMD was perfectly willing to continue the trend until nVidia's greed ruined it for everyone. I think that AMD has done a good job of resisting the temptation of following nVidia's pricing this generation, going back to what was both profitable and more importantly, sustainable.

If AMD prices their cards as shown, there is no question whatsoever that AMD will win this generation. They'll still make good coin (because, as we saw with Zen, chiplets are more economical to produce than monolithic dice) but more importantly they'll win marketshare and mindshare. To AMD, that's more important than a few extra dollars per card because that ensures long-term growth rather than maintaining the status quo.

If AMD does this correctly, nVidia will get hit hard. If AMD chickens out, nVidia will win again because people are dumb creatures of habit. Unless they're given a compelling reason to try something new, they won't.
 
Last edited:
This has to be wrong, the 4090 Metro Exodus 4K Native RT High can get 117 fps average. The chart here says 60. No doubt that 7900 XTX will approach 4090 on raster, but that is completely wrong for ME and even in the enhanced version of ME, 4090 can get over 100 fps average. The 7900 XTX at 1.6X RT vs 6950 XT will almost definitely be under 60 fps as MEE.

As always wait for the reviews.

I would also like to point out that what was released are 350watt cards.
You will see "OC" cards with 3 8pin conenctors that can use 450watts just like nvidia. I'm sure someone will release an OC card in the 425watt range.

You may have to pay attention to card reviews to see if they are "stock" or OC'ed.

One rumor is that you will see 3ghz on overclocked cards.
 
As always wait for the reviews.

I would also like to point out that what was released are 350watt cards.
You will see "OC" cards with 3 8pin conenctors that can use 450watts just like nvidia. I'm sure someone will release an OC card in the 425watt range.

You may have to pay attention to card reviews to see if they are "stock" or OC'ed.

One rumor is that you will see 3ghz on overclocked cards.
AMD is almost definitely in the efficiency curve range with the 2.3 GHz and 350 Watts. Don't expect more than 10% boosts from AIBs pushing these cards well beyond the efficiency limits. Another way to look at this is that you can limit the 4090's power quite a bit and still get 90-95% of its performance. Nvidia went all out with the 4090 and it shows in both the performance and the price. I'm not putting AMD down here, I'm pointing out that this slide is wrong, AMD's RT performance is still lagging behind quite a bit. Even AMD has admitted its targeting the 4080, though raster performance isn't far behind the 4090.
 
They'll still make good coin (because, as we saw with Zen, chiplets are more economical to produce than monolithic dice)
While that's certainly true, one also has to account for the increased cost of the packaging of the chiplets. AMD is using their Elevated Fanout Bridge system, which isn't anywhere near as straightforward as previous Navi packages -- it's necessary, of course, and does bring additional advantages, but cost reduction isn't one of them.

There's also the fact that manufacturing on N5/N6 isn't as cheap as it is for N7, something that AMD has pointed out themselves:

AtNozaoGiNMBtrnDWTADSQ.png


Given that the Navi 21 is 520 mm2 and the Navi 31 GCD is 308 mm2 (ignoring the six 38 mm2 MCDs), a 40% reduction in die area for about a 20% increase in associated yield cost (estimated from the chart above) might seem to be an absolute win. Well, in real terms, it is but how much of that cost reduction is then taken up by the MCD fabrication and final packaging is anyone's guess.

Obviously, AMD wouldn't be doing any of this and setting the prices at the level they have, if there wasn't a significant enough margin to make it all worthwhile.
 
Expect corpodrones praising 1000$ card in 3,2,1...
I understand your sentiment but your view is too narrow. Any ATi product with a 9 as the second digit in its model number is a halo product that people tend to not buy anyway. What AMD has shown is that this generation's cards should be $100 less expensive than last-gen. The RX 7900 XTX is $100 less than the RX 6950 XT and the RX 7900 XT is $100 less than the RX 6900 XT.

This means that the RX 7800 XT could be $549. Is that more palatable to you? I'll tell ya, it sure is more palatable to me! :laughing:
Better than nothing anyway, not insane 1600 and 1200 for x80 card, especially if these two beat 4080 in raster perf, or even with ampere in RT. I'll be glad to see ngreedia's mug fall into dirt and 2020 cards significantly drop in price.
Check my previous post for what could be the pricing structure of this generation and a little instruction manual on how to read Radeon model #'s.
Do you know how inflation works? Genuine question.
Inflation doesn't affect tech the way it affects everything else because it's offset by the fact that tech is unique as it gets less expensive over time to both produce and purchase. That's why the cost of CPUs and GPUs haven't increased all that much in the last 20 years. Sure, the 2017 mining craze caused GPUs to jump temporarily but they did settle down. I think that in this case, nVidia wants to keep the prices in the stratosphere caused by the market manipulation of the latest mining craze but AMD can tell that prices like that aren't sustainable and would ultimately result in people just not buying them any more. AMD recognises that sure, people can buy things they can't afford but only for so long before...
"BACK TO REALITY, WHOOPS! THERE GOES GRAVITY!"
On the other hand, nVidia doesn't care and never did.
It will be interesting to see how important RT is in the long run. I've used it on the PS5 and seen others use it on PC. It looks awesome but no more than, say, going from "high quality" to "highest quality". It seems to be something that Nvidia is pushing hard but only the enthusiast crowd is genuinely excited for.
It's only the young enthusiasts who are excited for RT because veteran gamers like me recognise RT for what it is, a gimmick. We saw real game-changers like hardware tessellation engines so we don't find RT to be all that impressive.
Wow, I thought they would be $999 and $1199 respectively. I think $899 is too close to XTX but maybe performance isn’t that different. Looks like this is gonna be a winner regardless.
Remember that these are halo products. It's the cards with the second digits of 6, 7 and 8 that most people buy and it's looking like they'll be $100 less expensive. That's all it'll take for AMD to win this generation and win BIG. By dropping the prices like this, they're also being strategic because nVidia's GPU is far more expensive to produce than ATi's GPU just based on die size alone. Add the economic advantage of using chiplets and it's obvious that AMD's pricing strategy is not only to increase their market and mind share but to also damage nVidia because now, nVidia can't make GPUs as cheaply as ATi can.
There is no question the RTX 4090 is a monster of a card, it has no rival in performance nor it's price, but also is a card that will end up in only a minuscule amount of peoples hands.
Keep in mind that the same thing could be said about the 7900 XT/XTX cards because those are level 9 halo products. Most people buy cards at levels 6-8 and those cards will be significantly less expensive than the level 9 halo cards.

It's those cards that matter most because marketshare = mindshare. I had a co-worker who had been gaming for well over a decade and only after getting to know me did he finally decide to try a Radeon card. In the beginning, he was like "I only like nVidia" (even though he'd never had an ATi card in his life). I asked him how he's liking his RX 6600 XT and he admitted "It's like you said, there's no real difference between them except that nVidia costs more." which means that he's no longer averse to buying red.

If Radeons are significantly less expensive for the same (or better) performance (which they almost always are), more and more people will make the same discovery he did and that will do far more to increase profitability than an extra $100 per card in a single generation. It looks like AMD has finally figured this simple fact out.
Very humble pricing for a flagship card. I'm both skeptical on why they keep the price of the 7900 XTX lower than the 4090, and at the same time, am excited to see how it will turn out.
The way I see it, the reasons why the RX 6900 XTX is the price that it are two-fold.
Reason #1 - From what we've seen, the RTX 4090 is way out on a level that most people don't care about (in both performance AND price) so why would AMD bother incurring the cost of getting ATi to produce a GPU like that. Developing extreme models like that is disproportionately expensive and requires things like huge power draw and exotic cooling solutions.

Remember that people are so dumb that they actually paid MORE RTX 3080s with only 10GB of VRAM over the RX 6900 XT despite being faster and having 6GB more VRAM. I'm sure that AMD noticed this and said to themselves "If they're THAT brainwashed then we're wasting our time." and so didn't bother getting ATi to make anything that would cost more than it would be worth.

It's a far better business decision to instead save that development money and use it to offer lower prices on cards in market segments that people will actually buy. This fixation on halo products is just plain stupid because they have absolutely no impact on how the cards that most people will actually buy.
Looking at how the 6900XT and 6950XT gave a run for the money for the 3090s, giving almost blow by blow performance, AMD should be keeping up the tempo in this gen too.
The problem is, as I said, too many fools wanted to pay more for a slower 10GB 3080 than less for a faster 16GB 6900 XT. Hell, even I was astonished by this because I thought for sure people would say "Oh hell, good enough!" but they didn't.

You know, you can't fix stupid and so AMD isn't even going to try. They're just going to leverage the cost savings of using chiplets to flood the market and gain mindshare. Then when the efficiency of chiplets causes Radeon performance to exceed that of GeForce, they'll be in a much better position to actually get sales at the halo level. Remember that ATi's principal source of revenue isn't RDNA, it's CDNA and Radeon Instinct is making them an absolute killing. They have no need to kill themselves trying to chase the retail halo crown. As things are now, it would be a fool's errand anyway.
Lest, at risk of disappointing all those awaiting for these 7 series Radeons.
I don't get why anyone would be disappointed. Having great cards that people can actually afford to buy and use is, to me, a much better outcome than having two halo-level products with halo-level pricing that few, if any, can afford. That would severely damage PC gaming in general. I truly believe that Radeon will eventually surpass GeForce, just like Zen did with Core.

Remember that AMD was much further behind Intel than ATi was behind nVidia but still came back from the brink to slap Intel down. Now, with AMD being a much stronger company than they were then (EPYC completely OWNS the server space these days), there's no reason why they can't do the same to nVidia.
Chiplet design certainly seems more elegant on paper than the old-school monolithic approach. Whether this translates to more reasonable prices in the mid-tier (via improved wafer yields) remains to be seen.
Well, it worked amazingly with Zen so it should work with both CDNA and RDNA.
Yes, their chiplet design will give them higher yields, too, so it will be easier for them to keep supply up and costs down versus Nvidia in this generation.
Exactly. This is a huge opportunity for them.
Whether the supply will be enough, whether the rasterization performance will be high enough, and whether RT makes a huge impact will be the deciding factors for most people.
Increased yields mean increased supply and the rasterisation performance of even the previous generation was overkill for most people. There's no way that it won't be good enough now that it's better than even that.
These cards are quite affordable compared to Nvidia, so I expect from a sales side they will be a strong winner for AMD. So long as they can beat the 4080 slightly in performance, these cards may also become the go-to for the content creator crowd: lots of memory for a lot cheaper, and still enough compute to get the job done, without also costing as much money on the energy side.
I think what AMD wants is to get as many Radeon cards into as many gamer hands as possible to increase their mindshare as much as possible. Pricing like this will do that.
Yes because we should always trust AMD and Nvidia's slides to show 100% accuracy and transparency.
I know, eh? That presentation was GAWD-AWFUL! I was like "Why are they only showing frame rates with FSR? I don't care about FSR, I want to know how good the HARDWARE is!". It was just plain stupid, but these presentations usually are. All I wanted from this presentation was to actually see the cards, learn the pricing and learn the release date. I have no intention of buying one because my RX 6800 XT will do me fine for many years to come but I do enjoy discussions like this and so I wanted to be informed. I also like to see the bigger picture and to do that, one must always be paying attention. :laughing:
Nah, I‘m complaining mostly about 80 tier cards, could’ve been around 500-700$ mark, don’t really care what they want for XTX 9999 Super Ti Halo Edition, the only reason I worry about their prices is that through halo products corps dictate pricetags for more “reasonable” products. If that wouldn’t have been the case and 4080 had top-tier chip and cost 599-699$, I wouldn’t care less if they priced 4090 north of 10 grands.
Anyway, I’m fine seeing AMD’s flagship going for 999 and not 1.5K, so that should bring prices down, too bad for 7900XT though - 100$ is insignificant for that performance gap and still guarantees AIBs will go north of 1000$ for non top-tier card.
It seems that their plan is to price new cards for $100 less than the previous generation. That would put the RX 7800 XT at $549, right where the RX 6800 XT should have been to begin with.
I’ve been a lawyer and .net dev from the start of my careers, and despite all the raises in my gross income, every year I’ve ended up earning less. But maybe its just because I was born in Ukraine, so I was ****ed up in life from the start, and things are fairy in “americas and europes”
No, it's more a matter of "In the USA, if you're already rich, you get richer but if you're not already rich, you get poorer.". Where the private sector is involved, the people at the very top make a mint while everyone else gets squeezed to death.

It's even worse in the USA than in most places in the developed world because people get hit with medical bills that cause them to declare bankruptcy. The thing is that Americans tend to be very short-sighted and say things like "Why should I pay someone else's medical bills?" because they have no concept of the fact that one day, someone else will be paying theirs. They seem to have this idea that they'll never get old and sick. Now, elderly people in the USA have become impoverished because of this asme attitude that they had when they were young. I guess it's a kind of poetic justice because they're being punished for their own greedy actions in the past.
To me, the most exciting thing about these new AMD graphics cards is that they support DisplayPort 2.1. If one is going to get a monster graphics card, one ought to be able to actually enjoy the high frame rates it makes possible!
Yeah but looking at the performance of the RTX 4090, is it really that hampered when it's still on another level despite having only DP 1.4 and despite the Radeons having DP2.1?
So the big feature, chiplets, is a bit of a disappointment to me.
What were you expecting? You should remember that when Zen first came out, people were also "disappointed" that it didn't completely mop Intel Core on their first try. Today though, AM4 outsells AM5 and 13th-gen COMBINED.
Also, I'm not surprised their ray tracing performance is likely to be 'way behind Nvidia's. Sure, that doesn't sound great from a marketing perspective. But they've managed to make their ray tracing and upscaling at least adequate, and for serious competitive gamers, frame rates matter - not being able to turn settings up to make the game look prettier.
Nobody complained about the RT performance of the RTX 3000-series so if the Radeons are capable of that, nobody should be complaining about them either.
So AMD is giving people the choice of not paying for ray tracing they don't need.
That's a good way of looking at it. (y) (Y)
A big reason, though, that will lead many people to choose Nvidia over AMD is that Nvidia Broadcast is a good free solution to replacing the background for streamers.
I don't agree with that assessment because 99% of gamers are not streamers. Sure, it's a great advantage for streamers but the percentage of the market is so small that it's like talking about extreme overclockers. Sure, they exist but they're so few as to be irrelevant.
AMD doesn't have anything like that; there's one deficiency in their software suite that they should be addressing.
Again, it's because there aren't enough streamers out there for it to be a priority.
Of course they don't have tensor cores, but since there are non-Nvidia solutions (either paid software, or awkward kludges) at present, AMD could still provide something that was nearly as good.
Sure, they could, but the difference made by RT is still so small that I really don't care about it and I'm thinking that most people wouldn't care if they actually thought logically about it instead of just thinking "I WANT IT, I WANT IT!".
The changes mentioned are neat. Is it me, or does everyone feel like this is a stop gap?

There's a sense of something much more, brewing in the AMD labs, they are successfully tweaking and adding features they didn't have before. It's busy over there... Great work if you ask me.
You're right, it IS a stop-gap. Zen was a stop-gap before Zen2 and Zen3 came along to topple Intel. The first chiplet Zen models weren't better in most cases than Intel Core but Zen had room to improve. RDNA has that same room to improve.
Honestly very tempted by the 7900XTX, as usual I'll wait for reviews but it probably performs better than anything else, other than for 4090 which it's probably nipping at the heels of anyway. For near enough half the price (in the UK you can't really get hold of a 4090 for less that £2k) I might actually pick one up for my new build.
If more people could think with their brain like you do, I think that this world would be a better place because you're absolutely right. (y) (Y)
 
Obviously, AMD wouldn't be doing any of this and setting the prices at the level they have, if there wasn't a significant enough margin to make it all worthwhile.
Agreed. It's also the best time for AMD to accept lower margins because they currently own the server CPU market with EPYC and so they're already raking in money hand over fist. They've never been in a better position to really try to make a difference on the ATi side.
 
It is better to be the king of the $1000 than the garbage of the $1600.

AMD knows that if they ask for $1500 they won't stand a chance against a 4090 because Nvidia name and RT performance wins easily; so due to savings on the hardware and so on they can have big earnings even selling cheaper. It's also VERY important to attract a future PS6 and xbox contact.
They could have priced it at 4080 prices. Seems like they don't have a slam dunk here on performance, so they are going for the value play. Not a bad idea and the cards do look interesting, but we need to see comparisons. And we really need to see the 4080 which won't be here until Dec, from what I hear.
 
RDNA-3 is more potent at gaming than AD102. What AMD announced was their stock cards... while AIB will be able to push 3,000MHz.

Giving AMD's partners the ability to Over-Clock and cut out their own niche in the market. Which is the exact opposite of what NVidia did.

Expect 7900 XTX to outperform rtx4090 by 10% or more.
 
Status
Not open for further replies.
Back