AMD's Radeon 7800 XT and 7700 XT could get announced around Computex next month

nanoguy

Posts: 1,355   +27
Staff member
Why it matters: There are hints that AMD is prepping mid-range RDNA 3 GPUs, but the company may take its sweet time bringing them to market. The GPU market definitely needs more competition in this segment, though with gamers rejecting the RTX 4070 it should surprise no one that Team Red isn't interested in falling into a price trap should Nvidia decide to discount its Ada offerings.

AMD's Radeon RX 7900 XT is the second-best-selling high-end graphics card on Amazon right now, and the XFX Speedster Merc310 model is currently $797.30 – 16 percent off its list price. As noted by Steven in his extensive analysis of the card against the similarly-priced RTX 4070 Ti, the RX 7900 XT is faster in rasterization and offers more VRAM, so it's an easy choice for people who don't care much about ray-tracing performance.

However, many gamers have been waiting for Team Red to come up with mid-range offerings at a more palatable price point. So far, the company has been relatively quiet on that front, though we know it's not busy making an RTX 4090 competitor based on the RDNA 3 architecture due to it being against AMD's general philosophy of balancing price and competitive performance.

When looking at the latest Steam survey, you can clearly see that Nvidia GPUs practically dominate with the RTX 3060 being the most popular overall. AMD needs to build strength in the value segment where Intel is also looking to gain some ground with its Alchemist GPUs, all of which have been receiving price cuts in the past several weeks.

The rumor mill says we'll see RX 7800 XT, RX 7700 XT, and an RX 7600 break cover around Computex this year, though it's possible the first two cards might be kept under wraps for longer due to yield issues with the Navi 31 and Navi 32 dies.

There's even word that AMD has been working on an RX 7800 XTX to sway more gamers away from the RTX 4070 Ti, but the company postponed plans for a release indefinitely until it can solve the high rate of manufacturing defects during trial production runs. The purported specs for the new card include a Navi 32 GPU with 70 compute units (8,960 shader cores) and 64 megabytes of Infinity Cache spread over 4 memory dies (MCDs), and 16 gigabytes of 21 Gbps GDDR6 memory connected over a 256-bit wide bus – all at a TBP of 300 watts.

Moving lower down the stack, the RX 7800 XT is expected to slot somewhere in between an RX 6900 XT and the RTX 4070 Ti in terms of rasterization performance. This card supposedly utilizes the full Navi 32 die so it would pack a total of 60 compute units (7,680 shader units). The MCDs, Infinity Cache size, and GDDR6 memory size are all the same, but the TBP is expected to fall between 250 and 285 watts.

A cut-down version of the Navi 32 GPU will be utilized for the RX 7700 XT, which is reportedly set to go against Nvidia's upcoming RTX 4060 Ti. The latter card is all but confirmed to feature a modest eight gigabytes of VRAM, while the Team Red counterpart will have 12 gigabytes of GDDR6. Otherwise, the rest of the design specs include 54 compute units, 48 megabytes of Infinity Cache, and a TBP of 225 watts.

Also read: GPU Pricing Update April 2023 – Is the Nvidia RTX 4070 Another Flop?

Understandably, AMD isn't sure how to price these GPUs without eating too much into its margins and also making them worthwhile for its AIB partners to integrate into custom cards. Gamers so far are rejecting Nvidia's mid-range Ada offering, and the RTX 4070 Ti sees even lower demand even at $815. At the same time, retailers have yet to clear RX 6000 series stock even after AMD slashed prices on cards like the RX 6800 XT and RX 6950 XT.

Then there's the RX 7600, which may also be revealed at Computex and could be great value for many people who are still gaming at 1080p. The upcoming card has the potential to offer RX 6750 XT levels of rasterization performance, but the limited VRAM capacity means it could have trouble selling even at $300.

Permalink to story.

 
The 7900XT is #20 on the Amazon best seller list at the time of this post. Where is the author seeing that it’s #2?
 
Just put the effing vram on the card, it's not like it's expensive or a shortage of it. It's going to cost an extra $20-30 to create a product with some life in it?
 
Just put the effing vram on the card, it's not like it's expensive or a shortage of it. It's going to cost an extra $20-30 to create a product with some life in it?
That's the thing. They don't want you to keep your GPU too long.. You have to think about these poor companies.. /s
 
That's the thing. They don't want you to keep your GPU too long.. You have to think about these poor companies.. /s
look, AMD has been a lot better about this than nVidia has. They're a businessm they're entitled to make money. 8gigs for ~$250 is fine. Memory is so cheap that just put the extra $20 of chips on it and blend it into the MSRP. A midranged card will always be a midranged card. They aren't going to get more FPS on the GPU by just adding some vram. Those GPUs are going to "expire" in a performance sense. Just give them the extra vram so that expiration date gets pushed down the road. We have 2 options, which is worse? You either give the card enough VRAM to make the user upgrade because the frames aren't high enough or you limit the VRAM and the card gets artificially limited
 
If AMD wants to take some market share

7800 XT @ $525 with 16GB and performance of ~6950 XT
7700 XT @ $375 with 12GB (16 preferred but not likely) and performance of ~6800 XT
7600 XT @ $300 with 12GB and performance of ~6800
7600 @ $225 with 8GB and performance ~6700 XT

Otherwise, these will not sell any better than what's already on the market. You can get a 6950XT for $600 now, it's going to need to be a better deal than that to move the needle.
 
look, AMD has been a lot better about this than nVidia has. They're a businessm they're entitled to make money. 8gigs for ~$250 is fine. Memory is so cheap that just put the extra $20 of chips on it and blend it into the MSRP. A midranged card will always be a midranged card. They aren't going to get more FPS on the GPU by just adding some vram. Those GPUs are going to "expire" in a performance sense. Just give them the extra vram so that expiration date gets pushed down the road. We have 2 options, which is worse? You either give the card enough VRAM to make the user upgrade because the frames aren't high enough or you limit the VRAM and the card gets artificially limited
Oh I agree with you that AMD has been a lot more generous than Nvidia for VRAM. And yes, even if you give 20GB of VRAM to a budget card, it will never use it. But right now they have 10% of the gaming market. So they need to be generous with VRAM and be priced lower for equivalent performance tier cards.
 
Steve said:

"However, many gamers have been waiting for Team Red to come up with mid-range offerings at a more palatable price point. So far, the company has been relatively quiet on that front, though we know it's not busy making an RTX 4090 competitor based on the RDNA 3 architecture due to it being against AMD's general philosophy of balancing price and competitive performance."

Well, that and the fact that they can't make a 4090 competitor with this generation. :p
 
One of the attractions of the 4090 is the high vram and cuda. If you were going to spend 4090 money on an nvidia card with cuda or an amd card without, which would you choose. Helped a friend out with some ML computations (complex weather prediction models) for a degree course and the recommended software was open source and cuda based. 3060ti churned through it very quickly.

We are beyond the point where GPU companies are “discovering” amazing ways to power up their cards performance that are amazingly cheaper. AMD aren’t about to present RDNA 4 as a light bulb moment - no one is discovering night and day tech that suddenly makes everything cheaper so the price points current cards are at isn’t coming down. There’s no benefit for AMD to bring out a card for less money with the performance of an old card that is still in stores. They aren’t about to “win” the GPU wars and grab all the market share.

It’s a real shame this point has been reached off the back of the crypto rush and the pandemic with inflated prices, because right now we are f*cked for at least 2 generations if we expect any cards with even 3080 performance to be released at original 2060 msrp within 4 years (and if anyone does, I suspect it will ship with minimal vram to keep the higher prices cards in demand).

The only thing AMD have done that stands out for me is the VRAM. Don’t know who was drinking heavily the day Nvidia put all that VRAM on the 3060. I suspect they have now been fired.
 
Last edited:
It's a shame AMD can't design their GPUs using chiplets like they've done for their CPUs (or do they?). This would mean better yields which leads to better prices. It should also make it fairly easy to create a range of products by varying the number of chiplets, vRAM and speeds. Personally I don't want to game at 4K but I do want an affordable mid range product. Here's hoping for the 7600.
 
Steve said:

"However, many gamers have been waiting for Team Red to come up with mid-range offerings at a more palatable price point. So far, the company has been relatively quiet on that front, though we know it's not busy making an RTX 4090 competitor based on the RDNA 3 architecture due to it being against AMD's general philosophy of balancing price and competitive performance."

Well, that and the fact that they can't make a 4090 competitor with this generation. :p

Can't...?
Dr Lisa Su announced that they could... just wasn't necessary, because NAVI31 beats nVidia's $1,200 chip and sometimes their $1,600 chip..!

Before Navi 31:
b8348a7024fab691787c244412227dbef218a112313d6aa56c13a3ce51630d34.png



After Navi31:

c924437558edbe8d25c65426fea38b7a8461e872f9dd5b744b3907d04a8b661e.png
 
It's a shame AMD can't design their GPUs using chiplets like they've done for their CPUs (or do they?). This would mean better yields which leads to better prices. It should also make it fairly easy to create a range of products by varying the number of chiplets, vRAM and speeds. Personally I don't want to game at 4K but I do want an affordable mid range product. Here's hoping for the 7600.
This RDNA2 chiplet was just first try that was made with low risk low reward style. Making too many changes at same time usually means much trouble.
 
It's a shame AMD can't design their GPUs using chiplets like they've done for their CPUs (or do they?). This would mean better yields which leads to better prices. It should also make it fairly easy to create a range of products by varying the number of chiplets, vRAM and speeds. Personally I don't want to game at 4K but I do want an affordable mid range product. Here's hoping for the 7600.
The workloads of CPUs and GPUs are vastly different, and splitting up a GPU into a stack of Zen 4-sized chiplets would create an expensive interconnect/data management/packaging problem. The cost of resolving that would completely negate any benefits from using lots of little chips.

While Navi 31/32 isn't very chiplet-y (as only the memory controllers and L3 cache slices attached to them are in separate dies), input/output circuitry and SRAM just don't shrink very well in newer fabrication nodes. So if AMD had stuck with a monolithic design, then making a 530-ish mm2 die on TSMC's N5 would have resulted in yields no better than the Navi 21 on N7 and be more expensive to produce.

As it so happens, the Navi 31 GCD is only 300 or mm2 in size, roughly the same as the Navi 22, so each wafer will churn out more dies. The MCDs are tiny, so even if the N6 yield for them is awful, AMD is still going to have lots of them to field into various products. Some of the cost benefits will be eaten up by the increased packaging costs, but obviously, AMD felt that it would still get a better return for the investment, in the top-end and upper mid-range sectors (bottom end will still be monolithic).

But it's important to note something here -- AMD never went down the chiplet route for its CPUs and GPUs to offer dozens of products, at low cost to the end user. It's all about improving its operating margins, as actual revenues are pretty good.
 
AMD really needs to step things up now as Nvidia didn't release anything that interesting. If they again trail Nvidia in performance and keep lacklustre ray tracing performance, nothing will be achieved.
 
AMD just needs to ship compelling 7000 series mid-range cards. I was hoping the 7700 XT would be a direct competitor to the 4070 (performance, power and packaging), but sadly, if the rumors are true, it is going to fall well short.
 
AMD could make a Competitor to the 4090 and it would cost the same if not more so why bother? Now if you're interested in spending lots of money, look at the Pro Workstations cards such as the Pro W6800 with 32GB of memory. That costs as much as the 4090 - according to Passmark, the W6800 is -50% lower scoring for +$100 of the 4090. Not a good place for AMD to be. Right now they've had to concede the Ultra/Titan Market as they simply don't have a GPU that scales as well as the 4090
 
AMD could make a Competitor to the 4090 and it would cost the same if not more so why bother? Now if you're interested in spending lots of money, look at the Pro Workstations cards such as the Pro W6800 with 32GB of memory. That costs as much as the 4090 - according to Passmark, the W6800 is -50% lower scoring for +$100 of the 4090. Not a good place for AMD to be. Right now they've had to concede the Ultra/Titan Market as they simply don't have a GPU that scales as well as the 4090

What are you talking about...?

47f1bd935c3fc8f9d35f983e2a7eadce138e384242be77c7191a06614f294d61.jpg
 
Unfortunately I suspect this will be overpriced like all GPUs of late and just be another irrelevant AMD GPU.
I hope that's not the case, but recent history isn't good.
 
Wow, talk about cherry-picking! Let's look at the overall results, shall we?

Well, If Dr. Su said it, it must be true!! 😏Nahhh- the RX 7900 XTX is a 4080 competitor, not the 4090.

And...?
I proved and showed you, that in reality its not true and that AMD could make a larger dGPU, but just didn't need to...

Again, the RTX4080 NEVER beats the RTX4090 in any game ever, yet the XTX does beat the venerable 4090 in a few games.... yet the 4090 cost 50% more money... how is the XTX's defeats possible..? Then so perhaps Dr Lisa Su is right... AMD didn't need to make a larger chip.... bcz RTX4090 are on sale now..! (bcz Gamers are not getting 50% more performance from their 50% more purchase.)


Ask yourself this.... do more people upgrade their video card for Call of Duty and multiplayer games, or do they upgrade their video card for RT and glorified graphics on single player games. In all the years of gaming, I have never seen, nor heard of a Player upgrading their GPU for a single player game....


Competitive games are what fuels the GPU sales... this is the current reality:

99dea9cad23c80e1414f6367e818943bb21d1f9d14248d830ffaf024efe502c0.png
 
Last edited:
Well, posting a graphic image created by AMD really doesn't prove much LOL; I'm sure Nvidia has similar charts showing just the opposite. Let's stick with independent results to keep things fair.

Again, the 7900 XTX and the 4080 are clearly direct competitors. Here's proof:

1683227207408.png
1683227281725.png

This is off-topic, so we may as well agree to disagree before a moderator steps in.
 
Back