AMD's upcoming RX 7900 XT rumored to have 24 GB of VRAM, 384-bit memory bus

Tudor Cibean

Posts: 182   +11
Staff
The big picture: Graphics cards based on AMD's new RDNA3 architecture are rumored to debut later this year. The flagship Radeon RX 7900 XT will reportedly feature a Multi-Chip-Module (MCM) GPU design with 384 MB of Infinity Cache and a staggering 24 GB of GDDR6 memory.

Both AMD and Nvidia are expected to launch their next-gen graphics cards later this year. We've already had plenty of leaks involving Nvidia's GPUs, so today, we're focusing on news involving AMD's upcoming RDNA3-based cards.

Judging by past releases, AMD is probably going to announce the higher-end models first, which are going to be based on the Navi 31 GPU. Rumors are that it will feature a 384-bit memory bus, and it will ship with up to 24 GB of VRAM. Assuming that the company will use 18 Gbps GDDR6 modules again, this would result in a memory bandwidth of 864 GB/s.

We already knew that the GPU might feature an MCM design, but instead of featuring multiple GCDs (Graphics Compute Dies) as previously thought, AMD will reportedly use six MCDs (Memory Compute Dies) with one big GCD.

What this means is that each MCD will feature 64 MB of L3 cache, combining for a total of 384 MB of what AMD calls Infinity Cache. That is three times more than the monolithic Navi 21 GPU. Splitting the GPU up into multiple dies could also improve yields, resulting in lower production costs.

The GCD will be fabricated on one of TSMC's 5nm process nodes, with the MCDs relegated to a 6nm node. AMD's Zen 3 CPUs use a similar design, as the I/O dies use a less-advanced GlobalFoundries process.

The Radeon RX 7900 XT is probably going to be the highest-end SKU based on the Navi 31 (at least until a refresh arrives). It'll reportedly arrive in the third or fourth quarter, with rumors putting its gaming performance at over twice that of the 6900XT.

Permalink to story.

 
"The flagship Radeon RX 7900 XT will reportedly feature ... a staggering 24 GB of GDDR6 memory."

Why is this "staggering"?? The Nvidia 3090 already has that...
 
They were competitive at resolutions lower than 4k, I think they lacked memory bandwidth.
Ray tracing is stupid even the 3090ti can't do it properly, hopefully next gen GPUs are powerful enough.

I agree that RT is just pathetic from both parties.

I don't know if I should feel bad for AMD for being so far behind Nvidia with RT or I don't know if I should feel bad that Nvidia has dedicated cores for it and they still suck at it.

My guess is we'll be on the cusp of seeing both parties handle RT at an acceptable level this up and coming gen, but it won't be until the following generation where RT will be the norm.
 
Personally, I have seen maybe two games were RT makes a difference that I would care for it, the rest is just the same hype for no significant result.

That said, I have mentioned this before, proper RT capable GPUS are at least two gens away and who knows when a game that properly use RT besides the "lets pause the game and admire this puddle" comes out.

I personally care for a GPU that can push 120 FPS@4K without stupid scaling with all settings on ultra.

I simply hate jaggies.
 
When Ray Tracing isn't a factor, AMD is OK. when Ray Tracing is a factor, they are a whole generation behind Nvidia and show no signs of catching up.
#1 Personally, I played through Quake II RTX and I saw NOTHING about Ray Tracing that couldn't have been accomplished using the same technology I've seen in newer games (Quake is really, really old).

#2 Ray Tracing is basically Nvidia moving the goal post so that they can advertise technology that AMD couldn't catch up with immediately - but it really doesn't make a difference in actual gameplay and barely makes much difference in visuals.

#3 My 3090 FTW3 is turned way up and playing Cyberpunk, I fail to truly see how much "better" RT makes the game. It's easily ignorable.
:joy: It's too easy with you.
 
When Ray Tracing isn't a factor, AMD is OK. when Ray Tracing is a factor, they are a whole generation behind Nvidia and show no signs of catching up.

How can you say they show no signs of catching up when RDNA3 isn't even released. Surely we can wait until we have actual Lovelace and RDNA3 cards to compare. RDNA3 will certainly beat Ampere for RT and from what I've read I think they'll narrow the gap considerably, even if still behind. But with more games supporting FSR I'm sure RDNA3 RT performance will be good.
 
Hopefully AMD can be more competitive this time around, last gens RTX performance was a let down compared to Nvidias.
So was 1st gen nvidia RT = Turing.

See the similarity? Maybe less fanboing then... maybe...:rolleyes:
Ray traced SOUND is way more important than ray traced light....
Do you play with your eyes closed?

Since when sound is more important than vision...
 
RX 7970 XT When??
They would be mad to not to a special edition with that name. The 7970 was legendary.
One thing's for sure the RX 7900 XT is gonna cost a lot of money, I honestly wouldn't be surprised if we get $1500 MSRP.
 
So was 1st gen nvidia RT = Turing.

See the similarity? Maybe less fanboing then... maybe...:rolleyes:

Do you play with your eyes closed?

Since when sound is more important than vision...

We understand you like marketing and don't actually PLAY games, or compete.

We understand that you are thee only person (in the world) who buys a $1k+ video card, to get slower frames in single Player walk-thru games, for ray-tracing & reflective puddles, right...?


We get that you do not play Competitively, and need as precise sound environment as possible... because accurate sounds stage to you doesn't matter.... accurate placement of footsteps doesn't matter to you. Cuz you are casual player.

Some people play games, some people compete... ray tracing is for the little people who need shiny things.
 
Honest question. If flagship cards can push 4k @ say 200fps, what exactly would RT do for you anyway? Not coming from the AMD side with this question, but it seems to me it would be like a guitar amp that has a volume knob that goes to 11.
 
I couldnt care less. I have had too many bad experiences with Radeons, I'm never ever buying one again.
 
We understand you like marketing and don't actually PLAY games, or compete.

We understand that you are thee only person (in the world) who buys a $1k+ video card, to get slower frames in single Player walk-thru games, for ray-tracing & reflective puddles, right...?


We get that you do not play Competitively, and need as precise sound environment as possible... because accurate sounds stage to you doesn't matter.... accurate placement of footsteps doesn't matter to you. Cuz you are casual player.

Some people play games, some people compete... ray tracing is for the little people who need shiny things.
Who is "we"? Are you talking about yourself in plural?

You make too many assumptions, you "big person" that you think you are, about "little people" - I suppose that includes me, rofl.

Yes, I actually play a lot of games (you want a list for 2021-2022? last 10 or 20 years maybe?) and I could not give a flying **** about $1000 GPUs or more, I'll never pay more than $600-700.

That being said my point still stands: 1st gen nvidia RT, as in Turing, was and still is as abysmal in RT as 1st gen RT from AMD, as in, RDNA2. Those are facts, not feelings.

Secondly vision is more important than sound, in every metric, in every tier and class and even social spectrum. Again facts, not feelings. If you're a competitive player or just a audio-snob, don't care which, you are still the minority with your wishful-dreaming of RT sound. Again facts, not feelings.

Careful not to fall off your high horse, "big person" - you might wake up to reality. In the mean time go cry to nvidia to make RT sound for you and your minority, see how that goes...
 
people just turned it off anyway. so it matters not. RT is just different and it makes developers lazy .
No it doesn't, it allowance more realistic lighting/reflections, the only reason you'd turn it off if you don't have an RTX card.
 
Remind me where AMD was during Turing... oh yeah, nowhere...
So what's your point? That AMD is late? Sure, you won... the internet debate. Not.

My point still stands: 1st gen vs 1st gen is the same, horrible RT perf for both.

Don't you even dare to claim you would have expected AMD to leapfrog nvidia in one generation and maybe what, beat them outright in raster and RT with RDNA2 when they jumped to 3090 performance from 5700 XT? Really? Pffft.

You should take off those horse blinders off... or you might fall of your high horse too, like the other guy.
 
Back