AMD's upcoming RDNA 5 flagship could target RTX 5080-level performance with better RT

AMD is using a new node too as both are on TMSC.

The 6080 can be whatever Nvidia wants it to be (see 4080 and 5080). They now have their GPUs spread very evenly to encourage upselling to the next level so it is highly likely they do a consistent performance increase across their line to maintain that.

I believe they increased things less recently since AMD announced it was getting out of the high end, which now makes it easier for AMD to get back into the high end. They of course have ample room to increase the 6080 closer to the 6090 - but will they? My guess is they will shoot for beating AMD's 80 by 5% to still claim victory while keeping their halo distance high.
My point was simply that if you’re at par with the 5080, you’re losing to the 6080.
 
Could you tell me in which games it was?
https://www.techpowerup.com/review/sapphire-radeon-rx-9070-xt-nitro/ show no one where 3090Ti was not slower by 5%. Mostly even more.


I am not a tech novice.
I just have not forget how to count. Yet.
Well, you can start here, then you can do your own homework now.
And yes, I'm familiar with AMD fans, so tell us how that doesn't count because
__________INSERT EXCUSE HERE__________

The games they used are Red Dead Redemption 2, Dying Light 2, Horizon Zero Dawn, Just Cause 4 and Expedition 33.

3090Ti MSRP: 1999 dollars
9700XT MSRP: 599 dollars
I'm pretty sure those prices are quite different today.
The 3090 ti is almost 3 1\2 years old, and the 9700 XT is AMD's best in 2025.
That's what struck me the most.
 
Sooo, midtier card for 1000$ again, but from AMD?
So by that logic Nvidia's second best card this generation - RTX 5080 is mid tier at 1100+
Great try to match a last generation product when the opposition has moved 1 or 2 jumps ahead of you.
What two jumps ahead? 5080 is only about 15% faster. Someone playing at 4K 50fps will get to 4K 57fps with an extra 15% so it's pretty meaningless gap. 5080 does not even have 24GB of VRAM.
Will AMD finally catch up to nVidia and Intel in terms of Ray and Path tracing?
They already have in RT. Only PT titles Nvidia still has a clear edge, but those are few and far between.
I'm not sure the main article really maths on this one.

A 96CU 384-bit RDNA4 GPU (so 50% more of everything, but current gen IPC) would already be comfortably between the 4090 and 5090 (so quite a bit faster than the 5080, which is still slower than the 4090).

Assuming RDNA5/UDNA whatever it's called comes with some IPC uplift as well, you're already looking at a part that's competitive with the 5090 if not faster.
Exactly. And people here are wondering if it will beat 5080. Not realizing that if AMD decided to make a 96CU RDNA4 with zero alterations to clock speeds etc it would already beat 5080 comfortably. And they're wondering that if a 96CU based UDNA on a smaller node with clock speed and IPC advancements will do that...
"With better RT." And by the time it releases, the RTX 6080 will be out.
If 6080 will be as underwhelming as 5080 it will only beat 5080 and fall way short of 5090.
But AMD's losing big time this gen. DLSS 4 is awesome, and NV has two good to great high end options in the 5070 and 5070 Ti.
Losing at what? There's nothing to lose at supposed 9% market share. 5070 is only 12GB.
That 9070 XT needed to be no higher than $650 and it's just now getting to $700-750. At that price the 5070 Ti's a no brainer (just bought one for $750).
That's not the case in most countries. In here 9070 XT is 687€ while 5070 Ti is 850€. That's 163 difference. Nothing 5070 Ti has makes it worth the extra in my eyes. This gap has actuality grown. The last month I checked it, it was 130.
Of course, if the 9070s didn't exist, the 5070s would be priced as ridiculously as the 5080 and 5090, so, thank you AMD. AMD's the only reason we're now getting 4080-level cards (5070 Ti) for $750.
The age old meme about people who want AMD to be cheaper only to turn around and buy Nvidia anyway. You deserve to be ripped off by Nvidia with that mentality. AMD is not a charity for Nvidia buyers, nor should they be.
Except AMD is admitting it WON'T be a flagship - since they're targeting the 5080...
AMD has admitted nothing. This is a rumor. We dont even know for sure if it will be called RDNA5 or UDNA.
My nephew helped me put the AMD GPUs performance in perspective. He has a 3090ti he bought in April of 2022. A friend of his bought a 9700 XT a few months ago. They ran head-to-head in one of their systems, and the 3090 ti is faster than the 9700 XT. Sometimes by a lot.
Someone can be a tech novice such as myself and still know that is just pitiful.
Do they both have the the same CPU and other hardware?
9070 XT is about 11% faster than 3090 Ti and much more efficient and faster even in RT along with access to FG that Nvidia themselves blocked on 30 series and earlier.
Wait, so you are telling both a 96 CU video card and a stripped down 80 CU console can both meat or even beat the RTX 5080?

Does not compute.
Clearly it does not compute to you that a 64CU RDNA4 that already only 15% behind 5080 will easily beat 5080 and likely 6080 with a 96CU UDNA design and a new node with higher clocks. Hell - even a 96CU RDNA4 would be able to do that already.
3090Ti MSRP: 1999 dollars
9700XT MSRP: 599 dollars

" (y) (Y)"
Not to mention that 3090 Ti is a 450W+ card. 9070 XT is a 300W card. It also supports FG and is faster in RT.

Even buying used 3090 Ti makes little sense.
 
Well, you can start here, then you can do your own homework now.
And yes, I'm familiar with AMD fans, so tell us how that doesn't count because
__________INSERT EXCUSE HERE__________

The games they used are Red Dead Redemption 2, Dying Light 2, Horizon Zero Dawn, Just Cause 4 and Expedition 33.


I'm pretty sure those prices are quite different today.
The 3090 ti is almost 3 1\2 years old, and the 9700 XT is AMD's best in 2025.
That's what struck me the most.
New RTX 3090 Ti stll cost nearly double vs RX9070.
 
Well, you can start here, then you can do your own homework now.
And yes, I'm familiar with AMD fans, so tell us how that doesn't count because
__________INSERT EXCUSE HERE__________

The games they used are Red Dead Redemption 2, Dying Light 2, Horizon Zero Dawn, Just Cause 4 and Expedition 33.
Firstly these are ran at 4K. 9070XT is not a 4K card. It's a 1440p card. It can of course play 4K, but it's not what it's meant for.

Secondly this is launch driver performance. As we've seen form latest testing that trough various updates 9070 XT performance has further improved.

Thirdly these no face benchmark channels are not reliable as a comparison. Who knows if their numbers are even real or how they configured their systems.

Major reviewers across the board have found 9070 XT about 10% better than 3090 Ti across many more games. Even if we exclude outliers where either AMD or Nvidia maybe unusually slow or fast. Plus the advantages 9070 XT has that I mentioned earlier like lower power consumption and ability to use FG and many more software features coming that 3090 Ti will not receive. Better RT performance too. Not to mention not having to worry about the power cable melting.
 
They already have in RT. Only PT titles Nvidia still has a clear edge, but those are few and far between.

Not quite. Looking at the numbers, 9070XT still loses more performance in RT than 5070Ti. Going from slightly faster than a 5070Ti in raster to slightly faster than a 5070 in RT is indicative of a lesser RT efficiency.
 
Major reviewers across the board have found 9070 XT about 10% better than 3090 Ti across many more games. Even if we exclude outliers where either AMD or Nvidia maybe unusually slow or fast. Plus the advantages 9070 XT has that I mentioned earlier like lower power consumption and ability to use FG and many more software features coming that 3090 Ti will not receive. Better RT performance too. Not to mention not having to worry about the power cable melting.
(y) (Y) And on that note, yes, I do also realize that a different collection of games may have resulted in a different result. But I posted the YT video I did because he is known as a straight shooter in his benchmarking.

New RTX 3090 Ti stll cost nearly double vs RX9070.
😲 I would never have thought they could still be had new.

Do they both have the the same CPU and other hardware?
As I said in my first post, both GPUs were tested in the same PC.
They restored a disk image when it came time to test the 2nd GPU.
 
I do also realize that a different collection of games may have resulted in a different result. But I posted the YT video I did because he is known as a straight shooter in his benchmarking.
There is something rotten at that test.
Let find why he has different results than most sites.

I do not have fancy videos. So this ought to be enough as start point.
Built-In benchmark (eliminates human errors) results for 7900XT/9070XT combined some CPUs.

Please somebody to post results of 3090(Ti)
 
I do not have fancy videos.
Honestly, that statement itself tells your story quite clearly.

There is something rotten at that test.
Let find why he has different results than most sites.

I do not have fancy videos. So this ought to be enough as start point.
Built-In benchmark (eliminates human errors) results for 7900XT/9070XT combined some CPUs.

Please somebody to post results of 3090(Ti)
How the hell can you use that for actual comparative benchmark tests?
Plus there are plenty of others with similar results.
 
Back