AMD rumored to be working on a Radeon RX 6900 XTX to challenge the RTX 3090

I

It’s laughable because AMD wanted twice the money for that 6900Xt than Nvidia wanted for the 3070. And also it’s poor at ray tracing and doesn’t have DLSS. Any game that does have those things (and there is quite a lot now) will run better on the Nvidia part that costs half as much.

Really the 3070 wiped the floor with most of the market if you account for price and features.
But Radeons have FSR already, and it performs quite well.

Are you sure about prices? Here, in Moscow, a Palit 3070 costs 1560 dollars, while an Asrock PG 6900XT costs 2000 dollars. It is not that the 3070 is half the price "in the real world". That was true for MSRP's only.
 
But Radeons have FSR already, and it performs quite well.

Are you sure about prices? Here, in Moscow, a Palit 3070 costs 1560 dollars, while an Asrock PG 6900XT costs 2000 dollars. It is not that the 3070 is half the price "in the real world". That was true for MSRP's only.
FSR isn’t as widely supported as DLSS. Right now DLSS is clearly the preferably tech.

And also, both of those prices are ridiculous. Sure, with inflated prices the 6900XT is better value than MSRP but it’s also absolutely ridiculously money to pay for what it is.
 
Nvidia was wrong releasing a technology (RT) for the RTX 2000
Actually it was perfect to create an army of mindless drones that wont stop repeating RT, even though its was useless on the rtx 20 and barely usable even with the dlss gimmick on rtx30.

Just look at how much time and posts in here repeat the same thing, but only one post said that in his experience, RT did provided a difference in eye candy and that was on an AMD gpu AND only on 2 games.

The rest are simply repeating the same lines from the same brochure.
 
Last edited by a moderator:
So do you genuinely believe an 8GB card will be limited within 12 months? Because that’s not going to happen lol.

I won’t buy a 3070 because they are well overpriced. But they would certainly do much better for much longer than any Radeon card. It has DLSS.

Maybe not in a year but 3070 is not a card that will last 5 years.... You already have games that do require that 8GB of memory now and things will only get worse for it, nVidia played us all with this shitty buffer sizes.....
 
Maybe not in a year but 3070 is not a card that will last 5 years.... You already have games that do require that 8GB of memory now and things will only get worse for it, nVidia played us all with this shitty buffer sizes.....
And here I am still using a GTX 660. If I had a 3070, it would certainly last me five years. Riding the bleeding edge of technology for the majority is a choice not a necessity.
 
And here I am still using a GTX 660. If I had a 3070, it would certainly last me five years. Riding the bleeding edge of technology for the majority is a choice not a necessity.
:joy: :joy: You obviously don't play modern games, I have a R9 290X that I use as my spare card, this is probably good 3 times faster than that 660 and even at 1080p this card wasn't the best for many games just last year :p
 
I guess I’m stupid then ¯\_(ツ)_/¯

But at least I got my 3090 at msrp at launch.

What may be coming out next year is of no use to me as I live in the present.

You may not feel that way now (although there are a few games that don't play at 4k 60fps (Ultra + RT) even today. And don't tell me the foolish argument that you can lower setting... you don't pay over $1k in this case $1.5k on the "best GPU" to play at Medium settings, that's really stupid (and I seen people saying that...).
But next year we will get more next gen games, that are not cross-gen like the ones we have now and those games will push the 3090/6900XT so hard it will crush them at 4k even with DLSS/FSR. That's why we will have such a huge jump with Lovelace and RDNA3, we need that power to actually play the next gen games that will come soon.

So when you see that happening to your 3090, you will think differently, if you have an ounce of self-reflection, that is. Some people will pretty much be oblivious to this change and they will keep living in their own cognitive dissonant reality pretending it's not true.

And here I am still using a GTX 660. If I had a 3070, it would certainly last me five years. Riding the bleeding edge of technology for the majority is a choice not a necessity.
If you buy the 3070 for 1080p (like I did with the 6700 XT) it will last you 4-5 years (without RT). But if you buy it because it can do 1440p today, you are wrong like so many others that live and buy for "today".

People nowadays take the wrong advice from so many YT channels and sites of "buy only for now, what is good enough now for you" and that is wrong. With that mentality you will need to constantly upgrade, as frequent as every year possibly, so you will lose a lot more money that way in the long run, than if you plan a little with your purchase and do a proper analysis of price/performance/longevity to buy a GPU for more than 2 years.

There only one caveat: you cannot do that for 4k rez. There is not GPU today that can last you 4 years (without major downgrade in IQ settings), like I said above as soon as next year those ludicrously expensive so called 4k GPUs of today will be mocked by next gen games and next gen GPUs alike.
But you can still apply this to 1440p to some degree, but mostly to 1080p.

The biggest issue here is the 4k dangling carrot (another one like RT). It's a stupid rez that people are hooked on for so many years now, yet for so many years we can't do proper 4k 60fps on all games without lowering the settings on some. So what's the use of 4k Medium, when you can do 1440p Ultra and unless you are nuts and play with pausing every 2 min and doing 600% zoom like DF, you will not notice the difference, but you will notice between Ultra and Medium settings, from a mile away.

And now they push 4k 120fps and 8k 60fps. Hahaha, that's the beyond stupid an even bigger dangling carrot.

News flash: even with a 2x perf upgrade of Lovelace and RDNA3 we will not do 4k 120fps Ultra + RT or 8k 60fps Ultra + RT. So what's the use of those crazy resolutions if you play on Medium settings on a $5000 or $10000 PC?

There will be this kind of "stupids" that pay those silly money thinking they are on the top of the world and either play at 8k 40fps maxed out or 60fps on Medium settings.

Riding the bleeding edge of technology for the majority is a choice not a necessity.
Yes, it's not a necessity, but actually for the majority is not even a choice, it's too expensive to be one.
 
Actually it was perfect to create an army of mindless drones that wont stop repeating RT, even though its was useless on the rtx 20 and barely usable even with the dlss gimmick on rtx30.

Just look at how much time and posts in here repeat the same thing, but only one post said that in his experience, RT did provided a difference in eye candy and that was on an AMD gpu AND only on 2 games.

The rest are simply repeating the same lines from the same brochure.

Definitely agree with this observation. Its pretty fascinating how many people are now salesmen for the company. NVIDIA marketing has absolutely won.

Ray tracing for me is still in the gimmicky stage. It's just not worth cutting you frame rate in half in my opinion. I can only speak for Cyberpunk and Crysis remaster, as those are the only two games I play with RT. Turns out, while RT in cyberpunk looks really good, I prefer the sparkly grain that the non-RT reflections have. Maybe thats just me though haha.

The RT in Crysis remastered, its literally -50% performance button without any meaningful difference. Plus it has lag spikes with RT on, just like Cyberpunk, which is strange.
 
Maybe not in a year but 3070 is not a card that will last 5 years.... You already have games that do require that 8GB of memory now and things will only get worse for it, nVidia played us all with this shitty buffer sizes.....
You are incorrect. There are no games that require 8GB of memory. I think you don’t understand how memory works on a graphics card. It’s not about total capacity but more memory bandwidth. There is a hard limit for some games but we are way off 8GB.
 
You are incorrect. There are no games that require 8GB of memory. I think you don’t understand how memory works on a graphics card. It’s not about total capacity but more memory bandwidth. There is a hard limit for some games but we are way off 8GB.
What? Ar you for real?
Hardware Unboxed showed so many times by now how you can easily tell how 8GB Vram fails in 4k in some games, like in Doom Eternal (without RT), which is widely known by now unless someone lives under a rock or they are a blind simp or a shill.

Here you go, seems you're the one who needs to understand (at 10:47 in the video):

This is not the only game. Tom from MLiD said the same thing and he found out in one of the RE games I think, forgot which one, that the fps tanks even in 1440p with 3070 because of the 8GB Vram. Also all the heavily modded games exceed 8GB Vram.

8GB Vram for 3070 and 3070Ti is beyond stupid for a $500-$600 (lying) MSRP, which actually is $1000+ (real price). It's a mockery for that price (I mean the MSRP) to get 8GB Vram, in 2021.

P.S. Just wait and see how the Ampere Super refresh will "fix" all the Vram stupidities on Ampere and all will be well in the world, except for the fools that paid the RTX premium tax only to get less Vram and be screwed (again, remember Turing?) by the Super refresh... I will really ROFL when this happens.
 
Last edited:
What? Ar you for real?
Hardware Unboxed showed so many times by now how you can easily tell how 8GB Vram fails in 4k in some games, like in Doom Eternal (without RT), which is widely known by now unless someone lives under a rock or they are a blind simp or a shill.

Here you go, seems you're the one who needs to understand (at 10:47 in the video):

This is not the only game. Tom from MLiD said the same thing and he found out in one of the RE games I think, forgot which one, that the fps tanks even in 1440p with 3070 because of the 8GB Vram. Also all the heavily modded games exceed 8GB Vram.

8GB Vram for 3070 and 3070Ti is beyond stupid for a $500-$600 (lying) MSRP, which actually is $1000+ (real price). It's a mockery for that price (I mean the MSRP) to get 8GB Vram, in 2021.

P.S. Just wait and see how the Ampere Super refresh will "fix" all the Vram stupidities on Ampere and all will be well in the world, except for the fools that paid the RTX premium tax only to get less Vram and be screwed (again, remember Turing?) by the Super refresh... I will really ROFL when this happens.
I am for real and I have a far better understanding than you do. Memory bandwidth is more important than total capacity. There are limits for memory but 8GB is clearly enough for now if you’re gaming at 1440p.

Buying a 3070 is far less stupid than buying any Radeon product. Radeon is absolutely trash these days, no ray tracing or DLSS, which is laughable. They also have dreadful drivers. I’m sorry if you don’t agree but there is a good reason Radeon is failing in sales, the product line has clearly been neglected by AMD in favour of consoles and CPUs. Absolutely no one with any sense should be buying Radeon today at MSRPs.

If you’re interested PM me, I’d be happy to educate you on how graphics cards work.
 
I am for real and I have a far better understanding than you do. Memory bandwidth is more important than total capacity.
Remember the R9 Fury X? It had 50% more bandwidth than the 980 Ti... The argument from everyone was that 4GB is not enough and the 6GB of the 980 Ti was much more important...
Fast forward to today, suddenly, things have flipped. This is yet another example of how nVidia fanboys constantly shift the goal post to accommodate nVidia, so that they always are having the advantage. It's confirmation bias and selective thinking.

I am very sure that if it was AMD that introduced RT, that everyone would be calling it a gimmick. But because it's nVidia, suddenly it's the best thing ever. They can't complain about AMD's rasterization performance or power consumption, so they find the next best thing. Technically, they can't really complain about drivers either, but, the lies never die...

Buying a 3070 is far less stupid than buying any Radeon product. Radeon is absolutely trash these days, no ray tracing or DLSS, which is laughable. They also have dreadful drivers. I’m sorry if you don’t agree but there is a good reason Radeon is failing in sales, the product line has clearly been neglected by AMD in favour of consoles and CPUs. Absolutely no one with any sense should be buying Radeon today at MSRPs.
I wonder how much nVidia is paying you. They better be, otherwise this is simply very sad.

I am for real and I have a far better understanding than you do.
If you’re interested PM me, I’d be happy to educate you on how graphics cards work.
No one wants to deal with you if you have such an attitude.
 
I am for real and I have a far better understanding than you do. Memory bandwidth is more important than total capacity. There are limits for memory but 8GB is clearly enough for now if you’re gaming at 1440p.

Buying a 3070 is far less stupid than buying any Radeon product. Radeon is absolutely trash these days, no ray tracing or DLSS, which is laughable. They also have dreadful drivers. I’m sorry if you don’t agree but there is a good reason Radeon is failing in sales, the product line has clearly been neglected by AMD in favour of consoles and CPUs. Absolutely no one with any sense should be buying Radeon today at MSRPs.

If you’re interested PM me, I’d be happy to educate you on how graphics cards work

I actually own a 3070 and a 6800 XT and they are both good cards. In Horizon Zero Dawn, Cyberpunk, and Red Dead, the 3070 has actually had noticeably more driver crashes. The also 3070 had that weird stutter in Red Dead when you pan the camera too fast which I could never get rid of. The 6800 never had that problem, so it was smoother in that respect.

That's only my experience on my own PC but the horrible drivers thing is definitely an exaggeration. I'd happily take a 6800 XT at MSRP.
 
You are incorrect. There are no games that require 8GB of memory. I think you don’t understand how memory works on a graphics card. It’s not about total capacity but more memory bandwidth. There is a hard limit for some games but we are way off 8GB.

Omg I can't believe you just said that..... I guess new Doom must be lying when it's asking for 9GB of memory for the Ultra Nightmare texture setting? Or
Remember the R9 Fury X? It had 50% more bandwidth than the 980 Ti... The argument from everyone was that 4GB is not enough and the 6GB of the 980 Ti was much more important...
Fast forward to today, suddenly, things have flipped. This is yet another example of how nVidia fanboys constantly shift the goal post to accommodate nVidia, so that they always are having the advantage. It's confirmation bias and selective thinking.

I am very sure that if it was AMD that introduced RT, that everyone would be calling it a gimmick. But because it's nVidia, suddenly it's the best thing ever. They can't complain about AMD's rasterization performance or power consumption, so they find the next best thing. Technically, they can't really complain about drivers either, but, the lies never die...


I wonder how much nVidia is paying you. They better be, otherwise this is simply very sad.



No one wants to deal with you if you have such an attitude.
Fury X is a prime example of what will happen to a 3070 in a near future
 
You may not feel that way now (although there are a few games that don't play at 4k 60fps (Ultra + RT) even today. And don't tell me the foolish argument that you can lower setting... you don't pay over $1k in this case $1.5k on the "best GPU" to play at Medium settings, that's really stupid (and I seen people saying that...).
But next year we will get more next gen games, that are not cross-gen like the ones we have now and those games will push the 3090/6900XT so hard it will crush them at 4k even with DLSS/FSR. That's why we will have such a huge jump with Lovelace and RDNA3, we need that power to actually play the next gen games that will come soon.

So when you see that happening to your 3090, you will think differently, if you have an ounce of self-reflection, that is. Some people will pretty much be oblivious to this change and they will keep living in their own cognitive dissonant reality pretending it's not true.
None of the games I play right now support RT or DLSS. They do run well at 5120x1440 which is what matters to me.

I’ve stopped caring about next gen games. At least where AAA games are concerned. It’s all a milking fest of flashy effects with no substance. I’m looking forward to Final Fantasy 14 Endwalker, Total War: Warhammer 3 and maybe Age of Empires 4. If something changes and they actually release something with next-gen graphics AND gameplay at around the Lovelace/RDNA3 launch, then I’ll have a good excuse to get a new exciting toy and my wife can take over the 3090. I’ve been doing this since the ZX81, so I think I know the drill.

Keep in mind that I bought my 3090 at non-scalped price at launch. I did feel a bit of buyers remorse when I first got it, but it was the only option because all the 3080s were GONE before you could even think about clicking on the order button. Now, I feel like I won the lottery. While everyone is stressing over lack of GPUs and things looking bleak in general, I’m just chilling with my 3090 mini reactor. No, this won’t be state of the art when next gen comes out, but that’s always the case and what if the chip shortage is still going on at that point? Besides, when I got my 3090, next gen was just a distant rumor with zero reason to believe anything. I can’t base my dwindling life span on questionable Internet rumors. They have to be pretty solid before I base decisions on them.
 
None of the games I play right now support RT or DLSS. They do run well at 5120x1440 which is what matters to me.

I’ve stopped caring about next gen games. At least where AAA games are concerned. It’s all a milking fest of flashy effects with no substance. I’m looking forward to Final Fantasy 14 Endwalker, Total War: Warhammer 3 and maybe Age of Empires 4. If something changes and they actually release something with next-gen graphics AND gameplay at around the Lovelace/RDNA3 launch, then I’ll have a good excuse to get a new exciting toy and my wife can take over the 3090. I’ve been doing this since the ZX81, so I think I know the drill.

Keep in mind that I bought my 3090 at non-scalped price at launch. I did feel a bit of buyers remorse when I first got it, but it was the only option because all the 3080s were GONE before you could even think about clicking on the order button. Now, I feel like I won the lottery. While everyone is stressing over lack of GPUs and things looking bleak in general, I’m just chilling with my 3090 mini reactor. No, this won’t be state of the art when next gen comes out, but that’s always the case and what if the chip shortage is still going on at that point? Besides, when I got my 3090, next gen was just a distant rumor with zero reason to believe anything. I can’t base my dwindling life span on questionable Internet rumors. They have to be pretty solid before I base decisions on them.
I pretty much had the same experience (wanted a 6800 or xt, but only the 6900xt was available and going fast at msrp). But I also got a Series X at msrp and like you, the list of games that I care for are either old games or non murder simulators.

Here is the kick, except for some small details here and there, my old and tired eyes can’t really see much of a difference between the pc and the xbox games.
If anything, the set and forget operation of the xbox seems to be winning over the pc.
Then the cost, gamepass ultimate is a great value that its hard to ignore.

Lets see how this goes.
 
None of the games I play right now support RT or DLSS. They do run well at 5120x1440 which is what matters to me.

I’ve stopped caring about next gen games. At least where AAA games are concerned. It’s all a milking fest of flashy effects with no substance. I’m looking forward to Final Fantasy 14 Endwalker, Total War: Warhammer 3 and maybe Age of Empires 4. If something changes and they actually release something with next-gen graphics AND gameplay at around the Lovelace/RDNA3 launch, then I’ll have a good excuse to get a new exciting toy and my wife can take over the 3090. I’ve been doing this since the ZX81, so I think I know the drill.

Keep in mind that I bought my 3090 at non-scalped price at launch. I did feel a bit of buyers remorse when I first got it, but it was the only option because all the 3080s were GONE before you could even think about clicking on the order button. Now, I feel like I won the lottery. While everyone is stressing over lack of GPUs and things looking bleak in general, I’m just chilling with my 3090 mini reactor. No, this won’t be state of the art when next gen comes out, but that’s always the case and what if the chip shortage is still going on at that point? Besides, when I got my 3090, next gen was just a distant rumour with zero reason to believe anything. I can’t base my dwindling life span on questionable Internet rumours. They have to be pretty solid before I base decisions on them.

I used to be one of this guys who would call others stupid for doing what they want with their hard earn cash till I grow up a little and I realised it is their money not mine so who am I judge? 😅 I myself almost bought a 3090 because I really wanted to play Cyberpunk day 1 but they were going for £1600 at the time and I ended up scoring a Palit RTX 3080 GameRock OC for £875 which I'm happy with, maybe next time I will get the 4090 / 7900XT :joy:
 
None of the games I play right now support RT or DLSS. They do run well at 5120x1440 which is what matters to me.

I’ve stopped caring about next gen games. At least where AAA games are concerned. It’s all a milking fest of flashy effects with no substance. I’m looking forward to Final Fantasy 14 Endwalker, Total War: Warhammer 3 and maybe Age of Empires 4. If something changes and they actually release something with next-gen graphics AND gameplay at around the Lovelace/RDNA3 launch, then I’ll have a good excuse to get a new exciting toy and my wife can take over the 3090. I’ve been doing this since the ZX81, so I think I know the drill.

Keep in mind that I bought my 3090 at non-scalped price at launch. I did feel a bit of buyers remorse when I first got it, but it was the only option because all the 3080s were GONE before you could even think about clicking on the order button. Now, I feel like I won the lottery. While everyone is stressing over lack of GPUs and things looking bleak in general, I’m just chilling with my 3090 mini reactor. No, this won’t be state of the art when next gen comes out, but that’s always the case and what if the chip shortage is still going on at that point? Besides, when I got my 3090, next gen was just a distant rumor with zero reason to believe anything. I can’t base my dwindling life span on questionable Internet rumors. They have to be pretty solid before I base decisions on them.
If that's the case, I agree with you and you should be fine with your 3090 for much more than just 1 or 2 years, if you don't play the most demanding/next gen games and don't care about that.

I still won't buy a GPU that is so expensive only for gaming. I still think it's a stupid investment. Yes, a 3080 is a good purchase (so is a 6800XT), at or close to MSRP, of course. These are the actual high end GPUs that are good value for their performance. I understand your circumstances, but I still don't agree to paying more than double the price of a 3080 for +10% more performance, for a 3090. Even if you give me the money to buy a 3090 and I can buy it at MSRP, I will not do that.

That being said, you are one of the few exceptions, the minority of 3090 gamers. Most of them want 4k, max settings, Ultra and RT. And for those what I said in my previous posts apply 100%.
 
Back