AMD rumored to be working on a Radeon RX 6900 XTX to challenge the RTX 3090

I
Lmao, how to say you’re an AMD fanboy without saying you’re an AMD famboy; “I have issues with nvidia anti-consumer practice”.

Yeah I don’t care about any of that. I only care abou what I can get for my money. And right now AMDs product set is laughable.
My 6900XT can wipe the floor with that 3070 at 4K. Incredible performer. So, how is it laughable?
 
My 6900XT can wipe the floor with that 3070 at 4K. Incredible performer. So, how is it laughable?
Well, you know the rabid fanbois, in their world, a 1650 is way superior than a 6900XT in everything at all resolutions, etc, because everything from any other company that is not nvidia is trash.
 
It can't be cutting-edge without proper ray tracing, it's always gonna be brought up to make it look worse next to RTX3090. Even though I think it's still early to care about ray tracing fact is many people bought into it and won't spend over a grand for a GPU with missing features.

I wish them success though, it's more interesting when there's competiton.

What do you mean?
AMD RNDA 2 DOES have 'proper Raytracing'... its just that its performance is not as good as NV's (on par or slightly better than RTX 2xxx series).
So, people aren't going to miss out on anything really... even though I personally think RT is not that big of a deal from a graphics point of view (the differences right now are too small to notice 99% of the time and the feature incurs massive performance loss).
 
Man, AMD is sure grasping at straws with this. I get the idea that these will be snapped up by miners because the lack of gaming features will make this card unpalatable to gamers.
To a point, I dont blame them. I mean, the 6900xt was always a halo product for the rabid ones to fight over which company does what better, etc.

Yes, its a bit hypocritical of me, since I have one, but it was either that one at msrp or something less performant for the same price.

But lets not kid ourselves, they are here mostly to show off.

I can see if already, if this is as good in rasterization or better than the 3090 Super, you can count on Jensen releasing either a Titan or a new Super Ultra model.

:)
 
What do you mean?
AMD RNDA 2 DOES have 'proper Raytracing'... its just that its performance is not as good as NV's (on par or slightly better than RTX 2xxx series).
So, people aren't going to miss out on anything really... even though I personally think RT is not that big of a deal from a graphics point of view (the differences right now are too small to notice 99% of the time and the feature incurs massive performance loss).
Shhh! Don't tell the nvidia fools with more money than sense that there is also green grass on the other side? NO! Let them sink into their ignorance... Jensen needs that RTX tax.

Since I got my RX 6700 XT I've been playing with RT:
- Resident Evil Village over 60fps, maxed out RT, no problem.
- Metro Exodus EE, everything maxed, RT on Medium, barely can tell it looks worse than higher settings. Again above 60 fps.

And even though in some games RT is a gimmick now, I can say in these 2 games it looks really good. The type of good that makes it look "just right". Like if you disable RT, you understand the image/lighting/reflections were done wrong until now, but we got so used to that, that we did not care or knew it was wrong, LOL.

The only game I can't play Ultra + RT is CP 77. So it's just on Ultra, no RT. But that game even though it's an nvidia sponsored title it has horrible overall optimization even on nividia GPUs. A 3070 with DLSS can't hold smooth 60fps Ultra + RT at 1080p in that game. It constantly drops below 60fps into low 50s, which you can tell is bad even with VRR/Freesycn/Gsync. So it's not really better at all.

The trick is very simple: RX 6700 XT is a 1440p raster GPU and a 1080p RT GPU. That's why I bought it for 1080p and this way I don't downgrade resolution or settings and I use all it's features which include RT and FSR.
So I can easily say now I don't give a flying **** about RTX or DLSS! :cool:
 
What do you mean?
AMD RNDA 2 DOES have 'proper Raytracing'... its just that its performance is not as good as NV's (on par or slightly better than RTX 2xxx series).
So, people aren't going to miss out on anything really... even though I personally think RT is not that big of a deal from a graphics point of view (the differences right now are too small to notice 99% of the time and the feature incurs massive performance loss).
AMDs ray tracing exists but it's not competitive with Nvidias. I agree that it's not worth it as I mentioned in the post you quoted but it's not me who cares, it's the people buying this stuff. I would never pay that much for a GPU regardless.
 
Shhh! Don't tell the nvidia fools with more money than sense that there is also green grass on the other side? NO! Let them sink into their ignorance... Jensen needs that RTX tax.

Since I got my RX 6700 XT I've been playing with RT:
- Resident Evil Village over 60fps, maxed out RT, no problem.
- Metro Exodus EE, everything maxed, RT on Medium, barely can tell it looks worse than higher settings. Again above 60 fps.

And even though in some games RT is a gimmick now, I can say in these 2 games it looks really good. The type of good that makes it look "just right". Like if you disable RT, you understand the image/lighting/reflections were done wrong until now, but we got so used to that, that we did not care or knew it was wrong, LOL.
I didn't know or care if it was wrong because I literally pay zero attention to things like shadows and reflections. I'm always looking for the next thing to kill or the next thing trying to kill me! :laughing:
The only game I can't play Ultra + RT is CP 77. So it's just on Ultra, no RT. But that game even though it's an nvidia sponsored title it has horrible overall optimization even on nividia GPUs. A 3070 with DLSS can't hold smooth 60fps Ultra + RT at 1080p in that game. It constantly drops below 60fps into low 50s, which you can tell is bad even with VRR/Freesycn/Gsync. So it's not really better at all.

The trick is very simple: RX 6700 XT is a 1440p raster GPU and a 1080p RT GPU. That's why I bought it for 1080p and this way I don't downgrade resolution or settings and I use all it's features which include RT and FSR.
So I can easily say now I don't give a flying **** about RTX or DLSS! :cool:
My 4K TV seems to have a hardware version of DLSS because try as I might, I cannot tell the difference between 720p, 1080p, 1440p or 2160p. I guess that when a TV is designed to upscale an image that was originally an analogue SD image to look good on a 55" panel, a digital 720p image is a piece of cake for it. I also don't give a flying FiretrUCK about RTX or DLSS.
 
If you play at 4K like me, the perf hit is terrible. So you have to use DLSS to compensate, but this is just a trick. At this time no card is able to keep a consistent framerate at 4K with RT enabled without res tricks. RT is for the future generations of graphics cards, or for people gaming at low resolutions. I have played "only" 4K for the last 3 years, and my plan is to move to 8K in a few years.
Everyone is about RT now, left, right and center (pro or against) and in X years when we achieve that jump in performance to be able to do 4k 120fps with RT, then they hit us with the next NewsFlashHammer: Real Time Path Tracing!

Ray Tracing is kids play compared to Path Tracing, both in IQ realism and how hard it is to run in real time.

So the cycle begins again for reaching that next gen GPU architecture that can do 4k 120fps with Real Time Path Tracing. So that's what, 2030? 2035? How long are we gonna keep chasing this forever dangling carrot?
 
I
My 6900XT can wipe the floor with that 3070 at 4K. Incredible performer. So, how is it laughable?
It’s laughable because AMD wanted twice the money for that 6900Xt than Nvidia wanted for the 3070. And also it’s poor at ray tracing and doesn’t have DLSS. Any game that does have those things (and there is quite a lot now) will run better on the Nvidia part that costs half as much.

Really the 3070 wiped the floor with most of the market if you account for price and features.
 
If you play at 4K like me, the perf hit is terrible. So you have to use DLSS to compensate, but this is just a trick. At this time no card is able to keep a consistent framerate at 4K with RT enabled without res tricks. RT is for the future generations of graphics cards, or for people gaming at low resolutions. I have played "only" 4K for the last 3 years, and my plan is to move to 8K in a few years.
Yeah I’d probably get a 1440p monitor to use ray tracing if I were able to buy an RTX part today. And the performance is clearly good enough to ray trace at 1440p. Even the 20 series was fine.

I’ve seen the difference between RT on and off, a few of my friends have RTX cards. And I find it bigger than the difference between low and high settings. It depends on the game of course but in something like Control I was shocked at how much better it looked turned on.
 
Yeah I’d probably get a 1440p monitor to use ray tracing if I were able to buy an RTX part today. And the performance is clearly good enough to ray trace at 1440p. Even the 20 series was fine.

I’ve seen the difference between RT on and off, a few of my friends have RTX cards. And I find it bigger than the difference between low and high settings. It depends on the game of course but in something like Control I was shocked at how much better it looked turned on.

In Control?....I think Cyberpunk or Metro Exodus its a better showcase for RT
 
In Control?....I think Cyberpunk or Metro Exodus its a better showcase for RT
I have played all three with RT on and yeah I thought they all showcased RT very well but for me turning it on made the biggest difference on control compared to without it. It feels so much more cartoony with RT offI didn’t get to see what it looks like without on Metro as I tried the enhanced edition, it’s been nearly two years since I played the original game without ray tracing. Also the rift breaker demo thing looks great with RT on.

For me it’s all about minecraft. Me and my friends are big into it, we have several custom servers and that game is absolutely transformed with RT. I’m so bloody jealous! They keep pumping these screenshots into the discord chat and it makes me feel like I’m playing an Amiga game by comparison.

I also noticed a visual improvement on death stranding with DLSS. It just looks cleaner and it’s running at a higher frame rate. Can’t understand why anyone would turn DLSS off in that game but they test it with it off for some reason.
 
It’s laughable because AMD wanted twice the money for that 6900Xt than Nvidia wanted for the 3070. And also it’s poor at ray tracing and doesn’t have DLSS. Any game that does have those things (and there is quite a lot now) will run better on the Nvidia part that costs half as much.

Really the 3070 wiped the floor with most of the market if you account for price and features.

3070 blah blah blah... That card was gimped before it even came out, enjoy running out of memory in a year if you ever get one, DLSS won't save you and nVidia will be laughing at you when you need to upgrade  you should probably stick to 1080p if you want to keep it for longer...
 
I have played all three with RT on and yeah I thought they all showcased RT very well but for me turning it on made the biggest difference on control compared to without it. I didn’t get to see what it looks like without on Metro as I tried the enhanced edition, it’s been nearly two years since I played the original game without ray tracing. Also the rift breaker demo thing looks great with RT on.

For me it’s all about minecraft. Me and my friends are big into it, we have several custom servers and that game is absolutely transformed with RT. I’m so bloody jealous! They keep pumping these screenshots into the discord chat and it makes me feel like I’m playing an Amiga game by comparison.

I only played the enhanced edition of metro but I thought the game looked really good although the fps was kida meh for a £900 gpu same goes for Cyberpunk without DLSS these games would be unplayable since the fps was like 40 at 1440p
 
3070 blah blah blah... That card was gimped before it even came out, enjoy running out of memory in a year if you ever get one, DLSS won't save you and nVidia will be laughing at you when you need to upgrade  you should probably stick to 1080p if you want to keep it for longer...
So do you genuinely believe an 8GB card will be limited within 12 months? Because that’s not going to happen lol.

I won’t buy a 3070 because they are well overpriced. But they would certainly do much better for much longer than any Radeon card. It has DLSS.
 
I only played the enhanced edition of metro but I thought the game looked really good although the fps was kida meh for a £900 gpu same goes for Cyberpunk without DLSS these games would be unplayable since the fps was like 40 at 1440p
When I tried Cyberpunk on a 2080 Super turning on DLSS cancelled out the performance hit of ray tracing. It was running at 60 just fine, without dlss and rt it was going about 80. It wasn’t my rig so I can’t comment on the exact settings but it looked great and I’d take 60 with RT over 80. This was at 1440p.
 
Simply for nostalgia, I love how they're using the XT amd XTX branding. Brings me back to my first real high-end graphics card, x1950xtx.
I had a bunch of those in a F@H farm. Awesome cards for the money and they ruled the folding game until Nvidia ruined the party.
 
The fanatics are at it again I see... Let's get some things out of the way.

Saying that AMD doesn't have Ray Tracing is like saying that Toyota doesn't have wheels, simply because a Bugatti is faster.

AMD cards might not have DLSS, but in all honesty, the majority of cards, either from nVidia or AMD, don't need DLSS, and the majority of users don't need it either, because of all those additional artifacts that are introduced in so many games. DLSS is mainly useful for improving image quality when you already have to lower the resolution to get good performance. And there's FSR as well. So yeah.

RT will mainly become relevant at the point where it starts influencing game mechanics. Before that it is simply shinier graphics, and graphics don't make a game. If they did, the Switch wouldn't have sold nearly as much as it did.
 
Let's get one thing straight: any, I repeat, any GPU today above $1000 (MSRP or scalped) is stupid. Period.
I don't care how much money you have... you can have a Mount Everest size pile of money and it's still stupid. You get +10% more performance for that special kind of stupid.

The 2nd reason why all the GPUs today above $1000 are stupid and are gonna be even more stupid is because about this time next year (Q2/Q3 2022) we will have at least one if not both new generations of GPUs from AMD and nvidia and those GPUs will make current gen look like a joke.

I guess I’m stupid then ¯\_(ツ)_/¯

But at least I got my 3090 at msrp at launch.

What may be coming out next year is of no use to me as I live in the present.
 
Everyone is about RT now, left, right and center (pro or against) and in X years when we achieve that jump in performance to be able to do 4k 120fps with RT, then they hit us with the next NewsFlashHammer: Real Time Path Tracing!

Ray Tracing is kids play compared to Path Tracing, both in IQ realism and how hard it is to run in real time.

So the cycle begins again for reaching that next gen GPU architecture that can do 4k 120fps with Real Time Path Tracing. So that's what, 2030? 2035? How long are we gonna keep chasing this forever dangling carrot?
Nvidia was wrong releasing a technology (RT) for the RTX 2000, when the market was not ready for that. They charged you more for a technology their cards couldn't cope with back in the day. Now, with the current generation, their cards can cope with it (without res tricks) only at 1080p and 1440p, I guess, but no more than that without tricks. RT would be, at minimum, for the next generation of cards. And yes, it is a never-ending story, you know. But companies make mistakes too, and this just makes the things even worse.
 
Last edited:
Back