AMD Radeon RX 6700 XT Review: Better than RTX 3070?

Well aside from Ampere basically dominating performance in those, but you want neither, which leaves AMD sponsored titles then?

How about Dirt5, where DF tested a 6800XT against an RTX3080 in ray tracing games, where they calculate the additional cost to render time in the form of milliseconds penalty the RT effects add, and in Dirt5 although the 6800XT has higher overall performance, the RT effect still incurs a bigger penalty relative to the 3080.


Or sponsored by neither AMD or Nvidia. Not much there, agreed.

But despite bigger RT penalty, AMD still has higher overall performance. Who cares about RT penalty if performance is still better :confused:

Dirt 5 launched 6.11.2020. GeForce RTX 3080 launched 17.9.2020. 6800XT launched 18.11.2020. It's simply not possible that Dirt 5 was made for RT from beginning. All current "RT" games are games without RT, where RT support was added very late stages of development. We will see real AAA RT titles around 2023 earliest. At that time, current cards are replaced by newer ones...

"Which one has better RT" is question we cannot answer. And when we can, we have newer cards.

Seriously, absolutely no one denies this, not even AMD.
Seriously, yes. It's not about who has "more RT processing power" or "has better theoretical RT performance". It's about who gets better performance on real life applications using realistic amount of RT. And that still remains to be seen. for a long time like I said above.
 
Yes, I think it's about time to highlight the card in question in the benchmarks. This has been going too long unchecked in Techspot.
 
Yes, I think it's about time to highlight the card in question in the benchmarks. This has been going too long unchecked in Techspot.
You pick an issue and stick with it. I like that. ;)
I'm like that with Gaming\Desktop Replacement Laptop reviews but
Tim Schiesser has really been taking care of that.
 
But despite bigger RT penalty, AMD still has higher overall performance. Who cares about RT penalty if performance is still better :confused:
This is precisely the point though, a bigger RT penalty in their own sponsored title, speaks volumes. I do agree we need to see a lot more flesh out over time, but the evidence right now is already reasonably strong.
 
Seriously, yes. It's not about who has "more RT processing power" or "has better theoretical RT performance". It's about who gets better performance on real life applications using realistic amount of RT. And that still remains to be seen. for a long time like I said above.
It's not really possible to take you seriously anymore. There are proven facts and those are always better than opinions. The RT gap between the 30 series and RDNA2 series has been well documented and tested, AMD hasn't even come close yet. So, if you are going to proclaim untenable positions because you like one manufacture more than the other, there is no point in discussion.
 
Funny how the RT hype continues, when in reality, there is so little to show for it.

Personally, only two demos really provided a real upgrade when using Rt, quake and Minecraft.

The others fall on the same hole, (oh look at this puddle, RT is marvelous there!).

But fine, you have your RT, the problem is that except for maybe the 3090 and in some case, a 3080, all cards are simply taking a performance hit that not even DLSS helps with.

Which tell us that the hardware is still not there.

And of course, regardless of AMD lower performing RT, they dont have their DLSS tech out yet, so we dont know what it can really do.

In the end, the drones needs to understand that this is still too new and too taxing on what is out there right now.

So why keep wasting your time and energy just to prop your fav company over the other?

As I said on another post, this sounds like the slaves back in the day when one would tell the other "my master has a bigger plantation that yours!"

Instead of thinking about how the heck to get their freedom....
 
Last edited by a moderator:
I think this is a video that every 'fanatic' should watch.

Always consider the possibility that you're wrong and "fighting" for something that is actually detrimental to yourself in the long term.
 
It's not really possible to take you seriously anymore. There are proven facts and those are always better than opinions. The RT gap between the 30 series and RDNA2 series has been well documented and tested, AMD hasn't even come close yet. So, if you are going to proclaim untenable positions because you like one manufacture more than the other, there is no point in discussion.
Some Nvidia sponsored titles are not "proven facts" or "well documented and tested". Nothing has so far been proven on code fairly optimized for both AMD and Nvidia.

Feel free to prove me wrong (I know you cannot do it).
 
Funny how the RT hype continues, when in reality, there is so little to show for it.

Personally, only two demos really provided a real upgrade when using Rt, quake and Minecraft.

The others fall on the same hole, (oh look at this puddle, RT is marvelous there!).

But fine, you have your RT, the problem is that except for maybe the 3090 and in some case, a 3080, all cards are simply taking a performance hit that not even DLSS helps with.

Which tell us that the hardware is still not there.

And of course, regardless of AMD lower performing RT, they dont have their DLSS tech out yet, so we dont know what it can really do.

In the end, the drones needs to understand that this is still too new and too taxing on what is out there right now.

So why keep wasting your time and energy just to prop your fav company over the other?

As I said on another post, this sounds like the slaves back in the day when one would tell the other "my master has a bigger plantation that yours!"

Instead of thinking about how the heck to get liberated....
This is not really true, there are games the run well with RT enabled, often though, its a choice between higher resolution or RT, DLSS makes it possible to be able to run the game with a high resolution and RT. Control is a great example. The game runs well at 1440p with RT on with both the 3080 and the 3090, but you can use DLSS to get there with 4K (1440p equivalent), the game also runs quite well with a 3070 if you turn RT to a lower RT setting or play it at 1440p with DLSS (1080p equivalent). This is not the only game, but it does show that its possible. It's also one of the prettiest RT games, and its a demanding game even if you don't have RT. I played the game on an RX 5700 XT and there were times where the frame rate would drop below 60fps @ 1440p on that card in the highest settings even on that card. The Medium is another game where RT can be run and still achieve pretty high framerates with DLSS ON, Metro Exodus is another. No, RT should not be the primary reason you pick up a GPU. DLSS is supported in even more games than RT and there it can be a big boost to frame rates, but even that should not be the only reason you pick up a certain GPU. If this is all Nvidia had going for it, and AMD was clearly blowing it the RTX 3080 away with the RX 6800 XT, then I would say, sure AMD is the way to go, but at 4K the $699 3080 often outperforms the $649 6800 XT. Without DLSS/RT, the cards are essentially equal in terms of what the experience the user will get, but when games do have RT or DLSS capabilities the 3080 is the superior card, if you plan on only running 1440p and never RT and don't care about DLSS, then sure, the 6800 XT is a great card and is most often superior at sub 4k resolutions. It's not one size fits all, like most things you can buy, there are pros and cons to each. AMD obviously thought RT was important enough to included it in RDNA2 and they also are working on a competitive feature to DLSS, so AMD is not buying your own defense of them.
 
This is precisely the point though, a bigger RT penalty in their own sponsored title, speaks volumes. I do agree we need to see a lot more flesh out over time, but the evidence right now is already reasonably strong.
Evidence right now is very weak. AMD can use Infinity Cache to store many assets ray tracing needs. While Nvidia cards have higher memory bandwidth with GDDR6X, Infinity Cache has simply superior latency and bandwidth against it.

Now, how many GPU architectures before RDNA2 has this kind of big cache? None. How many games are optimized from start to this kind on big cache right now? None.

Do the math.
 
This is not really true, there are games the run well with RT enabled, often though, its a choice between higher resolution or RT, DLSS makes it possible to be able to run the game with a high resolution and RT. Control is a great example. The game runs well at 1440p with RT on with both the 3080 and the 3090, but you can use DLSS to get there with 4K (1440p equivalent), the game also runs quite well with a 3070 if you turn RT to a lower RT setting or play it at 1440p with DLSS (1080p equivalent). This is not the only game, but it does show that its possible. It's also one of the prettiest RT games, and its a demanding game even if you don't have RT. I played the game on an RX 5700 XT and there were times where the frame rate would drop below 60fps @ 1440p on that card in the highest settings even on that card. The Medium is another game where RT can be run and still achieve pretty high framerates with DLSS ON, Metro Exodus is another. No, RT should not be the primary reason you pick up a GPU. DLSS is supported in even more games than RT and there it can be a big boost to frame rates, but even that should not be the only reason you pick up a certain GPU. If this is all Nvidia had going for it, and AMD was clearly blowing it the RTX 3080 away with the RX 6800 XT, then I would say, sure AMD is the way to go, but at 4K the $699 3080 often outperforms the $649 6800 XT. Without DLSS/RT, the cards are essentially equal in terms of what the experience the user will get, but when games do have RT or DLSS capabilities the 3080 is the superior card, if you plan on only running 1440p and never RT and don't care about DLSS, then sure, the 6800 XT is a great card and is most often superior at sub 4k resolutions. It's not one size fits all, like most things you can buy, there are pros and cons to each. AMD obviously thought RT was important enough to included it in RDNA2 and they also are working on a competitive feature to DLSS,
How many games you listed on that long-@ss paragraph?

so AMD is not buying your own defense of them.
Not defending anyone, so in the end you missed the whole point of the post because you need to defend Nvidia.
 
Evidence right now is very weak. AMD can use Infinity Cache to store many assets ray tracing needs. While Nvidia cards have higher memory bandwidth with GDDR6X, Infinity Cache has simply superior latency and bandwidth against it.

Now, how many GPU architectures before RDNA2 has this kind of big cache? None. How many games are optimized from start to this kind on big cache right now? None.

Do the math.
You forgot PCI rebar or whatever that is called.
 
Smart Access Memory (AMD) or Resizable BAR (common).

That's another thing about nobody in gaming have much experience, since it was mostly used on professional/server parts.

Yeah, not pretty much "evidence" available...
thanks for the info.

On a couple of places that tested this, it gave some games around 10% or so improvements in performance for pretty much free.

So assuming they can do something close to DLSS, they might have a nice surprise.
 
One weakness of DLSS is that it can't be used with VRS... If rumors are correct, AMD's solution will rely on VRS... It's going to be interesting to see how that plays out.
 
How many games you listed on that long-@ss paragraph?


Not defending anyone, so in the end you missed the whole point of the post because you need to defend Nvidia.
I like AMD, I thought my RX 5700 XT was a top notch card for the price. But I'm not down on Nvidia for offering new tech like DLSS and RT in videogames. You spend so much time trashing DLSS and RT if they are really so unimportant, why do you spend so much time and effort on it?
 
One weakness of DLSS is that it can't be used with VRS... If rumors are correct, AMD's solution will rely on VRS... It's going to be interesting to see how that plays out.
DLSS does not need VRS, DLSS renders the entire frame at a lower resolution, VRS renders only parts of the frame at a lower resolution. It doesn't make sense for DLSS to use VRS as well.
 
But I'm not down on Nvidia for offering new tech like DLSS and RT in videogames.
I'm sorry, but for someone that seems to be smart, you really failed at reading comprehension.
You spend so much time trashing DLSS and RT if they are really so unimportant, why do you spend so much time and effort on it?
At this point, I do wonder why this bother you so much that you are still responding and moving the goal post to something that honestly has already being discussed to death.

But I will repeat it one last time, RT and DLSS are not that important AT THE MOMENT, since the games selection is super small and one more thing, its simply another nvidia technique to keep the prisoners in their cells.

And since you have to have the last word and since its clear that you are missing your days of debating at the high school debate club, by all means, keep replying.
 
I like AMD, I thought my RX 5700 XT was a top notch card for the price. But I'm not down on Nvidia for offering new tech like DLSS and RT in videogames. You spend so much time trashing DLSS and RT if they are really so unimportant, why do you spend so much time and effort on it?
There is a double standard when it comes to nVidia tech and AMD tech. nVidia has a mind share, where pretty much anything they put out is seen as gospel. Even the first DLSS, which was seen as not that great by reviewers, were being propagated by many users. Now that the reviewers agree that DLSS is good tech, they have more fuel to pretty much shove it in everyone's face all the time. Not only is it annoying, but it refrains any objective discussion about the tech, because one is pretty much only allowed to be positive about it.

If this was AMD's tech, everyone would be downplaying it. Radeon boost has been around for quite a while, but, barely anyone talks about it. Same goes for Radeon Chill. And Radeon Anti-lag has been around for a while, barely anyone talks about it, but now that nVidia comes out with their version, suddenly everyone is interested.

nVidia's tech generally ends up either superseded by an open alternative, or simply dead. Hairworks is pretty much dead for example. Which game uses it...? while G-sync has been superseded by both FreeSync and HDMI 2.1 VRR. Yet, many people are willing to jump on the bandwagon, just because nVidia put it out, even if in practice, its use is limited. And I will never understand that.
 
I'm sorry, but for someone that seems to be smart, you really failed at reading comprehension.

At this point, I do wonder why this bother you so much that you are still responding and moving the goal post to something that honestly has already being discussed to death.

But I will repeat it one last time, RT and DLSS are not that important AT THE MOMENT, since the games selection is super small and one more thing, its simply another nvidia technique to keep the prisoners in their cells.

And since you have to have the last word and since its clear that you are missing your days of debating at the high school debate club, by all means, keep replying.
Honestly, because I like to see how far you'll go. And you just keep on going and going.
 
thanks for the info.

On a couple of places that tested this, it gave some games around 10% or so improvements in performance for pretty much free.

So assuming they can do something close to DLSS, they might have a nice surprise.
Yes but my point was that resizable BAR is new technology and no current AAA game is made with that in mind from beginning.
 
Yes but my point was that resizable BAR is new technology and no current AAA game is made with that in mind from beginning.
Oh I know, but remember, so far it has shown improvements on games without the game needing any tweaks.

So yes, assuming that games can be tweaked to take advantage of this, plus their DLSS implementation it could end up with some serious gains in performance.
 
Oh I know, but remember, so far it has shown improvements on games without the game needing any tweaks.

So yes, assuming that games can be tweaked to take advantage of this, plus their DLSS implementation it could end up with some serious gains in performance.
Exactly, so with optimizations gains may be even larger.
 
Back