GeForce RTX 3070 vs. Radeon RX 6800: 41 Game Benchmark

I absolutely love ray tracing and am very excited about the future of this tech now that we are starting to get decent implementations (well 4 games so far but we know there is going to be more). So I can’t imagine buying a graphics card for this sort of money that doesn’t have this capability.

I find in most RTX games (not BFV or SOTTR), turning RTX on has a more noticeable visual impact on the game than going from low to high settings, especially so in Cyberpunk where turning RT on changed the feel of the entire city and game.

I’m going to get accused for fanboying for this post and it’s true. I’m a fanboy of cutting edge graphics in games and today and especially in the future that means ray tracing.

Also DLSS is pretty awesome, ray tracing would be near unusable without it. It’s just so much a better solution now than some janky lower res with a sharpening filter that we used to do when we couldn’t run a game at native.

I've got a 3080, I tried the whole DLSS and RT, play with it enabled in Cyberpunk BUT at 1400p I still can't always maintain 60fps with DLSS set to quality, I would not use any other setting because you can absolutely tell the difference so for me RT is a bust atm. I paid £875 for my GPU and with the Quality setting in DLSS which basically renders the game at 1100p? I don't always get 60fps!! So if anyway here is buying a GPU that is not a 3080 don't bother with the RT because even DLSS wont save you, this is why I think 6800 > 3070
 
Last edited:
I've got a 3080, I tried the whole DLSS and RT, play with it enabled in Cyberpunk BUT at 1400p I still can't always maintain 60fps with DLSS set to quality, I would not use any other setting because you can absolutely tell the difference so for me RT is a bust atm. I paid £875 for my GPU and with the Quality setting in DLSS which basically renders the game at 1100p? I don't always get 60fps!! So if anyway here is buying a GPU that is not a 3080 don't bother with the RT because even DLSS wont save you, this is why I think 6800 > 3070
So I get 40-60 at 1440p, with DLSS balanced and RT ultra, RT reflections on and RT shadows off and that’s just on a 2080. However I did achieve that by following digital foundries settings guide, turning off SSR and AO to low gave me about 15fps for a start. Check it out because you should definitely be able to do better. A friend of mine has a 3080 and he gets 80fps on his 1440p ultrawide.

However I would also say don’t blame the card on this one. I have a 2080 at 1440p and cyberpunk is the only game I can’t max every slider on and get 60+ fps. Maybe it’s badly optimised or maybe it’s just such a complex game it needs that much performance.

Also note which CPU you have as it seems the game is very CPU intensive. I have a 4C8T part (until my 5800X arrives next week 😁) and it doesn’t do so well at cyberpunk, often limiting the game well below 60. Another friend of mines 8700K is def better but also struggled so this is a monster of a game on your CPU.
 
So I get 40-60 at 1440p, with DLSS balanced and RT ultra, RT reflections on and RT shadows off and that’s just on a 2080. However I did achieve that by following digital foundries settings guide, turning off SSR and AO to low gave me about 15fps for a start. Check it out because you should definitely be able to do better. A friend of mine has a 3080 and he gets 80fps on his 1440p ultrawide.

However I would also say don’t blame the card on this one. I have a 2080 at 1440p and cyberpunk is the only game I can’t max every slider on and get 60+ fps. Maybe it’s badly optimised or maybe it’s just such a complex game it needs that much performance.

Also note which CPU you have as it seems the game is very CPU intensive. I have a 4C8T part (until my 5800X arrives next week 😁) and it doesn’t do so well at cyberpunk, often limiting the game well below 60. Another friend of mines 8700K is def better but also struggled so this is a monster of a game on your CPU.

The way I see it I didn't pay almost £900 for a GPU to start dropping details so I have everything maxed out in the game, as for the CPU I did at first played with a overclocked Ryzen 7 2700X to 4250Mhz all core and with the SMT mod and now I have the 5800X my fps is mostly 60 but it can still drop to mid 40's :facepalm:

PS. Not sure if is just my 5800X but the temperatures of it are quite high even though I have a 360mm aio and and 6 intakes fans in my case I get around 75C while playing games on stock settings (n) (N)
 
The way I see it I didn't pay almost £900 for a GPU to start dropping details so I have everything maxed out in the game, as for the CPU I did at first played with a overclocked Ryzen 7 2700X to 4250Mhz all core and with the SMT mod and now I have the 5800X my fps is mostly 60 but it can still drop to mid 40's :facepalm:
So I’m with you, the game performs like a joke. But I still played 60+ hours and the experience was very good, in fact for me it’s one of the best open world games I’ve played in years. There are some games worth putting up with bad performance for and this is definitely one of them. Luckily it isn’t a competitive first person shooter but is a slower paced story game.

I should point out aswell that if you had purchased from a competitor that does not have ray tracing or DLSS, your experience would have been significantly worse.

But I wouldn’t hold it against Nvidia, it’s one game. I have a 2080, literally half the performance of your card and in every other game I’ve tried I can max it at 60 or more. (Well except flight sim, if you can call that a game).
 
So I’m with you, the game performs like a joke. But I still played 60+ hours and the experience was very good, in fact for me it’s one of the best open world games I’ve played in years. There are some games worth putting up with bad performance for and this is definitely one of them. Luckily it isn’t a competitive first person shooter but is a slower paced story game.

I should point out aswell that if you had purchased from a competitor that does not have ray tracing or DLSS, your experience would have been significantly worse.

But I wouldn’t hold it against Nvidia, it’s one game. I have a 2080, literally half the performance of your card and in every other game I’ve tried I can max it at 60 or more. (Well except flight sim, if you can call that a game).

I still like the game and the poorer than expected performance its not a big deal, the card is great although my wife has a 5700XT and she also plays at 1440p and the game still look quite good without RT in my opinion so I wouldn't mind playing it like that, only reason I got a GeForce card is because I sold my Radeon VII first in hopes of getting a 3080 as I did not believe AMD could deliver a 3080 class of performance but after the presentation I really wanted the Red Devil 6800XT but since they are harder to find than gold dust I just settled on the GeForce which I do actually like, its really quiet for such a power hungry card and it looks great, its the Palit GameRock OC :joy:
 
Sounds like someone who doesn't use DLSS. What's artificially upscaled mean? Isn't all upscaling artificial? I've done side by side comparisons and it's impressive the quality they manage to give you for the performance that you gain. Most commonly the people that rag on DLSS are the very same people that will praise FidelityFX Super Resolution.

All the upscaling tech (DLSS or Fidelity Cas) is still upscaling. I still see artifacts from the upscale. 1440p or 1800p upscale is still not 4k. Visually upscale can never give you the fidelity of rending at native resolution.

As for Ray Tracing, yeah I'll pass on a tech that requires a static image to appreciate. You're never gonna notice those Reflections when you're running and gunning.
 
As for Ray Tracing, yeah I'll pass on a tech that requires a static image to appreciate. You're never gonna notice those Reflections when you're running and gunning.
Never underestimate the power of imagination brother. Or delusion. Not that there is anything wrong with it. It only bugs me when it is passed off as absolute fact. Even as far as world lighting goes weather it is actually better or not is subjective. I have seen plenty of RT lighting effects that I didn't see as better. The only real point of agreement is that it is different.
 
Transistors (billion) on both cards: 17.4b vs 26.8b

While xbox series x has "15,300 million" transistors.

We can expect that the consoles will deliver quite the image quality since they could be optimized to the bone to use all those little switches inside, almost as many as on the 3070 (performance) level.

I am optimistic about ray tracing on consoles, since it was implemented in spider man quite very well.

I am not so keen on DLSS since I did not see any deep dive into the supposed tech and how it connects your algorithms to those of a super computer. Sounds like nice marketing too me, to make you believe ultra sharpening, upscale and contrast is done by some mega powerful tech inside your driver, which is probably just 0.05% (give or take) computational power of your gpu really.
 
Last edited:
All the upscaling tech (DLSS or Fidelity Cas) is still upscaling. I still see artifacts from the upscale. 1440p or 1800p upscale is still not 4k. Visually upscale can never give you the fidelity of rending at native resolution.

As for Ray Tracing, yeah I'll pass on a tech that requires a static image to appreciate. You're never gonna notice those Reflections when you're running and gunning.
Honestly mate, if you can’t see the difference RT makes then you straight up need an optician. It makes a bigger difference to the visuals of a game than any other setting outside of resolution.
 
Honestly mate, if you can’t see the difference RT makes then you straight up need an optician. It makes a bigger difference to the visuals of a game than any other setting outside of resolution.


Maybe you need better reading comprehension. You won't notice Ray tracing when you're running and gunning.
 
Maybe you need better reading comprehension. You won't notice Ray tracing when you're running and gunning.
Fair enough. So if you were playing a game like cyberpunk you can’t tell if RT is on for about 10% of the time you play it? As that’s about all the “running and gunning” you do.

Furthemore if you don’t notice RT when running and gunning then there is simply no way you would notice things like texture quality, Ambient occlusion, anti aliasing etc etc either as these have a far smaller effect on visual quality than RT does.

This is all very amusing. In 10 years time all games will be using real time lighting as it saves devs huge amounts of time and creates significantly greater and more realistic visuals. However I can’t guarantee that you in particular would notice this though..
 
Fair enough. So if you were playing a game like cyberpunk you can’t tell if RT is on for about 10% of the time you play it? As that’s about all the “running and gunning” you do.

Furthemore if you don’t notice RT when running and gunning then there is simply no way you would notice things like texture quality, Ambient occlusion, anti aliasing etc etc either as these have a far smaller effect on visual quality than RT does.

This is all very amusing. In 10 years time all games will be using real time lighting as it saves devs huge amounts of time and creates significantly greater and more realistic visuals. However I can’t guarantee that you in particular would notice this though..

Having turned RT on and off in bfv and cp2077 I can say it doesn't do much.

As for cp2077 plenty of time im inside a dark corridor or interior with no reflective surfaces. When I'm outside unless I specifically look for the reflection in the glass or puddles I'm not going to notice it.

For bfv, the RT is on the barrel of the gun and unless I'm playing on a map with plenty water not gonna notice. Even with maps with water RT is on near the edges.

Anti aliasing and Hi Def textures are technologies that adds ten times the image quality improvement than RT ever will.
 
Having turned RT on and off in bfv and cp2077 I can say it doesn't do much.

As for cp2077 plenty of time im inside a dark corridor or interior with no reflective surfaces. When I'm outside unless I specifically look for the reflection in the glass or puddles I'm not going to notice it.

For bfv, the RT is on the barrel of the gun and unless I'm playing on a map with plenty water not gonna notice. Even with maps with water RT is on near the edges.

Anti aliasing and Hi Def textures are technologies that adds ten times the image quality improvement than RT ever will.
Yup you definitely need an optician. Claiming Anti aliasing adding more visual quality than RT confirms that your eyesight is non existent. Also claiming that BFV implementation of RT is comparable to cyberpunk is a joke, BFV RT was awful, the reviewers said it was awful. But the reviewers (and anyone who has actually used RT) will tell you that it’s a good implementation in cyberpunk. Makes me wonder if you even have an RTX card.

Bury your head in the sand as much as you want mate. RT has gone mainstream in AAA gaming no matter how much you claim you can’t tell the difference.
 
Yup you definitely need an optician. Claiming Anti aliasing adding more visual quality than RT confirms that your eyesight is non existent. Also claiming that BFV implementation of RT is comparable to cyberpunk is a joke, BFV RT was awful, the reviewers said it was awful. But the reviewers (and anyone who has actually used RT) will tell you that it’s a good implementation in cyberpunk. Makes me wonder if you even have an RTX card.

Bury your head in the sand as much as you want mate. RT has gone mainstream in AAA gaming no matter how much you claim you can’t tell the difference.

I guarantee you the average gamer would notice jagged edges more than reflections that require reflective surfaces. Edges are always there, reflective surfaces not so much.
 
I guarantee you the average gamer would notice jagged edges more than reflections that require reflective surfaces. Edges are always there, reflective surfaces not so much.
I agree, however RT reflections are just icing on the RT cake. The best RT effect is lighting, allowing dark areas of 3D generated worlds to be dark and light areas to be light like they would in real life. Currently in most rasterised games you can see in the dark. You also have rt shadows, which can be diffused and far more accurate than the cartoony rasterised shadows we use today.

But I know I’m wasting my time. You want to be stubborn about progress. I’m old enough to remember people (mostly in magazines and old school web forums) slamming the og PlayStation for not being powerful enough to run 3D games when that launched.

If you can’t see the difference with RT on then you can continue to keep it turned off for now and the next few years. But at somepoint games will stop giving you that option. It’s called progress.
 
I agree, however RT reflections are just icing on the RT cake. The best RT effect is lighting, allowing dark areas of 3D generated worlds to be dark and light areas to be light like they would in real life. Currently in most rasterised games you can see in the dark. You also have rt shadows, which can be diffused and far more accurate than the cartoony rasterised shadows we use today.

But I know I’m wasting my time. You want to be stubborn about progress. I’m old enough to remember people (mostly in magazines and old school web forums) slamming the og PlayStation for not being powerful enough to run 3D games when that launched.

If you can’t see the difference with RT on then you can continue to keep it turned off for now and the next few years. But at somepoint games will stop giving you that option. It’s called progress.

So did you buy into the 8K gaming claim from a certain company too? I mean progress right?
 
So did you buy into the 8K gaming claim from a certain company too? I mean progress right?

You aren’t suggesting that the state of ray tracing today is comparable to the state of 8K gaming today now are you?

Because that would be very amusing.
 
Sure they both gimp your performance.
I have a card released in 2018 that can allow me to enable ray tracing in all games that have it so far and get 60fps at 1440p. That same card cannot even get 20fps in any game at 8K.

This should highlight how big a difference the level of “gimping” to the performance that you are referring to. And hopefully give you an idea of the discrepancy between the state of real time lighting and 8K performance.
 
I have a card released in 2018 that can allow me to enable ray tracing in all games that have it so far and get 60fps at 1440p. That same card cannot even get 20fps in any game at 8K.

This should highlight how big a difference the level of “gimping” to the performance that you are referring to. And hopefully give you an idea of the discrepancy between the state of real time lighting and 8K performance.

Sounds like you need a new card lol 😂
 
DLSS is even worse. Its just a check mark to lower your resolution when your ego wont allow you to do it manually. Want better performance at 4k, then just run the game at 1440p. That's DLSS logic, deception 101. DLSS is the biggest deception scam I've ever seen in PC tech. How they constantly get away with flat out lying to customers is beyond me. 4k with DLSS is not 4k and should not be advertised or reviewed as such. It should be 1440p with DLSS to artificially resemble 4k. And any performance comparison to it should be done at the same resolution, not the BS artificially upscaled resolution.
lol Gamers.....

 
In general, all the same points made here could be made about the RTX 3080 vs the RX 6090 XT. The more expensive 6090 may have a rasterizing performance lead, but when you factor in RT and DLSS, its pretty clear the RTX 3080 is still the card to get. Then there's the RX 6080 XT which realistically cannot be found anywhere close to its retail of $649, so same thing, its about on par with the 3080, but overall the 3080 is still the best value out there in high-end GPUs at the moment. Where AMD is really bringing the pressure though is with the console market. The XSX and PS5 are impressive and have closed the gap considerably between the high-end PC and console markets, at least in the short-term. When PS4 and XB1 released, there were already 'affordable by today's standards' GPUs available that were around 2.75X faster than the PS5. The RTX 3080 might be rated at 30 TFlops, but in reality, if we go by benchmarks, its only about 20% faster than the 2080Ti, putting the 'effective for gaming' TFlops right around 19-20. Less than 2X as powerful as XSX and right around 2X as powerful as PS5.
 
Back