GeForce RTX 3070 vs. Radeon RX 6700 XT: 50 Game Benchmark, 2022 Update

I always think 3070 or 3070ti is a good choice for 1080p with RTX and no more worries for upgrade in 3 to 4 years.
Almost exactly spot on. I've a 3070FE. 85% of games can run in 1440p without issue. But for a faultless experience then use for 1080p and use DSR (Dynamic Super Resolution) when it can be achieved - you can make the resolution quality higher and it works beautifully.
 
Well here in AU or at least in Brisbane, my local store has cheapest 6700 XT for A$1099. Cheapest 3070 is A$999 and 3070 Ti can be had for A$1099.

Maybe because we have some of the most expensive electricity in the world has something to do with it?
PCCG has the PowerColour 6700XT Red Devil OC for $999
I have the card and I am very happy with it
 
I thought the 6700XTs main competitor was the 3060Ti
Performance wise, I think that's the case - but they wouldn't know that until testing. They based this comparison on the MSRP, which puts the 6700XT within $20 of the 3070 - making it a fair starting point for comparison. Had the market been anywhere near "normal" over the past year or so, I would have expected the 6700XT to drop in price closer to the 3060Ti (based on performance/demand).

I got lucky and snagged a 3060Ti at MSRP, which is a fantastic card for the money (and the most I've ever spent on a GPU). But it definitely seems like the 3070 needs a bit more memory to really be somewhat "future" proof. At $400, the 3060Ti is fantastic - but I'm not sure you get a ton of extra value with the 3070 - it's probably better to hold out for the 3080 or wait for the next gen cards. I hope to see nVidia's 4000 series launch bit with more RAM in the upper-midrange of the product line. If AMD can make up some ground in their ray-tracing performance, it's going to be very exciting to see how some fierce competition affects pricing as the supply chain settles down.
 
Performance wise, I think that's the case - but they wouldn't know that until testing. They based this comparison on the MSRP, which puts the 6700XT within $20 of the 3070 - making it a fair starting point for comparison. Had the market been anywhere near "normal" over the past year or so, I would have expected the 6700XT to drop in price closer to the 3060Ti (based on performance/demand).

I got lucky and snagged a 3060Ti at MSRP, which is a fantastic card for the money (and the most I've ever spent on a GPU). But it definitely seems like the 3070 needs a bit more memory to really be somewhat "future" proof. At $400, the 3060Ti is fantastic - but I'm not sure you get a ton of extra value with the 3070 - it's probably better to hold out for the 3080 or wait for the next gen cards. I hope to see nVidia's 4000 series launch bit with more RAM in the upper-midrange of the product line. If AMD can make up some ground in their ray-tracing performance, it's going to be very exciting to see how some fierce competition affects pricing as the supply chain settles down.
That does bug me a little. 8GB on a 3070 does seem a bit silly given many people use the same card for many years. You could get a "mid range" 8GB RX580 5 years ago. Even the 3080 should have been 12GB from the beginning, not 10GB. When I changed the god awful chewing gum like thermal pads on my FE card for some Gelid pads, there's two blank spaces on the board for 2 more 1GB modules to fit. And sure the updated ones and the Ti have 12GB. But the cost of these things... when a 6700XT has 12GB out of the box.
 
Do you choose your gear based on the UI, or the performance?

I'll take framerates over a dated looking and rarely accessed UI any and every time.

It appears - at least in this case - that AMD's "fine wine" is aging like milk.
I choose my gear based on my needs and the truth. Not baseless bragging rights, mind share and lies.
 
I cant believe people are excited at these gpu prices. And these are the same people that say Apple customers are sheep.

The 3060Ti is the equivalent of a 6700XT, not a 3070.

That being said, they are all still to high, but I digress, foolish people giving these retailers what they want for pricing is why we are here.

Humanity destroyed by humans.
 
I cant believe people are excited at these gpu prices. And these are the same people that say Apple customers are sheep.

The 3060Ti is the equivalent of a 6700XT, not a 3070.

That being said, they are all still to high, but I digress, foolish people giving these retailers what they want for pricing is why we are here.

Humanity destroyed by humans.
When my friend posted in two enthusiast forums telling gamers that the only way they're going to stop being exploited is by only supporting a corporation that makes its primary mission creating and supplying gamers with hardware (rather than table scraps at ridiculous prices) — he was met with all sorts of excuses.

They included claims so wild, so nuts... One claimed that it's easier to create the equivalent of Google from scratch than to create a gamer-oriented GPU firm. It was all nonsense but that's how it is. Either the people are sheep or they're astroturfers from AMD and Nvidia pretending to be enthusiasts.

As Einstein said... doing the same thing again and again and expecting a different outcome is insane. Yet, that's what gamers do every time there is one of these crypto cycles. They continue to waste their energy hoping that AMD and Nvidia will become their friends.

Intel chose to pile on, after being part of the problem for decades by refusing to offer anything. It is going to contribute to the TSMC crunch, which won't help pricing. So, Intel is not going to be the solution. That company is far too concerned with fat margins. What gamers need is a leaner hungrier firm that won't try to milk them so much. It can be done. It takes effort.

AMD competes directly against PC gamers by propping up the parasitic console grift. But instead of being willing to recognize that, people express outrage when it's pointed out. I suppose they invested in AMD stock. Meanwhile, it has become fashionable to condemn Nvidia. All of this sort of thing is counterproductive. People should put their energy into actually solving their problems.
 
Does anyone else think maybe Metro should be removed from the figures as an outlier?

No, because more and more games will start have RT built in going forward. Metro is just to first game of many to require it. AMD fans talk futureproofing and VRAM as important, but forgets about RT perf.
 
That does bug me a little. 8GB on a 3070 does seem a bit silly given many people use the same card for many years. You could get a "mid range" 8GB RX580 5 years ago. Even the 3080 should have been 12GB from the beginning, not 10GB. When I changed the god awful chewing gum like thermal pads on my FE card for some Gelid pads, there's two blank spaces on the board for 2 more 1GB modules to fit. And sure the updated ones and the Ti have 12GB. But the cost of these things... when a 6700XT has 12GB out of the box.
VRAM does not save you when GPU is too weak. 6700XT got 12GB because it uses a 192 bit bus, just like 3060. The alternative was 6GB which can be a problem in some games, not many tho.

8GB is plenty for mid-end cards like this. 12GB on 3060 and 6700XT is pretty much useless because GPU is too weak.

Not a single game needs more than 8GB for 1440p maxed out. Even 4K/UHD gaming barely uses more than 8GB, this is why 3080 pretty much beats 6800XT in every game at 4K + have DLSS.

You talk future proofing and VRAM amount but don't mention RT performance? 6700XT has miserable RT performance like all AMD cards. In the future you will see more games like Metro Exodus EE thats REQUIRES RT, simply won't run without it enabled, and performance will be tanked on cards with bad RT perf.
 
From what I have seen the HD texture pack on far cry 6 barely make any difference visually but eat a lot of memory. It is not even worth using

Just like back in the days with Shadow of Mordor HD Texture pack... Made zero difference visually, but used alot more VRAM. Textures were the same just slightly less compressed.

Alot of VRAM is never going to save a GPU in the long run. When GPU is too slow, you will need to lower settings anyway, and when you lower settings ... VRAM usage drops *magic*

Pretty much NO GAMES use more than 6GB at 1440p maxed out. Alot of new games only use like 4-5GB maxed out at 1440p.. 12GB on cards like 3060 and 6700XT is pretty much pointless. They got 12GB because of the 192 bit bus.
 
Last edited:
I think it'll be best to revisit these 2 in a few years time when 8GB of VRAM definitely is the limiting factor at 4k gaming.

None of these cards are meant for 4K gaming (in demanding games) without DLSS or FSR. However 3070 beats 6700XT today in 4K gaming by around 10% in AAA games on high settings.

Pretty much no game needs more than 8GB in 4K and the ones that do, won't run at ultra settings on these cards anyway. If they don't run at ultra settings, VRAM usage is below 8GB.

Alot of VRAM on a weak GPU don't really make sense. Which is why 12GB on 6700XT and 3060 is pointless, however they use 192 bit bus so it was 6GB or 12GB. 6GB is still decent for 1440p gaming, since most games use 4-5GB on max settings here.

Tons of games don't really use much VRAM, like Elden Ring maxed out in 4K, 4.5GB used or so.

Always remember; More RAM; Higher allocation. This is true for system RAM and VRAM. If a 3090 uses 10GB in some games at 4K, a 3080 10GB might only use 7-8GB with the exact same settings. If you don't have enough VRAM - YOU WILL KNOW - because game stutters.

6700XT and 3070 are mainly 1440p solutions which they do really well, even with high fps.
 
Back