GeForce RTX 3070 vs. Radeon RX 6700 XT: 50 Game Benchmark, 2022 Update

Unfortunately, the 3070s 40% premium it's due to it being better at mining and not gaming performance.
 
Unfortunately, the 3070s 40% premium it's due to it being better at mining and not gaming performance.

Well here in AU or at least in Brisbane, my local store has cheapest 6700 XT for A$1099. Cheapest 3070 is A$999 and 3070 Ti can be had for A$1099.

Maybe because we have some of the most expensive electricity in the world has something to do with it?
 
I thought the 6700XTs main competitor was the 3060Ti
Going by Techspot‘s own numbers that would be the correct assumption. The comparison appears to be based on the naming (‚7‘) and msrp, however since the start the 3060Ti and 6700XT were the direct competitors, just like the 3060 and 6600XT as well as the 3050 and 6600. Then again, what else should reviewers base their comparison on ?

That said, I agree with the review‘s conclusion that with a 100ish price difference the 3070 would be a better buy but if the pricing difference stays large, than not.

Ideally, I feel the 6700XT should cost slightly more than the 3060Ti, I.e. if both reach msrp the 6700XT‘s should be lower.
 
Well here in AU or at least in Brisbane, my local store has cheapest 6700 XT for A$1099. Cheapest 3070 is A$999 and 3070 Ti can be had for A$1099.

Maybe because we have some of the most expensive electricity in the world has something to do with it?
We can get 6700XT for ~ $850, RTX 3070 ~ $750, 3070Ti ~ $800 in Malaysia so I don't think it has anything to do with electricity rate.
 
Does anyone else think maybe Metro should be removed from the figures as an outlier?
I think that extreme outliers, particularly if they are sponsored titles, should be removed or trimmed to the nearest in-line data unless there is a logical reason.

That should of course be done in both directions.

Although tbh, I don‘t think doing so would change the overall results in a noticeable way.
 
I think it'll be best to revisit these 2 in a few years time when 8GB of VRAM definitely is the limiting factor at 4k gaming.
 
I always think 3070 or 3070ti is a good choice for 1080p with RTX and no more worries for upgrade in 3 to 4 years.
 
Well here in AU or at least in Brisbane, my local store has cheapest 6700 XT for A$1099. Cheapest 3070 is A$999 and 3070 Ti can be had for A$1099.

Maybe because we have some of the most expensive electricity in the world has something to do with it?
If you check prices online (staticice.com.au) the two can both be had for the same price of around AU$990 in Oz.
 
'However, it all goes horribly wrong for the RTX 3070 at 4K as we run out of VRAM and frame stuttering becomes a major issue, resulting in 1% lows of just 9 fps. Although the Radeon 6700 XT is seen to be 43% faster when comparing the average frame rate, for Nvidia it's a complete fail under these conditions. We're starting to see a few scenarios where 8GB of VRAM isn't enough for driving the latest titles, at least without compromising on visuals such as high-res textures.'

And for what? The reviews I've read of the game say the graphics aren't good. So, what are we getting for all of that VRAM demand?

Crysis established the idea of a game getting lots of mindshare via press coverage for being 'demanding' on hardware. So, it seems that companies learned from that and have been competing to see which can release the least-optimized game possible, in order for reviewers to focus on the game as if it's somehow demanding because it's more complex.

Perhaps it's time to see if the emperor's clothes are worth the price.
 
I think that extreme outliers, particularly if they are sponsored titles, should be removed or trimmed to the nearest in-line data unless there is a logical reason.

That should of course be done in both directions.

Although tbh, I don‘t think doing so would change the overall results in a noticeable way.
The term outlier doesn't really apply well in this context.

If a developer makes the effort to optimize a game for Nvidia or AMD (not both) rather than releasing a game that's not optimized for either platform, the latter developer doesn't deserve to be rewarded with greater praise. It's better to be half-optimized than not optimized at all.

The main problem is when games are intentionally sandbagging performance by doing things like tessellating invisible water. That doesn't just involve trying to harm a competitor (e.g. AMD's cards which weren't as performant in tessellation at the time). It can also involve the mindshare gimmick I mentioned in my prior post. Simply releasing a 'demanding' title is often enough to get a lot of press focus. Many consumers assume that a 'demanding' title is better, more complex and therefore more worthy of interest/purchase.

DOOM was an 'outlier' because it was optimized so well for AMD hardware. Does that mean reviewers should have eliminated it from roundups? Quite the opposite. DOOM showed how a customized engine could extract a lot of performance being left on the table by others, when running on AMD.

Another interesting example is Deserts of Kharak, which actually ran extremely well on AMD's Piledriver CPUs, CPUs that trailed Intel badly in basically every other game. As an 'outlier' it was actually more worthy of journalistic focus because it suggested that there was performance being left on the table, that more optimization may have helped Piledriver performance. That doesn't mean it should have been done but it is very interesting for tech enthusiasts to learn.

Optimization doesn't have to involve sandbagging one platform. It is possible to optimize for both, switching the code to suit the platform being used by each gamer. It takes more effort. Many companies coast along on extremely inefficient design (like Civilization) because they aren't challenged by competition and by having journalists hold them accountable for their laziness. It's also very arguable, in the case of a game like Civilization, that the inefficiency of the language/coding style used is fine because it enables more content to be released. That's the basic fight between low-level optimizers and high-level content spewers. It's hardly an easy argument to resolve. Many times there is a middle road taken.

The market currently appears to favor churning out content quickly rather than producing polished optimized products. 'Eternal beta testing' as the consumer experience has been a concern for many years now. Proponents of that, though, argue that consumers benefit by getting more content. Getting more less-reliable content is better than getting less content that's more reliable? Many consumers seem to have been indoctrinated into living in the conditional future, rather than demanding that the present and near past be more satisfactory. They're always looking for the promise of great products and quickly abandon recent ones for their flaws. How much of that is indoctrination and how much of it is innate temperament I can't say.
 
Not surprised at the one-sided results. What the hell is going on with Metro exodus? Gimped the bus width and expect IC to make up for that beyond 1080p. Falls further behind as res increases. They better be quadrupling IC on RDNA3 since they are continuing with the bandwidth starvation approach.
 
So both have shortcomings that any owner of either would optimize (smartly choose appropriate settings) around. Lord knows a 8GB card owner wouldn't run the pointless HD texture pack in FC6, and 6700XT owners would probably avoid games with heavy (read: meaningful) ray tracing.

Certainly an interesting point of comparison when it comes to the touted 'fine wine' term. One won't be able to run absolute highest texture settings/packs at 4k and perhaps 1440p in coming years, one isn't suited well to running RT at all basically.

This nvidia advantage is all down to the gimping.
Yawn.
 
From what I have seen the HD texture pack on far cry 6 barely make any difference visually but eat a lot of memory. It is not even worth using
 
I thought the 6700XTs main competitor was the 3060Ti
You thought wrong. AMD tried to target and undercut the 3070 with the pricing, not target and overprice on a 3060TI. Granted a 6700XT is more inline with a 3060TI.
At this time, LHR is almost completely bypassed by several mining tools.
No its not. Fact check required. LHR can be bypassed to an extent, but not fully.
I think that extreme outliers, particularly if they are sponsored titles, should be removed or trimmed to the nearest in-line data unless there is a logical reason.

That should of course be done in both directions.

Although tbh, I don‘t think doing so would change the overall results in a noticeable way.
An adjustment should not be made either way. This represents a real world experience and is 100% logical cos its REAL, not a fictional one for the sake of comparison fairness LMAO!
 
Last edited:
Back