GeForce RTX 3070 vs. Radeon RX 6700 XT: 45 Game Benchmark

Sorry: the 4K performance is far too weak on my favorite games.
Both GPUs are unsuitable for 4K max settings, unfortunately.

Amd is getting greedier. nice. they chose to use a (only) 40CU GPU in the 6700xt compared to 72!!! in the 6800xt and just overclock the heck out of this gpu. Why? To sell a much smaller chip for more money.
Not really a direct comparison. The two GPUs are using different RDNA2 silicon (Navi 21 vs Navi 22) and the 6700XT seems to focus more on clock speeds than raw CUs. Hitting over 2.5GHz boost clocks is nothing to scoff at.

AMD really did manage to get 50% more perf/W without changing the process node. While Nvidia is still arguably better on many fronts, at least we now have good competition in the high end. RDNA2 reminds me of Zen2 when it launched, with RDNA3 possibly being a monster if rumors turn out to be true and they implement chiplets.
 
Last edited:
Amd is getting greedier. nice. they chose to use a (only) 40CU GPU in the 6700xt compared to 72!!! in the 6800xt and just overclock the heck out of this gpu. Why? To sell a much smaller chip for more money.
What? In the first place the 6700XT is using a different chip than the one in the 6800XT, second, everyone was expecting a 40CU config for the 6700XT, the same as the last gen 5700XT and yes every company will want and will do the most they can to get more money for their products, being fair or not this is the world created by consumerism and capitalism.
 
After more than a decade (since I first read his HD 4870 review back in the day), I suspect that I may never tire of Steve's hardware analyses. He's simply second-to-none and has no equal that I've ever seen. The time that Steve puts in and the dedication to his craft makes him a tech artisan in my book.

Having said that, I had a bit of trouble staying interested in this article. Not because of Steve's work, in fact, that's what kept me reading the whole thing but because it really doesn't matter. It really sucks to go from the excitement and craziness of 2020's second quarter to..... nothing, especially when it happened over a span of mere weeks.

Every day, I thank my lucky stars that I got my RX 5700 XT last August. I was worried that there would be a new $400 card with double the performance but as it turns out... there are no cards at all.

My R9 Fury was just fine for playing Assassin's Creed Odyssey and Godfall but I don't know how well it would be able to handle Far Cry 6. I do know that it's somehow handicapped when trying to play FS2020 based on numbers that I've seen (even the RX 470 is faster, WTF?!) and that could be a thing going forward.

Incidentally, I DID pre-order the remastered Rome: Total War but from what I can tell, that game would run fine even on a potato PC. Maybe I'll build a PC with spare parts using my FX-8350 and one of my R9 Furies for the older games in my library like Bioshock: Infinite, Rome: Total War, etc.
 
No power usage results......
This is a performance test, not an efficiency test. Standard power usage figures as always reported by TechSpot are available in the RX 6700 XT review, which is linked at the beginning of this article: https://www.techspot.com/review/2216-amd-radeon-6700-xt/
Adding a per-game power usage metric would require a ton of extra work and it wouldn't bring much extra info over what is already available.
 
Good article, Steve.

Pavlov was right. My mouth waters every time the GPU bell rings.

That being the case, I wonder if anyone might offer an opinion about the root causes of relative performance differences (this being TechSpot, there are sure to be a few opinions): software in the game engine, optimization in game, driver software, optimization of same, VRAM, other hardware, the GPU itself? or, are devs constrained to focus on one or the other maker (like Cyberpunk/Witcher3
vs Outer Worlds - almost a 20% spread)?

Shortly, we may see new hardware from Intel and possibly others. What should we look for? Will there be some real competition?
 
Sorry: the 4K performance is far too weak on my favorite games.

You are obviously trolling at this point. Everyone and their grandmother knows these aren't 4k gpus.

And on topic the 6700XT seems to me more of a competitor for the 3060TI unless amd can boost performance abit more with drivers.
 
In Australia the cheapest in stock 6700XT I can find is $899, however the cheapest in stock 3070 is $1349. Pricing is crazy everywhere, but seems crazier with NVIDIA hardware at the moment.
I believe that the 30 series (exception being the 3060) are far better Crypto cards, that is probably the reason for the price differential.
 
In my country both gpus sell for almost the same price, so the 3070 it's the clear winner albeit it's got less ram. I've seen a review in youtube of these two running the same games at the same time, same res, etc., and somehow the RTX managed to deliver better fps using less vram, so I don't think it's going to be a relevant factor for playing at 1440 the extra 4gb of the radeon
 
Agreed, but at that price delta the 6700XT is arguably the better buy unless you are a Control fanatic. At MSRP, the 3070 makes more sense.
I started Control right now. Even with SAM enabled and all the power of my card and rig, I cannot move it beyond 60 fps in 4K. So heavy! Obviously, all settings maxed out. I like this game for now, but I notice that the AI is just s***. The enemies reminded me of the first enemies of Lara in the 90's. Let's see if they improve later.
 
Frankly who cares. You cant buy any of these for less than 3 times mrsp. You wont be able too for at least 12 mths. Please review something I can buy. Maybe if you stop reviewing vapourware nvidia and amd will stop selling in bulk to miners and scalpers. We are not going to forget their greed. Hopefully Intel will start to challange in the near future. Amd and Nvidia have had it too good for too long.
 
Back