AMD Radeon RX 6700 XT vs. 5700 XT: Clock-for-Clock Performance

Agreed.

I love AMD but man, they really screwed up the pricing this round.

then again, with the shortage and who are buying, we dont matter, since AMD can charge F U money for it and the current buyers have that kind of money to spend.
Right now, msrp are purely theoretical values. It‘s still questionable if e.g. nVidia‘s are realistic in the first place.

Even if nVidia and AMD charge low prices, then AIB, distributors or retailers can pocket the difference.
 
Regarding your ~45% p/w improvement estimate, Steve.. I compared your 5700XT and 6700XT reviews, zooming in on 5700XT gaming power consumption in both reviews. ~65 watts difference! 186W vs 250W! The difference in numbers for the same card! Needless to say your gaming power consumption testing is worthless and needs drastic changes. And that's also why you got it wrong with your ~45% estimate.
 
Right now, msrp are purely theoretical values. It‘s still questionable if e.g. nVidia‘s are realistic in the first place.

Even if nVidia and AMD charge low prices, then AIB, distributors or retailers can pocket the difference.
Indeed.

Hell, the first one to pull this was nvidia with their rtx20 launch.

they knew that AMD had nothing to compete, so they increased the prices like crazy.
 
Regarding your ~45% p/w improvement estimate, Steve.. I compared your 5700XT and 6700XT reviews, zooming in on 5700XT gaming power consumption in both reviews. ~65 watts difference! 186W vs 250W! The difference in numbers for the same card! Needless to say your gaming power consumption testing is worthless and needs drastic changes. And that's also why you got it wrong with your ~45% estimate.
Apart from the fact that Steve hasn't mentioned a 45% performance per watt improvement in this article, the two power measurements you mention are done via two different tests. For the original RX 5700 XT review, Far Cry New Day at 1440p was used, whereas in the RX 6700 XT, the test ran Doom (2016) at 4K. The loads on the GPUs and local memory in the two tests simply isn't directly comparable; only relative differences within a respective test matters.
 
This article just confirms that AMD probably cut too much out from Navi22. While they still managed to keep up with the likes of a RTX 3060 Ti by means of pushing for very high clockspeed, I feel its become quite a power inefficient card. I do hope that we can see some decent IPC improvement with RDNA3, instead of relying heavily on clockspeed.

Anyway, it does seems like the Infinity Cache helped the RX 6700 series quite a bit considering its got a lower memory bus as compared to the RX 5700 series.
 
I went with the RX570 in the initial build. It seems that RX5700 offers a worthy incremental upgrade if and when I upgrade and if I stick with AMD.
 
The gpu averages on TechSpot need to be normalized, otherwise they suffer from what in statistics is known as high leverage point. It downweights the games with low fps and I’d argue that it should be exactly the opposite. For example, GPU A achieves 25 and 300 fps in two games we test. GPU B achieves 50 and 250 fps. By average, GPU A is better, but I’d argue that this is misleading and that the average user will see bigger improvement from 25 to 50 fps than from 250 to 300. One way to improve the statistics is to normalize to the fastest card. Then we would have GPU A 50% and 100% and GPU B 100% and 83%. In this case B with average 91% is better than A with 75%.
I don't know why you made this comment given there is no average for this content (ohh you meant the graph from the 6700 XT review). We use a 'geomean' so your assumptions are incorrect. (it also makes stuff all difference as your extreme example isn't relevant, but whatever as I said we use a geomean anyway)

Regarding your ~45% p/w improvement estimate, Steve.. I compared your 5700XT and 6700XT reviews, zooming in on 5700XT gaming power consumption in both reviews. ~65 watts difference! 186W vs 250W! The difference in numbers for the same card! Needless to say your gaming power consumption testing is worthless and needs drastic changes. And that's also why you got it wrong with your ~45% estimate.

No idea what you're on about there mate.

This comparison of underclocking a 6700xt to compare to a 5700xt all seems a bit silly..

Amazing insight, thank you.

Thanks for the review - it‘s very interesting. For me, personally, it‘s missing one interesting set of values - power consumption for each card at that clock speed.

Did you by any chance measure that ?

Power consumption figures would have misleading and pretty useless given neither GPU would have been using the optimal voltage for 1.8 GHz and in order to sustain that frequency the power targets were maxed out.

So it’s an overclock for $80 more?

This card is a festering heap of garbage. It’s not reported here because as is now public knowledge, Techspot is quite strongly biased against Nvidia and for AMD because of the whole Editorialgate. I mean what were the temps of both cards at the same clock speeds? I’d imagine quite similar. But these 6700XT parts are hitting 100C according to other reviewers. I guess that’s what happens if all you do is overclock it over the last generation oh and add some more memory chips. So the card can’t ray trace, it has no DLSS or equivalent, loses to a 3060 ti in a lot of cases (without DLSS), which is cheaper (and cooler) and on top of all of that there are major driver issues surrounding the 6000 series reported by several other tech outlets (AC Valhalla is very unstable I hear for example).

You’re an ***** if you buy this part (you know, if things were normal). Nvidia may be a bag of dicks but that bag of dicks make much better graphics cards than AMD right now.

No, go to your room right now!
 
Last edited:
Power consumption figures would have misleading and pretty useless given neither GPU would have been using the optimal voltage for 1.8 GHz and in order to sustain that frequency the power targets were maxed out.

I understand your point regarding power targets (the 6700 XT sits on a much better spot in its power curve), but considering that both have about the same performance at this clockspeed, imo this still would have given a good perf/W value.

 
I understand your point regarding power targets (the 6700 XT sits on a much better spot in its power curve), but considering that both have about the same performance at this clockspeed, imo this still would have given a good perf/W value.
It still is! Same performance as a 5700xt at a 1/3 less power?!

Next, this is perfect for laptops if you are using it as an onboard chip sharing system ram with that huge cache! Once they finish integrating Rdna2 with Ryzen3 then you're going to see a REALLY good laptop chip.
 
I understand your point regarding power targets (the 6700 XT sits on a much better spot in its power curve), but considering that both have about the same performance at this clockspeed, imo this still would have given a good perf/W value.
If anything, it's the other way round - at 1.8 GHz, the 6700 XT is 500 MHz under its Base clock, whereas the 5700 XT is sitting between its Game and Boost clock. There's a chance that the GPU/drivers will ignore any voltage settings anyway - for example, my 2080 Super will run at 1.043 to 1.05 volts, when under a heavy 3D load, almost irrespective of what the clocks are set to (it's the lower one between 1.44 and 1.95 GHz, and the higher one above 2 GHz). The voltage-frequency curve in Afterburner states that it should be at 0.7 volts at 1.44 GHz!
 
If anything, it's the other way round - at 1.8 GHz, the 6700 XT is 500 MHz under its Base clock, whereas the 5700 XT is sitting between its Game and Boost clock. There's a chance that the GPU/drivers will ignore any voltage settings anyway - for example, my 2080 Super will run at 1.043 to 1.05 volts, when under a heavy 3D load, almost irrespective of what the clocks are set to (it's the lower one between 1.44 and 1.95 GHz, and the higher one above 2 GHz). The voltage-frequency curve in Afterburner states that it should be at 0.7 volts at 1.44 GHz!

Thanks for clarifying this - I thought that the 6700XT was in a better / more favorable spot on the voltage curve vs the 5700XT.
It‘s always good to learn something :)
 
The 6700xt isnt a bad card, but the fact remains that the price increase wipes out most of its value as anything but a 5700xt with a huge OC. Just like the 2080 super was just a 1080ti.

At $385 this card would be nice, at $480 its compeltely DOA IMO. Not that it matters, because LMAO STOCKS.
 
The 6700xt isnt a bad card, but the fact remains that the price increase wipes out most of its value as anything but a 5700xt with a huge OC. Just like the 2080 super was just a 1080ti.

At $385 this card would be nice, at $480 its compeltely DOA IMO. Not that it matters, because LMAO STOCKS.
Completely agreed. I dont understand why AMD should be praised for an overclock and a price increase. Minimal effort and maximum profits. No DLSS, no ray tracing, no innovation. They dont deserve buyers in a normal market.

From a neutral perspective this is a bad product. And yes that is based on price. If it was the same or cheaper than the 5700XT it would be a good product. And this isnt like a 3090 which is obscenely expensive, that thing offers you something no one else can. But a 6700XT isnt that disimilar to a lot of already existing solutions, some of which run cooler and have things like DLSS.

You really are much better off buying a 3060 Ti or a 3070.
 
I don't think that's a neutral perspective.

First off, I think this is an excellent article. TechSpot regularly investigates odd nooks and crannies of computer tech, and I always come away with new insights. It's why I frequent the site.

The 6700XT is an excellent card. There's literally nothing wrong with it. People need to separate technical excellence with 'value'. They are two entirely different things. Value is entirely subjective, and based on many things. Cost is just one aspect. Style is very important to some. As are quietness, energy usage, size, reliability, brand loyalty, sales incentives, packaging, and usage modes. It's not realistic to reduce everything to financial value.
 
It's an appalling rip-off IMO, that performance increase should have come at the same price as the 5700XT, but a 20% price increase is a total joke given it's now only 192 bit. So where's the IPC uplift for RDNA2, it should have still been faster at same clocks. A 16GB 3070 Ti will destroy this and I am no NVidia fan, I wanted to buy, but they have turned out to be an even bigger joke with GPU supply. It's not that hard to find 3070 even if overpriced, it's much much harder to find 6800.
 
Completely agreed. I dont understand why AMD should be praised for an overclock and a price increase. Minimal effort and maximum profits. No DLSS, no ray tracing, no innovation. They dont deserve buyers in a normal market.

Well ahem. Nvidia got 100/100 for just overclock and price increase. Minimal effort and maximum profits. No DLSS, no ray tracing, no innovation, not anything. Not to mention there is just around 20 games right now that support DLSS 2.0.

According to you, Nvidia don't deserve buyers in a normal market. More information https://www.techspot.com/review/1174-nvidia-geforce-gtx-1080/

Also I don't understand what dimension you are living in. AMD card has support for ray tracing and for innovation, Infinity cache is something Nvidia will use later. Under different name and design of course but it will come.

From a neutral perspective this is a bad product. And yes that is based on price. If it was the same or cheaper than the 5700XT it would be a good product. And this isnt like a 3090 which is obscenely expensive, that thing offers you something no one else can. But a 6700XT isnt that disimilar to a lot of already existing solutions, some of which run cooler and have things like DLSS.

You really are much better off buying a 3060 Ti or a 3070.
I hope you maintain same points in future too. When AMD get's their own DLSS, you will then remind everyone that DLSs is only supported on handful of games and is therefore useless.

Also when Nvidia publishes next card, be sure to remind everyone that it's overpriced when compared to older card and what it cost year ago.

It's an appalling rip-off IMO, that performance increase should have come at the same price as the 5700XT, but a 20% price increase is a total joke given it's now only 192 bit. So where's the IPC uplift for RDNA2, it should have still been faster at same clocks. A 16GB 3070 Ti will destroy this and I am no NVidia fan, I wanted to buy, but they have turned out to be an even bigger joke with GPU supply. It's not that hard to find 3070 even if overpriced, it's much much harder to find 6800.
Why? Overall performance matters with clocks card is shipped. Not performance per clock since it's not same on retail products. And where is that IPC uplift? Why cares? Performance is roughly: IPC * clock speed. No-one cares about IPC OR clock speed, overall performance is what matters. And once again, Nvidia card got 100/100 for no IPC uplift https://www.techspot.com/review/1174-nvidia-geforce-gtx-1080/ so I really don't understand wtf you are trying to say.

There is no 16GB 3070Ti yet. Also if there is GTX 3070 Ti 16GB with DDR6X, then it will make GTX 3080 obsolete. Then good luck for those morons who bought GTX 3080 :joy:
 
Completely agreed. I dont understand why AMD should be praised for an overclock and a price increase. Minimal effort and maximum profits. No DLSS, no ray tracing, no innovation. They dont deserve buyers in a normal market.

From a neutral perspective this is a bad product. And yes that is based on price. If it was the same or cheaper than the 5700XT it would be a good product. And this isnt like a 3090 which is obscenely expensive, that thing offers you something no one else can. But a 6700XT isnt that disimilar to a lot of already existing solutions, some of which run cooler and have things like DLSS.

You really are much better off buying a 3060 Ti or a 3070.
I have a 6900XT and runs cool. Never more than 75 degrees. Not interested in the RT hype. Not interested in DLSS to trick the resolution (the Radeon has some settings to manage dynamic resolutions too). I like native resolution always, with maximum image quality.

I have been testing many games (e.g. Control, AC Origins, RDR2, Witcher 3, TR last games, etc.) and still have to find driver issues. Why don't you buy a Radeon yourself and check instead of hearing all the time? You could be surprised. Many things you express are just myths around the name Radeon. Have a direct experience instead.
 
Back