AMD Radeon RX 6700 XT vs. 5700 XT: Clock-for-Clock Performance

QuantumPhysics

Posts: 5,056   +5,664
I went looking for 4K performance charts.

Did you do any testing of this card’s 4K performance vs. the RTX 3090?
 

Shadowboxer

Posts: 1,575   +1,139
So it’s an overclock for $80 more?

This card is a festering heap of garbage. It’s not reported here because as is now public knowledge, Techspot is quite strongly biased against Nvidia and for AMD because of the whole Editorialgate. I mean what were the temps of both cards at the same clock speeds? I’d imagine quite similar. But these 6700XT parts are hitting 100C according to other reviewers. I guess that’s what happens if all you do is overclock it over the last generation oh and add some more memory chips. So the card can’t ray trace, it has no DLSS or equivalent, loses to a 3060 ti in a lot of cases (without DLSS), which is cheaper (and cooler) and on top of all of that there are major driver issues surrounding the 6000 series reported by several other tech outlets (AC Valhalla is very unstable I hear for example).

You’re an ***** if you buy this part (you know, if things were normal). Nvidia may be a bag of dicks but that bag of dicks make much better graphics cards than AMD right now.
 
The gpu averages on TechSpot need to be normalized, otherwise they suffer from what in statistics is known as high leverage point. It downweights the games with low fps and I’d argue that it should be exactly the opposite. For example, GPU A achieves 25 and 300 fps in two games we test. GPU B achieves 50 and 250 fps. By average, GPU A is better, but I’d argue that this is misleading and that the average user will see bigger improvement from 25 to 50 fps than from 250 to 300. One way to improve the statistics is to normalize to the fastest card. Then we would have GPU A 50% and 100% and GPU B 100% and 83%. In this case B with average 91% is better than A with 75%.
 

Aranarth

Posts: 113   +99
I went looking for 4K performance charts.

Did you do any testing of this card’s 4K performance vs. the RTX 3090?

Uh... did you READ the article?!

How about going back and reading just the first TWO paragraphs!?

After you have done that feel free to circle back and apologize!

(I have no problem with free speech, I DO expect you to take responsibility for exercising the right.)
 

Aranarth

Posts: 113   +99
So it’s an overclock for $80 more?

This card is a festering heap of garbage. It’s not reported here because as is now public knowledge, Techspot is quite strongly biased against Nvidia and for AMD because of the whole Editorialgate. I mean what were the temps of both cards at the same clock speeds? I’d imagine quite similar. But these 6700XT parts are hitting 100C according to other reviewers. I guess that’s what happens if all you do is overclock it over the last generation oh and add some more memory chips. So the card can’t ray trace, it has no DLSS or equivalent, loses to a 3060 ti in a lot of cases (without DLSS), which is cheaper (and cooler) and on top of all of that there are major driver issues surrounding the 6000 series reported by several other tech outlets (AC Valhalla is very unstable I hear for example).

You’re an ***** if you buy this part (you know, if things were normal). Nvidia may be a bag of dicks but that bag of dicks make much better graphics cards than AMD right now.

Sort of.

This is the important note:
"...as far as we can tell they’ve done it with the 6700 XT, delivering 30% more performance than the 5700 XT, while reducing power usage by roughly 15%.

So yes the card clocks much higher but also uses less power on the SAME NODE.

Usually if it was just an overclock (say comparing the the RX 480 8gb with the RX 580 8gb) performance would be up along with power usage being WAY UP.
 

NeoMorpheus

Posts: 681   +1,268
People, stop feeding the troll, please.

On the subject, this is very interesting.

At the same clock speed, its the same performance, but as pointed above, what is the power consumption at that speed?

Also, is this simply a higher clocked 5700 or it is really a RDNA2 card?

So confusing.

Great review though.
 

neeyik

Posts: 1,877   +2,191
Staff member
Also, is this simply a higher clocked 5700 or it is really a RDNA2 card?

So confusing.
The changes AMD implemented in RDNA 2 were mostly about streamlining the pipeline to enable higher clocks to be achieved; the rest were about improving data transportation (necessary due to the clock increase), additional data format support, and ray tracing acceleration. In terms of everything else (CU structure, for example) very little changed from RDNA.

Given that both models tested here are 40 CU chips, the results are in line with expectations. Or more rather, it shows that the use of the Infinity Cache ties in with AMD's claims that it helps to alleviate the load on the local memory system - if it didn't, then the 5700 XT would be ahead in every test due to its wider memory bus.
 

Lionvibez

Posts: 2,341   +1,884
I went looking for 4K performance charts.

Did you do any testing of this card’s 4K performance vs. the RTX 3090?
???
This card is not ment for 4k.
And what would be the point of comparing it to a 3090 which is like 3x the cost?
This request makes no sense at all.

So it’s an overclock for $80 more?

This card is a festering heap of garbage. It’s not reported here because as is now public knowledge, Techspot is quite strongly biased against Nvidia and for AMD because of the whole Editorialgate. I mean what were the temps of both cards at the same clock speeds? I’d imagine quite similar. But these 6700XT parts are hitting 100C according to other reviewers. I guess that’s what happens if all you do is overclock it over the last generation oh and add some more memory chips. So the card can’t ray trace, it has no DLSS or equivalent, loses to a 3060 ti in a lot of cases (without DLSS), which is cheaper (and cooler) and on top of all of that there are major driver issues surrounding the 6000 series reported by several other tech outlets (AC Valhalla is very unstable I hear for example).

You’re an ***** if you buy this part (you know, if things were normal). Nvidia may be a bag of dicks but that bag of dicks make much better graphics cards than AMD right now.
If the site is as bias as you claim why do you keep coming back?
 
Last edited:

Irata

Posts: 1,539   +2,519
People, stop feeding the troll, please.

On the subject, this is very interesting.

At the same clock speed, its the same performance, but as pointed above, what is the power consumption at that speed?

Also, is this simply a higher clocked 5700 or it is really a RDNA2 card?

So confusing.

Great review though.

Found this from Extremetech:


This is total system power consumption, but at the same clock speed and performance the 6700XT uses 100W less than the 5700XT.

Another thing: if all new DX12 Ultra features (excluding RT) are used, we should most likely see even better performance for the 6700XT.
 
This is why I'm super curious to see the performance in laptops. I expect it to be a very different fight against Nvidia in this space.

Found this from Extremetech:


This is total system power consumption, but at the same clock speed and performance the 6700XT uses 100W less than the 5700XT.

Another thing: if all new DX12 Ultra features (excluding RT) are used, we should most likely see even better performance for the 6700XT.
 

Mjswooosh

Posts: 30   +51
Performance is OK vs 5700 but underwhelming vs 3070. In any case, it's all academic. No stock means no real point as nobody but bot fueled scalpers will be able to buy them at MSRP. These articles mainly serve as a **** tease, nothing more.
 

Adi6293

Posts: 831   +1,121
So it’s an overclock for $80 more?

This card is a festering heap of garbage. It’s not reported here because as is now public knowledge, Techspot is quite strongly biased against Nvidia and for AMD because of the whole Editorialgate. I mean what were the temps of both cards at the same clock speeds? I’d imagine quite similar. But these 6700XT parts are hitting 100C according to other reviewers. I guess that’s what happens if all you do is overclock it over the last generation oh and add some more memory chips. So the card can’t ray trace, it has no DLSS or equivalent, loses to a 3060 ti in a lot of cases (without DLSS), which is cheaper (and cooler) and on top of all of that there are major driver issues surrounding the 6000 series reported by several other tech outlets (AC Valhalla is very unstable I hear for example).

You’re an ***** if you buy this part (you know, if things were normal). Nvidia may be a bag of dicks but that bag of dicks make much better graphics cards than AMD right now.

Jesus man chill out
 

NeoMorpheus

Posts: 681   +1,268
The changes AMD implemented in RDNA 2 were mostly about streamlining the pipeline to enable higher clocks to be achieved; the rest were about improving data transportation (necessary due to the clock increase), additional data format support, and ray tracing acceleration. In terms of everything else (CU structure, for example) very little changed from RDNA.

Given that both models tested here are 40 CU chips, the results are in line with expectations. Or more rather, it shows that the use of the Infinity Cache ties in with AMD's claims that it helps to alleviate the load on the local memory system - if it didn't, then the 5700 XT would be ahead in every test due to its wider memory bus.
Thanks for the info.

So speaking as someone that knows very little on that part, GPUs' dont have something equivalent to IPC as CPUs does?

Example, a 4 core CPU with higher IPC than a 4 core CPU will be faster , so does the same thing applies on the GPU realm?
 

NeoMorpheus

Posts: 681   +1,268
Found this from Extremetech:


This is total system power consumption, but at the same clock speed and performance the 6700XT uses 100W less than the 5700XT.

Another thing: if all new DX12 Ultra features (excluding RT) are used, we should most likely see even better performance for the 6700XT.
Thanks, thats very informative.

so in the end, it might be worth the extra cost for the new card. Of course, if it was available at MSRP.
 

neeyik

Posts: 1,877   +2,191
Staff member
So speaking as someone that knows very little on that part, GPUs' dont have something equivalent to IPC as CPUs does?

Example, a 4 core CPU with higher IPC than a 4 core CPU will be faster , so does the same thing applies on the GPU realm?
To a certain degree, yes. CPUs have to deal with a lot of highly variable, branching threads, whereas for the most part, GPUs don't (unless it's complex compute stuff).

The latter typically handles threads (sequences of operations) in batches of 32, all doing the same instruction at any one given moment in time to 32 pieces of information. There's relatively little difference between today's GPUs in how quickly they issue and process those instructions - for example, FP32 multiply can be issued in 1 cycle and processed in around 4; FP64 multiply is about ten times longer.

But those IPCs are heavily dependent on the data being ready to hand. As they have to read and write vast amounts of data all the time, the flow of bits throughout the GPU is critical to its IPC. In the case of the GPUs tested in this article, both have 40 Compute Units - each of which contains two 32 thread units. So, depending on the instruction, the chip could be trying to read or write 40 x 2 x 32 = 2560 32-bit data values at once. At a clock speed of 2 GHz, that would require a peak total internal bandwidth of 2 TB/s.

It sounds like a lot but in the case of the RX 6700 XT, it has a total peak theoretical bandwidth of 5.29 TB/s between the Level 1 and Level 2 caches. That sounds great and the combined peak bandwidth is even higher in some areas, but in reality, the actual bandwidth is a little less than this (and a lot less for the VRAM).

In short, a GPU's IPC is complex :)
 

Irata

Posts: 1,539   +2,519
Thanks for the info.

So speaking as someone that knows very little on that part, GPUs' dont have something equivalent to IPC as CPUs does?

Example, a 4 core CPU with higher IPC than a 4 core CPU will be faster , so does the same thing applies on the GPU realm?
It‘s a bit like with engines where horsepower is torque x RPM. So an engine with more torque will have more hp at the same RPM. But if the lower torque engine can do twice the RPM, it may end up having more hp.

Thanks, thats very informative.

so in the end, it might be worth the extra cost for the new card. Of course, if it was available at MSRP.
The 6700XT is definitely a card that interests me, but as a $400 card. We‘ll see what prices are if the market gets back to normal (hopefully).
 

NeoMorpheus

Posts: 681   +1,268
It‘s a bit like with engines where horsepower is torque x RPM. So an engine with more torque will have more hp at the same RPM. But if the lower torque engine can do twice the RPM, it may end up having more hp.


The 6700XT is definitely a card that interests me, but as a $400 card. We‘ll see what prices are if the market gets back to normal (hopefully).
Agreed.

I love AMD but man, they really screwed up the pricing this round.

then again, with the shortage and who are buying, we dont matter, since AMD can charge F U money for it and the current buyers have that kind of money to spend.