2070S vs 5700XT Cost per Frame Analysis @ 1440p

amoeba00

Posts: 92   +42
TechSpot Elite
I don't know if these are typos or bad math - but I found some discrepancies with how the cost per frame (cpf) is being computed (1440p resolution) in 2 TS articles and wondered if someone could provide some clarification.

(and before anyone comments - yeah, I had some extra time and am looking at either getting the 2070S or the 5700XT to upgrade my RX480 4GB)

The two articles in question are here:
https://www.techspot.com/review/1870-amd-radeon-rx-5700/
https://www.techspot.com/review/1902-geforce-rtx-2070-super-vs-radeon-5700-xt/

In the first one, it lists the following (card / avg frame / cpf):
2070S / 102 / $4.90
5700XT / 100 / $4.00

However, if you actually look at the charts in the article and crunch the numbers, they are a bit different:
2070S / 102.75 / $4.89
5700XT / 99.42 / $4.02

Looks like the 2070S was rounded down and the 5700XT rounded up (and it should have been the other way).

Of additional interest is in the second article - the charts list different avg frame rates for the 2070S, but the 5700XT remains the same (BF5 101 vs 103, Forza 104 vs 134, and WWZ 124 vs 192 (here the 5700XT is 116 vs 194). Taking those 13 listed games you get the following:
2070S / 146.38 / $3.69
5700XT / 141.23 / $3.12

Now, the second article doesn't list the avg frame rates for all 37 games - but you can still calculate it by dividing the card cost by the listed cpf. You wind up with much different average frame rates, but the difference in cpf should be reasonably close - otherwise why list those particular 13 games?
2070S / 116.13 / $4.65
5700XT / 110 / $4.00

So, if you re-calculate the first article's cpf using the different results from the second article, the cpf changes again:
2070S / 110.58 / $4.52
5700XT / 105.92 / $3.78

What all this means - is that the biggest argument for not getting the 2070S is bang for buck, but depending on which numbers you look at the cpf difference is anywhere from $.90 to $.57 - which makes the 2070S a far better deal.

Am I missing something?

Thanks!
 
I checked the first article's data and got the following statistics:

5700 XT, mean average fps = 100.0833; msrp $/fps = 3.99669
2070S, mean average fps = 101.5833; msrp $/fps = 4.922067

Rounding to 3 significant figures gives:

5700 XT, mean average fps = 100; msrp $/fps = 4.00
2070S, mean average fps = 102; msrp $/fps = 4.92

So the articles figures are essentially correct - the 2070S's $/fps is just 2c out. For the second article, there isn't any kind of evil masterplan going on, it's just that if one takes a sample of count 13 from a set of 37, there are 3,562,467,300 possible combinations of sample variations and each sample will produce an estimate of the full set's mean.

The 13 count sample shown in the article gives the following statistics (rounded to 3 significant figures):

5700 XT, mean average fps = 141; msrp $/fps = 2.83
2070S, mean average fps = 146; msrp $/fps = 3.42

However, the standard deviations of the data in the two articles is vastly different - between 16.5 and 19.3 fps for the first one, and between 73.4 and 80.7 fps for the second. This is due to the enormous frame rates achieved in CS:GO; it's essentially an outlier, so really should be removed for any kind of statistical analysis. If you do, this is what you get:

5700 XT, mean average fps = 120; msrp $/fps = 3.35
2070S, mean average fps = 127; msrp $/fps = 3.94

Without CS:GO included in the 13 count sample, the standard deviation drops to 29.4 to 30.2 fps; this is 'better' but still significantly larger than that seen in the first article. I've not had the time to look at the full 37 count data set, but I don't think it really matters, as any sample that can be realistically done by a small team of people in a limited amount of time is always going to be an estimate of the population (I.e. all games that the two graphics cards could possibly run). There will always be some variation in the statistics generated by relatively small sample, but even this brief analysis concurs with the articles comments pertaining to 'bang for buck'.
 
I certainly didn't mean to imply there was anything nefarious going on - I just threw those values into a spreadsheet and let that do the simple math - and the only difference with the second article cpf was that I used the article's msrp of 440/540 instead of 400/500 because they would have run them with the 3rd party cards instead of the original manufacturer released ones.

Of course, maybe that's the other point with the discrepancies - the 3rd party 2070S scored better than the original nVidia one and there isn't much of a difference between the 3rd party 5700XT and the original blower AMD one.

Thanks for the detailed response. Much appreciated.
 
I really like it when readers properly delve into the articles we all produce here, and I'm just as appreciative of the time you've afforded the analysis. :)

I'd missed that the msrp was different in the 2nd article, so my figures are a bit messed up; I suspect that the 5700 XT would still just have the edge on the bang-for-buck calculations, but either way, you're right in pointing out that it is very close.
 
Of course it would be a much easier decision if there were more non-reference 5700XT cards available, this wasn't v1 of Navi, and I wasn't itchin' to make a purchase right now.

I'm guessing if I find a great deal on a 2070 Super - I'll go that route, but if the AMD inventory gets better - then I still may go with the 5700XT. Either way, at the end of the day - I'm sure I'll be happy with either. Thanks again!
 
Last edited:
Wound up going with the Gigabyte 2070 Super Gaming OC 3x since it was $490 (including tax) after rebate. Given that the 5700XT cards I wanted were going for $450-470 (not including tax), - made more sense to go green, even though I wanted to stick with team red.
 
Back