Radeon VII Mega Benchmark: 33 Games vs. RTX 2080, GTX 1080 Ti & Vega 64

Vega 64 running 150 watts.
all I lost was 10 fps from stock clocks


Thanks for the link. Looks like the card is running on quite low voltage and quite low clocks. Of course, if the power limit is -31% then that could be expected. The numbers still don't quite add up, though. If the power usage is 150W with that power limit, that would imply that the normal power usage is ~220W, whereas the reference TDP of a Vega 64 card is 295W. Depending on which Vega 64 card you have, the TDP may be different, but the ballpark should still be the same unless you have some sort of power saving BIOS enabled. Just out of curiosity, does that program you use show the card using ~300W at stock settings?
 
well for example I drop my power target to -31% and I only loose 12 % performance.after I overclock the hbm to 1100 mhz from 945. Honestly the extra 8 pin on my vega just isnt needed, I would of been happy loosing 10% performance and gaining double the efficiency
I dont touch the voltage I let it do auto because the 50 watt gains there are to much bother with random crashing so I just let it do its own thing.
So I can undervolt this card, still use more power than a 2080, and perform even slower? And yet cost the same?!?!!? Yeah, sounds like a winner to me...
 
As a gamer card, this is a loser compared to the RTX 2080.

However, people will scoop these up for good reason. Resale will be excellent. First for the great FP64 performance and now for the capability to run professional graphic applications that were previously reserved $3000 workstation cards.
 
It seems that most here are even more short sighted than those that bought the GTX 680 over the HD 7970 all in the name of 5% more fps. Three years later, those GTX 680 owners had to pay full price for their GPU upgrade, while the HD 7970 owners all sold their cards for enough to cover half of their upgrade.
 
Very good article. The inescapable conclusion is that the VII fits in between the RX 2080 and 2070. But is that so bad? Does it justify granting the VII a 60% Rating?

I just bought a Radeon VII from amd.com for $699 with free shipping. It's still not available on Newegg.

Here's my thinking. The REAL cost of the Radeon VII is $550, because the included game bundle is worth at least $150, maybe much more. That puts it between the RX 2070 and RX 2080 if you add in their respective game bundles. Radeon VII performance also fits between the 2070 and 2080. The VII obliterates the 2070 for only $75 more. The VII's power draw is a bit more than the RX 2080 for slightly less performance, but costs at least $100 less. Ergo, it's a decent value.

My only concern is noise due to fan ramp up. As I mentioned in a previous post, the IGN review claimed they could use Wattman to lower the fan curve so it's not a problem - "One other thing to note – this GPU ships with a really aggressive fan curve, making it super loud when running at full load. Again, echoes of Vega once again. However, the good news is it's totally unnecessary as the card has a lot of headroom and the cooling seems over-engineered. The card never got higher than 66C with the default fan curve in the Radeon Wattman software. I lowered the curve to the point where I could barely hear the GPU at all, and even then it never got hotter than 70C. I have no idea why the default fan speed is so high, but it's not necessary."

I'm keeping my fingers crossed! If the fans are not tamable, then I'll just have to get a Morpheus or go with water. Sure, it will increase the cost, but the card still performs almost on par with the RX 2080, so who cares.

But here's the real lesson. When you buy computer equipment, you're really buying entertainment. The ability to overclock or play with cooling is a huge part of what you get. People will spend WAY more than they need for a Threadripper, play with aftermarket cooling, overclocking, mobos, etc... Admit it, people love their hobby computers.

The Radeon VII is a tinkerer's dream. Gamers Nexus pointed out that the Radeon VII is easy to open up, in contrast to the RX series which are really horrible to take apart.

It provides *almost* RX 2080 performance, which is totally sufficient for my gaming needs. Ray tracing has not matured yet, so it's not an issue. It's a damn good looking card. It's a super Bitcoin miner, should I want to try my hand at that. With the 16GB frame buffer and 1TB/s memory bandwidth it will let me play with various video editing and workstation programs. Just think of the many hours of fun I'll be getting. I can hardly wait.

And it allows me to support AMD, a tiny company that has valiantly faced down its two giant competitors, Intel and Nvidia.

My rating for the Radeon VII is 100%, (subject to future modification) hehehehehe.
 
It seems that most here are even more short sighted than those that bought the GTX 680 over the HD 7970 all in the name of 5% more fps. Three years later, those GTX 680 owners had to pay full price for their GPU upgrade, while the HD 7970 owners all sold their cards for enough to cover half of their upgrade.
Actually, they sell for virtually the same amount on ebay nowadays... but what proof can you offer that will assure me that this card will be worth more than the 2080 in 3 years?

No one can predict the future - yes, the 7970 was a great card that aged quite well... but that was also a LONG time ago! After seeing what AMD has offered on the GPU front over the past few years, I'd be very hesitant to purchase one - especially one that is slower, noisier and more power-hungry than its Nvidia counterpart.

Very good article. The inescapable conclusion is that the VII fits in between the RX 2080 and 2070. But is that so bad? Does it justify granting the VII a 60% Rating?

Yes, it is! because I'm paying the same amount of money for an inferior card!

Here's my thinking. The REAL cost of the Radeon VII is $550, because the included game bundle is worth at least $150, maybe much more. That puts it between the RX 2070 and RX 2080 if you add in their respective game bundles. Radeon VII performance also fits between the 2070 and 2080. The VII obliterates the 2070 for only $75 more. The VII's power draw is a bit more than the RX 2080 for slightly less performance, but costs at least $100 less. Ergo, it's a decent value.

No, bundles do NOT make your card cheaper... the 2080 and 2070 come with their own bundles... but cost is cost! Both cards are $700 and one is slower than the other... WHY would you buy the slower one?!?!


My rating for the Radeon VII is 100%, (subject to future modification) hehehehehe.

Fortunately, you don't review video cards for a living, because you'd be fired within a day or 2!

Releasing an inferior product at the same price as it's competitor does not rate 100% in any sane person's judgement.
 
Last edited:
The real cost of the Radeon VII is $700. Bundling 3 games I don't want or already have doesn't change the cost. You can't even easily resell the games because AMD uses that silly hardware verification tool before they will give you the games meaning you can only "sell" the games to someone else who bought similar AMD hardware but didn't get the bundle? $700... no way to bend that to fit a smaller number.

Rooting for the underdog is all fine and dandy - as I mentioned I wish AMD was doing a better job competing. They've got some decent parts in the CPU market now after many years of not being a viable option but they still have some work to do in the GPU market when it comes to gaming. Good luck trying your hand at content creation for the first time and seeing a difference in performance between a Radeon VII and a 2080.
 
Actually, they sell for virtually the same amount on ebay nowadays... but what proof can you offer that will assure me that this card will be worth more than the 2080 in 3 years?

No one can predict the future - yes, the 7970 was a great card that aged quite well... but that was also a LONG time ago! After seeing what AMD has offered on the GPU front over the past few years, I'd be very hesitant to purchase one - especially one that is slower, noisier and more power-hungry than its Nvidia counterpart.
Actually, they sell for virtually the same amount on ebay nowadays... but what proof can you offer that will assure me that this card will be worth more than the 2080 in 3 years?

No one can predict the future - yes, the 7970 was a great card that aged quite well... but that was also a LONG time ago! After seeing what AMD has offered on the GPU front over the past few years, I'd be very hesitant to purchase one - especially one that is slower, noisier and more power-hungry than its Nvidia counterpart.

Well if you upgrade every 8 years, then yeah, it wont matter. But I specifically gave the example of 3 years which is closer to when high end users upgrade.

AMD is slower than nVidia? Oh you don't say. I wonder how many are still happy with their RX 290/390 purchase compared to the GTX 970 3.5 GB owners. After that was Polaris, and despite the shortcomings, plenty are still happy with the performance they still get. Same with Fuji owners. That thing is still kicking ***.

But yeah, AMD had nothing to compete with the GTX 1080ti and RTX 2080ti despite only a small percent of people being able to buy them so screw AMD, right?
 
The last time I bought a card with a 3 game bundle I believe I got like 70$ for it on ebay. It can make a difference, It is was major a tipping point for my 200$ or so card. btw the 2070 and 2080 come with a game.
 
Well if you upgrade every 8 years, then yeah, it wont matter. But I specifically gave the example of 3 years which is closer to when high end users upgrade.

AMD is slower than nVidia? Oh you don't say. I wonder how many are still happy with their RX 290/390 purchase compared to the GTX 970 3.5 GB owners. After that was Polaris, and despite the shortcomings, plenty are still happy with the performance they still get. Same with Fuji owners. That thing is still kicking ***.

But yeah, AMD had nothing to compete with the GTX 1080ti and RTX 2080ti despite only a small percent of people being able to buy them so screw AMD, right?
Not my point.... I agree that the 7970 ended up being the better long term card.... but that was many years ago... the jury is still out on the 970 vs 390..... but you can’t base the longevity of the Radeon VII on either of those examples... there is absolutely 0 evidence that it will outperform the 2080 in 3 years! It might... it also might not... when buying a card, especially one for $700, you can’t buy based on what you think might be the case in 3 years!

RIGHT NOW, this card, as a gaming card, is a foolish purchase.
 
Vega 64 running 150 watts.
all I lost was 10 fps from stock clocks


Thanks for the link. Looks like the card is running on quite low voltage and quite low clocks. Of course, if the power limit is -31% then that could be expected. The numbers still don't quite add up, though. If the power usage is 150W with that power limit, that would imply that the normal power usage is ~220W, whereas the reference TDP of a Vega 64 card is 295W. Depending on which Vega 64 card you have, the TDP may be different, but the ballpark should still be the same unless you have some sort of power saving BIOS enabled. Just out of curiosity, does that program you use show the card using ~300W at stock settings?

no it shows me 265 watts around 570w at the wall at stock speeds with 130 rest watts at rest.
That's high rest watts due to my 6 hard drives not sleeping.
400 watts at wall with 150 watt usage on the gpu .
this is with - 43 power .

88 fps to 103 fps
150 watts to 265
400 watts wall to 570 watts wall

hope that helps
I loose 15 frames and save 170 watts at wall but in after burner im dropping only 115 watts
Im saving more power at the wall by 55 watts... odd.
 
Lowering the power target to -43%
why is it hardware info shows graphics dropping by 115 watts but in real life the actual true saving is 170 watts at the wall with this power meter that is pretty accurate.
I tested several electrical devices and results were within margin of error.
Whats that all about ?
 
Lowering the power target to -43%
why is it hardware info shows graphics dropping by 115 watts but in real life the actual true saving is 170 watts at the wall with this power meter that is pretty accurate.
I tested several electrical devices and results were within margin of error.
Whats that all about ?

AMD cards can only measure power consumption of the GPU alone and not HBM and VRM loss (around 50-60w) that's why. So for example the tdp of the entire board is 350w, reduced by 43% is 200W, take away 50W from hbm and VRM so the gpu consumption as reported by AB is 150W. Another 20w probably come from CPU because it doesn't have to work as hard as before.
 
Last edited:
I am really disappointed in Steve.

This review is either not complete, or complete farce. Why doesn't he do a set of graphs with statistics using MINUMUMS, & chart those games that way? As nobody cares how high their frames go, they only care how low they go, or keep dipping too.

Again, there was no frame times, or testing/reviewing the things that GAMER's hate in games... like tearing, or artifacts, or hindered gameplay, etc. Total absolute frame doesn't mean a single thing. How your frames bounce around are more important (ie: consistency) that anything to Gamers.

And having consistent frames and having less bare minimums is the single biggest goal of a Gamer looking for Performance. Dips are what causes us gamer's problems, so dips are the problem.

You can't recommend a $699 FreeSync2 Gaming card..? With 1TB of bandwidth and 16GB of memory..? Absolutely perfect card for any high end gamer looking to stay at 2k gaming on a 144Hz FreeSync Monitor. (The money you save on something like a Samsung QLED Gaming monitor (32", 34" & 49") is substantial.)


Steve, you are reviewing a gaming GPU in a vacuum, unrelated to Gamers. It is absurd and this review comes off cluttered with bias. (btw.. How does the 2080 work on newer/cheaper freesync monitors?)

I'm not disappointed in you, I came by looking for my daily dose of crazy and I bloody well got it, cheers mate.

BTW all the graphs feature 1% low performance, feel free to work out the average across the 33 games and get back to me ;)

You can download all the graphs here mate: https://www.patreon.com/posts/amd-radeon-vii-24597821
 
I'm not disappointed in you, I came by looking for my daily dose of crazy and I bloody well got it, cheers mate.

BTW all the graphs feature 1% low performance, feel free to work out the average across the 33 games and get back to me ;)

You can download all the graphs here mate: https://www.patreon.com/posts/amd-radeon-vii-24597821

Well Steve, since we all agree that you are the laziest benchmarker ever, I have done the work for 4k ultra results for those 33 games, which would have given every advantage to the Radeon 7.

Radeon 7: avg low - 50.5 fps ; avg.... avg? - 63.7 fps
RTX 2080: avg low - 56.0 fps ; avg avg - 67.2 fps
Advantage for the RTX: 10.9% on lows, 5.5% on averages. ** ok 8% and 8%. Adding these 4 columns of 33 numbers each was more than I could handle.

Both cards had 30 fps as the lowest low.
*Radeon 7 had 7 games with lows in the 30s. The RTX 2080 had only 5.

*edit made
** more edit
 
Last edited:
As for 1080p gaming high refresh rate, the toughest games were Assassin's Creed and Just Cause in respect to the lowest lows. The R7 hit 55 fps and 64 fps, respectively, while the RTX 2080 was at 67 fps and 74 fps.

Conclusion: Steve was clearly biased by testing a measly 33 games.
 
Well Steve, since we all agree that you are the laziest benchmarker ever, I have done the work for 4k ultra results for those 33 games, which would have given every advantage to the Radeon 7.

Radeon 7: avg low - 50.5 fps ; avg.... avg? - 63.7 fps
RTX 2080: avg low - 56.0 fps ; avg avg - 67.2 fps
Advantage for the RTX: 10.9% on lows, 5.5% on averages.

Both cards had 30 fps as the lowest low.
*Radeon 7 had 7 games with lows in the 30s. The RTX 2080 had only 5.

*edit made

Mate you have to make these fanboys work for it, you can't just show your workings out. Since you've spoilt my fun here are the results. ;)

unknown.png


RTX 2080 is 8% faster on average for the average frame rate and 8% faster for the 1% low, MIND BLOWN! Sorry for hiding these results so I could set the narrative, shame on me!
 
Last edited:
Lowering the power target to -43%
why is it hardware info shows graphics dropping by 115 watts but in real life the actual true saving is 170 watts at the wall with this power meter that is pretty accurate.
I tested several electrical devices and results were within margin of error.
Whats that all about ?

AMD cards can only measure power consumption of the GPU alone and not HBM and VRM loss (around 50-60w) that's why. So for example the tdp of the entire board is 350w, reduced by 43% is 200W, take away 50W from hbm and VRM so the gpu consumption as reported by AB is 150W. Another 20w probably come from CPU because it doesn't have to work as hard as before.

cpu power draw stays the same.
just stop pulling rubbish out the air.
 
Lowering the power target to -43%
why is it hardware info shows graphics dropping by 115 watts but in real life the actual true saving is 170 watts at the wall with this power meter that is pretty accurate.
I tested several electrical devices and results were within margin of error.
Whats that all about ?

I suggest you trust the power meter. Here's how I measured the power usage of my Vega 56:

1. Start Prime95 blend test to get full CPU utilization.
2. Check the power meter reading.
3. Leave the blend test running and start Unigine Superposition. The default settings should be fine.
4. Check the power meter reading.

The power meter readings will fluctuate, so you won't end up with an exact number, but you should quickly see if the difference between #2 and #4 is ~150W.

Btw, I'm not saying you shouldn't be happy with the ratio between power savings and performance loss with your settings. The absolute numbers just seemed odd to me, that's all.
 
Well if you upgrade every 8 years, then yeah, it wont matter. But I specifically gave the example of 3 years which is closer to when high end users upgrade.

AMD is slower than nVidia? Oh you don't say. I wonder how many are still happy with their RX 290/390 purchase compared to the GTX 970 3.5 GB owners. After that was Polaris, and despite the shortcomings, plenty are still happy with the performance they still get. Same with Fuji owners. That thing is still kicking ***.

But yeah, AMD had nothing to compete with the GTX 1080ti and RTX 2080ti despite only a small percent of people being able to buy them so screw AMD, right?
I had a 290 but I upgrade to a 1070. Don't forget the power and heat of those Fuji cards are horrendous. Anyways I think the limited vram was the big issue of the 970 faster aging. 980 and 980ti faired pretty well in the long run against the r9 fury and r9 fury x. I don't believe in Finewine if thats what you were implying.
 
Last edited:
Only reason to purchase this card is if you care about supporting the under dog.
If you dont support the underdog ever in your life you will only get more of what you are getting now such as the 2080ti prices.
In 10 years from today I see Nvidia cards costing double what they cost now maybe even more.
Who will be to blame. ?
That be you the customer.
 
Only reason to purchase this card is if you care .....

You only support the underdog when they behave like one. When AMD charges prices like Intel and nVidia while underperforming and 3 years late to market, that is not underdog behavior, that is just the latest me-too screw the gamers marketing plan. There is no point in supporting another monster like that.
 
Cherry picking two graphs when so many are available isn't a good way to argue a point. Picking just two graphs, I could argue that Radeon VII competes with the RTX 2080 Ti (see Steve's benchmark results for Dirt 4 and Battlefield V), which is clearly a ridiculous proposition to anyone who's read the reviews. Let's just look at the averages over a large number of games, shall we?

The 33 game average shows that the Radeon VII is 7% slower than the RTX 2080. On the other hand the RTX 2070 review's 20 game average shows that the RTX 2070 is 18% slower than the RTX 2080. While the data set is not the same, it's pretty safe to say that on average the Radeon VII slots pretty neatly between the RTX 2070 and RTX 2080, in which case it's not exactly true that the Radeon VII has trouble beating the RTX 2070. Sure, it's still poor value in comparison - assuming you aren't playing only Dirt 4 and BF5 - but if we're talking purely about performance, the Radeon VII is the faster GPU.
 
Back