Radeon VII Mega Benchmark: 33 Games vs. RTX 2080, GTX 1080 Ti & Vega 64

Well im happy with my vega 64 im getting plenty of frames to I run my gpu at 150 watts.
I loose maybe 10% performance but slash power in half.
AMD should of done that with this gpu but didnt.
 
How come you didn't benchmark games that will come out next year? And the year after? Because THOSE are the games that the radeon will crush the 2080... Don't believe me? Ask the AMD fanboys who will be rushing to defend this :)
I won't defend them with the new card specs and benching for future usage, I will say that I would give up 20% in FPS and some power if it is substantially less in $$$. I and many others look at what I am getting for the least amount of $$. Better to be slow and steady than broke and alone!!
 
I am really disappointed in Steve.

This review is either not complete, or complete farce. Why doesn't he do a set of graphs with statistics using MINUMUMS, & chart those games that way? As nobody cares how high their frames go, they only care how low they go, or keep dipping too.

Again, there was no frame times, or testing/reviewing the things that GAMER's hate in games... like tearing, or artifacts, or hindered gameplay, etc. Total absolute frame doesn't mean a single thing. How your frames bounce around are more important (ie: consistency) that anything to Gamers.

And having consistent frames and having less bare minimums is the single biggest goal of a Gamer looking for Performance. Dips are what causes us gamer's problems, so dips are the problem.

You can't recommend a $699 FreeSync2 Gaming card..? With 1TB of bandwidth and 16GB of memory..? Absolutely perfect card for any high end gamer looking to stay at 2k gaming on a 144Hz FreeSync Monitor. (The money you save on something like a Samsung QLED Gaming monitor (32", 34" & 49") is substantial.)


Steve, you are reviewing a gaming GPU in a vacuum, unrelated to Gamers. It is absurd and this review comes off cluttered with bias. (btw.. How does the 2080 work on newer/cheaper freesync monitors?)
 
This late in the day it's "new arch or go home". AMD need the GPU equivalent of Ryzen, but so far have done little more than the equivalent of holding back Ryzen and instead keep churning out FX-10590, FX-11950, etc, die shrinks of the FX-9590 as late as 2018, 2019, etc. As for "Well, it doesn't look TOO bad if you undervolt", are people seriously forgetting you can undervolt the competition too? Eg, 120w GTX 1060 runs at just 60w (50% power) whilst losing only 12% fps in Metro Last Light. If nVidia fanboys then said "now lets compare that "60w" GTX1060 to stock RX480 pulling 150-170w without attempting to undervolt the latter too", AMD fanboys would be howling at the moon over 'outrageous apples & oranges biased comparisons'...

"Undervolting so it looks half normal vs competition's stock even with a process advantage" is little more than an excuse to avoid admitting this is simply the wrong product at the wrong time, and AMD need to drag their core architectural design teams out from hiding under the bed, move onto a new arch already and start releasing the right products again...
 
You point is based on apple and oranges comparison, in which each don't both respond the same way.

You do not gain a massive thermal advantage when you undervolt a GTX 1060, like you do with a Radeon Vega. As such, I would essentially gain nothing in undervolting my rtx2080. The Radeon 7 is in fact a Radeon Mi40, so we all know how this goes.
 
I am really disappointed in Steve.

This review is either not complete, or complete farce. Why doesn't he do a set of graphs with statistics using MINUMUMS, & chart those games that way? As nobody cares how high their frames go, they only care how low they go, or keep dipping too.

Again, there was no frame times, or testing/reviewing the things that GAMER's hate in games... like tearing, or artifacts, or hindered gameplay, etc. Total absolute frame doesn't mean a single thing. How your frames bounce around are more important (ie: consistency) that anything to Gamers.

And having consistent frames and having less bare minimums is the single biggest goal of a Gamer looking for Performance. Dips are what causes us gamer's problems, so dips are the problem.

You can't recommend a $699 FreeSync2 Gaming card..? With 1TB of bandwidth and 16GB of memory..? Absolutely perfect card for any high end gamer looking to stay at 2k gaming on a 144Hz FreeSync Monitor. (The money you save on something like a Samsung QLED Gaming monitor (32", 34" & 49") is substantial.)


Steve, you are reviewing a gaming GPU in a vacuum, unrelated to Gamers. It is absurd and this review comes off cluttered with bias. (btw.. How does the 2080 work on newer/cheaper freesync monitors?)

If you want a proper review go over to techpowerup.com. Also the Radeon 7 is even more impressive than previous Radeons at mining. 103MH's for Vertcoin at 140 watts. I was doing 86.5MH's at an estimated 205watts and a Radeon Vega Frontier.

https://old.reddit.com/r/vertcoin/comments/apagvo/radeon_vii_140watts_at_the_wall_btw/
 
I am really disappointed in Steve.

This review is either not complete, or complete farce. Why doesn't he do a set of graphs with statistics using MINUMUMS, & chart those games that way? As nobody cares how high their frames go, they only care how low they go, or keep dipping too.

When I first started reading this I truly thought it was someone joking around... I was waiting for the punchline. These statements... it's hard for me to come up with the word to describe them. Is there anything anyone can put in front of you to show you how things currently are? I think everyone here is wishing for AMD to get back in the game and compete with nVidia - it's the only way pricing will ever become reasonable again but it's not happening with this type of release. If AMD is catering to the content creation market then fine - but I'm pretty sure they are selling this as a gaming card where it falls short for a new product at this price range.

I won't defend them with the new card specs and benching for future usage, I will say that I would give up 20% in FPS and some power if it is substantially less in $$$. I and many others look at what I am getting for the least amount of $$. Better to be slow and steady than broke and alone!!

This is a technology enthusiast forum. I don't think slow and steady is the mantra for any enthusiast forum except at Stabilitysp0t. Users want to know what they can get at several price points (mainly theirs), what the alternatives are, and how they compare. If this is too pricey for the desired performance level then you're either looking in the wrong pricepoint or the wrong performance level. A 2080 Ti would be great at the budget pricepoint but it's just not how things are.
 
I might have missed an explanation somewhere, but why exactly are there no benchmarks at 4k with a card of this caliber?
 
I am really disappointed in Steve.?)
It sounds like your disappointed in his conclusion.

Dips are what causes us gamer's problems, so dips are the problem.
In the past 4-5 generations, AMD cards have suffered from dipping more then Nvidia cards.

You can't recommend a $699 FreeSync2 Gaming card..? With 1TB of bandwidth and 16GB of memory..?
For $700 there are better performing cards, with IMO better drivers, more overclock headroom and less noise/heat.

Absolutely perfect card for any high end gamer looking to stay at 2k gaming on a 144Hz FreeSync Monitor. (The money you save on something like a Samsung QLED Gaming monitor (32", 34" & 49") is substantial.)
Not anymore.
A good Gsync monitor is half of what they used to cost and can be had for $400 anyday of the week.

It is absurd and this review comes off cluttered with bias.
I would say your biased towards AMD and your opinion is cluttered with bias.
This $700 GPU is having trouble beating a $500 GPU (RTX 2070), the results are very clear for those who truly care to see them. It's not worth $700 when you compare it to the competition. AMD need to factor in noise, heat, driver stability and mostly, this GPU is getting whooped. It validated its asking price in ONE SINGLE GAME. ONE SINGLE GAME.
This is a $600-$650 GPU at best. Heck even if they priced it $50 cheaper, or $650, it would still not be the best value.
You AMD guys are something else.
STOP BEING LOYAL TO A LOGO.
 
How often was it claimed that Nvidia only got away with charging what they did for the 1080ti and onwards because there was no competition? I’ve said it before and il say it again, get used to these prices, they aren’t coming down. Both manufacturers are releasing parts for top dollar now. At least Nvidia launched its new high price stuff with ray tracing and DLSS etc. We don’t really get anything from AMD. We can ***** and moan all we like, I don’t think it will influence manufacturers to lower prices.

I remember the Radeon HD7970 ghz edition coming out and teaching Nvidia a lesson, things have changed an awful lot since then. AMD have let down the Radeon brand, time for it to be sold to a more competent company.
 
Well im happy with my vega 64 im getting plenty of frames to I run my gpu at 150 watts.
I loose maybe 10% performance but slash power in half.
AMD should of done that with this gpu but didnt.

Those number don't really add up. Vega cards are normally power starved (among other things), so undervolting doesn't directly lead to power savings. Instead, the card just has more room to "stretch it's legs" and you'll see a performance increase while power draw remains the same (up to a point). In other words, you're either underutilizing the Vega 64, gimping it considerably through underclocking or getting your wattage from a software that's giving you incorrect readings. My undervolted (down to 1050 mV from 1200 mV) Vega 56 with a 10% power limit increase and a +150 MHz memory overclock increases system draw from the wall by ~300W when fully utilized - and this number is the difference between 100% CPU load and 100% CPU load + 100% GPU load, so it's not because the GPU stress test also stresses the CPU. GPU-Z reports power usage as ~200W (normal 180TDP + 10% power limit increase) under full load, but that's clearly only for the GPU die itself or not based on actual measurement. I'm not seeing those numbers when gaming, but that's because I play @1080p and use Vsync (60Hz), which means the card is underutilized practically all of the time.

Basically AMD's choices were either the current Radeon VII or a Radeon VII with Vega 64-like performance and a lower power consumption. Of these, the latter would have been utterly pointless. Now the Radeon VII at least provides an option to those who want something faster than a Vega 64 but do not wish to buy NVidia's products. Sure, the value isn't great, but the price is pretty much the best AMD could do with the specs it decided on.
 
Personally I would have preferred if they had gone with GDDR6 instead of the Expensive HBM2, would have been cheaper, and would look better overall. *Price vs Performance* not really sure why they are staying with HBM if it is so damn expensive, cause in the end we the consumer have to pay extra for that. :(
 
As I've noted on the other thread for this card, m3tavision is clearly an AMD fanboy - I wonder if he's HardReset in disguise. Bottom line here: as a GAMING card, which is what AMD has marketed it as, it fails at its current price point.

There is no way, RIGHT NOW, anyone in their right minds can recommend a card that is slightly slower, louder, less energy efficient yet costs the same amount of money.
 
Personally I would have preferred if they had gone with GDDR6 instead of the Expensive HBM2, would have been cheaper, and would look better overall. *Price vs Performance* not really sure why they are staying with HBM if it is so damn expensive, cause in the end we the consumer have to pay extra for that. :(

GDDR6 itself would have been cheaper than HBM2, but redesigning the GPU to use GDDR6 would not have been cheap. There are other reasons as well, but if you're interested, I'd suggest you check out Buildzoid's video on the subject. He rambles a bit, but it's still worth watching IMO:

https://www.youtube.com/watch?v=bXTSv50UBq4
 
Well im happy with my vega 64 im getting plenty of frames to I run my gpu at 150 watts.
I loose maybe 10% performance but slash power in half.
AMD should of done that with this gpu but didnt.

Those number don't really add up. Vega cards are normally power starved (among other things), so undervolting doesn't directly lead to power savings. Instead, the card just has more room to "stretch it's legs" and you'll see a performance increase while power draw remains the same (up to a point). In other words, you're either underutilizing the Vega 64, gimping it considerably through underclocking or getting your wattage from a software that's giving you incorrect readings. My undervolted (down to 1050 mV from 1200 mV) Vega 56 with a 10% power limit increase and a +150 MHz memory overclock increases system draw from the wall by ~300W when fully utilized - and this number is the difference between 100% CPU load and 100% CPU load + 100% GPU load, so it's not because the GPU stress test also stresses the CPU. GPU-Z reports power usage as ~200W (normal 180TDP + 10% power limit increase) under full load, but that's clearly only for the GPU die itself or not based on actual measurement. I'm not seeing those numbers when gaming, but that's because I play @1080p and use Vsync (60Hz), which means the card is underutilized practically all of the time.

Basically AMD's choices were either the current Radeon VII or a Radeon VII with Vega 64-like performance and a lower power consumption. Of these, the latter would have been utterly pointless. Now the Radeon VII at least provides an option to those who want something faster than a Vega 64 but do not wish to buy NVidia's products. Sure, the value isn't great, but the price is pretty much the best AMD could do with the specs it decided on.
well for example I drop my power target to -31% and I only loose 12 % performance.after I overclock the hbm to 1100 mhz from 945. Honestly the extra 8 pin on my vega just isnt needed, I would of been happy loosing 10% performance and gaining double the efficiency
I dont touch the voltage I let it do auto because the 50 watt gains there are to much bother with random crashing so I just let it do its own thing.
 
I am really disappointed in Steve.

This review is either not complete, or complete farce. Why doesn't he do a set of graphs with statistics using MINUMUMS, & chart those games that way? As nobody cares how high their frames go, they only care how low they go, or keep dipping too.

Again, there was no frame times, or testing/reviewing the things that GAMER's hate in games... like tearing, or artifacts, or hindered gameplay, etc. Total absolute frame doesn't mean a single thing. How your frames bounce around are more important (ie: consistency) that anything to Gamers.

And having consistent frames and having less bare minimums is the single biggest goal of a Gamer looking for Performance. Dips are what causes us gamer's problems, so dips are the problem.

You can't recommend a $699 FreeSync2 Gaming card..? With 1TB of bandwidth and 16GB of memory..? Absolutely perfect card for any high end gamer looking to stay at 2k gaming on a 144Hz FreeSync Monitor. (The money you save on something like a Samsung QLED Gaming monitor (32", 34" & 49") is substantial.)


Steve, you are reviewing a gaming GPU in a vacuum, unrelated to Gamers. It is absurd and this review comes off cluttered with bias. (btw.. How does the 2080 work on newer/cheaper freesync monitors?)
Im surprised you claim that you own an RTX 2080, and yet you're part of the AMD damage control team. Would you trade it in for a Radeon VII?
 
Back