AMD Radeon RX Vega 56 Review

So at stock, the 56 and 1070 look to be near equal. Too bad AMD was 15 months late to market. And far too expensive. The MSRP of the 1070 is currently $379 (correction, $349), and we have seen what the miners have done. Finding one of these vegas under $550 will be impossible.

Even without the miners, price is too high. GPU is too late to market IMO, and shouldnt be more expensive then the 1070 competitor is.

I assume that under OC VEGA's efficiency flies out the window compared to the 1070? Also, any word on temperature? Did AMD finally unscrew their cooler?
 
I kinda disagree with using an AMD card on an Intel Core i7. I think it's well known that Nvidia is optimized for intel better than AMD cards are.

Everyone's raving about the threadripper, but this Vega isn't impressive to me. My build is Core i9 with a Titan Xp. Vega isn't even as good as a 1080Ti or 1080 or even the 1070 (power consumption).

It's late to market and not as good a value as the Nvidia cards.

I'd only consider a Vega if I built a threadripper system.
 
Damn, this looks bad. The only thing that makes me still consider Vega is the FreeSync vs G-Sync price difference. If you buy both a screen and a GPU, you can actually get some performance advantage for the same price with AMD setup, but otherwise it's just bad.
 
I kinda disagree with using an AMD card on an Intel Core i7. I think it's well known that Nvidia is optimized for intel better than AMD cards are.

Everyone's raving about the threadripper, but this Vega isn't impressive to me. My build is Core i9 with a Titan Xp. Vega isn't even as good as a 1080Ti or 1080 or even the 1070 (power consumption).

It's late to market and not as good a value as the Nvidia cards.

I'd only consider a Vega if I built a threadripper system.
if AMD chooses to not optimize their driver for a CPU that commands 3/4ths of the gaming PC market or higher, then that is AMDs choice.

Doesnt mean we should misrepresent what AMD will perform like on the majority of systems.
 
It's certainly good that we finally have some competition at these higher price points, but we feel like if you’re going to come to the party an entire product cycle late, you kind of have to hit it out of the park. That’s not what AMD has done here.
That's something I agree with. As gamers get older and have more money to spend time becomes as important as value/price.
For example, we’ve seen previously the RX 480 lagging around 15% behind the GTX 1060 upon release. Months later that gap was closed to nothing and AMD's offering became considerably more attractive. RX Vega 56 is currently matching the more mature (and entirely awesome for the past 14 months) GTX 1070, but we can easily imagine it becoming some 10% faster before too long.
Does this put the "fine wine" debate to rest? I feel it does.
 
if AMD chooses to not optimize their driver for a CPU that commands 3/4ths of the gaming PC market or higher, then that is AMDs choice.

Doesnt mean we should misrepresent what AMD will perform like on the majority of systems.
The i7 does or Intel?
 
if AMD chooses to not optimize their driver for a CPU that commands 3/4ths of the gaming PC market or higher, then that is AMDs choice.

Doesnt mean we should misrepresent what AMD will perform like on the majority of systems.


Yeah but who says the "majority of systems" are Core i7?

I assure you - THEY AREN'T.
 
The i7 does or Intel?
intel in general. Ryzen is the first competitive thing AMD has had since 2011. most gaming PCs are running intels, and QP saying that techspot shouldnt test AMD on intel systems because AMD is too lazy to optimize is ridiculous.
Yeah but who says the "majority of systems" are Core i7?

I assure you - THEY AREN'T.
No, intel controls 75% of the enthusiast (read: gaming PC) market, not just the core i7.

I should have said "intel" instead of "CPU"
 
Actually, the graphs show something more like 50%, not 100% power consumption increase, at least if we look at the whole picture (because comparing extreme peaks is pointless), which... Still really sucks.
Depends on the reference point of your comparison - Vega 56 consumes around 100% more than the 1070 while the 1070 consumes 50% less power than the 56.
 
intel in general. Ryzen is the first competitive thing AMD has had since 2011. most gaming PCs are running intels, and QP saying that techspot shouldnt test AMD on intel systems because AMD is too lazy to optimize is ridiculous.
That was the clarification I was expecting. Most surveys and estimates I see have the i3 and i5 as the main Intel CPUs with Intel in the 80% range.
 
I kinda disagree with using an AMD card on an Intel Core i7. I think it's well known that Nvidia is optimized for intel better than AMD cards are.

I don't think this is true to be honest, having looked at lots of footage from AMD GPU launch events over the years, they have been using intel CPU's in their test systems at events, now why would they do that if it hindered the performance of the product they are trying to promote?

I would be interested to see if this is a true though :)
 
With nVidia's new GTX 2000's just around the corner, I'm not so sure if these cards should be taken seriously right now especially given the pricing. Right at this very second I can get my paws on a brand new GTX 1070 for +/- $300 and a 1080 for about $450. Sure that's a special price and it won't last forever, just until stocks are sold out.
 
Let's not argue about the CPU used. It's not going to make a big difference in terms of percentages by going with a Ryzen CPU.

TL;DR just go with whatever GPU is cheaper in your region and if the prices are similar just pick based on the games you want to play.

From what we could clearly see, the drivers are not yet great for it. I expect a few games will get some performance uplifts after some updates (7-10% is not unreasonable considering past AMD GPUs).

I would also love to see some compute benchmarks for Vega (OpenCL vs CUDA?). It seems to me that AMD invested big in the professional market this generation.
 
I think it's well known that Nvidia is optimized for intel better than AMD cards are.
Never have I seen anyone claim such thing. Can you provide evidence?
If there's one thing I believe I've read, though, is that AMD cards are favored by CPUs with higher core count.
 
Uhm... 1070 is around 130-140W, while V56 uses around 210W, or am I missing something important?
Sorry I should have said I was speaking in general terms about which reference point one was using and assumed both could be true (they can but your's is a different assertion). Those links wouldn't pull up for me (filtered) but I've seen about the same - 130-140 for the 1070 and 220-240 for 56.

Right now power and temp readings are all over the board for different sites. I like Techpowerup's charts which put them at 145W and 229W or 57.93% more than the 1070 or 36.68% less than the Vega 56.
 
Great performance from an AMD, really impressive, what would have been real impressive is if it came out 12 months ago to compete with the 1070 like it's doing now, 15 months later... Wait I take back the first impressive comment I made, when after 15 months you only just match your competitor and then don't even bother trying to undercut the price... Yeah, why did anyone think Vega was going to blow us all out of our socks? This is only news worthy because it's AMD doing something and certain people seem to need this in their lives or they lose their minds and start hating everything good from the competition.
 
Erm....Vega 56 isn't very good but it does stand up ok to GTX1070. It is probably a bit faster overall, with a few driver improvements it'll likely have a small lead.

Really it comes down to accepting it is a bit power hungry and a bit noisy according to other tests. It comes down more to price and availability. I wouldn't buy one until partners have better cooling solutions.

Having bought a Pascal myself over a year ago now I find it hard to see the attraction for gamers today and I don't think too many will be impressed, obviously for miners though they will love it. Until the bubble busts that is.
 
Likw Ryzen, AMD show up late to the market, being the second option, they still have the gall to overprice their wares. Really AMD?!? This is how you win friends? Any goodwill that was left over from the 939 era has now all been evaporated completely.

AMD, if you are just going to be the me-too product with me-too on par performance, and that is really being generous, actually stuff like Ryzen bottlenecking the GPU, slower single cor/gaming performance, and the higher power consumption of the Vega, really qualifies as 90% me-too. So the prices will have to be significantly lower. Sure price it high at the beginning see how many gullible fanboy donations you can get, but the market reality will soon assert itself. Just like Ryzen 1800x, in less than 6 months, has droped $150 from the $500 release price see:
http://www.microcenter.com/product/476003/Ryzen_7_1800X_36_GHz_8_Core_AM4_Boxed_Processor

Which is still $100 too expensive. Considering the the R7-1700 is basically the same performance as the 1800x and goes for $270 now:
http://www.microcenter.com/product/..._AM4_Boxed_Processor_with_Wraith_Spire_Cooler

What kind of deceit and bait and switch AMD trying to pull over their customers? AMD used to deliver value and it was clear and obvious to see. But now it is like all marketing games. What the heck happened{?
 
I kinda disagree with using an AMD card on an Intel Core i7. I think it's well known that Nvidia is optimized for intel better than AMD cards are.

Actually, it's not well-known because it's not true at all. That was an old wives' tale which is absolute BS. ATi wouldn't DARE leave Radeons unoptimised for Intel CPUs when Intel CPUs have recently had over 80% of the market. I think that you might be alluding to how the nVidia cards were having problems with Ryzen CPUs with their drivers showing worse gaming numbers than Radeons in the same performance bracket. There has NEVER been any evidence that Radeons run better with FX or Ryzen CPUs than they do on Intel Core CPUs.

It's late to market and not as good a value as the Nvidia cards.

Now THIS, I totally agree with. They're a year behind, they use more juice and they run hotter with the same performance as a GTX 1070. Regardless of why this is, it's just not a very compelling product.

I'd only consider a Vega if I built a threadripper system.

It would only be worth it if you wanted it for its compute capabilities. As a gaming graphics card, it's not priced very well, which is odd for an ATi product. I'm sure that prices won't stay where they are though.
 
Back