Let's unbox AMD's Radeon VII graphics card

Steve

Posts: 3,044   +3,153
Staff member

AMD unveiled the Radeon VII during AMD's CES 2019 keynote last month, claiming the new 7nm second-gen Vega GPU can boost performance 27 to 62 percent depending on the task. Boasting of 1 terabyte/s memory bandwidth, 60 compute units running up to 1.8GHz, and thanks to the shrink in process, the chip can squeeze 25% more performance in the same power envelope.

With an expected retail price of $699, AMD is looking to directly compete with the GeForce RTX 2080 that has the same MSRP. Radeon VII is expected to perform between 25 percent and 35 percent faster than Vega 64 (on paper).

During launch AMD will be offering an attractive game bundle with free copies of Resident Evil 2 remake, The Division 2, and Devil May Cry 5 shipping with every Radeon VII card.

This may be one of the last things you will hear about Radeon VII before we get to the benchmarks later this week. Official release of Radeon VII is slated for this Thursday, February 7 when you can expect our full review and benchmarks. For now you'll have to do with Steve's unboxing and color commentary. It's an entertaining thing to watch, so we'll leave you to it.

Permalink to story.

 
What is it with AMD and large acrylic packaging? Seeing an over-done packaging makes me think of it as a way of compensating for the product itself. I hope I'm wrong with this one, and surely AMD fans are crying while watching this. I'm gonna leave my emotions till the benchmark is out.
 
HBM should have been left aside for HPC applications. It cost a lot of money, takes a lot of space on the package, it is not energy efficient at all. They need to add a GDDR6 memory controller to Navi.
 
What is it with AMD and large acrylic packaging? Seeing an over-done packaging makes me think of it as a way of compensating for the product itself. I hope I'm wrong with this one, and surely AMD fans are crying while watching this. I'm gonna leave my emotions till the benchmark is out.

Wait, you actually think they ship the chip in an acrylic package separate from the PCB? It's not a DIY graphics card from AMD.

That's called a display unit.
 
HBM should have been left aside for HPC applications. It cost a lot of money, takes a lot of space on the package, it is not energy efficient at all. They need to add a GDDR6 memory controller to Navi.
You realize it actually takes up less space. There is literally no memory on the PCB at all and instead is around the die. That is the reason for it's huge bus bandwidth. One of the biggest draws of having hbm is it uses less power. Gddr5/x would have pushed Vega's power requirements and power envelope even higher.
 
HBM should have been left aside for HPC applications. It cost a lot of money, takes a lot of space on the package, it is not energy efficient at all. They need to add a GDDR6 memory controller to Navi.

AMD's HBCC can use any type of memory (Vega can theoretically use GDDR6). It is not an issue, it is packaging and marketing decision. Navi will be marketed at the mainstream.

Secondly, the reason you use HBM2 is because of energy efficiency! An those cost (of HBM2) doesn't matter to the Mi50 & Mi60 buyers, only the performance.
 
HBM2 is 20% more efficient over ddr memory and takes up much less space on the pcb that will eventually blossom and pay dividend in the future.
 
What is it with AMD and large acrylic packaging? Seeing an over-done packaging makes me think of it as a way of compensating for the product itself. I hope I'm wrong with this one, and surely AMD fans are crying while watching this. I'm gonna leave my emotions till the benchmark is out.

I believe this package called "Press Kit", normal user will have only the GPU and the normal packaging.
 
I bought RX 590 just to try it, returning later a bit faster than my old 1060 6gb and consuming up to 360W, so 2 times more than the 1060.

AMD is just desperate. You would have to be insane to buy their gpu, up to excatly 368W, that is more than 2080ti and offering the 1060 performance + a few %.

Just LoL +Wattman crashes 1 - 2 times per a day. Terrible card.

This abomination won't be any different.
 
I bought RX 590 just to try it, returning later a bit faster than my old 1060 6gb and consuming up to 360W, so 2 times more than the 1060.

AMD is just desperate. You would have to be insane to buy their gpu, up to excatly 368W, that is more than 2080ti and offering the 1060 performance + a few %.

Just LoL +Wattman crashes 1 - 2 times per a day. Terrible card.

This abomination won't be any different.
http://www.sapphiretech.com/productdetial.asp?pid=653DD044-C784-46AC-AFF4-84881431E725&lang=eng

Sapphire is saying that it is a 300W card, higher than the 2080 and 2080 TI's 250W.
I still think that will be a good product though. At least there is now competition to NVIDIA's card. If you have a need for the HBM2 or the terabyte/second bandwidth, it will be a great product for compute or GPU programming or the like. One thing I can say I didn't like is the price. It comes with some games, but I I think it would be better competition if it is $100 cheaper. At the same time, HBM is expensive as well...
 
I bought RX 590 just to try it, returning later a bit faster than my old 1060 6gb and consuming up to 360W, so 2 times more than the 1060.

AMD is just desperate. You would have to be insane to buy their gpu, up to excatly 368W, that is more than 2080ti and offering the 1060 performance + a few %.

Just LoL +Wattman crashes 1 - 2 times per a day. Terrible card.

This abomination won't be any different.

Hey superstar, the Radeon Seven is 7nm...
Though, your comments are laughable and leave you with stained fingers. Thanks for the chuckles.
 
If the leaked power consumption figures are true then this GPU is terrible. Normally when switching to a smaller node process you get more efficiency. But instead it looks like AMDs flagship 7nm part will perform worse and run hotter and hungrier than Nvidias flagship 14nm parts. This isn’t the competition we want.
 
They seem to have an efficiency problem, because at some point they had quite good efficiency (HD4000-7000 era). They need to improve their power saving techniques, maybe get inspiration from the Zen team.
 
If the leaked power consumption figures are true then this GPU is terrible. Normally when switching to a smaller node process you get more efficiency. But instead it looks like AMDs flagship 7nm part will perform worse and run hotter and hungrier than Nvidias flagship 14nm parts. This isn’t the competition we want.

I understand that you are new to computer hardware, & perhaps you live in a Country where electricity is costly. But please do understand, that to 90% of the People on TECHSPOT have ample electricity and are not worried about the electrical bill because of their GPU/Computer.

Hence it is not an issue or even a concern for 90% of the people who shop GPUs. Most higher end GPU's require twin 8-pin power cables. GPU's typical consume lots of power, it has been that way for 10+ years.


Also, why even mention 50watts and not what is gained with that 50watts ? (w/16GB of memory and 1TB of bandwidth.)
 
I understand that you are new to computer hardware, & perhaps you live in a Country where electricity is costly. But please do understand, that to 90% of the People on TECHSPOT have ample electricity and are not worried about the electrical bill because of their GPU/Computer.

Hence it is not an issue or even a concern for 90% of the people who shop GPUs. Most higher end GPU's require twin 8-pin power cables. GPU's typical consume lots of power, it has been that way for 10+ years.


Also, why even mention 50watts and not what is gained with that 50watts ? (w/16GB of memory and 1TB of bandwidth.)
Hahaha, I’ve been building for about 25 years mate and judging from your ruthlessly biased comments I have a much better idea of what’s going on than you do.

Energy consumption is hugely important for almost any product in the entire industry, it does have an affect on the environment after all. It’s also a good indicator of how well engineered a product is. Performance per watt is often a good way of comparing hardware

But I guess my point with this particular card is this, if you are choosing between two cards and they are both the same yet one uses less power and produces less less heat than you would pick that one.

Maybe it’s your lack of understanding but most of the time that graphics cards use more power they also produce more heat and this leads to more cooling required which in turn often leads to more noise. Also, die shrinks usually improve efficiency significantly but here we have 7nm drawing more power and running slower than the competitors 14nm. It’s a bit rubbish.
 
Energy consumption is hugely important for almost any product in the entire industry, it does have an affect on the environment after all. It’s also a good indicator of how well engineered a product is. Performance per watt is often a good way of comparing hardware

But I guess my point with this particular card is this, if you are choosing between two cards and they are both the same yet one uses less power and produces less less heat than you would pick that one.

Maybe it’s your lack of understanding but most of the time that graphics cards use more power they also produce more heat and this leads to more cooling required which in turn often leads to more noise. Also, die shrinks usually improve efficiency significantly but here we have 7nm drawing more power and running slower than the competitors 14nm. It’s a bit rubbish.

Is it really slower though? In gaming, it may be, but what about for video editing and rendering? The 16GB and 1TB/s can give it an edge in certain tasks. And at $0.10 per kilowatt hour where I live, you are talking about only a $3.60 per month cost increase if you are running 24 hours a day vs a 2080 or 2080 ti. If you can do a job in a shorter amount of time then you are still saving money.
We will see the results tomorrow in reviews. I don't think the power consumption is really a big deal here.
 
Some irony I enjoy. AMD to me is a better company. More progressive, works well with others, excellent partner, ethical. Nvidia makes better products, but falls short in the aforementioned categories that AMD has inherent to it's entire business. I must be part of the problem because I won't buy inferior products no matter the price disparity.
 
What is it with AMD and large acrylic packaging? Seeing an over-done packaging makes me think of it as a way of compensating for the product itself. I hope I'm wrong with this one, and surely AMD fans are crying while watching this. I'm gonna leave my emotions till the benchmark is out.

It is a Review/Tester Media kit... not retail!

Secondly, I think my earlier prediction may come true, that AMD has nothing to loose and will offer up a cheaper version of the Radeon VII @ $599.
 
I understand that you are new to computer hardware, & perhaps you live in a Country where electricity is costly. But please do understand, that to 90% of the People on TECHSPOT have ample electricity and are not worried about the electrical bill because of their GPU/Computer.

Hence it is not an issue or even a concern for 90% of the people who shop GPUs. Most higher end GPU's require twin 8-pin power cables. GPU's typical consume lots of power, it has been that way for 10+ years.


Also, why even mention 50watts and not what is gained with that 50watts ? (w/16GB of memory and 1TB of bandwidth.)
Hahaha, I’ve been building for about 25 years mate and judging from your ruthlessly biased comments I have a much better idea of what’s going on than you do.

Energy consumption is hugely important for almost any product in the entire industry, it does have an affect on the environment after all. It’s also a good indicator of how well engineered a product is. Performance per watt is often a good way of comparing hardware

But I guess my point with this particular card is this, if you are choosing between two cards and they are both the same yet one uses less power and produces less less heat than you would pick that one.

Maybe it’s your lack of understanding but most of the time that graphics cards use more power they also produce more heat and this leads to more cooling required which in turn often leads to more noise. Also, die shrinks usually improve efficiency significantly but here we have 7nm drawing more power and running slower than the competitors 14nm. It’s a bit rubbish.

Ha hahah ahhhaaa…

I was building systems since the MS-DOS days.. And, if you've been building rigs as long as you say you have, then why do you pretend not to know certain things..? or skew them..? Subsequently, now that you revealed yourself in such a way (for what you are), I have no reason to play electricity games with you.

It is more than laughable, that your main concern is not performance, or ability (1TB of bandwidth)... your main concern always and every time, is power draw? You hang all your post on power draw and perhaps noise...

Which is utterly laughable, because you are trying to SELL us a marketing ploy (that heat and watts matter when it comes to performance). I own a RTX2080 and will own a Radeon 7 too.

You are not an end user, or a gamer... who cares about Cost & Frames.

All you are trying to say in every post is that you can't afford a Radeon 7nm, because it uses 50watts more, for having twice the memory and twice the bandwidth as the rtx2080. The power usage is for plebs who can't afford a $700+ video card & psu. Stop pretending that power draw matters when we are talking about high-end...
 
Ha hahah ahhhaaa…

I was building systems since the MS-DOS days.. And, if you've been building rigs as long as you say you have, then why do you pretend not to know certain things..? or skew them..? Subsequently, now that you revealed yourself in such a way (for what you are), I have no reason to play electricity games with you.

It is more than laughable, that your main concern is not performance, or ability (1TB of bandwidth)... your main concern always and every time, is power draw? You hang all your post on power draw and perhaps noise...

Which is utterly laughable, because you are trying to SELL us a marketing ploy (that heat and watts matter when it comes to performance). I own a RTX2080 and will own a Radeon 7 too.

You are not an end user, or a gamer... who cares about Cost & Frames.

All you are trying to say in every post is that you can't afford a Radeon 7nm, because it uses 50watts more, for having twice the memory and twice the bandwidth as the rtx2080. The power usage is for plebs who can't afford a $700+ video card & psu. Stop pretending that power draw matters when we are talking about high-end...
I firmly believe you haven’t been building for more than a few months at best. You don’t have a clue.
 
I firmly believe you haven’t been building for more than a few months at best. You don’t have a clue.

Really?

Bro, I have tons of old rigs. Was moving stuff around over the holidays and found a dusty ole DFI Lanparty rig sitting in the basement closet. I have 25+ legit retail copies of MS-DOS ~ Windows NT ~ Windows 10 Pro.

You are trying to kill the messenger, because you don't like the message. Do You actually think wattage matters to someone who buys a $700 gpu..? Do you think everybody buys cheap $80 PSUs? Or is that just your nvidia talking point?

As if 250watts is a buy, but 300watts means no buy...lol! How do you justify such rational?
 
Back