AMD Vega packs next-gen compute units, HBM2 support

Scorpus

Posts: 2,162   +239
Staff member

AMD's Vega GPU is just around the corner, and it's a much larger architectural upgrade than we expected. AMD divulged a bunch of new details about Vega at their Tech Summit last month, and while we don't have any graphics card information or specifications at this stage, AMD fans have a lot to be excited about in the coming months.

For much of 2016, it was assumed that Vega was simply a larger version of Polaris with more compute units, complete with High Bandwidth Memory. It's now clear that Vega contains significant architectural improvements that will make AMD's next-generation graphics cards more powerful and more capable. In fact, it now seems that Polaris was a mere intermediary stepping stone as AMD transitioned their flagship GPUs from Fiji to Vega.

While Vega reportedly contains more than 200 new features, AMD spent time at their Tech Summit detailing the four most important changes. The biggest of these is a complete overhaul of their memory controller and structure, including the introduction of a High-Bandwidth Cache Controller (HBCC) and High-Bandwidth Cache. 

One of the problems that AMD identified with their last-generation GPUs is that some applications need to access more data than is available in their VRAM. Compute applications in particular, along with professional rendering tools, are most susceptible to slowdowns due to these problems. Why? Well, traditionally if a GPU wants to access data outside its VRAM, it must first pause and transfer this data from system RAM or SSD/HDD storage into VRAM before any processing can take place.

Vega changes this through the inclusion of a High-Bandwidth Cache Controller. AMD hasn't gone into detail on exactly how this controller works, but it seems as though it provides easier access to off-card storage like system DRAM, non-volatile RAM, and networked storage like SSDs and HDDs. Vega supports a virtual memory address space of 512 TB, which is far larger than any on-board VRAM solution, making it well suited to big data applications.

To demonstrate how useful the HBCC is, AMD showed off a scene being rendered on a Vega card using Radeon ProRender. This scene used hundreds of gigabytes of data, and was being rendered in real time, and even though the demonstration ran at only a few frames per second, AMD claimed that on previous cards it would have taken upwards of an hour to render each frame. Presumably these speed improvements have come from a drastic reduction in memory transfers per frame.

The HBCC also supports adaptive fine-grained data movement from external memory to Vega's on-die memory, which AMD is calling High-Bandwidth Cache. To be clear, High-Bandwidth Cache is simply a new name for VRAM, and in Vega this will be on-die HBM2. To speed up processing, the HBCC will dynamically shift the most scene- or compute-relevant data into Vega's HBM2.

While the HBCC will have the biggest implications for compute workloads, AMD did suggest it will have some use in games. AMD showed a graph that illustrated how games allocate a lot of memory for any scene, but only access a fraction of it to render a frame. In the future, we could see games make use of HBCC-like technology by utilizing a lot more memory per frame while allocating more memory than the cache/VRAM can hold.

Vega comes with a next-generation compute unit called the Vega NCU. It supports 128 32-bit, 256 16-bit and 512 8-bit operations per clock; the latter two are particularly important for deep learning, and this is where Vega will boast significantly improved performance over older architectures. The NCU also supports flexible mixed-precision, and a configurable double precision rate.

Aside from improvements to half and quarter precision performance, AMD claims the NCU is optimized for higher clock speeds and higher IPC, with an increased instruction buffer and an improved cache schedule. 

Vega includes a new programmable geometry pipeline that offers over twice the peak throughput per clock. This pipeline is essentially a primitive shader, which sits alongside other shaders like the vertex shader and compute shader. It has the ability to launch threads at the same rate as the compute shader, and the pipeline in general offers improved load balancing, which is an issue console developers highlighted during their time working with previous GCN architectures.

Developers will have to specifically target this new shader to get the most out of the Vega architecture, so it's not necessarily something we'll see utilized often, particularly for gaming workloads.

AMD has revamped the pixel engine in Vega as well, introducing a new rasterizer called the Draw Stream Binning Rasterizer. This rasterizer improves performance and power consumption by fetching only once for overlapping primitives, and shading only the pixels visible on screen. The pixel engine also now has access to L2 cache, which improves performance in deferred shading applications.

Basically, the pixel engine includes improvements that allow the GPU to work on stuff that actually needs to be worked on, while scheduling past work that doesn't contribute to a frame.

It could still be a little while before we see Vega graphics cards on the market, but AMD did have a working sample at their event, where they showed off Doom gameplay at 4K with ultra settings. In their demonstration, which was thrown together using beta drivers in a few weeks with little optimization, Vega achieved around 60 to 70 FPS, placing it in the same performance bracket as Nvidia's GeForce GTX 1080.

Vega sounds like a promising upgrade over AMD's past architectures, both for compute and gaming workloads. We'll have to wait a little while longer for AMD to reveal its consumer graphics card line-up and their corresponding specifications, but it's clear that AMD will have a high-performing flagship card in just a few months' time.

Permalink to story.

 
AMD fans have a lot to be excited about in the coming months
AMD is really building up hype? It's not like that has blown on their faces before or anything...
They delivered on Polaris: They gave exactly what they promised. Also, seeing some statements they've been making (such as 'poor Volta' in their After The Uprising YouTube video), they should really be embarrassed if they got this wrong. It wouldn't surprise me if the flagship card would match or be 5-10% faster than the 1080 Ti.
 
AMD fans have a lot to be excited about in the coming months
AMD is really building up hype? It's not like that has blown on their faces before or anything...
They delivered on Polaris: They gave exactly what they promised. Also, seeing some statements they've been making (such as 'poor Volta' in their After The Uprising YouTube video), they should really be embarrassed if they got this wrong. It wouldn't surprise me if the flagship card would match or be 5-10% faster than the 1080 Ti.
Actually, knowing AMD, it would not surprise me if their flagship card would be 5-10% slower than 1080(non Ti), but then again it`s just speculation at this point.
 
This very much reminds me of movie Contact.

vega.jpg


Is AMD advertisement campaign any better though?

According to the product they have been releasing over the years, the promise always surpasses the reality.

We'll see what happens this time, it could add another Terra-Flop (minus the Terra part) to their list :)

bust.jpg
 
Last edited:
They had a live demo running Doom @ 4k 60-70fps. So similar performance to the 1080.

That's absolutely fine, If it uses the same amount or slightly more power, that's also fine. Just keep the price just below Nvidia's and you'll make a killing!
 
Impressive. Good for gamers and VRs. Probably good for Photoshop aficionados, too. Pricing, pricing, pricing of cards? Oh, wait! I forgot. This is CES, where the product tease is what is most important. Just like the Comdex which CES replaced.
 
I'll wait for the reviews to be sure, but right now it looks like I may go red for the first time. I'm ready to ditch nVidia's shoddy drivers.
It is funny because just a few years back AMD had the (more) unreliable drivers. Funny how things change.
 
I'll wait for the reviews to be sure, but right now it looks like I may go red for the first time. I'm ready to ditch nVidia's shoddy drivers.
What exactly makes a driver "shoddy"? I've been through several dozen since I built my original rig in 2012 and haven't had a problem at all. I download the latest, do a clean install, and everything works fine. What problems are you having, and are you sure the drivers are the issue? Why am I not having issues using the same drivers you are?
 
I'll wait for the reviews to be sure, but right now it looks like I may go red for the first time. I'm ready to ditch nVidia's shoddy drivers.
Personally, I've never been that much of an nVidia fanboy (nV*****?) so I'm in your corner. I've always felt that AMD/ATi cards offered greater value/$.
 
Personally, I've never been that much of an nVidia fanboy (nV*****?) so I'm in your corner. I've always felt that AMD/ATi cards offered greater value/$.
I guess "value" is subjective; to me a card is of no value- regardless of price- if it doesn't perform the way I want it to. A Mustang is a better value than a Lamborghini, but it's not a Lamborghini.

Nvidia's top end cards have been outperforming AMD's top cards for several years, and as someone who want's the best performance I can get my hands on (at least, for under $1000), that only leaves me the green team option. We'll see how Vega stacks up against the 1080 Ti - if it's better, I'll happily go to the red side.
 
What exactly makes a driver "shoddy"? I've been through several dozen since I built my original rig in 2012 and haven't had a problem at all. I download the latest, do a clean install, and everything works fine. What problems are you having, and are you sure the drivers are the issue? Why am I not having issues using the same drivers you are?

Lots of game crashes and freezes. I do clean installs, don't install what I don't need (3D vision software in this case), keep everything up to date, general 'good idea' stuff when it comes to drivers. But I have trouble with Overwatch, Civ 6 and Civ 5 (Civ 5 didn't used to be an issue), Offworld Trading Company (didn't used to be an issue), BF3 (didn't used to be an issue), and Elite: Dangerous (always been an issue). I use a GTX 760 with a 1080p60hz monitor, 16GB of RAM and an SSD. I keep the settings modest; whatever gets me ~60fps, no more, no less. I've been having issues since around late October, which is around when NVidia drivers problem started becoming more common with the user base as a whole (judging by their forums posts).

It didn't used to be this way. Even being a 'mid-range' buyer, I used to go with NVidia over AMD/ATI because of their drivers. NVidia had their act together and you knew that you didn't have to worry about things like game crashes and freezes. As far as I'm concerned, you can be buying a 'sports car' or a 'family sedan', but its just a hunk of metal if the engine doesn't work flawlessly 99.99% of the time.

Personally, I suspect it is something they are doing for the Pascal architecture that is causing issues with older cards and newer cards, but for different reasons. Lets call it 'incompatibility' for old cards and 'buggy' for new ones, because things at least seems to be getting better for anyone with a 10xx series card.
 
Lots of game crashes and freezes. I do clean installs, don't install what I don't need (3D vision software in this case), keep everything up to date, general 'good idea' stuff when it comes to drivers. But I have trouble with Overwatch, Civ 6 and Civ 5 (Civ 5 didn't used to be an issue), Offworld Trading Company (didn't used to be an issue), BF3 (didn't used to be an issue), and Elite: Dangerous (always been an issue). I use a GTX 760 with a 1080p60hz monitor, 16GB of RAM and an SSD. I keep the settings modest; whatever gets me ~60fps, no more, no less. I've been having issues since around late October, which is around when NVidia drivers problem started becoming more common with the user base as a whole (judging by their forums posts).

It didn't used to be this way. Even being a 'mid-range' buyer, I used to go with NVidia over AMD/ATI because of their drivers. NVidia had their act together and you knew that you didn't have to worry about things like game crashes and freezes. As far as I'm concerned, you can be buying a 'sports car' or a 'family sedan', but its just a hunk of metal if the engine doesn't work flawlessly 99.99% of the time.

Personally, I suspect it is something they are doing for the Pascal architecture that is causing issues with older cards and newer cards, but for different reasons. Lets call it 'incompatibility' for old cards and 'buggy' for new ones, because things at least seems to be getting better for anyone with a 10xx series card.
Ah, okay.

I do get the occasional crash/freeze but always assumed it was either a buggy game or an issue with either my overclocked CPU and GPUs , or my SLI 670 setup. I just got around to playing the remastered Boishock and Bioshock 2 games since they were free to owners of the originals. The first Bioshock was freezing on me about every hour- but there were posts all over the internet that everyone was having problems with that title. Bioshock 2 ran nearly flawlessly, only crashing twice in 20 some hours of gaming.

Constant crashes can also be caused by troubled motherboards, unstable OCs or overworked PSUs, but it's certainly plausible that it could be a driver issue. Being that I'm flogging Keplers from 2012 (and still pretty happy with them, but waiting for the 1080ti), you may be right that there are just too many architectures being covered by one blanket.
 
Ah, okay.

I do get the occasional crash/freeze but always assumed it was either a buggy game or an issue with either my overclocked CPU and GPUs , or my SLI 670 setup. I just got around to playing the remastered Boishock and Bioshock 2 games since they were free to owners of the originals. The first Bioshock was freezing on me about every hour- but there were posts all over the internet that everyone was having problems with that title. Bioshock 2 ran nearly flawlessly, only crashing twice in 20 some hours of gaming.

Constant crashes can also be caused by troubled motherboards, unstable OCs or overworked PSUs, but it's certainly plausible that it could be a driver issue. Being that I'm flogging Keplers from 2012 (and still pretty happy with them, but waiting for the 1080ti), you may be right that there are just too many architectures being covered by one blanket.

No overclock to speak of, CPU cooled with an AIO, GPU is EVGA with stock (dust-free) cooler, using an 800W Seasonic PSU (I used to have a massive 6x4TB HDD RAID in this thing for my photography work - it needed the juice), and it handles intensive GPU-free Matlab workloads just fine. Everything was solid until just after the Pascal launch.

I had similar issues when I was running a GTX260. After a while, nVidia drivers just stop being happy on older silicon. Perfectly understandable after a few generations, I usually take that as my cue to upgrade. But this time, there are also a lot people complaining about instability on Pascal chips too. So now I'm curious about what AMD is about to bring to the table.
 
No overclock to speak of, CPU cooled with an AIO, GPU is EVGA with stock (dust-free) cooler, using an 800W Seasonic PSU (I used to have a massive 6x4TB HDD RAID in this thing for my photography work - it needed the juice), and it handles intensive GPU-free Matlab workloads just fine. Everything was solid until just after the Pascal launch.

I had similar issues when I was running a GTX260. After a while, nVidia drivers just stop being happy on older silicon. Perfectly understandable after a few generations, I usually take that as my cue to upgrade. But this time, there are also a lot people complaining about instability on Pascal chips too. So now I'm curious about what AMD is about to bring to the table.
Interesting- I guess I never paid much attention to drivers and focused mainly on hardware.

I'm waiting for AMD's cards too, along with the 1080ti. I'll consider either 'team' if performance is similar on the high end cards, and if it is, my decision may be made based on power consumption alone- I have a Seasonic X650 gold and it has been rock solid for over four years; I'd hate to part with it. Each of my 670's is rated at 170W- so 340W total - and it never broke a sweat. Titan (Pascal) only uses 250W and I'm assuming the 1080ti will come in below that, so I'm already ahead of the game. AMD is getting more efficient and with any luck Vega will follow that trend.
 
AMD delivered with Ryzen for sure. I have a 1700 and it is much much better than the old FX chip I had. Firestrike score with 970SLI is 19000 compared to the 12000 I had with my FX chip. THATS HUGE. Not only synthetic benches but real-time games have increased huge amounts for me. I hope with future updates and optimizations that Ryzen will lead the consumer market showing more cores for the money is a win-win. 16 powerful threads for 300$. Intel handed that over to them.
 
Back