AMD's Kaveri solves age-old issue, promises faster on-die GPUs

By on April 30, 2013, 3:00 PM

AMD has revealed more information regarding Kaveri: its upcoming 28nm Steamroller-based APU. The company is touting Kaveri's heterogeneous system architecture -- a design which aims to greatly increase the performance output of on-die graphics cores. This technological achievement likely mark the future direction of AMD's APUs.

Heterogeneous system architecture: that's the buzz term AMD is using to describe what sets Kaveri apart from previous APUs. In a nutshell, HSA solves a long-standing issue with existing CPU + GPU implementations by allowing both units to directly access each other's memory pools. Ars Technicia published an interesting (and detailed) write-up yesterday which explains why HSA makes Kaveri special.

In more detail, communication between CPUs and GPUs has remained somewhat inefficient over the years. Each unit has always enjoyed its own dedicated pool of memory and as a result -- for CPUs and GPUs to perform their duties -- they've traditionally had to copy data back and forth between each other's memory pools. Even when CPUs and GPUs share the same physical die, this logical separation remains.

Naturally, all that copying comes at the cost of performance. The solution? Create a shared addressing space for CPUs and GPUs.

This is where HSA comes in. Although it falls just short of creating a single, unified memory pool; HSA does provide seprate processing units direct access to each other's memory pools. This eliminates the need for copying which substantially decreases the time and work required. The result will be a nice bump in GPU performance.

Kaveri will most likely hit the scene toward the tail-end of 2013, possibly just before Intel's Broadwell arrives. Although Kaveri's computational units will be based on a somewhat aging 28nm process, the APU will pack a Graphics Core Next GPU (i.e. Radeon 7000-series). It'll be interesting to see just how well Kaveri performs.

Interestingly, both Microsoft and Sony will be rolling out their next-gen consoles based on AMD technology. Could Kaveri be one of the reasons console makers are making a shift from RISC to x86?




User Comments: 23

Got something to say? Post a comment
Guest said:

CUDA would be nice to have. Is there a reason why they can't implement this in their cards?

JC713 JC713 said:

Wow! I give AMD props for working on something so revolutionary (programming-wise). This will give Intel's L4 cache in the upcoming Haswell line a run for its money.

1 person liked this | bluebob951 bluebob951 said:

CUDA would be nice to have. Is there a reason why they can't implement this in their cards?

Cuda is a nvidia technology, why on earth would they ever put it in there cards...AMD uses there own technology...

Guest said:

Kaveri's CPU core is not Bulldozer but Steamroller, which is currently still in development and has about as much to do with Bulldozer as Intel's Haswell with Nehalem

Staff
Rick Rick, TechSpot Staff, said:

Kaveri's CPU core is not Bulldozer but Steamroller, which is currently still in development and has about as much to do with Bulldozer as Intel's Haswell with Nehalem

My apologies. Updated for accuracy.

JC713 JC713 said:

AMD has stream processors and nVidia has Cuda cores.

GeforcerFX GeforcerFX said:

AMD has stream processors and nVidia has Cuda cores.

back in the day the two architectures use to be quit different, now anymore there getting more similar to each other. As far as what the different cores can do both are about the same in technology.

2 people like this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

For some reason this line seems to be partially cropped. I'm using Opera 12.15 if thats any indication as to the issue. Has anyone else noticed this?

Interestingly, both Microsoft and Sony will be rolling out their next-gen consoles based on AMD technology. Could Kaveri be one of the reasons console makers are making a shift from RISC to x86?

1 person liked this | gamoniac said:

For some reason this line seems to be partially cropped. I'm using Opera 12.15 if thats any indication as to the issue. Has anyone else noticed this?

Interestingly, both Microsoft and Sony will be rolling out their next-gen consoles based on AMD technology. Could Kaveri be one of the reasons console makers are making a shift from RISC to x86?

Yep. On my IE 10 as well.

p3ngwin said:

CUDA would be nice to have. Is there a reason why they can't implement this in their cards?

you're asking why Nvidia proprietary CUDA isn't available in AMD's products ?

1 person liked this | p3ngwin said:

For some reason this line seems to be partially cropped. I'm using Opera 12.15 if thats any indication as to the issue. Has anyone else noticed this?

Interestingly, both Microsoft and Sony will be rolling out their next-gen consoles based on AMD technology. Could Kaveri be one of the reasons console makers are making a shift from RISC to x86?

same here on the latest Firefox.

TS-56336 TS-56336 said:

Simply wow...

Something like this has to happen in order to push the already super fast AMD APUs to the performance comparable to mainstream discrete GPUs. If this happens nVidia will lose mainstream GPU segment as well and their only market would be top high end.

Guest said:

No, we will keep needing more power. You think these crappy on die gpu's will push 2160P or 4k monitors? NOPE. It will be a while before you can hit 30fps in anything at 2560x1440+.

There are still some games that hit below 30fps at 1080p with all the crap on with a GTX680.

The next card I buy won't be able to do what I want (27in 1440p) unless I spend over $300 most likely. That won't come on your $100-200 amd cpu for quite a few more revs if they even live that long with 2B in debt, 1B in cash burning 1.18B/year. They are on track to do it again with their lost last week of 310mil (you have to count the 164mil sale/rental of austin land on top of the 146mil they lost - without that sale they'd show 310mil loss x 4=1.2B loss again). This is why in that call the CFO said they'll need a loan to make it past the end of the year. Simple math says this is not a lie. Who wants to keep throwing money at them to flush down the toilet forever? Loans will get tougher to get and at even higher interest than now. NOT GOOD for R&D. They cut 30% of their engineers and their drivers last year shows this. Enduro and runts/fcat issues, stutter etc...They don't have the money to do all this at once. It shows.

Guest said:

Why would a video editor use an AMD card if the top editing softwares doesn't support or refer to these cards as an advantage in video editing. Every Nvidia cards has CUDA support. Is AMD's stream processing supported by these Editing software? I don't think so.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

No, we will keep needing more power. You think these crappy on die gpu's will push 2160P or 4k monitors? NOPE. It will be a while before you can hit 30fps in anything at 2560x1440+.

There are still some games that hit below 30fps at 1080p with all the crap on with a GTX680.

The next card I buy won't be able to do what I want (27in 1440p) unless I spend over $300 most likely. That won't come on your $100-200 amd cpu for quite a few more revs if they even live that long with 2B in debt, 1B in cash burning 1.18B/year. They are on track to do it again with their lost last week of 310mil (you have to count the 164mil sale/rental of austin land on top of the 146mil they lost - without that sale they'd show 310mil loss x 4=1.2B loss again). This is why in that call the CFO said they'll need a loan to make it past the end of the year. Simple math says this is not a lie. Who wants to keep throwing money at them to flush down the toilet forever? Loans will get tougher to get and at even higher interest than now. NOT GOOD for R&D. They cut 30% of their engineers and their drivers last year shows this. Enduro and runts/fcat issues, stutter etc...They don't have the money to do all this at once. It shows.

Hate to be blunt here... But you are comparing oranges to bowling shoes. The APU is intended to fill the same slot as Intel's offerings with integrated graphics, to fill the low-end "bang for the buck" slot. And for increased efficiency / low power situations.

If you need the graphics resolutions you are talking about... Well, that's why AMD has an entire line of discrete GPUs. It's fairly ridiculous to expect a modern APU to push those extreme resolutions, and even more ridiculous to use the fact that they can't against them. Considering their financials and all of those other numbers thrown out there, I'd say the fact that they were still able to come up with such a revolutionary approach to improving on-die efficiency says more about the company than the budget rant. This is the kind of technology AMD needs to keep them in the running, and maybe help push them into some positive revenues, which is nothing but good for the entire industry (by keeping a level of competition for Intel alive).

But hey, if you wait long enough, I'm sure that eventually the on-die GPUs will be able to push those resolutions you mention. Of course, by then you'll probably be complaining because it can't push 16k resolutions at 120 fps... Welcome to the world of computers, where nothing is ever good enough.

Guest said:

Wow would love to buy this :)

1 person liked this | VitalyT VitalyT said:

To TechSpot web team -

Once again last lines of the article are cut off. Your web developer must have messed up something with DIV heights. Please fix it. It manifests in the latest FireFox and Chrome.

havok585 havok585 said:

To TechSpot web team -

Once again last lines of the article are cut off. Your web developer must have messed up something with DIV heights. Please fix it. It manifests in the latest FireFox and Chrome.

I canc confirm this. Firefox/Chrome/IE latest versions.

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

To TechSpot web team.
No response, why do we even bother trying to help?

1 person liked this | VitalyT VitalyT said:

No response, why do we even bother trying to help?

Why do we even bother writing here at all? But then I look at my own Avatar and I remember why - I don't really care.

GeforcerFX GeforcerFX said:

No, we will keep needing more power. You think these crappy on die gpu's will push 2160P or 4k monitors? NOPE. It will be a while before you can hit 30fps in anything at 2560x1440+.

There are still some games that hit below 30fps at 1080p with all the crap on with a GTX680.

The next card I buy won't be able to do what I want (27in 1440p) unless I spend over $300 most likely. That won't come on your $100-200 amd cpu for quite a few more revs if they even live that long with 2B in debt, 1B in cash burning 1.18B/year. They are on track to do it again with their lost last week of 310mil (you have to count the 164mil sale/rental of austin land on top of the 146mil they lost - without that sale they'd show 310mil loss x 4=1.2B loss again). This is why in that call the CFO said they'll need a loan to make it past the end of the year. Simple math says this is not a lie. Who wants to keep throwing money at them to flush down the toilet forever? Loans will get tougher to get and at even higher interest than now. NOT GOOD for R&D. They cut 30% of their engineers and their drivers last year shows this. Enduro and runts/fcat issues, stutter etc...They don't have the money to do all this at once. It shows.

So counting in that there PC graphics and CPU sales will prob stay the same, possibly increase on the laptop side of things (though laptops are fading pretty fast), then add in that they are the major chip manufacturer for both major new consoles being released at the end of the year. I would say if AMD can stay afloat for the rest of 2013, which I see them being able to do with some decent management, then 2014 is looking like a good year for them, especially if steam roller really pushes the FX cpus up in performance they might compete better in the enthusiast market. Right now though its the simple fact that over the next 5 years some where along the lines of 100 million game consoles (or more) with AMD CPU's and GPU's will be sold, and there making money off each one.

Staff
Rick Rick, TechSpot Staff, said:

To TechSpot web team -

Once again last lines of the article are cut off. Your web developer must have messed up something with DIV heights. Please fix it. It manifests in the latest FireFox and Chrome.

Better late than never? The problem should be addressed now. :-)

Guest said:

Basically I don't care if Kaveri isn't as powerful as intels processors at the time of release, because I have paid a small fortune for an intel based pc and it barely performs that much better than my old intel pc did when using Blender 3d and other programs like 3d coat.

Plus I have fallen out with intel and their processors because they are just not that powerful for the price you are paying, the only intel processors that actually are powerful enough are the Extreme processors and they are £1000 which is ridiculous plus they haven't gotten much better than generations ago either so basically, when you are buying an intel processor, all you are paying your money for is a rebranded chip with one or two minor changes which isn't fair and not worth the money. OH and before you say im actually an intel fan boy, but due to financial difficulties I have had to look else where because I just cant afford them any more plus the fact that im just not happy that there products are simply rebranded.

and one more thing amds processors have increased dramatically with each release, are batter at multi tasking so using programms like blender you are better of with an amd processor, and they will definitely if every thing goes the way it is going catch up with intel, because amd is gaining on intel due to their ignorance in thinking their the best.

although I wont be buying the kaveri processor, I am waiting for the successor to the FX brand which I hope amd brings out next year, that is of course there is an successor to the FX brand.

How ever I just want to make it clear if I had the money I would buy an intel processor, its just that they are too expensive.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.