Testing AMD Kaveri's Dual Graphics Performance

By on February 14, 2014, 1:08 AM

Although the Steamroller cores in AMD's Kaveri-based A8-7600 APU brought a notable boost in CPU efficiency, it felt like the company was mostly focused on gaming performance with last month's update. The A8-7600 wasn't much faster than last year's A10-6800K, but it was quick enough to power modern titles such as BioShock Infinite and Tomb Raider without help from a discrete graphics card.

At the same time that AMD is beginning to deliver on a years-long promise of single-chip PC gaming, its effort toward Crossfiring integrated graphics with discrete graphics is finally maturing. Currently, Kaveri APUs can only be paired with one of two discrete GPUs: the Radeon R7 240 and R7 250. Both are sub-$100 cards that we wouldn't typically recommend gamers invest in, but when combined with the A10-7850K's on-die GPU, we could see performance that has bigger implications for value-oriented builders.

Read the complete review.




User Comments: 25

Got something to say? Post a comment
mosu said:

Please elaborate on drivers used. I also would like to see how Mantle works in crossfire setups.

WangDangDoodle said:

I've been looking forward to this review, as I've been very curious about how the 7850K would perform with dual graphics. I'm planning to build myself a little Mini-ITX Steam Machine. Unfortunately, the performance of this APU, even with dual graphics, is a bit disappointing. Regardless, it's a good read. Thanks!

Guest said:

Too bad that you didn't test A10-7850K with 260X as you did with the i5-4430, just for performance comparison. It would be nice to have A8-7600 vs i3-4130 both tested with IGP and then with 260X, so we can see how big is a difference between CPU performance.

1 person liked this | hahahanoobs hahahanoobs said:

Great review Steven.

hahahanoobs hahahanoobs said:

Please elaborate on drivers used. I also would like to see how Mantle works in crossfire setups.

Mantle is still a work in progress. Look at the known issues in the 14.1 release notes.

1 person liked this | JC713 JC713 said:

Great review as always.

theBest11778 theBest11778 said:

I had an HP laptop with a 6750 Discrete and AMD A8 with Dual Graphics... it came out pretty much as the benchmarks here. With the Dual Graphics disabled it ran faster 90% of the time. Unless the APU's GPU, and the discrete are clocked the same, with the exact same memory speed (meaning you'd need to get the DDR3 version,) I'm guessing the system needs to wait to sync the two.

fps4ever said:

Wow...the inconsistency in AMD solutions is just mind boggling. Instant performance boost just by using an Intel cpu in the same price range. I so did not want that to be the case. And Mantle is just a band-aid on a universal performance problem for AMD. The current prices do not make the AMD combo a good buy.

GhostRyder GhostRyder said:

In truth I saw this coming because the 250 only has 384 Shader units (Which in truth is lower than the Shader units in the 7850K). In truth I knew the performance boost would not be enough to warrant buying that over a 260X. In reality, the only major advantage of the A10-7850K would be the HSA when utilized and the fact its an unlocked processor. Ive been playing with the 7850k, I got the one I built for a friend up to 4.5ghz without much effort, I would love to see a comparison between the locked ones and that one at that the max clock speeds just for the heck of it.

The sad fact is, AMD has not realised the better option to dual up with since this round they released two lower end cards than last gen to pair up with the GPU on the 7850K which results in poor performance in general. Its not an ideal situation unless space is an extreme issue, but a 260x is small to begin with so I doubt that would be a concern.

Great Review @Steve!

Adhmuz Adhmuz, TechSpot Paladin, said:

Is it me or for now this is a totally pointless technology, when enabled it actually slowed down the discrete graphics card... Sad really. Maybe when Mantle gets better it could use the APU and the GPU separately but instead to complement each other, otherwise combining the two just seems like a waste. For example run physics on the integrated GPU saving the CPU for what it's good at and the GPU for just graphics.

mosu said:

Great Review @Steve!

Check this: [link]

and this: [link]

Guest said:

It's questionable why someone would opt for an AMD A10-7850K + R7 250 instead of something like an Intel i3-4130 with a R7 260X. Faster CPU, faster graphics card, about the same price in total.

I went Trinity for my living room HTPC and really regret it now.

HSA has a ton of potential, as shown in some jaw-dropping benchmark results, but until software is designed to take advantage of it, it's not a great value for customers.

Guest said:

Good to see AMD finally picking up their game again lately.

Here's hoping for more to come from them :)

Staff
Steve Steve said:

Check this: [link]

and this: [link]

What is it that we are checking out exactly?

1 person liked this | Guest said:

No matter what, AMD is really doing well in development. I've always liked AMD I'd say I have owned only one non-AMD system (CPU) since 1995.

Then when AMD bought ATI this was a blessing to the market which can keep the market fair and prices low by giving us choice. I stopped using Nvidia cards circa 2001/2002ish. I've been with AMD ever since. I love underdogs.

Guest said:

I've been with AMD ever since. I love underdogs.

I do to, but I love performance more. Ever since multicore I've been with Intel. That said, I'm rooting for them too, and hope their next offerings are good enough or better to go with them again. They also need to work with Adobe to get HW acceleration working like the "Cuda" cores do with Nvidia. That would put them back on the table for consideration from people like me that use their machines for both work and play.

Guest said:

Crossfire doesn't work?! Unless I am blind every benchmark showed a single GPU with an iCore 5 as superior. Yawn

2 people like this |
Staff
Steve Steve said:

Crossfire doesn't work?! Unless I am blind every benchmark showed a single GPU with an iCore 5 as superior. Yawn

You might be blind then. The Crossfire setup was faster in three of the six games tested as stated in the conclusion.

Guest said:

I agree that the value proposition isn't great for dual graphics with the 7850k (currently ~$185), but Kaveri was designed to shine at 35w, not 95w. Based on reviews of the A8-7600, the design goals were achieved, so dual graphics in entry-mid level laptops is what I have my sights on.

GhostRyder GhostRyder said:

You might be blind then. The Crossfire setup was faster in three of the six games tested as stated in the conclusion.

@Steve I got an odd question for ya, when you were testing im curious about the core clocks on the R7 250 and A10's R7. Now in CFX and SLI normally it goes with the lower clock speed and hits the GPU's down to those levels. Just curious if you noticed anything like that as im curious if playing with the clock speeds in the Bios like I did (I bumped my friends one up to 960mhz) would make a difference in CFX (At least a noticeable one).

Guest said:

In your article you implied that you should chose the faster GDDR 5 memory over the slower DDR 3 memory. Did you do any performance comparisons between r7 250 DDR 3 and the GDDR 5 memory when in crossfire with the Kaveri?

Thanks

Staff
Steve Steve said:

In your article you implied that you should chose the faster GDDR 5 memory over the slower DDR 3 memory. Did you do any performance comparisons between r7 250 DDR 3 and the GDDR 5 memory when in crossfire with the Kaveri?

Thanks

No need its way slower, for gaming AVOID DDR3 at all costs, end of story. There is less than half the available memory bandwidth which crushes performance.

@Steve I got an odd question for ya, when you were testing im curious about the core clocks on the R7 250 and A10's R7. Now in CFX and SLI normally it goes with the lower clock speed and hits the GPU's down to those levels. Just curious if you noticed anything like that as im curious if playing with the clock speeds in the Bios like I did (I bumped my friends one up to 960mhz) would make a difference in CFX (At least a noticeable one).

No that wasn't happening, both GPU's were running at full speed when Crossfire was working. I didn't try and match the clock speeds through overclocking though.

GhostRyder GhostRyder said:

No that wasn't happening, both GPU's were running at full speed when Crossfire was working. I didn't try and match the clock speeds through overclocking though.

Ah ok, was curious if the 7850k would actually while still improving performance in CFX was holding back the power by forcing downclocking. Good to note, thanks for the response.

Guest said:

Since the igpu is crossfired with the videocard and the igpu is utilizing your onboard DDR3 memory, wouldn't this clock the videocard's DDR5 ram down to DDR3 speeds?

What speed was the DDR3 being used in the benchmarks? These APU's perform best with speeds of 2133 or faster from what I have read. I am seriously considering an APU solution but this is the one thing holding me back, a DDR5 card or a DDR3 card. There must be a reason behind AMD offering a R7 with DDR3 at the same price point.

Guest said:

Hi Sir,

Can you repeat the benchmarks in your Kaveri review using Catalyst 14.2 ?

[link]

Based on the article from the link posted above, there is ~30% performance improvement when going from Catalyst 13.12 to Catalyst 14.2

Thanks

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.