TechSpot

Testing AMD Kaveri's Dual Graphics Performance

By Steve
Feb 14, 2014
Post New Reply
  1. mosu

    mosu TS Enthusiast Posts: 307

    Please elaborate on drivers used. I also would like to see how Mantle works in crossfire setups.
     
  2. WangDangDoodle

    WangDangDoodle TS Rookie Posts: 47   +13

    I've been looking forward to this review, as I've been very curious about how the 7850K would perform with dual graphics. I'm planning to build myself a little Mini-ITX Steam Machine. Unfortunately, the performance of this APU, even with dual graphics, is a bit disappointing. Regardless, it's a good read. Thanks!
     
  3. Too bad that you didn't test A10-7850K with 260X as you did with the i5-4430, just for performance comparison. It would be nice to have A8-7600 vs i3-4130 both tested with IGP and then with 260X, so we can see how big is a difference between CPU performance.
     
  4. hahahanoobs

    hahahanoobs TS Booster Posts: 981   +97

    Great review Steven.
     
    Steve likes this.
  5. hahahanoobs

    hahahanoobs TS Booster Posts: 981   +97

    Mantle is still a work in progress. Look at the known issues in the 14.1 release notes.
     
    Last edited: Feb 15, 2014
  6. JC713

    JC713 TS Evangelist Posts: 7,018   +912

    Great review as always.
     
    Steve likes this.
  7. theBest11778

    theBest11778 TS Enthusiast Posts: 151   +32

    I had an HP laptop with a 6750 Discrete and AMD A8 with Dual Graphics... it came out pretty much as the benchmarks here. With the Dual Graphics disabled it ran faster 90% of the time. Unless the APU's GPU, and the discrete are clocked the same, with the exact same memory speed (meaning you'd need to get the DDR3 version,) I'm guessing the system needs to wait to sync the two.
     
  8. fps4ever

    fps4ever TS Rookie

    Wow...the inconsistency in AMD solutions is just mind boggling. Instant performance boost just by using an Intel cpu in the same price range. I so did not want that to be the case. And Mantle is just a band-aid on a universal performance problem for AMD. The current prices do not make the AMD combo a good buy.
     
  9. GhostRyder

    GhostRyder TS Evangelist Posts: 2,262   +541

    In truth I saw this coming because the 250 only has 384 Shader units (Which in truth is lower than the Shader units in the 7850K). In truth I knew the performance boost would not be enough to warrant buying that over a 260X. In reality, the only major advantage of the A10-7850K would be the HSA when utilized and the fact its an unlocked processor. Ive been playing with the 7850k, I got the one I built for a friend up to 4.5ghz without much effort, I would love to see a comparison between the locked ones and that one at that the max clock speeds just for the heck of it.

    The sad fact is, AMD has not realised the better option to dual up with since this round they released two lower end cards than last gen to pair up with the GPU on the 7850K which results in poor performance in general. Its not an ideal situation unless space is an extreme issue, but a 260x is small to begin with so I doubt that would be a concern.

    Great Review @Steve!
     
  10. Adhmuz

    Adhmuz TechSpot Paladin Posts: 927   +104

    Is it me or for now this is a totally pointless technology, when enabled it actually slowed down the discrete graphics card... Sad really. Maybe when Mantle gets better it could use the APU and the GPU separately but instead to complement each other, otherwise combining the two just seems like a waste. For example run physics on the integrated GPU saving the CPU for what it's good at and the GPU for just graphics.
     
     
  11. mosu

    mosu TS Enthusiast Posts: 307

  12. It's questionable why someone would opt for an AMD A10-7850K + R7 250 instead of something like an Intel i3-4130 with a R7 260X. Faster CPU, faster graphics card, about the same price in total.

    I went Trinity for my living room HTPC and really regret it now.

    HSA has a ton of potential, as shown in some jaw-dropping benchmark results, but until software is designed to take advantage of it, it's not a great value for customers.
     
  13. Good to see AMD finally picking up their game again lately.
    Here's hoping for more to come from them :)
     
  14. Steve

    Steve TechSpot Staff Topic Starter Posts: 1,384   +479 Staff Member

  15. No matter what, AMD is really doing well in development. I've always liked AMD I'd say I have owned only one non-AMD system (CPU) since 1995.

    Then when AMD bought ATI this was a blessing to the market which can keep the market fair and prices low by giving us choice. I stopped using Nvidia cards circa 2001/2002ish. I've been with AMD ever since. I love underdogs.
     
    penn919 likes this.
  16. I do to, but I love performance more. Ever since multicore I've been with Intel. That said, I'm rooting for them too, and hope their next offerings are good enough or better to go with them again. They also need to work with Adobe to get HW acceleration working like the "Cuda" cores do with Nvidia. That would put them back on the table for consideration from people like me that use their machines for both work and play.
     
  17. Crossfire doesn't work?! Unless I am blind every benchmark showed a single GPU with an iCore 5 as superior. Yawn
     
  18. Steve

    Steve TechSpot Staff Topic Starter Posts: 1,384   +479 Staff Member

    You might be blind then. The Crossfire setup was faster in three of the six games tested as stated in the conclusion.
     
    GhostRyder and mailpup like this.
  19. I agree that the value proposition isn't great for dual graphics with the 7850k (currently ~$185), but Kaveri was designed to shine at 35w, not 95w. Based on reviews of the A8-7600, the design goals were achieved, so dual graphics in entry-mid level laptops is what I have my sights on.
     
  20. GhostRyder

    GhostRyder TS Evangelist Posts: 2,262   +541

    @Steve I got an odd question for ya, when you were testing im curious about the core clocks on the R7 250 and A10's R7. Now in CFX and SLI normally it goes with the lower clock speed and hits the GPU's down to those levels. Just curious if you noticed anything like that as im curious if playing with the clock speeds in the Bios like I did (I bumped my friends one up to 960mhz) would make a difference in CFX (At least a noticeable one).
     
  21. In your article you implied that you should chose the faster GDDR 5 memory over the slower DDR 3 memory. Did you do any performance comparisons between r7 250 DDR 3 and the GDDR 5 memory when in crossfire with the Kaveri?

    Thanks
     
  22. Steve

    Steve TechSpot Staff Topic Starter Posts: 1,384   +479 Staff Member

    No need its way slower, for gaming AVOID DDR3 at all costs, end of story. There is less than half the available memory bandwidth which crushes performance.

    No that wasn't happening, both GPU's were running at full speed when Crossfire was working. I didn't try and match the clock speeds through overclocking though.
     
  23. GhostRyder

    GhostRyder TS Evangelist Posts: 2,262   +541

    Ah ok, was curious if the 7850k would actually while still improving performance in CFX was holding back the power by forcing downclocking. Good to note, thanks for the response.
     
  24. Since the igpu is crossfired with the videocard and the igpu is utilizing your onboard DDR3 memory, wouldn't this clock the videocard's DDR5 ram down to DDR3 speeds?

    What speed was the DDR3 being used in the benchmarks? These APU's perform best with speeds of 2133 or faster from what I have read. I am seriously considering an APU solution but this is the one thing holding me back, a DDR5 card or a DDR3 card. There must be a reason behind AMD offering a R7 with DDR3 at the same price point.
     


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.