AMD A8-7600 Review: 'Kaveri' APU Put to the Test

By Julio Franco
Jan 23, 2014
Post New Reply
  1. cliffordcooley likes this.
  2. cliffordcooley

    cliffordcooley TechSpot Paladin Posts: 5,079   +1,182

    Long story short conclusion:
    Puiu and wastedkill like this.
  3. wastedkill

    wastedkill TechSpot Addict Posts: 770   +154

    I can see this APU going far if its advertised as a below $300 gaming budget system which in fact it is. APU's are gonna be the new consoles instead of actual consoles being developed xD
  4. GeforcerFX

    GeforcerFX TechSpot Booster Posts: 239   +27

    Good performance, the new full A10 chip is the one I really wanna see bench marked that chip backs a lot of graphics power into a overclock able CPU. The biggest thing I noticed in the benchmarks at least was that there single core performance has gotten a lot better with steam roller. But if AMD is going to kill off the FX series or most of them at least (makes sense to me just have 1 or 2 really high end really high core count chips for enthusiasts that works on FM2+) they are gonna need to get 6-8 core chips on APU's in the 95 watt range (for non-k variants) Otherwise all there ever going to compete with is the I3 and the i5 will continue its nice little dominance of the middle ground.
  5. Puiu

    Puiu TechSpot Booster Posts: 991   +82

    I am extremely surprised at the power consumption numbers, 50% less under load and 80% at idle compared to the previous generation. that's kinda insane. what did they do to?
  6. It's a bit weird to compare between old A10 and new A8 then called A8 is slower because old A10 is faster? Aren't you supposed to compare same old Axx and new Axx, not old Axx and new Ayy? Other than that, great review and thanks!
  7. yukka

    yukka TechSpot Paladin Posts: 668   +22

    I didn't read the whole review as I am at work but it would be nice to see a comparison against some older discrete cards. How does this rate compared to a GTX 460.. if I got one for the missus would it put her at a disadvantage and could she play elder scrolls online on it :)
  8. hUMA will be really good for integrated gfx, decent speed boosts to be gained and with mantle will push it ahead of other integrated GPU's so its all good
  9. AnilD

    AnilD Newcomer, in training Posts: 18

    So, where's the first Mantle game to put this APU to proper gaming use?
  10. I just finished a new PC build with an A10-7850K. At first it was slower than expected. Turns out that the only display driver currently available is a half-gig AMD beta driver, which can be downloaded from the AMD web site. With the driver installed the machine is quite fast. Video is set to share 1-GB of RAM (8-GB installed, with Windows 8.1). Use the fastest RAM you can, of course, when you order your new gear.
  11. AMD still has a ways to go in single thread/core processing. These are still running 4/5 the single core speed of much lower end i3 processors.

    On the GPU side, they run circles around Intel. However, just to be competitive CPU side, they are having to throw in twice as many cores, which means most software is still going to suffer.

    I am still crossing my fingers for AMD to be able to take on the i3/i5 processors and bring GPU performance that Intel can't currently touch.
  12. With Mantle and TrueAudio I'm not particularly worried, and the HSA app benchmarks look pretty serious too. They might not be better for apps you get right this moment, but there's a lot to be said for it long term.
  13. GhostRyder

    GhostRyder TechSpot Guru Posts: 1,841   +395

    In short yes the A8 7600 is not as powerful as a GTX 460 on the GPU side. I do not know how well it will run Elder Scroll Online, however ill bet at a lower setting it would do just fine. At the top tier the A10 7850k has similar performance to the HD 7750 minus a notch in clock speed and no GDDR5 Ram.

    I guess the value of this chip is the point that its pretty much similar to the A10 6800k from last gen minus a lower clock speed but runs at much lower voltage. That means it would be great in something really small, but its definitly not going to be any G4M3R L33T Chip.

    Great review as always
    Steve and Julio Franco like this.
     
  14. If you think a bit faster GPU access and offsetting audio is going to compensate for a 20% computational difference, someone is misleading you.

    If you look at gaming now, 90% of the titles are CPU bottlenecked up to 1080p, not GPU. Thus TrueAudio or Mantle will do nothing to help them.

    As for thread out to the GPU (HSA), Windows 8 already is doing more of this than people realize.

    (Even going back to Vista, OS level software was using early DirectCompute on DX10 hardware, which has been extended in 7 and further in 8 where the OS is taking older software code and shoving it through the GPU when it can.)

    A lot of WinRT is using or set up to use the GPU/CPU agnostically, and this may help AMD, but they are going to still need to meet Intel on per core processing.

    With Windows 9 (newer concepts brought over from the XB1 development) , more of this happen, but by the time it and other software catches up to provide necessary gains, Intel's GPU adaptations will be closer to what AMD is producing.

    The other problem facing AMD is NVidia is retooling for mobile and mobile GPU additions to Intel devices. This could mean discrete NVidia GPUs in tablets running close to the power range of Intel HD GPUs.

    When it comes down to base level performance, single thread speeds are crucial. Even if you have 64 cores, if executing a single thread on one core is 20% slower, that software is still going to run 20% faster on a i3/i5 CPU. Even when dealing with 4-8 threads on an i5 class processor it is going to be 10-20% faster than the fastest 8 core AMD CPU.

    I'm an not a fan of AMD or Intel on this subject. They both have done good and really disgusting things to the technology market.

    Part of Intel's laziness in the past years is waiting out the ATI/AMD integration and 'just' staying ahead of them in core speeds. It is why their lower end mobile 'Atom' technology stood still while Microsoft literally begged them to move it forward over 5 years ago.

    AMD also let their CPU performance tank in favor of their GPU technologies. This has kept them a bit ahead of NVidia, but at the cost of remaining competitive with Intel. It shouldn't have needed to be a trade off.

    AMD had a better SoC technology they got from Microsoft and started with a better multi-core technology, and have fallen behind Intel with both. There is also no reason AMD should have let up advances on their APUs and let Intel get their HD 4xxx GPUs running as fast as AMD integrated GPUs.

    All these technologies you mention will help AMD, but they will also help Intel. I see people also mention hUMA as a savior, but they forget or missed that Microsoft created the software version of this technology back in 2005/2006 and it has been in Windows NT since Vista. (As is how the WDM/WDDM and full video subsystem works in Vista and more specifically the evolution of how it works in Windows 8 today. Notice when AMD talks hUMA, they are very specific to talk about it in 'hardware', because engineers will quickly point out NT is already doing it in software.)
  15. Steve

    Steve TechSpot Staff Posts: 1,135   +276 Staff Member

    That wasn't the case in the clock-to-clock testing.

    It's not weird at all, in fact it makes perfect sense. On the CPU side of things the A10 and A8 series are identical with the exception of clock speeds. What differs is the GPU spec and the A10-6800K and A8-7600 feature very similar GPU's so it made sense to compare them. As it turns out the R7 and 8670D are very much alike as we suspected.

    None of the games we tested were CPU bottle-necked at even 1280x800, installing a powerful discrete GPU significantly boosted the performance in games for the low-end CPU's such as the A8-7600.
    Last edited: Jan 23, 2014
    GhostRyder and cliffordcooley like this.
  16. JC713

    JC713 TechSpot Evangelist Posts: 6,114   +732

    I think if AMD went with the previous plan of putting GDDR5 right on the motherboard, Kaveri would be an absolute budget gaming beast. Too bad DDR3 pulls it back.
  17. Steve

    Steve TechSpot Staff Posts: 1,135   +276 Staff Member

    Probably not, it would be a lot more expensive so while it would be beastlier it wouldn't be a budget beast ;)

    Added cost would likely make it pointless, might as well just buy a discrete graphics card.
    JC713 likes this.
  18. JC713

    JC713 TechSpot Evangelist Posts: 6,114   +732

    True :D.
    Steve likes this.
  19. theBest11778

    theBest11778 Newcomer, in training Posts: 99   +16

    I was really hoping the rumors of 896 GPU cores was true. Kaveri is good, but only a step ahead. AMD needs to jump generations to catch up. Unfortunately die space was probably the main issue here, but that makes you wonder how they cram 1024 GPU cores on the PS4 APU...???
  20. Puiu

    Puiu TechSpot Booster Posts: 991   +82

    think what this will mean for AMD with DDR4 soon
  21. JC713

    JC713 TechSpot Evangelist Posts: 6,114   +732

    Definitely
    Puiu likes this.
  22. Afaik, the "cpu-bottleneck" is because DX11's single-threaded dispatch-approach to the render-pipeline. With Mantle, there should be BIG differences in regard to this short-coming!
    ViperSniper2 likes this.
  23. GhostRyder

    GhostRyder TechSpot Guru Posts: 1,841   +395

    They can do it, but its going to take some time before they are willing and make the necessary changes. Remember that the Jaguar APU's inside the XB1 and PS4 are on completely custom boards and such so they can have more freedom than on something interchangeable.

    Yea, I mean DDR4 will start to make things very nice for the APU later on, however at this performance point I don't think its really a necessity. I just built a friend a machine using an A10 7850k (Hes on a tight budget) and some DDR3 2400 ram that performs really quite well in gaming for the money spent. Its not a bad little machine, I put it on a ASrock MITX A88X board in a Bitfenix Prodigy and its chugs along. I can't wait to see a this APU on a laptop, imagine a laptop with this much graphical power, it may not be high end but man for people on a 5-600 buck budget getting an A10 like this would blow my mind for budget gaming on a laptop. Lets throw some DDR4 on that :p
    JC713 likes this.
  24. WangDangDoodle

    WangDangDoodle Newcomer, in training Posts: 22   +6

    Will TechSpot do a review on the A10-7850K and dual-graphics mode? I'm wondering which discrete cards are compatible with--and how fast it can get in--dual-graphics mode.

    I'm planning to build myself a Mini-ITX steam machine. It doesn't need to be top shelf, but I want to be able to play games in 1080p with medium/high detail if possible. I'm not sure why you guys chose to review the lower end of the APU lineup, but I'm hoping there's a high end review in the making. :)
  25. GhostRyder

    GhostRyder TechSpot Guru Posts: 1,841   +395

    At the moment I have an A10-7850k M-ITX machine built for a friend (Waiting on a replacment PSU) and from what ive seen it does a decent job especially if you put DDR3 2400 with it and overclock the GPU/CPU. The Dual Graphics options from what ive heard are limited at the moment to the R7 240 and the R7 250 which are both sadly less powerful than the on board GPU but tend according to some leaked benchmarks and some posted by AMD the performance can double (Though alot of this is just a rumor and what AMD showcased so who knows).
    WangDangDoodle likes this.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.