Weekend tech reading: Llano GPU 325% faster than Sandy Bridge

Matthew DeCarlo

Posts: 5,271   +104

AMD - Llano offers 325% better graphics performance than Sandy Bridge AMD continues to ride the GPU performance before the launch of the new mid-range processor Llano. Recently leaked documents from the company points to that the Llano-based Fusion A series will offer up to 325% better graphics performance then equivalent Intel processors based on Sandy Bridge. Nordic Hardware

Jack Wolf, who did the math behind computers, dies at 76 Jack Keil Wolf, an engineer and computer theorist whose mathematical reasoning about how best to transmit and store information helped shape the digital innards of computers and other devices that power modern society, died on May 12 at his home in the La Jolla section of San Diego. He was 76. NYT

Firefox 5 beta arrives, quietly Mozilla officially activated its beta channel on Friday, providing the first beta version that comes out of its accelerated release cycle. Don’t expect revolutionary changes. Following a first “fake” beta build (5.0b1) that was posted on May 2, Mozilla has moved the second build (5.0b2) into the public beta channel. ConceivablyTech

Apple alumni don't fall far from the tree After selling mobile ad startup Quattro Wireless to Apple in late 2009, Lars Albright took a job helping the iPhone maker work with its community of mobile app developers. He noticed that programmers were having trouble keeping users glued to their apps. Voilà: business opportunity. Bloomberg

IT's future: Bring your own PC-tablet-phone to work CIOs should buckle up and brace themselves for a future of flexible IT as employees will be routinely bringing in their own machines and expecting the business to support them, says Tony Henderson, head of communications at UK tech sector trade body Intellect. Silicon.com

Guild Wars 2 interview We’ve already had a lengthy chat with Guild Wars 2 designers Jon Peters and Eric Flannum about how the game’s progressing but the ArenaNet devs were also kind enough to impart to us some new information on a brand new character class, the engineer. Strategy Informer

Editorial: Why Half-life 3 isn't coming soon Are you waiting for Half-life: Episode 3? Or maybe you're thinking Valve's ditched the episodic format altogether (and you'd probably be right). Regardless, don't hold your breath for a sign of life from Gordon Freeman any time soon. IGN

Google silently patches Android authentication flaw Google is implementing a server-side fix to address the authentication flaw that allows third-parties to access Android user data... eWeek

Q&A: How today's tech alienates the elderly On Silver Surfer's Day, a UK academic has blamed unnecessarily complicated user interfaces for putting older people off joining the Government-backed Race Online. PC Pro

Permalink to story.

 
Interesting claim from AMD. If so that would mean death to a range of budget graphics cards, including their own.
 
Well its an APU a cpu + gpu.

It really should be "Llano APU 325% faster..."

Looks like Sandy Bridge got owned

http://www.youtube.com/watch?v=mdPi4GPEI74
 
GPU is right since its the gpu component that is 325% faster and not the entire APU or or CPU portion.
 
Interesting claim from AMD. If so that would mean death to a range of budget graphics cards, including their own.
Yes, AMD balance sheet is going to be very dependant upon how much business they pick up from OEM's which presently use Nvidia low-end graphics - especially the huge Asian (read China) markets.
Can't help but think that revenue is likely to be up -if they gain a larger share in low end (Brazos) and lower-mainstream (Llano) markets, but margins and profit line could take a hit. I'd assume that Zambesi will cannibalize/replace AMD's current lineup, so it would seem that they really need Interlagos to make a dent in Intel's stranglehold in the server market* to turn the corner substantially.

*[source]
 
Everyone who knows anything about Intel chipsets know they perform like crap compared to AMD/NVIDIA solutions. Larrabee was always going to fail unless you like brilliant looking graphics at 0.5fps. Took them several chipsets to get HDMI decoding right (they claimed G35 and G45 did it but both had non-functioning or incomplete implementations).
It just needs more press like this so mainstream takes notice.
 
AMD has already proved to us that the GPU solution in Llano will leave anything else integrated today far behind, but we are starting to worry about the lack of material around the actual CPU portion of the chip.

^That's my thought. So the question is, are you really that excited that they mixed old ish with new ish? It's early on and eventually they'll pair it with CPU built on a new micro-architecture, but for now this does not interest me, but then again, neither do laptops.
 
"AMD - Llano offers 325% better graphcis performance than Sandy Bridge "

not sure how much it matters, it should be "graphics"
 
Considering Llano is the mainstream option, AMD making little noise about the CPU portion does make sense. Joe Blow, who has a Core2 with a 5400RPM HD wouldn't notice much of a difference when it comes to viewing email or YouTube.

But then again, he likely wouldn't push Llano's potential graphics power, either. It'll most likely down to power consumption, and platform cost. Not sure if AMD has nailed the latter.

Bulldozer is where processing raw power will be a factor.
 
Its about time for amd to shine. The 4 year intel/amd power struggle is just about on schedule for a switch.
 
Sorry...
I lost my exuberance for dealing with three (or four) different graphics driver sets on one system.

Oooh, almost missed this gem...though I smell set up..
Its about time for amd to shine. The 4 year intel/amd power struggle is just about on schedule for a switch.
Like your optimism.
With AMD, IBM et al now realising -belatedly- that gate-first is pretty much a dead end, they now need to re-jig for gate-last (where Intel seems to have a clear run to 16/14nm thanks to FinFET's). Given that Intel and TSMC have pretty much all the experience with gate last, wouldn't it be just a little bit surprising if Intel didn't already have their IP and patents over the best ways through the process (and not just FinFET's)?
For AMD to get out from under is going to require some serious R&D investment. Intel will more than likely remain ahead in performance/die-size mm² and performance/watt (thanks in part to using a smaller process) for the foreseeable future, so AMD will need to beat Intel at the architecture level, because they certainly wont be doing it at clockspeed or TDP.

As for Llano, it does provide a real opportunity for AMD, although I suspect that beating SB by a whopping margin is somewhat of a hollow victory- Intel aren’t exactly known as the gold standard in graphics. While AMD tout the gaming qualities of Llano, I’m sure Intel will be expounding on the “good enough” principle of low-res gaming and pointing to the fact that Llano is still K10 architecture based. AMD will surely reply with "How much computing power do you need in a mainstream CPU?"

rinse, repeat
 
With AMD, IBM et al now realising -belatedly- that gate-first is pretty much a dead end, they now need to re-jig for gate-last (where Intel seems to have a clear run to 16/14nm thanks to FinFET's). Given that Intel and TSMC have pretty much all the experience with gate last, wouldn't it be just a little bit surprising if Intel didn't already have their IP and patents over the best ways through the process (and not just FinFET's)?

So If I understand this correctly, gate-last avoids some undesired effects that the heating process has on the High K Hafnium transistors in the area of leakage,conductivity etc. How much of an architectural re-write is this process of changing to gate-last exactly? (assuming AMD adopts Tri-gate of its own) It almost seems that Glo-Fo's changing from GF to GL may be a rising 'tide' type of scenario? ...or am I way off the mark here?
 
So If I understand this correctly, gate-last avoids some undesired effects that the heating process has on the High K Hafnium transistors in the area of leakage,conductivity etc.
Aye. By quite a margin it would seem
[Dave Kanter] [Thomas Hoffmann]
How much of an architectural re-write is this process of changing to gate-last exactly?
Wouldn't have a clue tbh. Haven't seem anyone switch from GF to GL on an architecture.
(assuming AMD adopts Tri-gate of its own) It almost seems that Glo-Fo's changing from GF to GL may be a rising 'tide' type of scenario? ...or am I way off the mark here?
Samsung had already jumped ship, IBM and the rest I think compared their ramp to 32nm with Intel's -who had basically launched the Nehalem architecture change, quickly morphed it into Lynnfield whilst simultaneously introducing the same arch on 32nm with Clarkdale. Intel don't seem to have had any problems (manufacturing wise) with Clarkdale, Gulftown or Sandy Bridge
 
By the way Ivy Bridge prototypes are being circulated for testing etc. right now. So the 'good idea' in reality has come to pass.

Any way on the GPU side of things, it is good to see AMD showing some progress, this should at least force Intel to improve its IGP enough to enable it to be there or there about. I think the entry level GPUs are total wastage of money, as they doesn't offer compelling enough performance for lot more power consumption; hence, elimination of this segment should be good for consumers (read lot less deception from OEMs).

Google being bit 'deceptive' when it comes to patching up their own holes ? ;)
 
Back