Showing Intel is getting serious about graphics, company rehires 'Larrabee' GPU architect

Julio Franco

Posts: 9,095   +2,046
Staff member
Why it matters: Intel hasn't been competitive in the GPU market for more than two decades and the closest thing to a discrete GPU they currently have is the Xeon Phi compute card. But that's about to change.

Intel is expected to become a player in the consumer GPU market after hiring the head of AMD's Radeon Technologies Group last year. Raja Koduri's official position at Intel is Senior VP where he oversees Intel's Core and Visual Computing Group. Koduri is not the only high-profile hire Intel has made as of late either.

Most recently, Intel confirmed that its first true dedicated graphics card (in a long, long time) will be launching in 2020. But this is not the first time Intel has been working on its own graphics processor. About ten years ago, Intel was talking up its x86-derived Larrabee GPGPU. On paper, it sounded like a real threat to Nvidia and ATI (now AMD), but after facing several delays and complications, plans to market a consumer version of Larrabee were put on hold indefinitely.

Today we're finding out that one of the "fathers" of Larrabee, Tom Forsyth is rejoining the company. Forsyth is a self-described graphics programmer and chip designer with recent stints at Oculus and Valve. He was on the original Larrabee team and now he's back at Intel...

The Larrabee project was subject to much debate over the years, often called a failure, but in Tom Forsyth's own words, it was actually a "pretty huge success." Larrabee never made it to market as a consumer product -- you know, the kind we'd have loved to see compete with Radeon and GeForce -- but out of that project Intel got Xeon Phi, a x86-based manycore processor used in supercomputers and high-end workstations.

Forsyth said in his old blog post that in terms of engineering and focus, Larrabee was never primarily a graphics card. But this time it seems Intel is actually focused on developing a graphics product for consumers. According to the latest rumors, Intel has split its dedicated graphics project into two: one will focus on video streaming within data centers, and the other will be for gaming.

Whatever Intel's ultimate motivations may be, gaming/consumer graphics or other applications that modern GPUs can handle like machine learning, AI and self-driving cars, we'll find out soon enough.

Permalink to story.

 
If nothing else I want to see better iGPUs on intel parts. Intel tacitly admitted consumers desire this and they see a market for it, hence the pretty stunning move made pairing several 8th gen CPU models with Vega graphics.

If the discrete cards fail again then maybe they can push through something better for their iGPU designs in the next couple years.
 
"...Intel confirmed that its first true dedicated graphics card will be launching in 2020."

I had an Intel i740 AGP dedicated graphics card in my first computer in 1998.

https://en.wikipedia.org/wiki/Intel740

It wasn't very competitive with RIVA TNTs, Radeons, and VooDoos of the day, but it was a separate card (that I eventually replaced with a GeForce 2 MX).
 
I had an Intel i740 AGP dedicated graphics card in my first computer in 1998.
True, I also had one of those Intel AGP graphics cards at some point. It was meant to be somewhat implied after the intro, but factually not correct standalone, now corrected :)
 
I had an Intel i740 AGP dedicated graphics card in my first computer in 1998.
True, I also had one of those Intel AGP graphics cards at some point. It was meant to be somewhat implied after the intro, but factually not correct standalone, now corrected :)


I went with the TNT; anyway personally IF Intel can get something compelling out the door, especially for notebooks, it would be much more interesting; because if done rightly, it has potential to reduce power consumption and provide competitive performance at least in mid to high-midrange segment.
 
I'd like to start the rumor right now, that the upcoming Intel GPU will beat the upcoming GTX 1180. I back that prediction with nothing at all, just wishful thinking, because that would really cause upheaval and disruption in the high-end graphics market, a condition that's been needed for some time now. Right this minute, my prediction is as valid as anyone's, however far-fetched it may be. We have no verified info at all about either GPU, but that never stopped the rumor mill in the past. So feel free to speculate, before it's too late.
 
Back