The Last Time Intel Tried to Make a Graphics Card

>> "The fixed function nature of [GPU] shader cores makes it difficult to run non-gaming workloads on them. This meant that game technology innovations were often held in check by graphics hardware capabilities."

Sentence 1 is strictly untrue, else NVidia wouldn't be invading the scientific/engineering/AI and data mining markets so heavily, and whether true or false, setence 2 doesn't follow from sentence 1.

>> " One of the key drawbacks to the ring topology is that data needs to pass through each node on its way. The more cores you have, the greater the delay."

It wasn't really the ring topology so much as the exponential communication requirements of MIMD architectures. Intel made the same mistake nearly twenty years before Larrabee, in it's "Personal Supercomputer" project. It used a hypercube-based topology, which avoided the delays in the ring structure used by Larrabee, but still failed due to the massive intercommunication bandwidth necessary to feed those processors.
 
It doesn't seem to be their speciality but I'll keep an open mind and see what they come up with, still, with so many graphic card makers in the marketplace I think their investment would be better served elsewhere .....
 
It was true at the time of Larrabee's development, 12 or so years ago (e.g. the GeForce GTX 200 era).
Cuda was released a couple years before that, and tools like Lib/SH and Accelerator even earlier. But ignoring that, the point I was making was the article was expressing present tense, not past.
 
It was true at the time of Larrabee's development, 12 or so years ago (e.g. the GeForce GTX 200 era). It's definitely not the case now, of course.

At the time of Larrabee's development ATI and Nvidia were both shipping cards with pixel pipelines and vertex sharder cores. Not unified shaders. And while ATI was clearly the first to really deep drive into unified shaders and put them in a consumer product, ATI's R400 was a unified shader design that was scrapped. In its place ATI's R300 tech was quickly scaled up to become the x800 series. The R400 chip would later be upgraded to the R500, which a version of only ever saw use on the xbox 360. Older DX9 games didn't favor the design change, and that was clearly shown when DX10 cards came to market.

It was easy to see that Intel clearly overlooked Unified Shader tech and instead went for an overly complex setup that they would no doubt mess up. Doomed from the start. Their plan may have worked if they built a design from the ground up. The Pentium MMX arch, or P5, was a one time used uarch from intel. Replaced by the leagues better Pentium Pro design, which the core basis of is in the heart of all their future success. They left it for awhile with their netbust experiment, but it lived on in laptops then back onto the desktop with the Core series. Using Pentium MMX as the basis and not a design built for scaling was their down fall.

IMO Intel's New Desktop class cards will not perform better than AMD or Nvidia. But at the same time I don't think they need to. They would be perfect for OEM systems and people looking for cheap lower end cards. But I'd expect their 2nd or 3rd gen to catch up.
 
Last edited:
IMO Intel's New Desktop class cards will not perform better than AMD or Nvidia. But at the same time I don't think they need to. They would be perfect for OEM systems and people looking for cheap lower end cards. But I'd expect their 2nd or 3rd gen to catch up.
Brilliantly said, I think this is the main point why intel decided to strengthen their GPU division: To keep their OEM customers. Probably they were afraid to lose the OEM market to AMD.
 
Excellent as always, these long-form explanatory articles are why I prefer Techspot over the constant reviews of identical cases and power supplies that now seem to the the staple of other well-known tech sites.
 
At the time of Larrabee's development ATI and Nvidia were both shipping cards with pixel pipelines and vertex sharder cores. Not unified shaders.
Intel released their details on Larrabee at Siggraph 2008 and informally stated that they were stopping development in making a discrete graphics card in May 2010. So if one assumes that Intel started working on Larrabee from, say, 2005 then that first 3 year period definitely had unified shader products on the market. ATi were already using a unified shader architecture in the Xbox 360 (2005). The first shipped PC graphics card with unified shaders were Nvidia's GeForce 8 series in November 2006 and ATi's Radeon HD 2000 series in June 2007.

It was easy to see that Intel clearly overlooked Unified Shader tech and instead went for an overly complex setup that they would no doubt mess up. Doomed from the start.
Definitely doomed :) but I would argue that Intel hadn't so much overlooked it, but simply thought that they could do better.
 
I don't know how seriously to take this. I mean, Intel has been making iGPUs since before the Core2Duo days and despite the ~20 years of experience, they've never really figured out how to make their GPUs more useful than the GeForce 8400 GS (essentially a display adapter). They ignored a VERY lucrative GPU market despite having the most common GPUs on the planet for an incredibly long time despite having money coming out of their ears. Every time they tried something, it fell apart. It was like watching a toddler trying to make a sand castle with dry sand, deciding it was impossible and deciding to do something else with that same sand/silicon like they did with Larrabee.

I was never able to understand why this is so I don't know if high-performance GPUs are so complex that even Intel can't really get into it or if it's something else. One thing's for sure there was definitely some kind of barrier standing in Chipzilla's way.
 
I don't know if high-performance GPUs are so complex that even Intel can't really get into it or if it's something else. One thing's for sure there was definitely some kind of barrier standing in Chipzilla's way.
I'm sure they were simply attempting to avoid cannibalizing their CPU market. That may not have been as immediately ruinous a decision as Kodak's decision to avoid digital cameras to avoid harming film sales, but in the long run it may run a close second.

Incidentally, that also explains their decision to architect Larrabee around a MIMD architecture, as it makes it less of a GPU and more of a graphics-oriented CPU.
 
They ignored a VERY lucrative GPU market despite having the most common GPUs on the planet for an incredibly long time despite having money coming out of their ears.
Back then, though, it wasn't as lucrative as it is now - take Nvidia's financials from 2005 to 2010, where it ranged from $2b to $3b revenue, and highly variable net incomes. That may have been a factor behind the lack of commitment to pushing out a discrete graphics card, especially compared to how much money they were raking in from their processor division.

I'm sure they were simply attempting to avoid cannibalizing their CPU market.
This is a good point, and perhaps ties in with the above - Core was clearly such a success, both architecturally and financially, that releasing any product that might damage that market or their image could well have stopped Larrabee from ever becoming nothing more than an engineering experiment.
 
Back then, though, it wasn't as lucrative as it is now - take Nvidia's financials from 2005 to 2010, where it ranged from $2b to $3b revenue, and highly variable net incomes. That may have been a factor behind the lack of commitment to pushing out a discrete graphics card, especially compared to how much money they were raking in from their processor division.
That's true but they also threw money at things that were complete failures like Ultrabooks. I remember Charlie Demerjian's less-than-complimentary assessment of Intel's "fiscal responsibility" so I still always found it odd.
 
Intel released their details on Larrabee at Siggraph 2008 and informally stated that they were stopping development in making a discrete graphics card in May 2010. So if one assumes that Intel started working on Larrabee from, say, 2005 then that first 3 year period definitely had unified shader products on the market. ATi were already using a unified shader architecture in the Xbox 360 (2005). The first shipped PC graphics card with unified shaders were Nvidia's GeForce 8 series in November 2006 and ATi's Radeon HD 2000 series in June 2007.

IMO Intel had to chose a design path right away, and unified shader products like you said didn't really hit the market until the end of 2006. While we did have the 360 early on, from a performance perspective it wasn't much faster than hardware at the time. So I understand Intel's engineers choice to go a different route from what was previously the norm, when they had so little experience.

Intel's success has largely been thanks to the Pentium Pro and future derivatives. That and AMD's at the time Fab, now Global Foundries, kept AMD behind schedule like clock work. Thankfully GF's inability to keep up has allowed AMD to use TSMC, as they would not be in the position they are today with GF.

Tables sure have turned with Intel no longer having the Fab advantage, and doesn't seem like that will change for years to come. TSMC is firing on all cylinders with all that Apple Money.
 
Price-performance is the key metric here, even if Xe does not match the scalding performance of AMD and nVidia at the high end. We are not all gamers. nVidia has shown the way in applying its technology to whole classes of other kinds of problem solving. Its high end gaming cards are very good for machine learning, for example..
 
Brilliantly said, I think this is the main point why intel decided to strengthen their GPU division: To keep their OEM customers. Probably they were afraid to lose the OEM market to AMD.
Looking at what I'd say are the majority of OEM systems with a dGPU, they usually have a 16xx series card (at best) or MX graphics in the case of laptops. If I were at Intel, I'd want that part of the BOM, particularly since being on par with that level of cards should be a lot easier than going for the gaming crown.
That will also allow them to offer bundling deals to OEM and hurt their hpc competitor nVidia - every GPU they don't sell means a reduced bottom line.
 
Intel is floundering badly and trying to stay relevant among all the innovation by competitors. I'm willing to bet there's still a fair chance this endeavor will never make it to market.
 
At the time of Larrabee's development ATI and Nvidia were both shipping cards with pixel pipelines and vertex sharder cores. Not unified shaders. And while ATI was clearly the first to really deep drive into unified shaders and put them in a consumer product, ATI's R400 was a unified shader design that was scrapped. In its place ATI's R300 tech was quickly scaled up to become the x800 series. The R400 chip would later be upgraded to the R500, which a version of only ever saw use on the xbox 360. Older DX9 games didn't favor the design change, and that was clearly shown when DX10 cards came to market.

It was easy to see that Intel clearly overlooked Unified Shader tech and instead went for an overly complex setup that they would no doubt mess up. Doomed from the start. Their plan may have worked if they built a design from the ground up. The Pentium MMX arch, or P5, was a one time used uarch from intel. Replaced by the leagues better Pentium Pro design, which the core basis of is in the heart of all their future success. They left it for awhile with their netbust experiment, but it lived on in laptops then back onto the desktop with the Core series. Using Pentium MMX as the basis and not a design built for scaling was their down fall.

IMO Intel's New Desktop class cards will not perform better than AMD or Nvidia. But at the same time I don't think they need to. They would be perfect for OEM systems and people looking for cheap lower end cards. But I'd expect their 2nd or 3rd gen to catch up.
Not if Raj is running the show, he could screw up beer sales in Ireland.
 
For a while thy also experimented with motherboards sporting ATI graphics incorporating chipset. That collaboration ended when AN\MD acquired ATI.

I remember in the days when add on cards were required for practically everything. Floppy and HDD drivers. Video drivers. Ethernet cards. Serial and parallel ports. In those days there were several graphic chip manufacturers in the mid range. Intel could have acquired one of them instead of going on its own and making a mess of it.
 
For a while thy also experimented with motherboards sporting ATI graphics incorporating chipset. That collaboration ended when AN\MD acquired ATI.

I remember in the days when add on cards were required for practically everything. Floppy and HDD drivers. Video drivers. Ethernet cards. Serial and parallel ports. In those days there were several graphic chip manufacturers in the mid range. Intel could have acquired one of them instead of going on its own and making a mess of it.
You forgot about Kaby Lake-G haven't you ;)
 
Great article, but that's not quake4 being raytraced, it's ETQW (Enemy Territory Quake Wars), which Bethesda should totally do a remake of
 
Back