Intel accidentally confirms four Xe discrete GPUs, hints at more upcoming products

mongeese

Posts: 643   +123
Staff
Something to look forward to: This year Nvidia has continued to dominate the high-end GPU world with AMD claiming a few victories at mainstream price points with Navi so far. It has also forced Nvidia to lower pricing thanks to the healthy competition. Next year, Intel will join the fray with at least four powerful discrete GPUs with hardware-accelerated ray tracing.

As happens every so often, Intel accidentally published a developer version of their 26.20.16.9999 graphics driver with the codenames for ten different categories of products. They’ve since taken it down, but Anandtech forum user ‘mikk’ republished them all.

The highlight, of course, is the references to Intel’s new Xe discrete GPUs, which are mentioned five times. Three mentions are thick with detail; “iDG2HP512,” “iDG2HP256,” and “iDG2HP128.” All Intel graphics processors feature the “I,” “DG” is thought to mean discrete graphics and the “2” is the variant. “HP” could mean high-power, implying these are fully-fledged desktop GPUs, and the following three-digit number could be the number of execution units, Intel’s equivalent to CUDA cores.

While 512 execution units sounds low compared to Nvidia’s specs, Intel’s graphics architecture is quite different. Intel’s Gen11, for example, provides 16 flops per execution unit per clock, such that a 512 EU graphics core operating at 1800 MHz would reach 14.7 TFLOPS, a little more than an RTX 2080 Ti. A 256 EU GPU would have 7.4 TFLOPS, the same as an RTX 2070, and a 128 EU GPU would have 3.7 TFLOPS. Of course, we have no idea what speeds the GPUs will actually operate at.

The last two mentions of discrete GPUs are vaguer. An “iDG1LPDEV” suggests that a “1” low-power variant is in the works and might arrive later, due to the “dev” status. It could be a separate GPU for laptops. An “iATSHPDEV” name references the Arctic Sound code-name, which Intel is using for its first-generation discrete GPUs. Code relating to it describes it as using Gen12 architecture in a high-power configuration. It might be initial testing of a second-gen component.

On the CPU side, the driver update lists five series’ worth. In order of soonest to be released, “LKF-R” is Lakefield, a strange low-power processor series with five cores (one at 22nm, four at 10nm). Four models appear in the listing. “EHL,” or Elkhart Lake, is a group of ultra-low-power SoCs built on 10nm expected to be released next year. It also appears with four variants.

“TGL” is the juicy Tiger Lake. The successor to the imminent Ice Lake, Tiger Lake will be the mainstream architecture for late 2020 using 10nm. The driver lists eight models employing Gen12 Xe graphics, though in a low-power configuration.

“RKL” is Rocket Lake, the replacement for Tiger Lake coming in 2021. We can see from the eight processors listed that it is still in the early stages of development. Breaking down one name, “iRKLLPGT1H32,” we can see that one Tiger Lake CPU will use the “GT1” graphics engine with 32 execution units, a solid show. Others in the listing employ “GT0” and “GT0.5” GPUs, some of which appear to use 16 execution units.

Last but not least we have “ADLS,” Alder Lake, which replaces Rocket Lake in 2021 but is still expected to rely on 10nm manufacturing. Only two “dev” models are listed, so we don’t know anything at all really, but it’s likely Intel doesn’t either – it’s still four generations away. There are also three listings yet to be deciphered, “iRYFGT2,” “iJSLSIM,” and “iGLVGT2.” Each with only one model.

While interpreting the leaked information doubtlessly involves some error, the general gist of it for both GPUs and CPUs is promising. Future integrated GPUs seem to be powerful and Intel is well underway in developing many generations’ worth of processors. Their discrete GPUs, providing there are no ugly surprises in the clock speed or software front, appear to be capable of battling Nvidia everywhere in the line-up. With a little luck, Intel will provide the competition the market deserves.

Permalink to story.

 
It'll be interesting if Intel competes with Nvidia at the high end while AMD continues to offer mid-range to low-end GPUs at a better value.
 
More vaporware from the big guys.

Doubt it.

From March 2019 Interview with Raja Koduri:

From the interview it seems that Koduri asked himself an important question about what he wanted to do, how he could best apply his knowledge and skills, during the next 10 years. Only one company checked all the boxes with regard to core technologies, assets, and people that Koduri sought to "do some beautiful, amazing things," with the deluge of data from the growth in computing, social media, mobile, IoT and so on – Intel was the only choice.

They next interesting topic covered by Barron's is how Koduri successfully headhunted Jim Keller for Intel within a couple of months of accepting employment there. Koduri said he managed to communicate his excitement about the possibilities availble at Intel over the next 10 years, and so Keller gave up his position at Tesla to help Koduri marshal Intel's 20,000-plus design engineering team.

Of the above 20,000 folk, Koduri revealed that he has a 4,500 strong team working on Intel's future integrated and discrete graphics projects. Koduri emphasised the importance of balance in this design team – between hardware and software expertise. Excitedly, the Intel graphics chief explained "What I'm doing is helping them figure out how to build products that scale up from the low power, mobile domain up to petaflops—the big data center GPUs. Both internally and externally, there's a lot of excitement from our customers, from enthusiasts, from the market about our entry into discrete graphics in 2020."
 
Last edited:
From what I've seen Intel has eight shader cores per execution unit which means 512 EUs would have 4096 shader cores. Of course Nvidia and AMD have different names in configurations of their compute units and shaders.
 
I really hope they do something about nVidia because AMD is doing nothing there...

They rival with Intel and Nvidia at the same time but on the different markets. So, they can't just pull something extra-powerful without harming the accompanying business. But they really try.
And I really hope that someday, the times of Radeon 9700/GeForce 5900 rivalry will return - when the competition was fierce and there were also a lot of other players on the market (Matrox, S3 etc.).
 
If all these Intel "lakes" actually existed, water would likely cover 90% of the earth's surface. Of course with the pace of global warming, that may occur sooner rather than later anyway.

Come to think of it, perhaps the heat generated by Intel's new graphics processors may, in and of itself, be the tipping point...:eek:.

So Kidz, the moral of the story is, don't everybody rush out and buy Intel's new video cards all at once, it could herald our doom. Try and pace yourselves, show some restraint, use some discretion.
 
Last edited:
"Accidently" ..... what a laugh and a completely worn out term. Why don't these companies simply release the information and ask "what do you think about this?". They would get a lot more responses, most of which would be valuable to their efforts .....
 
I work on an engineering team that designs compute solutions with clusters of servers with GPGPU's. I've worked with NVidia, AMD, and Intel Phi cards for years now. The one thing that I know for certain is that Intel's driver development is the very worst of them. By a long shot. AMD is second-worst and NVidia's is passable after maybe two or three iterations of the same driver. All of it is unacceptable though. All of it reeks of forced releases by stupid marketing schedules.
 
128, 256 and 512 would make more sense as a memory bus-width, than the number of graphics cores
 
I don't care about the hardware can INTEL deliver drivers?
That's exactly what I said right above you. Intel drivers are THE WORST.
Well, at least from my personal experience, I beg to differ. I've never had a stitch of a problem with Intel's chipset or IGP drivers, either on the board (G-31 G-41) , or in the CPU (i3 and up).

I have ATM, nasty recurring problems with 2 of the much vaunted Nvidia drivers. One just quits working (A GT-710 in an Intel P-45), the other fails to wake if left on standby for too long (Z170 GTX-1050 ti).

I"m not a gamer, so I can't speak to that,. but for day to day imaging work or surfing the web, Intel video has never failed me, all the way back to the 915 board era.

@axiomatic I certainly can't speak to server applications either. But my home experience has been trouble free.

Perhaps Intel's foray into discreet VGA, will be problematic, as they won't have the cumulative experience with optimizing drivers for a diverse array of individual games, as do both Nvidia and even AMD.
 
Last edited:
Be interesting to see if Intel pulls up with some serious graphics power and targets the workstation environments. Most of what we see on these sites centers around gaming performance comparisons, but the corporate workstation is a potentially huge market as well.
 
Well, at least from my personal experience, I beg to differ. I've never had a stitch of a problem with Intel's chipset or IGP drivers, either on the board (G-31 G-41) , or in the CPU (i3 and up).

I have ATM, nasty recurring problems with 2 of the much vaunted Nvidia drivers. One just quits working (A GT-710 in an Intel P-45), the other fails to wake if left on standby for too long (Z170 GTX-1050 ti).

I"m not a gamer, so I can't speak to that,. but for day to day imaging work or surfing the web, Intel video has never failed me, all the way back to the 915 board era.

@axiomatic I certainly can't speak to server applications either. But my home experience has been trouble free.

Perhaps Intel's foray into discreet VGA, will be problematic, as they won't have the cumulative experience with optimizing drivers for a diverse array of individual games, as do both Nvidia and even AMD.

I do not disagree on chipset drivers. But not much changes in the management engine. GPU's have to be tailored regularly for all kind of reasons. Intel's NIC drivers are a mess as well. Don't even get me started on combining Intel drives with Linux kernel compilation. So many dependencies just to get CPU power management functional and at the same time, so limited in the kernels it supports. Peace brother.
 
Back