1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Intel accidentally confirms four Xe discrete GPUs, hints at more upcoming products

By mongeese · 21 replies
Jul 27, 2019
Post New Reply
  1. As happens every so often, Intel accidentally published a developer version of their 26.20.16.9999 graphics driver with the codenames for ten different categories of products. They’ve since taken it down, but Anandtech forum user ‘mikk’ republished them all.

    The highlight, of course, is the references to Intel’s new Xe discrete GPUs, which are mentioned five times. Three mentions are thick with detail; “iDG2HP512,” “iDG2HP256,” and “iDG2HP128.” All Intel graphics processors feature the “I,” “DG” is thought to mean discrete graphics and the “2” is the variant. “HP” could mean high-power, implying these are fully-fledged desktop GPUs, and the following three-digit number could be the number of execution units, Intel’s equivalent to CUDA cores.

    While 512 execution units sounds low compared to Nvidia’s specs, Intel’s graphics architecture is quite different. Intel’s Gen11, for example, provides 16 flops per execution unit per clock, such that a 512 EU graphics core operating at 1800 MHz would reach 14.7 TFLOPS, a little more than an RTX 2080 Ti. A 256 EU GPU would have 7.4 TFLOPS, the same as an RTX 2070, and a 128 EU GPU would have 3.7 TFLOPS. Of course, we have no idea what speeds the GPUs will actually operate at.

    The last two mentions of discrete GPUs are vaguer. An “iDG1LPDEV” suggests that a “1” low-power variant is in the works and might arrive later, due to the “dev” status. It could be a separate GPU for laptops. An “iATSHPDEV” name references the Arctic Sound code-name, which Intel is using for its first-generation discrete GPUs. Code relating to it describes it as using Gen12 architecture in a high-power configuration. It might be initial testing of a second-gen component.

    On the CPU side, the driver update lists five series’ worth. In order of soonest to be released, “LKF-R” is Lakefield, a strange low-power processor series with five cores (one at 22nm, four at 10nm). Four models appear in the listing. “EHL,” or Elkhart Lake, is a group of ultra-low-power SoCs built on 10nm expected to be released next year. It also appears with four variants.

    “TGL” is the juicy Tiger Lake. The successor to the imminent Ice Lake, Tiger Lake will be the mainstream architecture for late 2020 using 10nm. The driver lists eight models employing Gen12 Xe graphics, though in a low-power configuration.

    “RKL” is Rocket Lake, the replacement for Tiger Lake coming in 2021. We can see from the eight processors listed that it is still in the early stages of development. Breaking down one name, “iRKLLPGT1H32,” we can see that one Tiger Lake CPU will use the “GT1” graphics engine with 32 execution units, a solid show. Others in the listing employ “GT0” and “GT0.5” GPUs, some of which appear to use 16 execution units.

    Last but not least we have “ADLS,” Alder Lake, which replaces Rocket Lake in 2021 but is still expected to rely on 10nm manufacturing. Only two “dev” models are listed, so we don’t know anything at all really, but it’s likely Intel doesn’t either – it’s still four generations away. There are also three listings yet to be deciphered, “iRYFGT2,” “iJSLSIM,” and “iGLVGT2.” Each with only one model.

    While interpreting the leaked information doubtlessly involves some error, the general gist of it for both GPUs and CPUs is promising. Future integrated GPUs seem to be powerful and Intel is well underway in developing many generations’ worth of processors. Their discrete GPUs, providing there are no ugly surprises in the clock speed or software front, appear to be capable of battling Nvidia everywhere in the line-up. With a little luck, Intel will provide the competition the market deserves.

    Permalink to story.

     
  2. Puiu

    Puiu TS Evangelist Posts: 3,573   +2,055

    More competition is always welcomed.
     
  3. Nobina

    Nobina TS Evangelist Posts: 2,045   +1,562

    It'll be interesting if Intel competes with Nvidia at the high end while AMD continues to offer mid-range to low-end GPUs at a better value.
     
  4. XtremeHammond

    XtremeHammond TS Booster Posts: 101   +67

    I hope Intel will bring something top-tier like Nvidia's x080 Ti level.
     
  5. JaredTheDragon

    JaredTheDragon TS Guru Posts: 637   +411

    More vaporware from the big guys.
     
    Charles Olson likes this.
  6. Faplyboy

    Faplyboy TS Enthusiast Posts: 45   +12

    AMD > Intel
    reason: price
     
    Charles Olson likes this.
  7. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,619   +945

    Doubt it.

    From March 2019 Interview with Raja Koduri:

    From the interview it seems that Koduri asked himself an important question about what he wanted to do, how he could best apply his knowledge and skills, during the next 10 years. Only one company checked all the boxes with regard to core technologies, assets, and people that Koduri sought to "do some beautiful, amazing things," with the deluge of data from the growth in computing, social media, mobile, IoT and so on – Intel was the only choice.

    They next interesting topic covered by Barron's is how Koduri successfully headhunted Jim Keller for Intel within a couple of months of accepting employment there. Koduri said he managed to communicate his excitement about the possibilities availble at Intel over the next 10 years, and so Keller gave up his position at Tesla to help Koduri marshal Intel's 20,000-plus design engineering team.

    Of the above 20,000 folk, Koduri revealed that he has a 4,500 strong team working on Intel's future integrated and discrete graphics projects. Koduri emphasised the importance of balance in this design team – between hardware and software expertise. Excitedly, the Intel graphics chief explained "What I'm doing is helping them figure out how to build products that scale up from the low power, mobile domain up to petaflops—the big data center GPUs. Both internally and externally, there's a lot of excitement from our customers, from enthusiasts, from the market about our entry into discrete graphics in 2020."
     
    Last edited: Jul 27, 2019
  8. TheHughMan

    TheHughMan TS Rookie

    From what I've seen Intel has eight shader cores per execution unit which means 512 EUs would have 4096 shader cores. Of course Nvidia and AMD have different names in configurations of their compute units and shaders.
     
  9. Gahl1k

    Gahl1k TS Member Posts: 24   +18

    Oh, man. Can't wait for Intel/Intel-Intel/Nvidia Internet feuds.
     
  10. DaveBG

    DaveBG TS Maniac Posts: 424   +157

    I really hope they do something about nVidia because AMD is doing nothing there...
     
  11. XtremeHammond

    XtremeHammond TS Booster Posts: 101   +67

    They rival with Intel and Nvidia at the same time but on the different markets. So, they can't just pull something extra-powerful without harming the accompanying business. But they really try.
    And I really hope that someday, the times of Radeon 9700/GeForce 5900 rivalry will return - when the competition was fierce and there were also a lot of other players on the market (Matrox, S3 etc.).
     
    Charles Olson and DaveBG like this.
  12. theruck

    theruck TS Booster Posts: 158   +46

    The article is pure speculation, hoping-to-be trending speculation
     
    Charles Olson likes this.
  13. captaincranky

    captaincranky TechSpot Addict Posts: 15,184   +4,130

    If all these Intel "lakes" actually existed, water would likely cover 90% of the earth's surface. Of course with the pace of global warming, that may occur sooner rather than later anyway.

    Come to think of it, perhaps the heat generated by Intel's new graphics processors may, in and of itself, be the tipping point...:eek:.

    So Kidz, the moral of the story is, don't everybody rush out and buy Intel's new video cards all at once, it could herald our doom. Try and pace yourselves, show some restraint, use some discretion.
     
    Last edited: Jul 28, 2019
    Puiu and mosu like this.
  14. Uncle Al

    Uncle Al TS Evangelist Posts: 5,716   +4,048

    "Accidently" ..... what a laugh and a completely worn out term. Why don't these companies simply release the information and ask "what do you think about this?". They would get a lot more responses, most of which would be valuable to their efforts .....
     
    Charles Olson, MarkHughes and Godel like this.
  15. axiomatic13

    axiomatic13 TS Maniac Posts: 231   +162

    I work on an engineering team that designs compute solutions with clusters of servers with GPGPU's. I've worked with NVidia, AMD, and Intel Phi cards for years now. The one thing that I know for certain is that Intel's driver development is the very worst of them. By a long shot. AMD is second-worst and NVidia's is passable after maybe two or three iterations of the same driver. All of it is unacceptable though. All of it reeks of forced releases by stupid marketing schedules.
     
    mosu likes this.
  16. Dimitrios

    Dimitrios TS Guru Posts: 532   +393

    I don't care about the hardware can INTEL deliver drivers?
     
    axiomatic13 likes this.
  17. axiomatic13

    axiomatic13 TS Maniac Posts: 231   +162

    That's exactly what I said right above you. Intel drivers are THE WORST.
     
    Dimitrios likes this.
  18. FF222

    FF222 TS Addict Posts: 190   +122

    128, 256 and 512 would make more sense as a memory bus-width, than the number of graphics cores
     
  19. captaincranky

    captaincranky TechSpot Addict Posts: 15,184   +4,130

    Well, at least from my personal experience, I beg to differ. I've never had a stitch of a problem with Intel's chipset or IGP drivers, either on the board (G-31 G-41) , or in the CPU (i3 and up).

    I have ATM, nasty recurring problems with 2 of the much vaunted Nvidia drivers. One just quits working (A GT-710 in an Intel P-45), the other fails to wake if left on standby for too long (Z170 GTX-1050 ti).

    I"m not a gamer, so I can't speak to that,. but for day to day imaging work or surfing the web, Intel video has never failed me, all the way back to the 915 board era.

    @axiomatic I certainly can't speak to server applications either. But my home experience has been trouble free.

    Perhaps Intel's foray into discreet VGA, will be problematic, as they won't have the cumulative experience with optimizing drivers for a diverse array of individual games, as do both Nvidia and even AMD.
     
    Last edited: Jul 28, 2019
    axiomatic13 and tomkaten like this.
  20. kmo911

    kmo911 TS Booster Posts: 132   +14

    I hope they can compete with amd nvidia. prizes are too high on some gpu s.
     
    Last edited: Jul 29, 2019
    Charles Olson and axiomatic13 like this.
  21. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,482   +521

    Be interesting to see if Intel pulls up with some serious graphics power and targets the workstation environments. Most of what we see on these sites centers around gaming performance comparisons, but the corporate workstation is a potentially huge market as well.
     
    axiomatic13 likes this.
  22. axiomatic13

    axiomatic13 TS Maniac Posts: 231   +162

    I do not disagree on chipset drivers. But not much changes in the management engine. GPU's have to be tailored regularly for all kind of reasons. Intel's NIC drivers are a mess as well. Don't even get me started on combining Intel drives with Linux kernel compilation. So many dependencies just to get CPU power management functional and at the same time, so limited in the kernels it supports. Peace brother.
     

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...