Intel Xe Graphics Preview v2.0: What we know about Intel's upcoming GPU

This is going to go down in flames bright enough to light up the night sky.

I will bath in the majesty of its glow.

Let's hope you are wrong, because as consumers we are in dire need of more strong players in the discrete GPU architecture market (more players in the x86-64 CPU and mechanical HDD markets would also be welcome but I'm not going there...).

Duopolies have never been good for consumers. Of course it would be much better if it was some other company rather than Intel (for example, a comeback of Matrox or S3) , but beggars can't be choosers...
 
Realistically, they need to do two things: make good enough low and mid range GPU and convince OEM to use their product rather than the competition's.

I think that they are good at the second part has already been established.

Do they need a 2080 / 3080 beater? No, not really. Low and midrange OEM sales is where the volume is and they could hurt nVidia the most. Dedicated server / data center GPU would be a second attack vector.

As depriving their competitor of income is another tried and true (and successful) Intel strategy, my bet is that this is what they will go for first.

I'd say that yes, they do need a 2080 / 3080 beater, even if the volume and profit is in low and midrange. It's all about marketing and brand image.
 
I'd say that yes, they do need a 2080 / 3080 beater, even if the volume and profit is in low and midrange. It's all about marketing and brand image.
Intel already have a great brand image to the general public - after all they spent years investing in the "PC = Intel" image. Is the average consumer as aware of nVidia ? And would they care if their laptop / desktop came with an XE card instead of a 1050 or 1650 ?

Long term, not offering a high end gaming card is imho a more efficient option. Sure, the next gen nVidia cards will most likely be great. But if you manage to cut them off from revenue, how will they fund R&D for the gen after that? This (funding) may work for one more gen but after that....which would make it that much easier but beat a cash starved nVidia.
After all, this strategy worked fine against AMD back in the day.

-> If you want to destroy your enemy, destroy what keeps them alive.



That's almost like saying "Intel knows what they're doing". :)

In terms of selling their products with (almost) any means necessary..absolutely. I think Sun Zu and Clausewitz are probably required reading at Intel HQ.

While their size and cash certainly help, they are great at exploiting others' short term thinking.

Having a strong alternative to Xeon would be in everyone's long term interest in thew server market as you can be sure prices will go right up to where they were before Epyc once that is taken care of. Same as back against Sun Sparq or Opteron.

That of course means that you would need to buy Epyc servers. On the other hand, getting a nice rebate on Intel gear - which is the risk free alternative since you are not doing anything differently - is more likely to result in a nice fat bonus check for whoever is responsible for the purchase. So in many cases Xeon it is.

Same for OEM - having Intel and AMD compete for their orders on merit would be great for OEM , but (just like back in the Athlon days), supporting this competition is a big risk. Could one OEM survive against the others after being hit hard by Intel's CPU shortage and having to pay list price (no more "meet the comp funds") ? Big risk to take.

This is a different kettle of fish, I don't know anything about relationships between NV and OEMs.

Neither do I, but having a premium product that buyers want and others cannot offer is always great leverage.

Different industry, but I remember talking to someone who was responsible for aircraft purchases at a large airline. He said that they would much rather exclusively buy smaller Airbus (vs. the 737), but if you do not buy Boeing's smaller planes, you end up way back on the waiting list for the 747 which at the time they needed and there was no alternative to.

nVidia are tough and have great business smarts, so they may be tempted to try something similar with their higher end cards (I.e. "not buying 1050 / 1650 GPU for your standard desktop ? Have fun waiting for the 3080 to arrive since it's on limited supply").

But if there were an alternative to their high end card from AMD, they could not play that card.
 
I have a very good feelings on Intel's Xe GPU efforts due to the leaked benchmarks. The Rocket Lake Xe IGPU is indeed impressive. It seems to be using 1/4 th the power of a similar performing Nvidia GPU. The power budget is the main reason why Gaming laptops underperform. If Xe is that power efficient like the leaks are showing, it will revolutionize the mobile gaming. Laptops won't need to be thick and heavy for game. Looking forward to it...
Like Apple's intention to make Gaming Macs? might be just a coincidence ?
 
I think this is great progress for Intel. Looks like the competition is going to heat up.
 
Praying they can make a better or competitive product. We desperately need more competition in this 2 pony race. If they actually surpass AMD and nVidia, this would be a GREAT thing.....here comes the plunging prices!
 
The current crop of integrated Intel GPUs are hobbled by their lack of TMUs, ROPs, and memory bandwidth. But the compute capability is pretty good, for what's there. For example, a quick run of the OpenCL Compute tests in Geekbench 5 for a UHD 630 vs Titan X (Pascal) gives the following:

computecomparison001.jpg

Now the Titan X has 3854 shaders vs the UHD 630's 24 - they're not the same (as explained in the article), as each shader unit in Intel's GPU can process up to 8 ALU operations per cycle, whereas each shader in the Titan just does the one. So if we say the 630 actually has 23 x 8 = 192 units, the Titan has 20 times more. And yet only 2 of those tests display that scale of difference.

There's obviously more to it that that (cache amounts, cache hierachy, etc play a big part) but for the size it is (roughly 60 square mm, against the Titan's 470, with the shader units taking up about 1/3 of the area), it holds up pretty well. Of course, a 630 die 20 times bigger would be a hulking monstrosity, so Intel will need to be making some serious architectural changes to pack it down to size (or not...).
 
It's interesting how the industry have sparked all of this in bright colors.

Intel is in a good position anyway, despite the results. Some reasons would be:
They fortify their IGPU install base.
In laptops these gpus are powerful, not that much powerful to run Division 2 or Fortnite or tomb raider ideally, but powerful enough to run the likes of Alien Isolation, CS GO, and also LOL, which is embraced by the masses dearly.
They can take away Nvidia's chance of a new nintendo (cpu+gpu), but the chances are slim. Though other competitors might arise.
These iGPUs are powerful for what they consume.

So if the next intel can run Division 2 and Tomb Raider at decent frames, and especially the deared Fortnite, that pumped up Nvidia's sales in discretes, then they are in for the ride.

Intel might try again their luck in tablets.

I think AMD mentioned that it costs somewhere in the range of 300 million to make a processor (dunno if from scratch though). Intel can try many times, but they are well established in some low spec games like I've mentioned.
AMD should have released an APU that has less cpu power but much higher gpu power.

Intel position will also bring more light to AMD as well. Since intel will steer a mass adoption of iGPU's that are used in portable devices, AMD has some juice in those areas as well, so it would be an alternative in the segment.

Let's just hope Intel did not jump on this boat just because the releases from both Nvidia and Amd been slower then usually, and intel feels comfortable with the turtle moves.
 
Last edited:
Intel already has a good thing going since they make up the majority of the market's integrated graphics which people are using without a GPU card in most situations.

All they really need to do is target the low end, medium end and high end with cards that outperform or match the 1660 Ti, 2060 Super, 2080 Super and the 2080Ti.

Ultimately they have to beat the competition at a slightly lower price. The rest will take care of itself.
Waaaaaaaa? "Intel already have a good thing ~ intergrated graphics.." I've heard some pish recently but that tops it. They make and sell the CPU the intergrated graphics come as a part of. Is to sell more CPU's for more. They aint making money from selling interagtred graphics. They making money from CPU's. You get that right? Well, I guess not. Logic should tell ya. How does having intergrated GPU's make it a good thing in this context? We're talking discrete graphics here, as in EXTRA. As in, lucrative, most expensive part of many folks' machines. As in Intels next target cos is a 2 brand war currently. Makes massive sense to get involved. But according to you, they already involved. No. New Ryzen APU's destroy Intels intergrated graphics BTW. So Intel are losing there. Perhaps not in market share.. Yet. But obviously intel needs new money making avenues. Hence the push for DISCRETE g cards. Not intergrated. Which play little role in this thread, topic or conversation.
 
Software is one of my main concerns. AMD and Nvidia have large driver teams, that can rely on decades of previous experience with baked in fixes and tweaks for thousands of legacy games included in every release. They also enjoy wide community support for obscure game fixes, especially Nvidia.

Intel have to assemble a huge software team and roll out rapid driver improvements to build a good reputation with any potential customers for these graphics cards. If the hardware is good but the drivers are immature and broken, all the effort to move into this market will be for nought.

Yer cos?? Again madness. Intel established 1968, revenue $71 billion, 100,000 employees. Nvidia 1993, $12 billion, 13000, AMD 1969, $8 billion, 11000. So Intel, driver king for milllions of servers, majority of machines (pc/laptop) over the world, Windows OS full intergration - hence it working out of the box with Win 8 and 10 (even AMD mobos have Intel parts on them alot of the time btw). ANd you want to compare Intel drviers to AMD, with their shoddy history? How do you think many PCIE slots work? Driverless? Or with an Intel driver communicating to Intel CPU? Yer, I dont think Intel will have an issue getting to work with drivers. I mean making there own drivers, work with with there own G Card drivers? Really?
 
Yer cos?? Again madness. Intel established 1968, revenue $71 billion, 100,000 employees. Nvidia 1993, $12 billion, 13000, AMD 1969, $8 billion, 11000. So Intel, driver king for milllions of servers, majority of machines (pc/laptop) over the world, Windows OS full intergration - hence it working out of the box with Win 8 and 10 (even AMD mobos have Intel parts on them alot of the time btw). ANd you want to compare Intel drviers to AMD, with their shoddy history? How do you think many PCIE slots work? Driverless? Or with an Intel driver communicating to Intel CPU? Yer, I dont think Intel will have an issue getting to work with drivers. I mean making there own drivers, work with with there own G Card drivers? Really?

It's one thing creating chipset drivers and another creating graphics drivers for games. Especially ones that work well a long way into games of the past, not just the present.

That's why your chipset drivers are a few dozen megabytes, with maybe half a dozen revisions over 5 years of their support. Yet modern graphics driver packages are getting on for half a gigabyte and get updated every few weeks.

Nvidia employ a surprisingly disproportionate amount of software engineers due to the need to rapidly update and maintain a complex always evolving software environment.

As the article points out Intel's graphics drivers aren't updated nearly as quickly as the other two main vendors. They are not used to dealing with the demands of gamers who expect immediate support and rapid fixes for new games. You have to work with a vast array of game developers, develop that close network as well as hardware vendors.

That's why it's an issue that Intel need to get on top of.
 
Last edited:
It's one thing creating chipset drivers and another creating graphics drivers for games. Especially ones that work well a long way into games of the past, not just the present.

That's why your chipset drivers are a few dozen megabytes, with maybe half a dozen revisions over 5 years of their support. Yet modern graphics driver packages are getting on for half a gigabyte and get updated every few weeks.

Nvidia employ a surprisingly disproportionate amount of software engineers due to the need to rapidly update and maintain a complex always evolving software environment.

As the article points out Intel's graphics drivers aren't updated nearly as quickly as the other two main vendors. They are not used to dealing with the demands of gamers who expect immediate support and rapid fixes for new games. You have to work with a vast array of game developers, develop that close network as well as hardware vendors.

That's why it's an issue that Intel need to get on top of.

Hardware companies always produce software. I imagine Intel's 112k employees have been on top of this for years.

PRODUCTION of advanced hardware is Intel's focal point, along with the million other technologies they're involved with.
 
Hardware companies always produce software. I imagine Intel's 112k employees have been on top of this for years.

PRODUCTION of advanced hardware is Intel's focal point, along with the million other technologies they're involved with.

All I know was that this is a step up above knocking out some chipset drivers once a year. Software support for discrete consumer graphics cards is critical.

You are saying the same things- Intel are huge and all this revenue and employees are well on top of it.

Sure it's easy to think that. Until something like Spectre turns up and exposes complacency that appears to run as a culture throughout Intel in recent years.

On top of all that? Let's hope so. I want these cards to be really good. :D The industry would get a big boost.

Intel are probably the only one that could pull off the this kind of entry into the discrete market. Marrying hardware with the required software, they are one of the tiny number of companies with the resources.
 
Last edited:
All I know was that this is a step up above knocking out some chipset drivers once a year. Software support for discrete consumer graphics cards is critical.

What's critical today that wasn't critical before? Intel has been updating drivers for - mmm years. I got new Intel drivers packed in a Windows preview release - update YESTERDAY. Not graphics, some other hardware, I didn't read it all. Someone is busy doing something over at Shacka Intel.

You are saying the same things- Intel are huge and all this revenue and employees are well on top of it.

there was a bit on Intel.. "not used to dealing with the demands of gamers who expect immediate support and rapid fixes for new games. You have to work with a vast array of game developers, develop that close network as well as hardware vendors"

I will posit (philosophically) Intel deals with a more strident group than gamers could ever be. OEM's. PC gamers not console/web-browser gamers.

Sure it's easy to think that. Until something like Spectre turns up and exposes complacency that appears to run as a culture throughout Intel in recent years.

The specter of Spectre on Intels face. Google had something to say about that a year ago: https://blog.malwarebytes.com/cybercrime/2019/03/spectre-google-universal-read-gadget/

As have AMD. "AMD says its hardware has “near zero” risk to one Spectre variant because of the way its chip architecture is designed, but AMD CPUs can still fall prey to another Spectre flaw." https://www.cnet.com/news/amd-spectre-affects-processors-chips-intel-arm/

Blaming Intel for complacency over an architecture fault discovered years later is odd, ...no Spectre or variant threat has been used in an attack so hardware and software mitigations must be working, or they're not. Who knows?

Anyway, since Skylake some/all are of these threats are being precluded by architecture. I don't expect Meltdown or Spectre's complicated techniques to account for much beyond academia.

On top of all that? Let's hope so.

They've got a running add in card going around, we'll see how it runs!
 
Koduri is being sued by the Rapper "thin" KXE , he's got Beef with the West Coast gang.
OR
That's the barsteward who knocked my wing mirror off !

Will be a crowded market AMpere (70% better) RDNA2.0 (refined, ray-tracing) Xe (similar, but much lower power)
 
Realize that this is Raja Koduri's project. It will probably absolutely positively and without any other possibily, require HBM6 in order to do anything. And HBM6 will cost only 12x what HBM5 costs!
 
Back