Intel Xe Graphics Preview: What we know (and what we don't)

The drivers are critical and you can see Intel have been slowly improving software support on their integrated GPU line the past year or so.

I don't expect some sort of industry leading, triumphant launch for these products. It's such a difficult task to enter a market that has been completely dominated by two companies for nearly 20 years. Just think about one small aspect- the back catalogue of game specific driver fixes that AMD and Nvidia possess, that experience gained working continuously on discrete graphics.

So the hurdle to do this is enormous, the costs to pull together hardware and software huge. That's why nobody else has tried it, only a handful of semi-conductor companies would even be capable.

What I do expect is some solid offerings for their price with a few problems that should be ironed out quite quickly. Leading to an eventual ramp in performance and competitiveness after a couple more hardware generations.

We want three big players in the graphics market. As development slows down and the industry fights harder for smaller performance gains, another player is welcome to keep the momentum going.
 
They have quite the pedigree with those AMD & (one) Nvidia staff hires. I wonder why a lot more AMD employees jumped ship as opposed to Nvidia employees?
 
This is probably why Nvidia hasn't released a 7nm chip yet. Once they actually know Intel Xe's performance, then they release the 7nm to top it. Should be definitely good for the mid-range market though with a Nvidia/AMD/Intel pricing war!
 
This is probably why Nvidia hasn't released a 7nm chip yet. Once they actually know Intel Xe's performance, then they release the 7nm to top it. Should be definitely good for the mid-range market though with a Nvidia/AMD/Intel pricing war!

Whaaa..?
NO, Nvidia new chip is already being worked on. It takes years to design a chip and Nvidia just can't react to a new card on the market and come out with something, because of it. It doesn't work like that.

Nvidia is all done making GPUs for right now and is instead waiting another Year before releasing their new Ampere chip. There is nothing left in Turing for gamers except excessive prices and marketing gimmicks...

Nvidia's greed has left the market wide open for AMD and Intel to claim.
 
So, I'm assuming they will take the CPU route of pricing?

Or they could take the CPU benchmarking route. I am sure the fine people at Bapco are cooking up a "more realistic" GPU benchmark as we speak, only using "software that people typically use", "randomly" picked from the top 100 list.

Ok, now I've run out of quotation marks for this post....
 
Whaaa..?
NO, Nvidia new chip is already being worked on. It takes years to design a chip and Nvidia just can't react to a new card on the market and come out with something, because of it. It doesn't work like that.

Nvidia is all done making GPUs for right now and is instead waiting another Year before releasing their new Ampere chip. There is nothing left in Turing for gamers except excessive prices and marketing gimmicks...

Nvidia's greed has left the market wide open for AMD and Intel to claim.

You really believe that Nvidia decided to release the latest generation (Turing) on TSMC's 12nm instead of TSMC's 7nm because they are "all done making GPUs for right now"? Come on, man.

Nvidia hasn't moved to TSMC's 7nm because they can and don't have to. They have no competition at the top (ahem, AMD) and their offerings at 12nm have profit margins that are just fine.

What if AMD's 5700 XT had actually beat out the 2080 Ti? Nvidia can just do a die shrink of Turing to 7nm and easily beat AMD again. But AMD's newest offering, the 5700 XT, at 7nm can't even compete with Nvidia at 12nm, so Nvidia has zero incentive to do this.

Now if Intel's Xe is actually faster than RTX 2080 Ti, you can bet your booty that the 7nm die shrink of Turing will be out there pronto to one-up Intel. This is business chess 101.
 
TSMC started mass production on their 7FF process in April 2018; on the other hand, their 12FF node started a year earlier. So by the time Turing was released in September 2018, the process was nicely matured with good yields; ideal for a new product. Nvidia will be going with Samsung for the next chip, rather than use TSMC.

It will be interesting to see what process Intel use for the Xe range - I should imagine that they’ll go with their base 10nm process for the mainstream release, and a later 10+ node for the high end models.
 
I expect the graphics to be on par with Nvidia. Driver performance is critical and probably biggest reason I uses NVIDIA cards. Intel's drivers tend to be top notch so I don't expect many issues here with some growing pains at first maybe.

What I would like to see is a graphics card with high end audio support, DSD specifically for me. everything converts DSD to PCM and be transported that way via HDMI but native support would be best.

I'm running a GTX 1060 on a Intel Extreme motherboard that still performs great but is finally getting dated with technologies. I wish Intel still made motherboards. I may make the jump to Intel GPU if they can come out with a reasonably priced 4k capable GPU with ray tracing
 
Intel's biggest hurdle is going to be perception.

With enough time, talent, and resources anyone can create a competitive GPU. And manufacturing costs are relatively equal across the board. So on paper it should be relatively easy for Intel with it's cornucopian resources to enter the GPU market.

But it really doesn't matter how good a product is if people don't believe it's good. And Intel has a long history of producing barely adequate graphic solutions to overcome. It's kind of how AMD painted itself into a corner by being known as the price performance king. Making it really hard to offer a premium product at a premium price point. They didn't even bother going after the 2080 this generation.

The only way Intel has a fighting chance IMHO is they need to use the same strategy Nintendo has when going up against the two major consoles. Produce a product that fills a unrecognized niche' like they did with both the Wii series and the Switch. But is there really any value added niche' in the discrete GPU market?

They could of gone after ray tracing and if memory serves at one time were seriously considering just that, but never went anywhere with it. After ray tracing what unique must have feature can they offer to drive sales? I can't think of one. So even if Intel can bring a competitive product to market it most likely won't be enough.

They can try undercutting the market. But as AMD has shown that's a duel sword and can actually hurt as much as it helps. Plus Nvidia and AMD have been playing in this chess match for decades and know every counter strategy available. Sure a price war will benefit us consumers, but will it benefit Intel? Considering its a public company, only for a short time and then the certain loses will force it to exit the market.

To actually move the needle Intel needs not just a competitive product but one that crushes the competition. Giving consumers no choice but buy it. Truly viable real world 4k and ray tracing at the current market price point would do it since both haven't really matured to the point where we see the benefits marketing hype says are there.

But is it possible? I really don't know TBH, but I doubt it. If Intel can show a sub 1,000 dollar card running a AAA game with ultra graphics, 60+ FPS 4k, and fully utilized ray tracing, then yeah they can. But that isn't a low bar to jump over, so I personally see Intel entering the market and then exiting it in a couple of years due to not capturing a significant enough portion of the market.
 
No from all. 1000 usd in nok would be about 8.36 usd X 1000 nok. I would be better using pcie 4.0 from amd. if intel must compete they have to go to 256 gb on eatch gpu up down in speed. and even with 1 tb vram on gpu the need to get pcie speed to atleast pcie 3.0-7.0 to beat both amd nvidia.

benchmarks of intel must beat both in ray tracing and amds own arcitecture. that would be exspensive for intel gpu. more cores hotter gaming and atleast 5000 in game renderes. to just get over 60 fps the nxt would be 240 fps to beat.

screen neded to have 240 fps and games must be supporting that too.
if intel want to take down both in a single SHOT (drunk) they have to drink for two of them.
we just hope fps are just as good for begginning to get good middle or low FPS gaming rendering and so on. what if you get 2x of intel gpu and one of nvidia amd to test out 1st.

https://www.tomshardware.com/news/intel-xe-gpu-specs-features,38246.html gpdr linked
 
Last edited:
It really makes me wonder, how much the effort in creating the market-lukewarm Kaby Lake-G has got anything to do with Intel's success in launching this product?

And that's even before we started talking about Raja at all.
 
Article say's ----------------------------> "Intel’s cores are nothing like Nvidia’s CUDA, but they bear similarity to AMD’s stream processors."

First thing that came to my mind was hell yeah because INTEL grabbed many AMD Radeon employee's.
 
The cores in Intel's GPUs have been similar to the Steam Processors that AMD use since 2006, at the leas; so it's unlikely to be due to AMD staff moving into Intel.
 
I am speculative they will have Linux support. I've only run Windows for gaming in 25+ years up to currently Win7, but due to M$ trying to force Win10 on us, I will most likely build my new rig on Linux. :/ That is where future game makers will most likely see my money.
 
One thing I haven't seen any details on are the 4k blueray codec. As far as I knew, a 4k blueray drive won't 4k if ran via discreet GPU? Only way is if it uses Intel's integrated GPU as Intel I believe Intel owns the codec?
 
Nobody really expects for Intel to dethrone nVidia in the first round of the match. But they can eventually outpace Team Green - tons of $. However... Intel will certainly be a no-go option at the beginning if they persist with ridiculous pricing know from all other market segments and all of that with inferior silicone..

I would love for Intel to do good. We need a 3rd entity in this market.

And yes of course nobody will really care if new Intel cards can't do well in games and 3D rendering engines. Biggest strength of for example nVidia is that consumer cards are much better choice for say IRAY than any Titan or Quadro.
 
One thing I haven't seen any details on are the 4k blueray codec. As far as I knew, a 4k blueray drive won't 4k if ran via discreet GPU? Only way is if it uses Intel's integrated GPU as Intel I believe Intel owns the codec?
Ultra HD Blu Ray is encoded using HEVC, which isn't owned by Intel. However, 4KBR playback on PC also requires Intel's Software Guard extensions and driver support for Advanced Access Content System 2.0 - the latter, I think, is the real problem as it's developed by a consortium that neither AMD nor Nvidia are part of, but Intel is.

So if you have a specific Intel CPU (Skylake or newer, I believe) and a discrete Intel Xe graphics card, you should be fine. Anything else...forget it :confused:
 
The 512 Xe could be as slow as a Vega64, we know nothing by now.
The scaling of components usually leads to smaller Chips being more efficient e.g. in Games.
Will it only be shown in 2020 for the first time, or will it be sold before christmas?
Some newer slides could be understood in Xe being available from 2021, I´m confused by that.
It looks like there will be no Icelake desktop CPUs at all wich could mean that the 10nm process isn´t capable of high wattage or high voltage, the latter could enable big GPUs but the former could mean 10nm is completely useless for big GPUs.
Tigerlake seems to get to the desktop and should use the newer 10nm+ or ++ process in a time when intel may offer 7nm already, again very confusing.
 
Back