Intel's Xe GPUs will feature hardware-level ray tracing support

midian182

Posts: 9,722   +121
Staff member
Why it matters: While not everyone thinks real-time ray tracing is the best thing to happen to PC graphics in years, it’s definitely having an influence. Intel, for example, has revealed that its upcoming Xe graphics architecture will support hardware-based ray tracing acceleration.

Intel announced the news at this week’s FMX graphics trade show in Germany. Xe is primarily designed for data centers, but according to Tom’s Hardware, there will be a second architecture for discreet graphics card aimed at the consumer market. The cards are built on the 10nm process and should arrive next year.

The area to benefit most from Intel's data center Xe cards will likely be the entertainment industry. Rendering animated movies such as How To Train Your Dragon: The Hidden World is usually carried out on the CPU because it is more precise, but Intel’s GPUs could offer similar levels of accuracy while being much faster.

“Studios continue to reach for maximum realism with complex physics processing for cloth, fluids, hair and more, plus modeling the physics of light with ray tracing,” wrote Jim Jeffers, a senior principal engineer and senior director of Intel’s Advanced Rendering and Visualization team. “These algorithms benefit from mixed parallel and scalar computing while requiring ever-growing memory footprints. The best solutions will include a holistic platform design where computational tasks are distributed to the most appropriate processing resources.”

With hardware-based ray tracing appearing in the data center-focused Xe architecture, it seems almost certain that the feature will trickle down to the consumer-level products. Like Nvidia’s strategy, Intel might only offer ray tracing in the more expensive models.

Last month saw Nvidia add ray tracing support to some of its non-RTX cards via a driver update, but using it brings a huge performance hit. With Intel Xe featuring hardware-based ray tracing, it should offer superior DXR performance compared to the software solution.

When announcing its 7nm Navi graphics architecture this week, AMD’s Lisa Su declined to comment when asked if it would support ray tracing.

Permalink to story.

 
My guess is that Intel's renewed interest in making GPU shows that they know the market is hot for low-end gaming computers and they want to be able to claim more market share with strong CPU-GPU ties. I don't see Intel competing with Nvidia anytime soon. They are the undisputed leader for now and AMD shows no signs of catching up.
 
My guess is that Intel's renewed interest in making GPU shows that they know the market is hot for low-end gaming computers and they want to be able to claim more market share with strong CPU-GPU ties. I don't see Intel competing with Nvidia anytime soon. They are the undisputed leader for now and AMD shows no signs of catching up.

My guess is they're after the growing data center market that AMD is doing quite well in and Nvidia.

It's where the money will be in 10+ years when everyone moves to streaming games as well.
 
It's where the money will be in 10+ years when everyone moves to streaming games as well.
Can we bookmark this for 10 years time and see how little streaming games has come?

I'm sorry, but it just isn't happening until way more people can get at least the same quality Internet at home as dedicated Leased Business lines and that just isn't happening in 10 years.

Or should I say, this isn't happening in the United Kingdom in the next 10 years, or even 20 years to be honest, Virgin Media is the only ISP properly putting Fibre in the ground but only in areas they deem worth it, while Open Reach are still installing copper from the 60's in new builds...
 
Any modern graphics card can do Ray Tracing. Latest buzzword I guess to get the crowd frothing at the mouth.

gcPK0wb.gif
gcPK0wb.gif
 
Last edited:
My guess is that Intel's renewed interest in making GPU shows that they know the market is hot for low-end gaming computers and they want to be able to claim more market share with strong CPU-GPU ties. I don't see Intel competing with Nvidia anytime soon. They are the undisputed leader for now and AMD shows no signs of catching up.

My guess is they're after the growing data center market that AMD is doing quite well in and Nvidia.

It's where the money will be in 10+ years when everyone moves to streaming games as well.

lol
 
Still gaming without rts enabled on gtx gpus .and now you must have rtx gpu to play games. if you get good performance with gtx 1080 ti you can still play games like a pro. to enable a simulated rts on gtx cards would o.c not fit in the new world. get on with life and game on gtx to rtx 2nd gen arrives. you cant beat gtx on 3 years later with new tecnology that would be unfair to the older gtx series.
It is still the top en of gtx not rtx. the top care are some how rtx titan in cad and rtx 2080 ti in games benchmarks.

Just like linius said both cards can run game but rtx titan are cad adobe tested.
so if im gaming on rtx 2070 thats has the power of gtx 1080 (ti) and just want to game on it.
would it not be the best in that year of gaming benchmarks. rtx cant compete with a older gtx cards.
they have more speed and does have rts onboard.
if I could send in the gtx and get rts on it I would be happy.
"it will never happen."
If I play with rts off on gtx fastest fps and give a (damn)(dn duke nukem talk) of turning on rts on rtx 2xxx card .
would it be faster or slower ingame. Since gtx can run fast or faster then top rtx 20xx ti card ?
do be fair with benchmarks.
Older gpus cant o.c hanndle never rts . so get it over.
playing with top cards before rtx was dual monitors super uhd 4k on full on all ti cards.
when rtx card came on marked suddenly gtx 1080 ti was old ?
(its just a gimmic to newer version .) new rtx cards runs 2x better just like the gtx 980 ti compared to gtx 1080 ti gpus.

And older cards cant ever run as good as top ti gpus.
if intel beats both gtx rts and rtx rts in fps then I woud buy a motherboard with pcie 4.0 5.0 and jump over the rtx series and run intel instead. and hope ill run in better fps.
"if intel beats both..."

UPD... Remake : what if ms-dos win 3.1 11 95 could get running rts rtx on their OS win 98 se 2000 me xp with this enabled. dont forget older os. they could do it better with rts turned on. but nope older os dont have higer then dx9.0c. so if you are getting a vista win 7 8 8.1 you should be auto patched to run rtx rtx game. but no it is just win 10 ENABLED.
What if mac unbutu could run this one too ? vista had only dx 10 10.1 so then it was 1st too support dx11 gaming.
microsoft can still give vist win 7-8.1 dx12 support. (will never happen)
they gave up some good OS to just get new gpus with faster fps and not tinking about dx11 was good enough for playing stalker in dx 10.1 11. testalation are good enough on older gpus but harware rendering must be set to top end rtx gpus. not low end gpus as a WARRRNING on low FPS.
STicker LABEL

if yeah buing gtx 10xx ti series you dont get next version upd es. yeah get low fps rtx and just simuleted not real time rendering like rtx gpus. (g-force 3 4 mx)

Dont waste money on low end gtx 10xx ti cards. you can buy one and use it to 2013-2025 games but with low fps rts turned on.
there SHOUlD be A WARRNING on new gtx ti card, WE CANT run just like RTX 20xx ti series ever.

So games that only support dx 10.1-dx12 supports rts gpu.
ms should re-relase older os inside windows 10 to get (hdd)12 bit 16 bit support back.
like the first real time gpu gf 3 ti that could run in real time rendering it would be like amd nvidia intel must learn that rts gpus must getting some versions on its back to run good rts.
3dfx are still emulated in nglide powervr gaming can be rendered with newest gpus in this time ??
Just like 3dfx was new before nvidia (bought it) amd matrox gpus in older time we must still wait for full real time rts gaming. nxt gen must be about 3x-8x faster to get decent fps in win 10.
ms must support both amd nvidia and intel o.c in 3dmarks game cad (repeating it)

it will take many years to get good fps with rts turned on. kO' lets say 10-25 years to just test it out. WE would be OLD before next gen arrives.

upd 2 upd 3
 
Last edited:
Nivia is out from the Tesla car business, they had a weak selling on RTX cards, ppl much more prefer GTX for (now) better price/value/performance, thanks to the ridiculous RTX prices.
AMD somewhat in the Nvidia's shadow, Intel trying to catch up with Nvidia on the Data Center GPU, because they dominating the CPU side of it.

Also just to point out one of same thing between RTX cards and PhysX is that both had a rough start and the cards are/were so expensive majority of ppl simply skipped that gen and the next one too before it got "cheaper" (price/value/performance normalised)
 
It's also rubbish with cherry picked stats.

I'm no fan of the price either but the 2080Ti is categorically faster than the VII

Yes all cherry picked.
but the fact im able to is a worry.
when AMD eventually quits selling gaming cards it will be the end for pc gaming innovation and low prices.
Its already here and people are feeling the hurt to snubbing the competition at their own expense.
 
Intel will prob price gouge the sh!t out of their offerings like they do w/ their CPU's and eventually run themselves out of the market.
 
Intel will prob price gouge the sh!t out of their offerings like they do w/ their CPU's and eventually run themselves out of the market.
As opposed to offering a very limited supply of the top end CPU, that cannot meet demand leading to price gouging.

Yeah... go look at the 3900x being more expensive than the 9900k now.

Also to you folks who think ray tracing is just a "gimmick" that tech has been a wet dream for game's developers since the early 2000s.

It won't be adapted and deployed until consoles can do it because PC games are always restricted to how far they can go by the marketable or adaptable cross-platform.

To sit and read these absolutely uninformed opinions about Ray Tracing when you fundamentally do not seem to understand how much this eases up scene design from rasterizing... just lol. Instead of hacking in shadow effects, or reflections, or refraction you now get all of that for _free_ no additional power required, and for cinematic effects you can now spend the tech on shaders in different ways versus wasting GPU cycles on producing lighting fakes.

Like you really have no idea how silly your complaints are - especially now that Intel is going to drop a technology for it that will be reverse compatible with all DX11 GPUs.
 
Time to dust off those Larrabee schematics at Intel HQ....
Although we don't know a huge amount about the Xe architecture, we know that first versions will be based on the current Gen 11 and there's not much Larrabee about that design. The abandoned processor was essentially several basic x86 cores, but with a large SIMD single thread vector unit in each one, all connected to a big block of cache through a ring bus. It had no hardware specific for graphics processing, other than units for texture addressing, sampling, and filtering. Gen 11, on the other hand, has no x86 cores, lots of dedicated hardware for various graphics functions, multiple cache levels, and numerous SIMT vector units. The only thing that's really common to both is the ring bus, but this is very an Intel 'thing'.
 
Back