Intel Arc Alchemist GPU shows off AI-based XeSS magic in new Riftbreaker video

Jimmy2x

Posts: 238   +29
Staff
Highly anticipated: GPU shortages have been a roadblock for gamers, builders, and other users looking for an upgrade since late 2020. Earlier this year, Intel announced that their in-house graphics hardware lineup, Arc, would be released by the first quarter of 2022 to compete directly with Nvidia and AMD. Based on recent information and videos from Intel, it looks like Team Blue is serious about capturing a piece of the GPU market.

Two of the leading technologies in this generation's GPU wars have been Nvidia's Deep Learning Super Sampling (DLSS) and AMD's FidelityFX Super Resolution (FSR). Both promise to provide an enhanced visual experience for users, one through an AI-based algorithm and specialized hardware, and the other by rendering lower resolution frames and upscaling them using an open source algorithm. Never one to be outdone by the competition, Intel has thrown their hat in the ring with their own super sampling feature, known as Intel Xe Super Sampling (XeSS).

Intel recently released a video of XeSS in action on an Intel Arc GPU. Unlike previous demonstrations, this latest video from Team Blue provides one of the first real looks at 1080p content upscaled and recorded in 4k. The video leaves little to be debated, with the upscaling solution delivering noticeably higher quality 4k images.

Intel's new super sampling technology brings a distinct advantage to the table. While Nvidia relies upon specialized Tensor Cores for a hardware-based upscaling solution, XeSS has been designed using open standards, similar to AMD's FSR, to run across all current GPU platforms. This means users from teams Red, Green, and Blue will have the ability to use the new upscaling solution for any supported titles.

It is currently unknown what level of compatibility Intel's solution will provide with older generation hardware. AMD's solution was well received due to its ability to breathe new life into aging hardware, offering support for architectures as far back as AMD's 400 series and Nvidia's GeForce 10 series.

There is currently no shortage of demonstration videos showcasing the DLSS and FSR solutions. Unfortunately, the compression losses incurred by most major video platforms make it hard to compare each solution to Intel's XeSS in a side-by-side comparison. We likely won't be able to see this head-to-head comparison until Intel's Arc GPUs actually launch, but based on the initial releases from Intel, it looks like the company is making big moves to enter the GPU arms race.

Permalink to story.

 
Rift Breaker?

Is that like the new Ashes of the Singularity? Why do we get so many very specific niche RTS games that people mostly don't care for a few short years after release except as a popular benchmark?
 



FSR right now is unusable at less than 4K. And I only say that because I don’t have a 4K monitor to test it but that’s what the community seems to be saying. It’s appalling at 1080p, the visual quality takes such a big hit that it’s not worth the extra fps. I own a an RX480 and running at 1080p, I’m probably the ideal user for the tech and I don’t use it.

Now DLSS was poor for about a year after it was first introduced so maybe FSR will improve too. But if this XeSS thing comes along and is better from day one then we need to stop making excuses on AMDs behalf for FSR.

Intel, put a real game to show new technology, not that crap.

Pfft, the rift breaker is more of a real PC game than things like fortnite, Warzone or anymore of that battle royale crap.
 
I'd be happy if these companies would just stop using random combos of letters and pseudo-words to name their products.
 
Graphics are crap.

It's not a AAA, with a huge budget game.
But it looks well enough for a strategy game. And when there's a ton of enemies attacking, with tons of explosions, it manages to impress.
Because of this, with so many units in movement, transparencies, destuction, etc, it manages to be an excellent showcase for XeSS.
Consider that one of the problem with DLSS 2.0 on many games, is ghosting.
XeSS, seems to not have that problem. This is very good for Intel's first outing with an AI upscaler.
 
Graphics are crap.
I thought they were quite good. The ray tracing is awesome, missiles will fly past and light up the sky and the shadows are thrown across the map, the lighting engine is actually outstanding in general. Also its quite impressive how they have tonnes of enemies on the screen at the same time and it doesnt seem to slow down at all.

It is tower defence really but its quite a unique game. I only really played it because it was on game pass though, not something I would buy.
 
It's not a AAA, with a huge budget game.
But it looks well enough for a strategy game. And when there's a ton of enemies attacking, with tons of explosions, it manages to impress.
Because of this, with so many units in movement, transparencies, destuction, etc, it manages to be an excellent showcase for XeSS.
Consider that one of the problem with DLSS 2.0 on many games, is ghosting.
XeSS, seems to not have that problem. This is very good for Intel's first outing with an AI upscaler.
Good for the devs of that game. But did Intel need to use that for showing technology? That is for children. Use Total War franchise instead, or something similar if you want to show new technology.
 
Good for the devs of that game. But did Intel need to use that for showing technology? That is for children. Use Total War franchise instead, or something similar if you want to show new technology.

A complex strategy game is for children?
I could understand if you said that about Fortnite....

You seem a bit mal content today.
Chill out mate. It's just a video of a tech demo.
 
A complex strategy game is for children?
I could understand if you said that about Fortnite....

You seem a bit mal content today.
Chill out mate. It's just a video of a tech demo.
Because I am interested in the performance of that new card I need a real game and not that mockery for children. If most people are childish, it is not my fault, but at my 57, I cannot buy something like that. Sorry.
 
I agree with Geralt, this game is not the best example intel could choose to show this XeSS tech.
It's a niche game and it will stay like that forever, no matter the Steam reviews...

Intel, you need a big AAA game to show XeSS, so let's see the next one.

I'm actually interested in this and their GPUs, despite not caring about their CPUs. I can do that, right?
 
I agree with Geralt, this game is not the best example intel could choose to show this XeSS tech.
It's a niche game and it will stay like that forever, no matter the Steam reviews...

Intel, you need a big AAA game to show XeSS, so let's see the next one.

I'm actually interested in this and their GPUs, despite not caring about their CPUs. I can do that, right?

They already showed xess on Hitman 3.
 
I agree with Geralt, this game is not the best example intel could choose to show this XeSS tech.
It's a niche game and it will stay like that forever, no matter the Steam reviews...

Intel, you need a big AAA game to show XeSS, so let's see the next one.

I'm actually interested in this and their GPUs, despite not caring about their CPUs. I can do that, right?

I think it's perfect for this demonstration since this is a game that already supports both DLSS and FSR. It's good for comparison purposes. Besides, it's a game with small elements and details, things that are hard to reconstruct from lower resolutions.
 
I agree with Geralt, this game is not the best example intel could choose to show this XeSS tech.
It's a niche game and it will stay like that forever, no matter the Steam reviews...

Intel, you need a big AAA game to show XeSS, so let's see the next one.

I'm actually interested in this and their GPUs, despite not caring about their CPUs. I can do that, right?
I'm all for more competition in the GPU space, so even though I don't want alderlake, I looking forward to seeing how they perform and what the price will be.
 
FSR stands no chance on PC.
Good software is not in AMD's bag. Intel has more software engineers than AMD has staff.

That doesn't guarantee an Intel victory (we haven't seen reviews), but XeSS has a far better chance.

Intel server parts compete with AMD on the server space especially with their software optimizations. It's no joke. AMD is still a component company.
 
I'd be happy if these companies would just stop using random combos of letters and pseudo-words to name their products.
And use what instead? AMD the only one not using "SS".

I think Intel did fine. Xe is their gpu line and XeSS is the supersampling technique related to their gpu line. DLSS is descriptive of the technique Nvidia uses. While AMD over here going FidelityFX SUPER RESOLUTION. Which tells you absolutely nothing about it other than it messes with resolution.
 
DLSS and XeSS look much better than FSR IMO. AMD needs to brings large improvements to FSR 2 in RDNA3. At this stage unless RDNA3 also dramatically improve RT performance as well as FSR I'd be more inclined to buy Nvidia's Lovelace. I won't buy first gen Intel Arc, but it looks promising and it's great for consumers to see a third player in GPU's. Still I'm not writing off RDNA3 because we get hardware accelerated FSR, and it will double performance and should keep the rasterisation crown, but a lot will hang on RT.
 
DLSS and XeSS look much better than FSR IMO. AMD needs to brings large improvements to FSR 2 in RDNA3. At this stage unless RDNA3 also dramatically improve RT performance as well as FSR I'd be more inclined to buy Nvidia's Lovelace. I won't buy first gen Intel Arc, but it looks promising and it's great for consumers to see a third player in GPU's. Still I'm not writing off RDNA3 because we get hardware accelerated FSR, and it will double performance and should keep the rasterisation crown, but a lot will hang on RT.

You know XeSS and FSR are GPU independant ? - so you can choose as you like - if you have Nvidia you have one extra solution . Though where it matters is what developers do - or whether it matters .
It's still early days

RT is still in it's infancy for consumer GPUs - both AMD/Nvidia and probably Intel will bring massive improvements in their next iterations - plus I'm sure software cheats/techniques will come .
I know nothing about it - but I could imagine one - instead of running 4K RT say on a water texture in real time . Run a quick Monte Carlo simulation* at say 540p , using AI and a known database to build 4K image .
Monte Carlo simulator's build quite an accurate model - for nearly impossible or super long things to calculate - eg maybe 20 quick slightly varied 540p snapshots when combined give you a pretty accurate visual that AI can use with an existing database to build a pretty realistic 4K visual.
 
You know XeSS and FSR are GPU independant ? - so you can choose as you like - if you have Nvidia you have one extra solution . Though where it matters is what developers do - or whether it matters .
It's still early days

RT is still in it's infancy for consumer GPUs - both AMD/Nvidia and probably Intel will bring massive improvements in their next iterations - plus I'm sure software cheats/techniques will come .
I know nothing about it - but I could imagine one - instead of running 4K RT say on a water texture in real time . Run a quick Monte Carlo simulation* at say 540p , using AI and a known database to build 4K image .
Monte Carlo simulator's build quite an accurate model - for nearly impossible or super long things to calculate - eg maybe 20 quick slightly varied 540p snapshots when combined give you a pretty accurate visual that AI can use with an existing database to build a pretty realistic 4K visual.
I believe XeSS works on any GPU but can also use hardware acceleration on Xe GPUs. So like a best of both DLSS and FSR I guess.

Also dont put down RT so easily, I have an RTX 2080, ive had it for years at this point and there is no game that I cant run at 60 or higher with RT on at 1440p. There is so much bullshit surround RT its ridiculous. Yes it does tank performance in general but not beyond the point of playability. If you own a 60hz monitor you really dont lose much. The Riftbreaker - as mentioned here runs at about 90fps at 1440p with RT on. Its a great experience.

Ive also noticed that the amount of games with RT has ballooned over the last year or so. I now find that most new games I buy have it, even Ubisoft games are getting it and they completely ignored it in the beginning. Hell even Fortnite and COD Warzone has it apparently

Personally if I was spending big money on a GPU today, I wouldnt consider a GPU that didnt perform well using RT, you would be losing out on so much in todays games. Of course there will be big improvements to come but thats always the case.

 
Because I am interested in the performance of that new card I need a real game and not that mockery for children. If most people are childish, it is not my fault, but at my 57, I cannot buy something like that. Sorry.
I'm 56 and find that Riftbreaker on Win10 is interesting but doesn't hold my interest long enough to be bothered to play it. I tend to prefer Guild Wars 2 and my RX 5600 XT as it's fast enough and does what I need at 2k

Three issues here pup. 1st off involves tweaking the encoder app to get the best images and I suspect 20+ hours of effort. 2nd is the damn card isn't available as yet so as far as I'm concerned this is VaporWare until you can actually buy it. Until then, look at what Intel is Promising and See how close they actually get with the real product. 3rd is why no listing what CPU did they use when playing? Maybe they used a tweaked 12900k Alderlake to get things this clear so once again, vaporware
 
Back