Intel Iris Xe DG1 gaming performance revealed in tests

midian182

Posts: 9,738   +121
Staff member
In a nutshell: We already knew that Intel’s Xe DG1 graphics card isn’t going to be competing with the latest and greatest from Nvidia and AMD, but a new review shows it has potential as a budget 1080p option.

The first benchmark of the Intel Xe DG1 graphics card appeared on the Basemark GPU database back in April. The result suggested it offered performance inferior to the Polaris-based Radeon RX 550 released back in 2017, which isn’t great.

However, YouTube channel ETA PRIME managed to get its hands on a DG1 via a prebuilt, $749.99 CyberPowerPC gaming system that includes an Asus DG1-4G. It was tested across a range of games, giving us a better idea of the capabilities.

The card’s specs come in at 80 execution units (EUs) or 640 shading units, 4GB of LPDDR4X-4266 across a 128-bit interface, a 1,500 MHz boost clock speed, and it runs at just 30W, meaning external PCIe power connectors are not required and it can be passively cooled.

The PC comes with a single stick of 8GB DDR4-3000 memory, which ETA PRIME swapped out for dual-channel 16GB DDR4-3600 to give a bit of a performance uplift. Interestingly, it also features a Core i5-11400F, despite Intel previously saying the GPU could only be paired with 9th-Gen (Coffee Lake) and 10th-Gen (Comet Lake) processors. It appears that Rocket Lake has been added to the compatibility list.

The gaming benchmarks show the Xe DG1 can handle most games at 1080p pretty well when the graphics are set to normal or low. GTA V gets between 79 – 92 fps at standard settings, Forza 4 averages over 60fps (low), and Genshin Impact is around 60fps (medium). As you might imagine, some demanding games do struggle: Cyberpunk 2077 can only manage 30fps when at 720p with low settings, while Red Dead Redemption 2 gets between 32 – 47 fps at 900p with settings downgraded to low.

While the Xe DG1 looks to be on-par with some of the latest Ryzen APUs, the Xe-HPG (DG2) that recently had its GPU revealed has at least one SKU featuring 512 EUs, which could place it somewhere between the RTX 3070 and RTX 3080, performance-wise

Permalink to story.

 
Saw this yesterday on ETA Prime's channel (Does rather good content on low end rigs and dev boards and other such devices like tablets) and it is fairly disappointing that intel couldn't at least match the 1050 performance: their dedicated GPU even if it's the low end, can't even compete with AMD's APUs right now.

It really must be just that tiny bit of silicon that would go into the integrated iris pro graphics just given it's own pcb and heatsink and more ample power limits but almost no extra cores.
 
Saw this yesterday on ETA Prime's channel (Does rather good content on low end rigs and dev boards and other such devices like tablets) and it is fairly disappointing that intel couldn't at least match the 1050 performance: their dedicated GPU even if it's the low end, can't even compete with AMD's APUs right now.

It really must be just that tiny bit of silicon that would go into the integrated iris pro graphics just given it's own pcb and heatsink and more ample power limits but almost no extra cores.
It's a 30 watt GPU being paired with a CPU without integrated graphics during a GPU drought. Neither ASUS or Intel made any performance claims about this new GPU. Everything from the passive cooling to the low power needs to the likely low cost says "this is a budget GPU". I don't know why you got your hopes up.
 
It's a 30 watt GPU being paired with a CPU without integrated graphics during a GPU drought. Neither ASUS or Intel made any performance claims about this new GPU. Everything from the passive cooling to the low power needs to the likely low cost says "this is a budget GPU". I don't know why you got your hopes up.
As I alluded to, we know what AMD can do given a power limit of about the same on an APU (65 watts shared with the CPU so 30 watts is generous)

But to your point, I guess my issue has more to do with how useless I find these types of cards and not this one specifically, whenever it's this one or a 1030 or a 710/730 or anything like that: if the option is there to do just as good with integrated graphics then why not chose AMD for it then?

Seems like it has a lot less to do with tech and a lot more to do with companies that have an overall contractual obligations with intel to *not* use AMD APUs and just need this very specific case of "Better than UHD but not worth a high performance GPU" cases.

But assuming intel can scale up to 100-150 watts range then I probably would consider them, specially if the do better on supply than Nvidia and AMD right now, we'll see.
 
As I alluded to, we know what AMD can do given a power limit of about the same on an APU (65 watts shared with the CPU so 30 watts is generous)

But to your point, I guess my issue has more to do with how useless I find these types of cards and not this one specifically, whenever it's this one or a 1030 or a 710/730 or anything like that: if the option is there to do just as good with integrated graphics then why not chose AMD for it then?

Seems like it has a lot less to do with tech and a lot more to do with companies that have an overall contractual obligations with intel to *not* use AMD APUs and just need this very specific case of "Better than UHD but not worth a high performance GPU" cases.

But assuming intel can scale up to 100-150 watts range then I probably would consider them, specially if the do better on supply than Nvidia and AMD right now, we'll see.


So what are you asking? Why would Intel bother making this if an AMD apu is almost as good as it?

The Answer is they are not AMD, they are developing GPU's? They haven't produced these benchmarks as some sort of "beats an amd apu" marketing campaign.

If they had then your question would be valid, as it stands I would call it retarded.

Or are you asking why would someone buy this to play games?

The Answer is they can't, it isn't available separately.

The only point of this article being printed is to give some indication of performance and for a 30w card to compete with 30w cards from nvidia (gt 1030) seems like a decent result.

Yes gt 1030 is a few years old now but this already the last gen Intel.

Sounds like you have an AMD bedsit and wallpaper, but can't you wait until Intel actually release a GPU before you say they have failed, would add to your credibility.
 
Saw this yesterday on ETA Prime's channel (Does rather good content on low end rigs and dev boards and other such devices like tablets) and it is fairly disappointing that intel couldn't at least match the 1050 performance: their dedicated GPU even if it's the low end, can't even compete with AMD's APUs right now.

Compared to the 1050, it has the same # of shaders, much slower memory and less than half the design power. It had no chance against a mature product like the 1050. We'll see if Intel can couple this with proper GDDR5 and a decent TDP to extract closer to the expected performance.
 
Well, at least they should be available....maybe? Hard to believe Intel screwed this pooch by not making the GPU in their own fabs. The only good thing to come from this is that we will have 3 manufacturers of discreet cards.
 
Why anyone would spend $750 on a PC that can only run games in 1080p on medium or low, in 2021, escapes me. Honestly I’d spend that money on a PS4, Xbox Whatever or a used rig with significantly more power, and wait until the situation improves. Or get into movies instead.

$750 for that performance is daylight robbery, even if it actually turns on, which given Cyberpower’s reputation might be an ask too far.
 
As I alluded to, we know what AMD can do given a power limit of about the same on an APU (65 watts shared with the CPU so 30 watts is generous)

But to your point, I guess my issue has more to do with how useless I find these types of cards and not this one specifically, whenever it's this one or a 1030 or a 710/730 or anything like that: if the option is there to do just as good with integrated graphics then why not chose AMD for it then?

Seems like it has a lot less to do with tech and a lot more to do with companies that have an overall contractual obligations with intel to *not* use AMD APUs and just need this very specific case of "Better than UHD but not worth a high performance GPU" cases.

But assuming intel can scale up to 100-150 watts range then I probably would consider them, specially if the do better on supply than Nvidia and AMD right now, we'll see.

Here is what I found on the Intel website about this GPU:

"Elevate creativity and productivity with Intel® Iris® Xe dedicated graphics, available in systems through select partners. Experience amazing HD video capabilities for work, home, and remote learning with the first dedicated GPU Add-in Card for desktop PCs based on Xe architecture. With support for up to 3 simultaneous 4K HDR displays, you can immerse yourself in your favorite content or multi-task like a pro." So based on Intel's marketing what is your specific issue with this video card?

Where is the evidence this GPU, which you can't buy separately at this time, is better or worst than an APU? Do you have comparisons for creative tasks comparing

For gaming I'm finding the, currently overpriced, but available, 3400G ($287 vs $149 MSRP) is well behind the Intel Iris Xe DG1 in Fortnight at 1080p and low settings it's one of the few tests I could find that showed a similar game between the 3400G and the Intel Iris Xe DG1 (terribly long name by the way)
3400G FPS 67 low / 81 high
Xe DG1 FPS 108 low / 262 high

How much is a system going to cost based on a 3400G? The CyberPowerPC with only 8GB of single channel RAM cost $749 are you going to be able to put together a decent system based on a 3400G that can beat the CyperpowerPC? At that price point does it really matter? No one who's running a business doing create work is going to be buying such a system and anyone who's doing creative work as a hobby doesn't need to save 20 seconds on a video render.

Where did you get your objective evidence anyone has a contract with intel specifically baring a company from using AMD APU's in their systems? CyberpowerPC has at least 40 AMD powered systems. What would lead you to believe Intel is stoping them from using APUs? Could it have something to do with APU availability and cost and not some conspiracy? What is CyberPowerPC supposed to do with all the 10400F they have? I assume they've had CPU's and motherboards sitting around waiting for GPU's for a while now. Does AMD have an exclusive contract with Microsoft and Sony to make SOC's for their consoles? Is there anything wrong with that?

Instead of basing your opinions on how you feel about something, please bring some objective evidence, a bunch of benchmarks using the same resolution and settings would be a good place to start. You can't buy the card separately anyway so what does it matter? You can't buy 5000 series APU's either so what's the point of making comparisons?
 
Why anyone would spend $750 on a PC that can only run games in 1080p on medium or low, in 2021, escapes me. Honestly I’d spend that money on a PS4, Xbox Whatever or a used rig with significantly more power, and wait until the situation improves. Or get into movies instead.

$750 for that performance is daylight robbery, even if it actually turns on, which given Cyberpower’s reputation might be an ask too far.
Because a computer with better performance would cost substantially more. $750 was less than most RTX 2060 are going for. It's as if you didn't realize there was a GPU shortage with high demand for PC gaming.
 
Here is what I found on the Intel website about this GPU:

"Elevate creativity and productivity with Intel® Iris® Xe dedicated graphics, available in systems through select partners. Experience amazing HD video capabilities for work, home, and remote learning with the first dedicated GPU Add-in Card for desktop PCs based on Xe architecture. With support for up to 3 simultaneous 4K HDR displays, you can immerse yourself in your favorite content or multi-task like a pro." So based on Intel's marketing what is your specific issue with this video card?

Where is the evidence this GPU, which you can't buy separately at this time, is better or worst than an APU? Do you have comparisons for creative tasks comparing

For gaming I'm finding the, currently overpriced, but available, 3400G ($287 vs $149 MSRP) is well behind the Intel Iris Xe DG1 in Fortnight at 1080p and low settings it's one of the few tests I could find that showed a similar game between the 3400G and the Intel Iris Xe DG1 (terribly long name by the way)
3400G FPS 67 low / 81 high
Xe DG1 FPS 108 low / 262 high

How much is a system going to cost based on a 3400G? The CyberPowerPC with only 8GB of single channel RAM cost $749 are you going to be able to put together a decent system based on a 3400G that can beat the CyperpowerPC? At that price point does it really matter? No one who's running a business doing create work is going to be buying such a system and anyone who's doing creative work as a hobby doesn't need to save 20 seconds on a video render.

Where did you get your objective evidence anyone has a contract with intel specifically baring a company from using AMD APU's in their systems? CyberpowerPC has at least 40 AMD powered systems. What would lead you to believe Intel is stoping them from using APUs? Could it have something to do with APU availability and cost and not some conspiracy? What is CyberPowerPC supposed to do with all the 10400F they have? I assume they've had CPU's and motherboards sitting around waiting for GPU's for a while now. Does AMD have an exclusive contract with Microsoft and Sony to make SOC's for their consoles? Is there anything wrong with that?

Instead of basing your opinions on how you feel about something, please bring some objective evidence, a bunch of benchmarks using the same resolution and settings would be a good place to start. You can't buy the card separately anyway so what does it matter? You can't buy 5000 series APU's either so what's the point of making comparisons?
Well that's a lot of text to say "This isn't for gaming is for content creation"

Which is a fine point: Intel might never push this as explicitly for gaming for all we know at this point and I am aware that while intel iGPU is decisively slower than AMD's APUs (With the latest Iris pro and previous iris line up being the one exception where it kinda gets up to par) there is some very heavy optimizations for example in Adobe software that heavily uses the intel graphics and such.

But given all that, is it really that surprising that I was talking strictly gaming A/B comparisons when you comment on an article describing a video that explicitly talks about gaming?
 
Because a computer with better performance would cost substantially more. $750 was less than most RTX 2060 are going for. It's as if you didn't realize there was a GPU shortage with high demand for PC gaming.
Yeah I know, that’s literally why I said I’d rather spend the money on a PS4, or any other option if I wanted to game. It’s as if you didn’t realise I was saying this isn’t a viable option for gaming, in an article that describes its gaming performance.
 
Back