Intel Iris Xe DG1 GPU first benchmark suggests inferior performance to the Radeon RX 550

jsilva

Posts: 325   +2
TL;DR: The first benchmark of the Intel Xe DG1 graphics card has been added to the Basemark GPU database. It's still too soon to draw conclusions, but based on this result alone, the OEM-oriented Intel graphics card seems slightly inferior to the Polaris-based Radeon RX 550 that we tested back in 2017.

The Basemark entry spotted by Apisak implies that this is the Asus DG1-4G that we have previously shown to you. To put the entry's result into perspective, Apisak compared it to a Radeon RX 550 graphics card, a sub-$100 entry-level GPU from four years ago.

Indeed, back in 2017 we published an Esports GPU battle between two GPUs that were supposedly selling under $100: AMD's Radeon RX 550 and Nvidia's GeForce GT 1030.

Based on Intel's 10nm SuperFin node, the Asus DG1-4GB features a cut-down Xe Max GPU clocked at 1500 MHz with 80 execution units (EUs). The memory buffer is composed of 4 GB of LPDDR4X-4266 across a 128-bit bus, and the whole card consumes about 30 W.

As for the Radeon RX 550, the 14nm GPU comes with 10 compute units (CUs) clocked at 1,183 Mhz and 4 GB of GDDR5 at 7,000 MHz on a 128-bit bus. Rated with a 50 W TBP, this Radeon GPU offers a memory bandwidth of up to 112 Gbps.

Both systems are using an Intel Core i3-10100F. For a fair comparison, only the Vulkan API scores will be considered. Comparing the 17,289 points of the DG1 graphics card to the Radeon RX 550's score of 17,619, there's a small 2% difference between the two.

Also read: The Last Time Intel Tried to Make a Graphics Card

Such marginal difference in a single benchmark isn't enough to determine which one is faster, but it gives us an idea of what the DG1 will be capable of. As Intel launches more GPUs into the market, we'll certainly see higher-performance cards than this one. We still don't know when this will happen, but it should be sometime this year.

Permalink to story.

 
I wonder if it will accelerate applications like premiere pro etc. Not everyone uses GPUs for gaming. Right now an Intel iGPU thrashes a 6900XT at Adobe CC apps purely because Radeons don’t support hardware acceleration in that software.

Also I wonder if it can mine. If it can mine then Intel will make money on it regardless of how poor its gaming performance is. And could anyone blame a company for cashing in on this newfound highly lucrative silicon market?
 
Well I did called this out as being at least as good as a 1650 so it looks like it might be passable as an "Well there's just no other GPU available" option. Well maybe: we still need to see if they can get close to that 150 USD mark when it comes to price: if they want over 200 for this then it's probably not worth it vs just sticking with what you have and waiting.
 
Like I said years back Raja is nothing special. All the money INTEL spent and the amount of ex AMD workers that INTEL grabbed and still nothing!

Save some money let Raja sell lemonade instead.
 
Like I said years back Raja is nothing special. All the money INTEL spent and the amount of ex AMD workers that INTEL grabbed and still nothing!

Save some money let Raja sell lemonade instead.

He did do a great job over overselling FIJI and Vega.
 
With only 80 EUs, this is the same 80 EU GPU Intel includes in their laptop U-class Tiger Lake 4-core Core i5, and is smaller than the 96 EU GPU found in the Tiger Lake i7.

It's a laptop iGPU.

So matching the RX 550 seems about what to expect from this first try, especially seeing as it doesn't even have GDDR5 or better.
 
If it could work in conjunction with the iGPU ala amd hybrid crossfire that'd be interesting for a low profile media streaming/party gaming PC. But just 80EU is boring, I want to see one of these running at the 300w power draw of a 11900k. 512+ EU. THAT would be interesting.
 
If their top card can match an 1070GTX, that would be amazing for a first attempt. Nevertheless, the important thing is that hopefully, in 3-5 years from now, we may have some good 3rd option.
 
With only 80 EUs, this is the same 80 EU GPU Intel includes in their laptop U-class Tiger Lake 4-core Core i5, and is smaller than the 96 EU GPU found in the Tiger Lake i7.

It's a laptop iGPU.

So matching the RX 550 seems about what to expect from this first try, especially seeing as it doesn't even have GDDR5 or better.
It is indeed essentially a GPU stripped out of laptop CPU, but when one compares the component specifications and theoretical throughputs of the DG1 to the AMD Lexa used in the Radeon RX 550, it is somewhat underwhelming:

DG1 (1500 MHz) vs RX550 (1183 MHz)
FP32 peak throughput = 1.98 vs 1.21 TFLOPS
Peak pixel output rate = 31.0 vs 18.9 GPixels/s
Peak texture sampling rate = 62.0 vs 37.9 GTexels/s

One can pull up some other graphics cards, from the database, to compare it to:


The High detail results are rendered at 4K and then scaled down to the monitor's results (same as 3DMark), whereas the Medium detail test is done at the monitor res. One can see that it's around the same as a GT 1030 2GB and one can see that it's about the same as a GeForce GT 1030 (DDR4 or DDR5 model). The DDR4 version has especially miserable throughput figures (just 17 GB/s of memory bandwidth, for example) so the DG1 doesn't exactly stand out here.
 
You can't build silicon Rome in a day. Launch it cheap, get it in the wild. Get experience from customer feedback, improve drivers which are critical to GPU performance. Move on from there. Iterate quickly.

Perhaps in another couple of years Intel will have finally fixed what they call their 7nm process node and will be able to turn out competitive midrange GPUs on it. Which will be enough to make the enterprise profitable.
 
You can't build silicon Rome in a day. Launch it cheap, get it in the wild. Get experience from customer feedback, improve drivers which are critical to GPU performance. Move on from there. Iterate quickly.
I don't recall Rome had GPU or did launch cheap...
 
The only thing that worth spend money on is Ray Tracing performance, or else any graphics card from past gens would do fine really...
 
Underwhelming, but that was to be expected. Maybe second or third gen Xe will be an actual competitor for AMD and Nvidia.
First off it's not for sale to consumers, yet. Secondly it was only in Vulcan. Thirdly it's a low power video card during a video card drought.
 
Back