Early benchmarks make Intel's Arc mobile GPU look like a gimped GTX 1650 Max-Q

nanoguy

Posts: 1,355   +27
Staff member
In brief: Intel's first wave of Arc graphics cards isn't here yet, but the company has been hyping up the Xe-HPG core and support for Xe Super Sampling (XeSS). Early benchmarks have started to appear online, and if proven true they suggest that Intel still has a lot of work to do if it wants to have "Alchemist" ready for a 2022 release.

Intel is currently working on its Arc high-performance GPUs for desktops and mobile form factors, but the first products of that effort won't arrive until sometime in the first quarter of 2022. Chipzilla has been throwing hints about its plans with Arc and how it intends to challenge both Nvidia and AMD in the discrete GPU market, but other than a roadmap and some nifty software tools, we've had little to go on for an idea about how these new GPUs might perform.

A number of leaks have suggested that Intel's upcoming "Alchemist" graphics solutions will come in several SKUs, ranging from 128 execution units and 4 GB of VRAM on a constrained 64-bit bus and up to a high-end model with 512 execution units with 16 GB of VRAM on a 256-bit bus, which is said to be roughly between an Nvidia RTX 3070 and an RTX 3080 in terms of performance.

Thanks to some early benchmarks spotted by @Tum_Apisak, we can get a very rough idea on how well the mobile variant of the top-end Alchemist GPU might perform. These are Geekbench 5 tests performed with an Intel Tiger Lake CPU, and we're likely looking at an early engineering sample, but the image isn't pretty for Alchemist.

It looks like the engineering sample obtained an OpenCL score of 34,816 points when clocked at 1,800 MHz, which would be a rather disappointing result if taken at face value. For reference, this is around the same level as Nvidia's last-gen GeForce GTX 1650 Max-Q, which is actually able to score in excess of 36,000 points on the same test.

Granted, this is only a singular test that measures the compute performance of the Alchemist GPU, which again is an early sample clocked at a rather low 1,800 MHz. Intel suggested during its Architecture Day presentation that we can expect the Xe-HPG graphics engine to have 1.5 times the performance-per-watt and frequency boost of the Xe-LP one found in the Iris DG1 card released earlier this year.

This works out to a theoretical boost clock speed in excess of 2 GHz, which is further indication that we're looking at very early silicon here, and in a mobile form factor no less, which means it could be power-constrained. Back in June, someone tested the Iris DG1 gaming performance and found it was surprisingly decent across a number of popular games, despite a barrage of early benchmarks suggesting otherwise.

If anything, the same can happen with Alchemist, and it goes without saying that Intel only needs to come up with entry-level and mid-range GPUs during the current market conditions, which are expected to persist for at least another year. If it can manufacture them without disruptions using its own chip plants, many gamers and cryptocurrency miners will buy them in a heartbeat.

Permalink to story.

 
Ok so that is NVIDIA GeForce GTX 1650 Ti Max-Q, a laptop only GPU that scores 3162 3D marks.

We are talking about 45% of the performance of GTX 1080 Founder's Ed stock/RTX 2060.

Still it's a lot better than 1050TI and a little slower than the 1060 3GB, Steam's most popular GPU.

Idk why you are so contemptuous and condescending. It's obvious this wasn't meant to pwn the 3090 that guy who posts here owns and reminds us all the time.
 
I hope they can pull at least 1070, 1080 performance on discrete cards and flood the market. I`m sure I`ll be buying one if it`s MSRP price.
 
Ok so that is NVIDIA GeForce GTX 1650 Ti Max-Q, a laptop only GPU that scores 3162 3D marks.

We are talking about 45% of the performance of GTX 1080 Founder's Ed stock/RTX 2060.

Still it's a lot better than 1050TI and a little slower than the 1060 3GB, Steam's most popular GPU.

Idk why you are so contemptuous and condescending. It's obvious this wasn't meant to pwn the 3090 that guy who posts here owns and reminds us all the time.

When it comes to hardware it does looks to be a strong offering don't get me wrong. But given how notoriously bad intel has been at driver support and well driver optimization is basically non-existent, it's really hard to be optimistic about this product from a gaming perspective.

So at least for me I'm ok with being extremely cautious about the eventual release of these cards as in, if I get one I expect to wait at least 6 months for the community to properly test 'em and figure out where it is in terms of driver support for gaming but I really do not expect this will be usable for average gamers before that 6 months after release timeframe.
 
It's never going to be "ready," until they put more shaders in the thing.

96 Tiger Lake shaders = around GT 1030, so 512 shaders = around 4x the GT 1030's performance, or a 1650.
 
So whats the AMD equivalent to this Nvidia GPU?

Oh thats right, only Nvidia makes GPU's.....

Why do you come here? Literally every article mentioning nvidia, no matter how obviously relevant, you complain about how biased Techspot is. At least QP occasionally makes valid points in between all the boasting. I’m not even going to start on using $ in place of ‘s’ like an early 2000s 12 year old.

Back on point, if intel can deliver mid range performance at a mid range price point then this could be very exciting. The red flag is that intel has never wanted to play in the value segment and Raja has a history of delivering lacklustre performance. Im just hoping they’ve been able to match the R&D budget to the marketing, or it’ll be a hot mess in every sense.

 
I'm actually quite hopeful for Intel
1 They have deep pockets
2 The CPU , AI, Server competition is heating up and there are definitely synergies. So need to extend product base , plus incorporate GPU tech into AI etc
3 They already have knowledge of GPUs - even if mainly only attached to CPU - so knowledge of simple drivers at least .
4 There's an evolution at the moment in GPU and chip design and huge beasties to lots of smaller beasties - they need to be on top of this for CPU design as well
5 They already have a lot of skill in chip design and very importantly intra/inter chip communication
6) in line with number 1 deep pockets - they have engineers and can buy engineers to analyse , take apart, copy , xray whatever the competition designs

So a big shake up going on .
Hopefully ignoring other type of tech ( light, quantum, organic etc )
We see an evolution after this of highly reprogrammable arrays/cpu/gpus on the fly - super small fast memory/transistors - probably not needed to shaders etc - but just maybe generating ultra realistic graphics from small data set or seed . Plus if you have a RTX 9090 - you certainly do not want it just to produce some pretty graphics and sound - if will need to do AI - help fix old photos/videos , analyse large data sets - I would be definitely annoyed to have a $4000 part sitting there - I want to be able to talk to it batch process any data on my network or internet to my choosing - You lot write scripts to select, rename etc etc - I'm too lazy to learn - I just download an approximation .

So I'm optimistic for Intel - I optimistic in 10 years we will be able to make seriously good games ourselves with more powerful GPUs , Assets , engines to create unique looking games - that a lot of hard stuff like game timings can easily be tweaked - ie the GPU/Engine will playtest the game for human playlike optimisations -all done by natural language .

Who doesn't want their dog or cat helping them out in a game

tl/dr Intel will probably do well
 
Back on point, if intel can deliver mid range performance at a mid range price point then this could be very exciting. The red flag is that intel has never wanted to play in the value segment and Raja has a history of delivering lacklustre performance. Im just hoping they’ve been able to match the R&D budget to the marketing, or it’ll be a hot mess in every sense.
Yeah, a decent $200 GPU for 1080p would be appreciated. Even $300. I’m stuck on Raja’s previous $200 GPU - the RX480. Which sure, wasn’t much to get excited about but the performance it offered for its price was decent. I’d take a ray tracing capable GPU for $200-$300 to put some life in my old LG 1080P 60hz panel until better GPUs come down in price. Currently $300 might get you a 5500XT or if you’re very lucky a used RTX 2060.

But I must admit, my expectations are not high.
 
In my opinion, I feel Intel's ARC dedicated GPUs are too late in the game, especially when we are looking at the top end model that is expected to compete with the likes of RTX 3070 and RX 6700 XT. The former was announced and released last year and a new generation expected in 2022. 2022 is where ARC is supposed to be released if I am not mistaken. So whatever attractiveness of the ARC GPU will be negated as we close in towards the later part of 2022. The only saving grace for Intel is that the GPU shortage may drive a higher adoption rate of their ARC GPU. But this is a double edged sword for them because if Intel mess up the initial experience, when things eventually recovers, not many gamers will be interested in buying it. It is like how AMD messed up the RDNA launch with very bad driver issues, which made a lot of people skeptical when they released RDNA2. Till now, I have seen people asking if AMD GPU drivers are stable or not. So that impression sticks around.
 
Why do you come here?
There are some smart and informative posts within some few articles in here.
Literally every article mentioning nvidia, no matter how obviously relevant, you complain about how biased Techspot is.
even though the facts are in front of your eyes, you cant or prefer not to see, hence your comment.
At least QP occasionally makes valid points in between all the boasting.
According to my “likes” to post ratio, many agree with my points, but
I’m not even going to start on using $ in place of ‘s’ like an early 2000s 12 year old.
If you truly believe that corporations like intel and nvidia dont pay SOME venues for positive reviews or constant bombardment of their goods, you are in no position to call anyone a juvenile.
 
INTEL really can't fail. After watching the market since the early 90's INTEL can't fail with what's been going on lately. I'm worried about DRIVERS hurting INTEL or APU's from AMD that may push INTEL away.

For the first time ever I will say this. Hope INTEL succeeds we need more players and these stupid miners. MINERS will buy INTEL time.
 
My prediction: Intel's cards will be 100% geared for mining. Miners buy ten times the cards that gamers do - that's their real target market.
 
I'm curious about drivers and how will Intel GPUs handle older games that had patches and bug fixes for Nvidia and AMD but not for Intel.
 
Back