Intel Arc Alchemist flagship clocks in at 2.4 GHz in leaked benchmark

mongeese

Posts: 643   +123
Staff
Forward-looking: After many delays, Intel is finally within months of delivering its first generation of discrete gaming GPUs, branded Arc Alchemist. Engineering samples are appearing regularly in public benchmark databases, giving us insight into their performance and specifications.

Already, we know quite a bit about the series. At the top end, the Arc Alchemist flagship has 512 EUs and 16 GB of GDDR6, and then there are at least two smaller models with 256 and 128 EUs and correspondingly smaller amounts of memory.

Before today, Intel GPUs had been seen at speeds of up to 2.1 GHz. But this morning, a handy Twitter bot found a new OpenCL benchmark entry from the 512 EU model and it had a peak frequency of exactly 2.4 GHz.

On paper, a GPU with 512 EUs clocked at 2.4 GHz has 19.7 TFLOPs of power. Its rumored direct competitors, the RTX 3070 and 3070 Ti have just a fraction more.

Performance in the benchmark didn’t quite measure up though. At 85,448 points, it scored only slightly better than an RTX 2060 Super. But there's any number of legitimate reasons why it might’ve underperformed, the likeliest of which is that it was clocked below 2.4 GHz for most of the run.

Image credit: Videocardz

In other leaked benchmark results, it’s done better and worse.

In an OpenCL benchmark run from November, it achieved just ~68,000 points. But in a recent SiSoftware Sandra benchmark, it traded blows with the 3070 Ti and generally came out ahead.

Intel’s GPU was clocked at 2.1 GHz during the SiSoftware benchmark, which means that at 2.4 GHz it could’ve pulled away from the 3070 Ti. Still, synthetic productivity benchmarks aren’t good indicators of gaming performance. Rest assured, when time comes, we'll have full blown reviews of Intel's new graphics cards.

Permalink to story.

 
And when it shows subpar performance in the real-world tests, they will say there was a leak in the performance, during the benchmark.
 
Frankly, if they are priced right I don't see anything wrong with 3070ti levels of performance. I would not complain at all if I could get a 3070ti around MSRP. I play at 4k and a 1070ti is plenty for 4k60. Granted, I only play ESO and EvE, so they aren't demanding, but a 3070ti would be over kill for me and even let play new games at 4k.
 
A seriously impressive performer this GPU seems to be, for a first try anyway. At this point I'm actually more interested in how many important features their driver suite will have, since I honestly think AMD is that much ahead even Nvidia, having overclocking tools and such all in the same package, which I do enjoy. Not everyone will find every feature that important and neither I have overclocked my current RX 5700 XT, but I do use other features like Chill, Anti-Lag, Enhanced Sync, in-game screen overlay, etc. I haven't checked if Intel has a similar feature to Enhanced Sync/Fast Sync, but that particular feature is a must for me. The driver suite is certainly going to affect opinions of many long time GPU buyers I'm quite sure.

I'm open to buy an Intel GPU, but mentioned features are important and I will also not accept a GPU like in the picture, some more robust custom model is required, though of course if that thing proves to be an amazing performer, then there's is no problem, but that doesn't seem likely given the TDP.

What is also exciting to see, is what games work the best on Intel's architecture, as we can already see different behavior in different benchmarks, if the leaks are to be believed.
 
Strong likelyhood these will come in slower and cheaper in the first instance. Nothing wrong with that.
 
Rembrandt's iGPU also demolishes the AL iGPU, so it's not looking that good given the clocks and mainstream desktop cards will be releasing not long before we start getting RDNA3 and Lovelace. This card is way too late. They should accelerate Battlemage and release that next year.
 
Good I plan never to use amd or nvidia again after their lack of caring for us. profit above all other considerations. will come back to bite them I hope. Just stop forgiving them and keep your money for others that deserve it.
 
Rembrandt's iGPU also demolishes the AL iGPU, so it's not looking that good given the clocks and mainstream desktop cards will be releasing not long before we start getting RDNA3 and Lovelace. This card is way too late. They should accelerate Battlemage and release that next year.
I say intel release their GPUs and let them take it slow. If it is at 3070ti levels of performance that's already entering the enthusiast levels of performance with the mainstream GPUs being the bulk of sales. If they get the price right to dominate the mainstream market I don't think it matters who has the performance crown. The other thing is that these GPUs will essentially be a BETA. They need to get them into the hands of users and see what works and what doesn't before they bring a new series of GPUs to the market. We very well may see these first GPUs sold at a loss to get them into the hands of consumers and be written off in development costs.

I know many people on this board, myself included, who want the best and the fastest but we aren't the mainstream market.
 
A seriously impressive performer this GPU seems to be, for a first try anyway. At this point I'm actually more interested in how many important features their driver suite will have, since I honestly think AMD is that much ahead even Nvidia, having overclocking tools and such all in the same package, which I do enjoy. Not everyone will find every feature that important and neither I have overclocked my current RX 5700 XT, but I do use other features like Chill, Anti-Lag, Enhanced Sync, in-game screen overlay, etc. I haven't checked if Intel has a similar feature to Enhanced Sync/Fast Sync, but that particular feature is a must for me. The driver suite is certainly going to affect opinions of many long time GPU buyers I'm quite sure.

I'm open to buy an Intel GPU, but mentioned features are important and I will also not accept a GPU like in the picture, some more robust custom model is required, though of course if that thing proves to be an amazing performer, then there's is no problem, but that doesn't seem likely given the TDP.

What is also exciting to see, is what games work the best on Intel's architecture, as we can already see different behavior in different benchmarks, if the leaks are to be believed.
Huh? AMD isn't ahead in anything gpu related. Nvidia has had software for over a decade plus their partners like msi and evga all make some of the best overlooking tools. So no AMD isn't ahead. This ain't AMD vs Intel this lil aMD vs KING OF THE GPU MARKET NVIDIA.
 
Huh? AMD isn't ahead in anything gpu related. Nvidia has had software for over a decade plus their partners like msi and evga all make some of the best overlooking tools. So no AMD isn't ahead. This ain't AMD vs Intel this lil aMD vs KING OF THE GPU MARKET NVIDIA.
if by "king of the GPU market" you mean inventing proprietary gimmicks that lock competitors out of the market, then yeah, they're king. nVidia makes something, AMD makes a free version and everyone uses that instead of nVidia's. Gsync is dead, all it is now is an nVidia badge on a FreeSync monitor.

And AMD's drivers on the hardware side are better. The GUI's aren't as friendly but they are more stable. The only time they become unstable is when *surprise* nVidia hardcodes their technology into a game, closes the source code and causes compatibility issues with AMD hardware. Their anticompetitive behavior has caused performance issues across the industry and sky-rocketed the price of GPUs.

I say all this as someone who has a 1070ti.... If the market was different I'd have bought a 6800xt because I primarily run Linux and nVidia Linux drivers are trash
 
if by "king of the GPU market" you mean inventing proprietary gimmicks that lock competitors out of the market, then yeah, they're king. nVidia makes something, AMD makes a free version and everyone uses that instead of nVidia's. Gsync is dead, all it is now is an nVidia badge on a FreeSync monitor.

And AMD's drivers on the hardware side are better. The GUI's aren't as friendly but they are more stable. The only time they become unstable is when *surprise* nVidia hardcodes their technology into a game, closes the source code and causes compatibility issues with AMD hardware. Their anticompetitive behavior has caused performance issues across the industry and sky-rocketed the price of GPUs.

I say all this as someone who has a 1070ti.... If the market was different I'd have bought a 6800xt because I primarily run Linux and nVidia Linux drivers are trash

I use Nvidia too but this is damn accurate. On one hand we go in rampage mode when Epic was store locking a game for month but when Nvidia locks technology, its alright?

Not to mention these tech lock makes hardware costs ridiculous. Just look at Gsync vs Freesync monitors costs.
 
Context is important. Full reviews are needed to see the big picture of the overall performance.
 
if by "king of the GPU market" you mean inventing proprietary gimmicks that lock competitors out of the market, then yeah, they're king. nVidia makes something, AMD makes a free version and everyone uses that instead of nVidia's. Gsync is dead, all it is now is an nVidia badge on a FreeSync monitor.

And AMD's drivers on the hardware side are better. The GUI's aren't as friendly but they are more stable. The only time they become unstable is when *surprise* nVidia hardcodes their technology into a game, closes the source code and causes compatibility issues with AMD hardware. Their anticompetitive behavior has caused performance issues across the industry and sky-rocketed the price of GPUs.

I say all this as someone who has a 1070ti.... If the market was different I'd have bought a 6800xt because I primarily run Linux and nVidia Linux drivers are trash
Nvidia is KING of the GPU market, period.

My 1070 works fine. Drivers work fine. Nothing is ever perfect.
 
I say intel release their GPUs and let them take it slow. If it is at 3070ti levels of performance that's already entering the enthusiast levels of performance with the mainstream GPUs being the bulk of sales. If they get the price right to dominate the mainstream market I don't think it matters who has the performance crown. The other thing is that these GPUs will essentially be a BETA. They need to get them into the hands of users and see what works and what doesn't before they bring a new series of GPUs to the market. We very well may see these first GPUs sold at a loss to get them into the hands of consumers and be written off in development costs.

I know many people on this board, myself included, who want the best and the fastest but we aren't the mainstream market.

I'm not after the best and fastest, I'm just pointing out that Intel has been so late to the party with these they will now be judged against next gen AMD and Nvidia cards. The top spec Intel will be a $500+ card and will get it's @rse handed to it by 4070 and 7700XT. If they were priced much lower than the competition then it wouldn't matter, but I doubt they can afford to do that even with their deep pockets, Intel doesn't do cut price hardware. I'm still excited we'll have a third player but I won't be buying one of these cards. It depends how desperate you are for a new card. If they can have decent supply it'll probably do well by default, but GPU supply is slowly improving across the board.
 
"Nvidia is KING of the GPU market, period."

And that's good for you how again?
I own both company's products. Don't see much difference, except when someone's talking about them.
Didn't say anything about me. I said they are the KING of gpus, which they are. Has nothing to do with me, more so the millions who use their product everyday compared to AMD. Which is what makes them KING of gpus.
 
Didn't say anything about me. I said they are the KING of gpus, which they are. Has nothing to do with me, more so the millions who use their product everyday compared to AMD. Which is what makes them KING of gpus.
Do you at least own Nvidia shares to justify the degree to which you're ego-welded to the company?
 
Do you at least own Nvidia shares to justify the degree to which you're ego-welded to the company?
Again, nothing to do with me or ego. Not my fault you dont understand who the King of gpus is. Funny people didnt have a issue calling Intel King of the cpus for 15 years. Many still believe they are. AMD has finally become viable in cpus but not in gpus.
 
What exactly is an "EU"? You are aware that the first time you use an abbreviation that isn't well known in a piece of writing, you should explain what it is, because someone will be encountering it for the first time? But you guys didn't even do that in the earlier piece you linked to. Once again, y'all need an editor.

(Don't tell me "just Google it" as the onus is on the writer to provide clarity.)
 
Are you serious or is that a lame attempt at sarcasm? If you're serious, what rock have you been living under, eh?

" EU " stands for European Union, a VERY common expression...
"the Arc Alchemist flagship has 512 EUs"

That very clearly DOES NOT mean "European Union" in this context.

EDIT: FFS read the article next time. If you did a CTRL-F on "EU" you would see Europe never came into it.
 
Back