Leaked benchmarks show Intel Arc Alchemist GPU performing like a GeForce RTX 3070 Ti

Daniel Sims

Posts: 1,381   +43
Staff
Something to look forward to: For a while now, rumors have pegged the performance of at least one of Intel’s upcoming discrete desktop GPUs at around the level of Nvidia’s RTX 3070 or 3070 Ti. The latest leaked benchmark for what seems to be the top-end Intel GPU lends further support to these rumors.

This week, a benchmark for an Intel graphics card appeared on Sisoftware, and it seems to compare favorably against Nvidia’s RTX 3070 Ti. The site scores the card in terms of stats like clock speed, GP compute (at multiple levels of precision), and efficiency.

The benchmark doesn’t name the Intel card (only lists the driver), but it’s undoubtedly the rumored high-end Intel Xe HPG 512 EU based on its stats. It’s shown to have 4096 ALUs, 512 EUs, and—oddly—12.8GB of VRAM. Earlier rumors gave the Xe HPG 512 EU 16GB of RAM (above).

The GPU in this test has a clock speed of 2.1GHz, but we don’t know if this is a base or boost clock. In either case, it beats the 1.8GHz clock of a 3070 Ti in the same benchmark (below).

The RTX 3070 Ti wins in single-float GP compute at 25,810.54 megapixels per second versus the Intel card's 20,888.29. However, the Intel beats the 3070Ti in half-float at 35,093.25 versus 35,072.61Mpix/s, double-float at 1,000.31 versus 571.93Mpix/s, and quad-float at 109.46Mpix/s versus 21.62Mpix/s.

The 3070 Ti wins slightly in Sisoftware's overall speed efficiency stat, but the Intel GPU ranks slightly higher among GPUs overall (both are in the 99th percentile). Still, we likely won't know how the Arc Alchemist GPUs stack up until they're in the wild and accurate game benchmarks show up.

Earlier rumors positioned another mid-range Intel model as a competitor to the RTX 3060 and a lower-end model against the GTX 1650—though with ray tracing and Intel’s DLSS competitor XeSS, which the 1650 lacks. They’re supposed to launch in March.

Permalink to story.

 
Interesting. My requirement for a new GPU were to be 1. 50% faster then a vega 64 at 1440p 2. have at least 12GB of RAM and 3. be under $500.

This GPU would seem to tick the first two boxes, much like the 6800 did. Now here is the question, will the GPU launch at MSRP? $700 is more then I want to spend, but if it is consistently available at $700 I may bite anyway, seeing as everything is getting more expensive and my vega is long dead now. Especially if it OCs, if it can be pushed to 2.5+ GHz like the RX 6000 series there might be some serious performance on tap.

But if it's $1000+ then I'm waiting another generation. Intel has the gorilla size to muscle their way into the market, grabbing huge market share and building up a returning fanbase if they just do the common sense thing in limiting bulk sales to miners and scalpers. That would either force Nvidia/AMD to address the market demands OR give intel a huge leg up next generation, similar to what the 9700/9800 pro did for ATI in 2003.
 
1. Compute performance doesn't automatically translate to gaming performance, otherwise the Radeon 7 would have been the most powerful GPU of its time.

2. Performance doesn't matter if driver support continues to suck. Intel don't have a good track record with graphics driver support, even reviews of Iris Xe continue to find games where it either doesn't run or runs bizarrely poorly compared to other games. No point having the best performing graphics cards if games either don't run or have weird graphical glitches/frame timing issues.

3. As someone that was following tech during the i740 hype days, and have also seen Koduri shred his credibility during the Vega days, I will remain firmly in the 'believe it will fail until proven otherwise' camp.
 
1. Compute performance doesn't automatically translate to gaming performance, otherwise the Radeon 7 would have been the most powerful GPU of its time.

2. Performance doesn't matter if driver support continues to suck. Intel don't have a good track record with graphics driver support, even reviews of Iris Xe continue to find games where it either doesn't run or runs bizarrely poorly compared to other games. No point having the best performing graphics cards if games either don't run or have weird graphical glitches/frame timing issues.

3. As someone that was following tech during the i740 hype days, and have also seen Koduri shred his credibility during the Vega days, I will remain firmly in the 'believe it will fail until proven otherwise' camp.
Intel has also never made a gaming focused chip before. The majority of Xe use cases are focused around some type of production based workload. Given intel's software dev team is bigger then all of AMD, we should give them a chance at supporting a standalone dGPU first.

Koduri is also responsible for polaris, the best budget GPU since the 7870. While vega was a turd, AMD's actions since his departure, whether it be the 5600xt memory clock fiasco, the entire rDNA clock rate fiasco, the 6700xt/6600xt price stagnation fiasco, or now the disastrous 6500xt would indicate it wasnt just him. AMD ahs been broken for a logn time, and one man cannot make a GPU successful.

Vega was a clear mistake, but then again the FX series from nvidia, the FX series from AMD, and netburst from intel were also giant missteps. At least vega today is good with the compute stripped out, more efficient then polaris was (and really what he should have done in the first place).
 
I was thinking about Intel as a choice, but then again I remembered I never gamble. This is the first gen of something Intel has zero experience with. So many things could go wrong and they will. I don`t want to be a beta tester for them. Besides, it looks like Ethereum is taking a beating right now, so, mining demand will surely slow. If on top of this, they come out with a smug MSRP of 700 euros or more, because "we can", then I`d say keep it.
 
This synthetic performance needs to translate into gaming too and then it will be good.

But the bigger question will be: at what price? MSRP is one thing and real shop prices another story...

My OC-ed RX 6700 XT gets 8600 score in this test (more than stock 3070 Ti) and Arc 512 EU gets 9000 score. I'm waiting for gaming benchmarks.
 
Drop board power 10% save 100 watts and change thermal paste. its a good gpu then. this vega 64 is.
easy to make a unlocked gpu look bad but just look at nvidia today their power usage is over 500 watts pure insanity and no one says a word.
 
I'll believe it when I see an actual card running a game.

At the tiny trickle of information thus far, I wouldn't be surprised to see Nvidia launch the 4000 series first.
 
If we look at the performance of the XE integrated graphics as a precursor, I feel the dedicated high end solution may end up looking great in benchmarks, but falling short in actual games. Intel may have very deep pockets, but I don't believe their drivers will be fantastic from day 1. All of the current games are not optimized for Intel GPUs in the first place. It is easy to optimize the driver for benchmarks, but less so in games.
 
Intel has also never made a gaming focused chip before. The majority of Xe use cases are focused around some type of production based workload. Given intel's software dev team is bigger then all of AMD, we should give them a chance at supporting a standalone dGPU first.

Koduri is also responsible for polaris, the best budget GPU since the 7870. While vega was a turd, AMD's actions since his departure, whether it be the 5600xt memory clock fiasco, the entire rDNA clock rate fiasco, the 6700xt/6600xt price stagnation fiasco, or now the disastrous 6500xt would indicate it wasnt just him. AMD ahs been broken for a logn time, and one man cannot make a GPU successful.

Vega was a clear mistake, but then again the FX series from nvidia, the FX series from AMD, and netburst from intel were also giant missteps. At least vega today is good with the compute stripped out, more efficient then polaris was (and really what he should have done in the first place).


They tried before with the i740 in 1998
 
If we look at the performance of the XE integrated graphics as a precursor, I feel the dedicated high end solution may end up looking great in benchmarks, but falling short in actual games. Intel may have very deep pockets, but I don't believe their drivers will be fantastic from day 1. All of the current games are not optimized for Intel GPUs in the first place. It is easy to optimize the driver for benchmarks, but less so in games.

True that. Also with software drivers in games the team has to even deal with timing and pressure to fix any issues or most gamers start quickly crying waiting for a fix. It's like playing basketball with yourself a lone vs millions of people watching your move big competition hovering over you. INTEL needs to know video card making is very fast pace.
 
I can wait for the reviews, but my feeling is that Raja funked up again. Not like is the first time.

@Achaios
Totally forgot about the GPU drop, made me laugh again.
 
It's a promising start, but regardless of how this does in the end, it's hopefully just the first step on a long road which will end with 3 companies making useable cards.
 
Show me the card. If you can't get hold of one at any where near RRP it doesn't matter one bit how good the card is. I hope it succeeds, we need a rocket up Nvidia and AMD's butt.
 
True that. Also with software drivers in games the team has to even deal with timing and pressure to fix any issues or most gamers start quickly crying waiting for a fix. It's like playing basketball with yourself a lone vs millions of people watching your move big competition hovering over you. INTEL needs to know video card making is very fast pace.
Personally, I don't have high hopes for the first gen Intel GPU. This may be a pipe cleaning card, just like AMD's RDNA where they had so many issues to iron out, but also led to the much better driver experience with RDNA2. And to add on to Intel's woe, they are lagging behind competition quite badly. If the rumours of launch delays to H2 2022 is true, this card is going to go head on with Nvidia and AMD's next gen card. Unless Intel intends to release a new gen of GPU each year, otherwise, they are going to be 1 cycle behind.
 
Imagine one day pairing an Intel GPU with AMD CPU. But like others have said it's the driver support I'm worried about. They've got deep pockets at least especially compared to AMD's driver team.
 
About time. GPUs are more lucrative than CPUs. With Datacentres & AI reliant on them, Intel cant afford to be left out. 3 graphics companies will be much better than the current duopoly.
 
The only thing I care about now is when are these fricken cards going to show up. Intel has been talking a LOT and delivering very little.
 
Back