Intel shows the Arc A770 beating the RTX 3060 in ray tracing benchmarks

mongeese

Posts: 643   +123
Staff
Forward-looking: The Intel marketing machine has produced yet another video about the architecture and features of the Arc series, this time with a focus on ray tracing. If Intel's claims are valid, the Arc 7 A770 could have outstanding ray-tracing performance for its tier.

The video is staged as an interview with Ryan Shrout, Intel marketer, posing the questions to infamous engineer Tom Petersen, best known for his 15 years as an Nvidia frontman. Now he's peddling the benefits of Intel's ray tracing hardware acceleration — go figure. He explains it well in the first half of the above video, so give it a watch if you're interested.

In the second half, Petersen graphs the frame rates of the A770 and 3060 in 17 games with ray tracing enabled. Seeing the results, he says, "we win most — we win, really." But to give Intel the benefit of the doubt, I'll also quote the claim it makes in the fine print: the "A770 delivers competitive ray tracing performance against the RTX 3060 at 1080p across a sample of popular games."

Competitive or the winner? Have a look at Intel's graph and decide for yourself.

Here's where I would warn you that Intel could've cherry-picked these games, except that there probably aren't any other games that support ray tracing on Arc GPUs. In any case, don't place too much faith in Intel's numbers.

On average, the Arc A770 is almost 13% faster than the RTX 3060. Intel benefits from some big swings in its favor, like a 56% lead in Fortnite and a 31% lead in Metro Exodus. However, the A770 falls considerably behind in Battlefield V and "Guardians of the Galagy [sic]."

Intel says it conducted benchmarks with the games' settings maxed out with the justifiable exception of motion blur. With those settings and at 1080p, every game was playable on the A770 bar, maybe Cyberpunk 2077 and Fortnite, but for the most part, you'd want to dial the settings back to medium.

If you look closely, you'll see that Ghostwire Tokyo was benchmarked with a beta driver that (allegedly) improves performance by 25 percent. Intel could've chosen not to add that label, but it did so it could reiterate its plan to enhance the Arc series' performance with each driver update.

At the end of the interview, Shrout and Petersen talk briefly about using XeSS in conjunction with ray tracing. Nvidia and AMD each suggest using their respective super sampling technologies, DLSS and FSR, when their GPUs don't have the horsepower to push native resolution with ray tracing enabled, and it's the same deal here. The A770 does get some remarkable gains with XeSS enabled, but we'll have to wait and see what the image quality is like.

Intel still hasn't shared a release date for the A770 or the rest of the series but promises it is getting close. The A770 is expected to cost less than $400 and might be competitive with the 3060 near its MSRP of $330.

Permalink to story.

 
I will take their results and promises with a block of salt. Marketing machine doing what it does best. However, if it is competitive and can help drive GPU prices down, then good for Intel and all of us gamers that are still waiting for affordable GPU upgrades (my 1070 Ti is really showing its age).
 
Gosh forbid they actually release the ARC 750-770 so we can find out the truth. At this point most everyone is mocking their Marketing for these cards now.
 
This would have been news if it could take over at least the 3070Ti.

For a 3060? It's just yawn and eye-roll.
 
OMG, a spanking new Intel card beats an ancient NVidia card if you spend more money than the Nvidia card costs

OMG....OMG....OMG....nobody cares!

 
I have some doubts :
1. What is the exact configuration, settings etc related to these benchmarks. Could be that various different and specific setup were created to suite them.
2. How is the image quality as some driver may have taken shortcuts to render images. This will give high fps while adversely affecting images.

Unless the reviewers get a hold of these and run thorough tests, let's just take these results with a mound of salt.
 
"Intel shows" and that's about it....until independent reviews show...it's just a show.
 
Semi-Accurate has an interesting explaination about what's going on with the drivers. If true it makes sense.
 
I will take their results and promises with a block of salt. Marketing machine doing what it does best. However, if it is competitive and can help drive GPU prices down, then good for Intel and all of us gamers that are still waiting for affordable GPU upgrades (my 1070 Ti is really showing its age).
I won't be a third player that drives prices down. It will be demand.
 
Intel will never surpass nvidia and AMD in GPUs... They should stick to what they know: CPUs...
 
This seems like a solid first outing for Intel. Looks promising!

Intel will never surpass nvidia and AMD in GPUs... They should stick to what they know: CPUs...
Your opinion doesn't seem to match Intel's showing here. Looks like they're going to compete very well, very soon.
 
Last edited:
I was intrigued until I saw these were Intel's graphs. Not enough salt in the oceans to have any faith in these results. I'll wait for Techspot, Techpowerup, Gamers Nexus etc to test these claims before believing anything Intel says.
 
Intel has cancelled its entire discrete graphics card line Moore's Law Is Dead has reported. Alchemist and Battlemage are dead and everything is being shut down. Losses have been way too high and they figure they can never catch up to AMD and Nvidia in performance.

Funny, eh? For weeks now they've been telling everyone just the opposite. That they have a winner for a graphics card and they are going to stick it out through thick and thin. And now is is all gone ... 🤷‍♂️🤨
 
Intel has cancelled its entire discrete graphics card line Moore's Law Is Dead has reported. Alchemist and Battlemage are dead and everything is being shut down. Losses have been way too high and they figure they can never catch up to AMD and Nvidia in performance.

Funny, eh? For weeks now they've been telling everyone just the opposite. That they have a winner for a graphics card and they are going to stick it out through thick and thin. And now is is all gone ... 🤷‍♂️🤨
So one outfit has reported this. Nothing official from Intel themselves, and you believe it like it's fact?

Why would they give in now? Half the cards haven't even launched yet...
 
Intel has cancelled its entire discrete graphics card line
Another tree swinger flinging it's poo.

When will all of this fanboi disinformation nonsense stop? I mean really, does humanity actually have the maturity to quit with this kinda of agenda based malarkey?
 
That guy at Moore's law.....As long as nobody from Intel steps out, I dont belive some streamer looking for more clicks/views.
 
Back