Intel Arc B580 Review: Best Value GPU

I wonder if intel will release an ARC B770 for under $399
Looking at the die size and power draw of the B580, it might have been intended as their 700 series card. I have a feeling we may not get a B770.
 
It looks pretty decent within this limited scope of games.

It certainly delivers a very honest cost-benefit ratio in this price range. But it seems to me that this is a mid-end chip redirected to compete with low-end GPUs.

ARC B580 -> 272 mm2
7800XT GCD 200mm² + MCD 36.6 mm² x 4
4070 -> 294 mm²

 
Great work Intel! I can't wait until the APU is sun-setted and the GPU is linked to the CPU using the Intel enhanced interface. Laptop GPU's are not great if you want a PC to perform use a desktop with a GPU like this. My laptop was over 3k and can't load much on the GPU for gaming crashes miserably. Had to go back to desktop and better cooled GPU options like this just missed this one though.
 
It certainly delivers a very honest cost-benefit ratio in this price range. But it seems to me that this is a mid-end chip redirected to compete with low-end GPUs.

ARC B580 -> 272 mm2
7800XT GCD 200mm² + MCD 36.6 mm² x 4
4070 -> 294 mm²
It's just an inefficient architecture.
 
I'm looking forward to any higher (770/780?) models and how they compare to a 3060Ti... (as well as any from the red and green teams in the same price segment, if any...)
 
Great review (as always) Steven. Thanks!

I'm glad that Intel has diced up the market in this tier. That said, the used graphics card market is pretty robust once again. In the $250-350 range you can find some compelling choices that would dominate all of these cards.
 
Great review (as always) Steven. Thanks!

I'm glad that Intel has diced up the market in this tier. That said, the used graphics card market is pretty robust once again. In the $250-350 range you can find some compelling choices that would dominate all of these cards.

Two years ago I got a used ASUS TUF Gaming GeForce RTX 3070 OC for $300. Today 3070 cards are arround $250.
That is still too much for a 8GB card. But looking at the graphs it's still better than this B580. 22% if I look at TPU graphs.
 
Two years ago I got a used ASUS TUF Gaming GeForce RTX 3070 OC for $300. Today 3070 cards are arround $250.
That is still too much for a 8GB card. But looking at the graphs it's still better than this B580. 22% if I look at TPU graphs.
I'm still holding my RX6800 16GB, if you get lucky you can find one for around $300. For as old as it is, it will still hold its own - especially against this collection.
 
Happy to see this is priced appropriately and Intel is bringing at least a small amount of competition to the market.
 
I'd be very tempted to tell people who have this budget to get a used RX 6700 XT or a 3060ti.
 
Can't wait to see B780 now. Not that I would buy one, but a 3rd player in the low to mid-end GPU market is good.

I could see Intel and AMD smash heads in this market in the coming years. Hopefully AMD enter high-end again tho, and fight Nvidia, with UDNA in 2026.

Nvidia don't care about low end. It's up for grabs. They care about mid to high-end.

Both AMD and Intel should go hard after the sub 500 dollar market and both needs to improve their upscalers/AA and frame gen to match Nvidia.
 
A series got my interest, now B series has my attention. About effing time Intel showd up to the party, an it seems legit value for the lower end mainstream. Now with that very decent balance of specs, features and performance, give us some more Intel!
 
Not interested in these releases, as my good old RX6750XT outshines them. I guess we will have to wait and see if any B7** models are forthcoming , and with a surprise punch. But sadly, I doubt it.
 
"For these tests, we used quality upscaling at 1080p and 1440p. This means DLSS for Nvidia GPUs that support it, FSR for Radeon GPUs and GeForce GPUs without DLSS, and XeSS for Arc GPUs. Let's see how the B580 performs."
(from 2nd paragraf in raytracing performance)

Isn't this giving Intel the exact advantage that they probably was hoping for when they decided to go down in render resolution compared to AMD and Nvidia using same preset names?

Context: When AMD launched FSR, they followed Nvidia and used the same naming scheme and the same render resolutions for each of the upscaler presets (Quality, Balanced, Performance etc.).
Intel, however, uses different rendering resolutions for their one-to-one named upscaler presets. Meaning FSR and DLSS uses the same render resolution if both are set to Quality. However Intel's XeSS uses a lower render resolution at a preset also named Quality.
I suspect that Intel decided to "copy" only the naming of the upscaler presets, while using different render resolutions, specifically with the intent to mislead customers. By showing performance results that are higher with "on the surface" equivalent visuals.

I think you did it well in the HUB/Techspot video&article
https://www.techspot.com/review/2860-amd-fsr-31-versus-dlss/
Where you pointed out that "Quality vs. Quality vs. Quality" isn't an exact comparison. Even at the same preset level, the upscalers aren't equivalent in their visual output.

And I know you didn't do this to intentionally give any unfair advantage. I'm NOT accusing you of any fanboyism. I've followed for many years and know that those to claims are stupid, because all you great work is a testament to your dedication to help consumers.
BUT
I think, because of Intels deceptive behaviour, this is why reviewers/testers should BE WARY to show users results, the users might not fully understand or misinterpret. It is probably best to stick to native resolution testing, as upscalers are making you compare graphics/visuals that aren't one-to-one.

(If you feel the need to include upscaler performance, e.g. do to frequent requests, I think the best solution is to stick with an open source upscaler. That would sadly be only FSR. As Intel XeSS, as you know, changes how it works depending on if it's used on Intel or other vendors products. And DLSS is of course Nvidia lock'ed)
 
AMD Graphics bosses really need a reality check on their gaming engineering division... I am amused how Intel was able to pull this through... in fact, their Ray tracing performance is not that bad comparing to AMD cards... AMD couldn't do this considering their long experience in GPU making and wider game support... So, AMD graphics division needs a shake up and trimming of inefficiency...
 
Back