Intel Arc B580 massively underperforms when paired with older CPUs

DragonSlayer101

Posts: 647   +3
Staff
What just happened? Intel's Arc B580 graphics card has been a massive hit with both reviewers and users. So much so, we even hailed it as the best value GPU on the market in our review. However, new reports are coming out indicating that the graphics card's impressive performance is restricted to newer CPUs.

Update (Jan 11, 2025): A few weeks after we published our day-one review of Intel's Arc B580, reports began to surface indicating that the graphics card's impressive performance was largely dependent on newer CPUs. In response, we have published a new review featuring extensive benchmark data that delves into the B580's overhead issues, which cause performance to degrade in CPU-limited gaming scenarios. This updated analysis includes not just the Intel Arc B580, but also the RTX 4060, Radeon 7600, and Radeon 7600 XT. All of these GPUs were retested using the Ryzen 5 5600, allowing us to compare the new results against the original data gathered with the 9800X3D.

As discovered by Hardware Canucks, the Arc B580 suffers from massive performance issues when paired with CPUs older than five years – this happens in many games, leading to stuttering and low frame rates not experienced in previous reviews which usually test with more modern CPUs.

The channel tested the Arc B580 with an Intel Core i5-9600K – a relatively older processor released in 2018 – and found that performance in some games, including Marvel's Spider-Man Remastered and Starfield, was so poor that the games were nearly unplayable.

Unfortunately, the B580's reduced performance is not an isolated issue with 9th-gen Intel Core CPUs, as the issue persists with other older processors. In another test run by our very own Steve Walton on Hardware Unboxed, he confirms that the GPU has similar problems when paired with a Ryzen 5 2600X.

As shown in Hardware Unboxed's testing, the B580 performed much worse than the RTX 4060 in games like Warhammer 40,000: Space Marine 2 when paired with either a Ryzen 7 9800X3D or a Ryzen 5 2600. In fact, with the latter, the average frame rates hit just 31 FPS, while the 1 percent lows dipped to 25 FPS, making the game almost unplayable.

A similar result was seen in Hogwarts Legacy, where the B580 was around 46 percent slower than the 4060, averaging only 24 FPS. Starfield also yielded poor results for the B580, which was about 45% slower than the RTX 4060 when paired with the Ryzen 5 2600.

However, these problems seem limited to a handful of titles. In many other games, the B580's performance is in line with expectations. For instance, in games such as Alan Wake 2, Doom Eternal, Horizon: Forbidden West, and even Call of Duty: Black Ops 6, the B580 delivers playable frame rates when paired with the i5-9600K.

It is worth noting that all Arc graphics cards require Resizable BAR (ReBAR) and Smart Access Memory (SAM) support. For this reason, Intel only recommends its 10th-generation Core CPUs or newer and AMD Ryzen 3000 series or newer for its Alchemist and Battlemage GPUs. However, the systems tested above had ReBAR backported and enabled, suggesting that the issue lies elsewhere.

It is unclear whether this is an architectural issue or a driver-related problem that can be addressed via a future update. Intel is aware of the issues and is already investigating, so perhaps we can expect an answer sooner rather than later. Needless to say, we also plan to re-review the B580 in the coming days, so stay tuned for more information.

Permalink to story:

 
Hopefully, it can be fixed. It is really bad for Intel's reputation.
It is like the most obvious choice for people with older CPUs, and now this...
 
So if you use a CPU outside what Intel recommends. You're gonna have a bad time? Got it.

Maybe you can use the money on the ARC GPU to upgrade from your ryzen 2000 series?

To be perfectly honest, I really only expect to keep my main system for about 5 years and it gets moved to the rack in the garage where it continue to do the stuff it did 5 years ago perfectly fine.
 
That's honestly awful. Nobody should buy this if a 5600 can't extract 100% of its performance.

Nobody should buy it regardless. Intel suffers from AMD syndrome, they can't get the software right. It's reason I won't use anything but Intel CPU's and Nvidia GPU's. Yeah, there's an occasional problem, but at the end of the day the reliability and stability is light-years ahead of the competition. I would love to see true, genuine competition arise in all PC component markets, but that ain't today.
 
So if you use a CPU outside what Intel recommends. You're gonna have a bad time? Got it.

Maybe you can use the money on the ARC GPU to upgrade from your ryzen 2000 series?

To be perfectly honest, I really only expect to keep my main system for about 5 years and it gets moved to the rack in the garage where it continue to do the stuff it did 5 years ago perfectly fine.

Except the Ryzen 3000 series are supposed to be supported, so there is an issue out there.

The main reason Steve tested older CPUs outside of the specs is because most users probably wouldn't look at the specs. Sure, that's on them, but the way this GPU has been marketed, it's for budget PC gamers who normally will have CPUs 5 years or older in their systems.

This way, at least they now have visual proof of the issue. But again, for those on older CPUs that are supposedly supported per Intel's specs, those still show a massive performance hit where the 4060 with less VRAM is outperforming the Intel B580 by massive margins. So something is wrong.
 
Nobody should buy it regardless. Intel suffers from AMD syndrome, they can't get the software right. It's reason I won't use anything but Intel CPU's and Nvidia GPU's. Yeah, there's an occasional problem, but at the end of the day the reliability and stability is light-years ahead of the competition. I would love to see true, genuine competition arise in all PC component markets, but that ain't today.
What are these horrible lies? Intel CPUs have had horrible stability issues lately. AMD is ahead in terms of stability, software and power consumption and performance. On the GPU side, I'd say Nvidia is still better.
 
Pretty sure the ReBAR/SAM issue was a problem with the A series as well, Intel specified around launch back then that you shouldn't use one of these cards if your motherboard/CPU didn't support ReBAR/SAM. Interesting that it's still an issue, albeit a bit less as people upgrade/replace their old hardware.

Unfortunate that even besides the ReBAR/SAM issue, that performance can sometimes be so dependent on the CPU. Performance is hit *so* much harder compared to the 4060 in those benchmarks. Something to keep in mind for potential buyers I suppose.
 
So if you use a CPU outside what Intel recommends. You're gonna have a bad time? Got it.

Maybe you can use the money on the ARC GPU to upgrade from your ryzen 2000 series?

To be perfectly honest, I really only expect to keep my main system for about 5 years and it gets moved to the rack in the garage where it continue to do the stuff it did 5 years ago perfectly fine.

My thoughts exactly lol
 
Except the Ryzen 3000 series are supposed to be supported, so there is an issue out there.

The main reason Steve tested older CPUs outside of the specs is because most users probably wouldn't look at the specs. Sure, that's on them, but the way this GPU has been marketed, it's for budget PC gamers who normally will have CPUs 5 years or older in their systems.

This way, at least they now have visual proof of the issue. But again, for those on older CPUs that are supposedly supported per Intel's specs, those still show a massive performance hit where the 4060 with less VRAM is outperforming the Intel B580 by massive margins. So something is wrong.
It's for anyone who wants reasonable performance without paying absurd GPU prices. While not ideal and unfortunate, the power play would be to buy the ARC and upgrade the rest of the system with the difference.

The thing is, I see this as a driver optimization issue as the lack in performance for each generation is about on par with the increase in single threaded performance.

What this *hopefully* means is that there is still a lot of performance left on the table by Intel strictly in the drivers.

And, frankly, considering that I don't have a need or desire for an opulent gaming setup, I'd very much be interested in putting the savings into a ryzen 7600 for something that simply plays games.

Many people, not just "poor people", have no desire to spend the absurd amount if money companies are asking on even midranged gaming setups. This isn't just in gaming, it's in every industry. People are looking at absurd asking prices and just not buying anymore.
 
Someone really averse to upgrading from AM4 can get himself a Ryzen 7 5700X for less then $150. Add a Thermalright Assassin King 120 SE for $20 and overclock the CPU. An all core 4.3 GHz should not be a problem at all, mine is happily chugging along at 4.4GHz with no issues.

My MPC is running such a setup, along with an ancient GTX1080 and 32GB of 3600DDR4. Fallen Order looks and plays great at 1080 ultra settings on the Sony 4K TV it is hooked up to. So does AC:Origins or Odyssey. I imagine a B580 would be better than at 1080 in this context.
 
Last edited:
Someone really averse to upgrading from AM4 can get himself a Ryzen 7 5700X for less then $150. Add a Thermalright Assassin King 120 SE for $20 and overclock the CPU. An all core 4.3 GHz should not be a problem at all, mine is happily chugging along at 4.4GHz with no issues.

My MPC is running such a setup, along with an ancient GTX1080 and 32GB of 3600DDR4. Fallen Order looks and plays great at 1080 ultra settings on the Sony 4K TV it is hooked up to. So does AC:Origins or Odyssey. I imagine a B580 would be better than a 1080 in this context.
Look at the results in the article and HU testing, even a Ryzen 5700X3D experiences a pronounced slow-down in certain (high CPU utilisation) titles with the B580. This isn't just limited to people with potato CPUs from 5 years ago. It also makes the B580 significantly slower than the 4060 in many titles, which negates any reason to buy a B580 in my opinion given they are the same price.

The reality is that most B580 reviews which were done on 7800X3D/9800X3D processors (to remove the CPU bottleneck) are pretty misleading for the types of setups people actually buying a B580 will be using. People aren't buying $250 GPUs to pair with a $500 CPU, it is usually the other way around. If you have a Ryzen 7600-like CPU or slower, the B580 appears inferior to the 4060/7600.
 
So if you use a CPU outside what Intel recommends. You're gonna have a bad time? Got it.

Maybe you can use the money on the ARC GPU to upgrade from your ryzen 2000 series?

To be perfectly honest, I really only expect to keep my main system for about 5 years and it gets moved to the rack in the garage where it continue to do the stuff it did 5 years ago perfectly fine.
The specs Intel used were when z370 and z390 boards didn't have the bios updates yet to support rebar... Intel just never updated their support. Every board 10th gen on had rebar from the start.. this is why. I'm sure they will get to the bottom of it. Z370 and z390 boards have had rebar support for a long time. Rebar is not CPU specific it's board/chipset specific.
In fact I was able to enable rebar support by adding in UEFI modules on my server boards bios to support my a380. It now has rebar support and works really well. I don't game with this system though just encoding/ decoding. This is on a 1st gen scalable xeon
 
Last edited:
The sound in the cube intro on yt could be better. It's nice and smooth, but it might need a little more personality to really make an impact. This type of "commercial" sound is usually used in longer commercial videos that have a message. It might not be the best fit for an intro, which is short and flashy. The intro should have its own charm, some excitement and be memorable.

Here's a suggestion: try using this sound on the intro (or something similar with some character). Listen it - Download wav

Ah, and the voice should not be mono.
 
Nobody should buy it regardless. Intel suffers from AMD syndrome, they can't get the software right. It's reason I won't use anything but Intel CPU's and Nvidia GPU's. Yeah, there's an occasional problem, but at the end of the day the reliability and stability is light-years ahead of the competition. I would love to see true, genuine competition arise in all PC component markets, but that ain't today.

""Occasional problem"" like 2 entire generations of CPUs that explode
 
31 FPS average with a drop as low as 24 FPS is "unplayable"? That is very much playable! I've played games worse than that. It isn't great, but when you are trying to save money by putting a budget card in a 5 year old computer, you shouldn't be expecting everything to be amazing. If Intel can write better drivers that gets this higher that would be nice, but the real story is how does this card compares to similar priced cards on the same 5 year old computers...
 
31 FPS average with a drop as low as 24 FPS is "unplayable"? That is very much playable! I've played games worse than that. It isn't great, but when you are trying to save money by putting a budget card in a 5 year old computer, you shouldn't be expecting everything to be amazing. If Intel can write better drivers that gets this higher that would be nice, but the real story is how does this card compares to similar priced cards on the same 5 year old computers...
For the price of this GPU and even some second hand AM4 kit, you'd be getting a worse experience than a PS5, so why would you accept 24fps when you can get 30-120fps (usually 60fps) with a console?

I get it, a PC does more stuff, but 24fps is close to unplayable, it's absolutely the lowest fps you can play anything, 30fps is kinda reserved for console games that insist on putting graphics first, but 40-60fps is kinda the minimum these days or at least, the expectation for the prices they ask for.
 
If you really think about it 2 of these CPUs that are showing problems are worse than the CPUs in the base PS5 lol and you're pairing it with a $250 card and getting console like performance (30+fps) while using maxed out graphic settings which a console certainly isn't doing.

There honestly isn't a problem here because the ryzen 2600 shown in this test is worse than the ps5s cpu and its giving much better performance than a ps5 while being paired with a $250 gpu using maxed out graphic settings in game.
 
I was hoping Intel would succeed in the discrete GPU market, but generally assumed they would fail. Seems to be playing out as expected.
 
If you really think about it 2 of these CPUs that are showing problems are worse than the CPUs in the base PS5 lol and you're pairing it with a $250 card and getting console like performance (30+fps) while using maxed out graphic settings which a console certainly isn't doing.

There honestly isn't a problem here because the ryzen 2600 shown in this test is worse than the ps5s cpu and its giving much better performance than a ps5 while being paired with a $250 gpu using maxed out graphic settings in game.
My 8700k I get 200-400fps in cs2 and valorant with an a770. Sure not high end games but nothing wrong for competition gaming.
 
Back