Intel Arc B580 GPU Re-Review: Old PC vs New PC Test

I'm still in the market for an Intel to replace my RTX2060. I'd been an Nvidia guy since my first aftermarket GPU, a GeForce 2 MX. But they have become so anti-consumer since after the RTX1660 that I am no longer interested in giving them my money. I understand them going all-in on AI (although I think that bubble will burst like the dot-com bubble), but rather than gouging the budget minded mainstream gamers who got them where they are today, use some of those AI profits to subsidize the low- to mid-range GPUs. They could have built massive goodwill rather than turning an entire group of gamers unable to afford $500+ GPUs to look at alternatives. If the 4060 was launched at $200 I would have bought one at release.

Edit: clarified when I feel they became anti-consumer
 
Last edited:
While somewhat disappointing, this card is still perfect for someone who does some work and lighter (for lack of a better word) gaming on their desktop.

Honestly, this doesn't impact people building a new PC; only people hoping to upgrade an old system. HUB was already recommending the 7600 for all but the most budget builds.

 
"..the performance issue is likely to occur only in very CPU-demanding games.."

Um... Yeah, because you purposely bottlenecked the CPU path with a 6 year old chip that isn't supported by the manufacturer. This entire re-test feels designed to support a biased assumption.
 
"..the performance issue is likely to occur only in very CPU-demanding games.."

Um... Yeah, because you purposely bottlenecked the CPU path with a 6 year old chip that isn't supported by the manufacturer. This entire re-test feels designed to support a biased assumption.

It's a 4 year old CPU and the 5600 is from 2022 so really 2+ but exaggeration is a fun thing right? And my cheap B450 Mobo got a BIOS update 3 months ago, so yet another exaggeration about "unsupported", LOL.

What is purposely bottlenecked? You missed the point as loads of people use CPUs with similar power to the 5600 so they need to know what to expect with the B580. You might have noticed that the AMD and Nvidia GPUs were a lot less bottlenecked in some games with the same CPU.

Lower frames is less good. If you play those bottlenecked games a lot then get AMD or Nvidia. If you don't play those games, then get the B580. It ain't that hard to understand.
 
Interesting. I have a question... what parameters were used to choose the games tested? A new game appears out of nowhere, which significantly influences the content of the article and conclusion
 
While somewhat disappointing, this card is still perfect for someone who does some work and lighter (for lack of a better word) gaming on their desktop.

Honestly, this doesn't impact people building a new PC; only people hoping to upgrade an old system. HUB was already recommending the 7600 for all but the most budget builds.
Hardware Unboxed did extensive tests. Even 7600 causes performance degradation. 7600 costs 200$. It's hard to recommend pricier CPU for 250$ GPU.
 
I'm still in the market for an Intel to replace my RTX2060. I'd been an Nvidia guy since my first aftermarket GPU, a GeForce 2 MX. But they have become so anti-consumer since after the RTX1660 that I am no longer interested in giving them my money. I understand them going all-in on AI (although I think that bubble will burst like the dot-com bubble), but rather than gouging the budget minded mainstream gamers who got them where they are today, use some of those AI profits to subsidize the low- to mid-range GPUs. They could have built massive goodwill rather than turning an entire group of gamers unable to afford $500+ GPUs to look at alternatives. If the 4060 was launched at $200 I would have bought one at release.

Edit: clarified when I feel they became anti-consumer

If Nvidia used their dominant position in one market to improve their position in another market by selling at a loss, they'd almost certainly be in touble with regulators for predatory pricing.
 
Sorry to disagree with the reviewers on the B580. Not with the use on new systems, but on older systems. From the beginning Intel has made it known that these cards need to have Resizable Bar enabled for their cards to run at their best performance. That was made known with the Arc series of cards. That requirement did not change with the release of the Battlemage cards.
Started using Arc series when they hit the shelves. Was in the process of replacing an Msi Z270 mb with MSI Z790 ddr4 mb. This sysem used an I7-7700k with 32 g 3600 ddr4 and 2T and 2T of m.2 storage. This system used a Nvidia GTX1070 for video out. My current build has an I6 13600 cpu, 64g Corsair 3600mz(rgb) memory, and 10T of Samsung m.2 storage(3-2T 980 Pro, 1-4T 990 Pro). Video output is courtesy of an Asrock B580 Steel Challenger card. Used this card to upgrade from an Arc 770 16g. I am very happy with the results.
Usually play Fallout 76, Starfield, Doom Eternal, Genshin Impact, Bioshock, and others. The only game I've played so far with unfavorable results has been Aliens-Dark Decent. This has mainly been some objects not rendering correctly around the edges. Basically play with the game settings that are set at default. Will change settings after playing a couple of times, as needed. Video settings usually at high, very high, or ultra with very good results.
Unfortunately most good things come with a gacha. The gacha part of this card is the Resizable Bar requirement. Without it the cards run like a Porsche with flat tires. Anyone who wants to use this card need to do this...research. The B580 is definitely worth the price of entry and will do fine for a budget, or even midrange, sytem. So, let s not demean the card for using the newest tech available. It's up to builders to be aware of the technical issues of new hardware. Especially those who are just getting their toes wet building their first pc.
I've been playing with computers since the early 80's. These were mainly systems that ran their programs on paper tapes with punchouts. A lot has changed since then. The one thing that hasn't changed is knowing what parts do and running them with practical performance. Been running nVidia cards since the late 80's when I started building my own systems. AT that time AMD was relegated to budget builds...period! When Intel released the Arc A770, I jumped on it. nVidia has just gotten out of hand. Those that helped them get to where they have gotten have been left in the dirt.
Call me an Intel fanboy if you wish. I prefer a good deal and Intel provides that with their Arc and Battlemage cards. Until Intel drops out of the market I'll continue with their offerings. Hopefully that will continue with a B770 card.
 
Sorry to disagree with the reviewers on the B580. Not with the use on new systems, but on older systems. From the beginning Intel has made it known that these cards need to have Resizable Bar enabled for their cards to run at their best performance. That was made known with the Arc series of cards. That requirement did not change with the release of the Battlemage cards.
Started using Arc series when they hit the shelves. Was in the process of replacing an Msi Z270 mb with MSI Z790 ddr4 mb. This sysem used an I7-7700k with 32 g 3600 ddr4 and 2T and 2T of m.2 storage. This system used a Nvidia GTX1070 for video out. My current build has an I6 13600 cpu, 64g Corsair 3600mz(rgb) memory, and 10T of Samsung m.2 storage(3-2T 980 Pro, 1-4T 990 Pro). Video output is courtesy of an Asrock B580 Steel Challenger card. Used this card to upgrade from an Arc 770 16g. I am very happy with the results.
Usually play Fallout 76, Starfield, Doom Eternal, Genshin Impact, Bioshock, and others. The only game I've played so far with unfavorable results has been Aliens-Dark Decent. This has mainly been some objects not rendering correctly around the edges. Basically play with the game settings that are set at default. Will change settings after playing a couple of times, as needed. Video settings usually at high, very high, or ultra with very good results.
Unfortunately most good things come with a gacha. The gacha part of this card is the Resizable Bar requirement. Without it the cards run like a Porsche with flat tires. Anyone who wants to use this card need to do this...research. The B580 is definitely worth the price of entry and will do fine for a budget, or even midrange, sytem. So, let s not demean the card for using the newest tech available. It's up to builders to be aware of the technical issues of new hardware. Especially those who are just getting their toes wet building their first pc.
I've been playing with computers since the early 80's. These were mainly systems that ran their programs on paper tapes with punchouts. A lot has changed since then. The one thing that hasn't changed is knowing what parts do and running them with practical performance. Been running nVidia cards since the late 80's when I started building my own systems. AT that time AMD was relegated to budget builds...period! When Intel released the Arc A770, I jumped on it. nVidia has just gotten out of hand. Those that helped them get to where they have gotten have been left in the dirt.
Call me an Intel fanboy if you wish. I prefer a good deal and Intel provides that with their Arc and Battlemage cards. Until Intel drops out of the market I'll continue with their offerings. Hopefully that will continue with a B770 card.

That was a long comment that can simply be addressed with "This has nothing to do with Resizable BAR".
 
Well, since you cant buy a B580, unless you pay a premium from WEELIAO on NE or Amazon (Cheapest is $340). Intel has some time to optimize their drivers before the 5060 and "9060" come out.
 
We need the real thing 🥇
NIH9nZM.jpeg
 
Hardware Unboxed did extensive tests. Even 7600 causes performance degradation. 7600 costs 200$. It's hard to recommend pricier CPU for 250$ GPU.
The 7600 had only minor "degradation" compared to the fastest CPU on the planet.

The worrying part was for people with a 5600 or slower. I.e., people trying to upgrade a 4+ year old PC.
 
The 7600 had only minor "degradation" compared to the fastest CPU on the planet.

The worrying part was for people with a 5600 or slower. I.e., people trying to upgrade a 4+ year old PC.
That is true. But the minor drop with 7600 allows 4060 to take the lead in titles like hogwarts legacy or spiderman. Someone could make a bad purchase based on tests with 9800x3d.
 
Last edited:
That is true. But the minor drop with 7600 allows 4060 to teak the lead in titles like hogwarts legacy or spiderman. Someone could make a bad purchase based on tests with 9800x3d.

That's why everyone should look at the balance of games and card features as few people play only 1 or 2 games, especially once they finish their current titles. 4060 will be faster sometimes, 7600 will be faster other times, B580 will be faster most times especially at 1440p, but takes a bigger hit in some games than the other 2.

Find your personal balance.
 
Excellent review and deep dive. Building with new(er) parts vs upgrading old(er) parts merits its own consideration, really holds true with gains (or lack thereof) in what upgrades makes the most sense when working with old(er) systems whenever we decide to replace a part (or two) and if we plan to upgrade further down the line, another words B580 may be a good upgrade even with older system if as example the plan is to upgrade the CPU soon as well.

The other part that gets passed over with review often is testing a low-cost GPU with top-of-the-line system parts -- which is not really represented of how that GPU will be used a vast majority of the time (most likely going in a step down or two CPU).

While 1440p limits the baseline vs doing everything in 1080p, its really still the new standard or baseline and better reflects the resolutions most game at -- much prefer seeing benchmarks with real resolutions (even if that doesn't reflect a repeatable baseline).
 
Personally I dont feel an Intel ARC GPU needs to be 20% better value than a Radeon to make it worth it. Geforce yes I would agree. But I don't think Radeon has a quality edge over ARC.

From what I can see in this data I think I would go for a 4060. Im surprised the Geforce parts are that close in price. I would have thought the gulf would be higher. From my experience of over 20 years of building gaming PCs I have had far more consistent experience with Geforce parts than Radeon parts.
 
It's a shame that it's very CPU dependent. A friend of mine has been trying to get a B580 to pair with a Ryzen 7600x, but hasn't been able to find any in stock around MSRP before they sell out, so I don't think this news is impacting sales a whole lot unless there's a scalping problem going on. Maybe I'll mention the RTX 4060 being available at a similar price even though the Arc card could be a better performer.
 
Personally I dont feel an Intel ARC GPU needs to be 20% better value than a Radeon to make it worth it. Geforce yes I would agree. But I don't think Radeon has a quality edge over ARC.

From what I can see in this data I think I would go for a 4060. Im surprised the Geforce parts are that close in price. I would have thought the gulf would be higher. From my experience of over 20 years of building gaming PCs I have had far more consistent experience with Geforce parts than Radeon parts.

Radeon has a definite quality edge over ARC in that you can pop one in a non-ReBAR PC or any PC and it'll work close to full performance. Sure you could get a little more performance in some games if you had ReBAR but the difference isn't as large or widespread as with Arc. IIRC 9% down in one game, 5% in 2 others and within 1-3% (ie, noise) in the rest. Nothing like -50% and -20% in a few and -5% in many with the B850.

Put simply: I use Radeons in non-ReBAR PCs and they just work (5600 XTs in 2 Optiplexes). My B580 stays in ReBAR PCs for now but I'll be trying the ReBAR hacks in an Optiplex or 2 to see if that can get Arc working properly there.

Is Nvidia an even safer bet? Seems so and you'll pay extra for that which makes sense as Nvidia has additional features that many like. My 3050-6GB lives a happy life in another Optiplex but is still slower than the cheaper 5600 XT.
 
Interesting article and appreciate the revisit.

Out of curiosity, do HUB/TS have any plans to incorporate at least one older generation CPU (2-3 generations old) into GPU reviews moving forward? Seems like a case where a few well-placed tests on older hardware could help detect abnormal scaling and provide another data point for upgraders.
 
Back