Intel's Arc B580 debuted as a budget-friendly GPU champion, but new data reveals performance struggles with older CPUs. Our re-review dives deep into benchmarks, CPU limitations, and gaming scenarios.
Intel's Arc B580 debuted as a budget-friendly GPU champion, but new data reveals performance struggles with older CPUs. Our re-review dives deep into benchmarks, CPU limitations, and gaming scenarios.
"..the performance issue is likely to occur only in very CPU-demanding games.."
Um... Yeah, because you purposely bottlenecked the CPU path with a 6 year old chip that isn't supported by the manufacturer. This entire re-test feels designed to support a biased assumption.
Hardware Unboxed did extensive tests. Even 7600 causes performance degradation. 7600 costs 200$. It's hard to recommend pricier CPU for 250$ GPU.While somewhat disappointing, this card is still perfect for someone who does some work and lighter (for lack of a better word) gaming on their desktop.
Honestly, this doesn't impact people building a new PC; only people hoping to upgrade an old system. HUB was already recommending the 7600 for all but the most budget builds.
I'm still in the market for an Intel to replace my RTX2060. I'd been an Nvidia guy since my first aftermarket GPU, a GeForce 2 MX. But they have become so anti-consumer since after the RTX1660 that I am no longer interested in giving them my money. I understand them going all-in on AI (although I think that bubble will burst like the dot-com bubble), but rather than gouging the budget minded mainstream gamers who got them where they are today, use some of those AI profits to subsidize the low- to mid-range GPUs. They could have built massive goodwill rather than turning an entire group of gamers unable to afford $500+ GPUs to look at alternatives. If the 4060 was launched at $200 I would have bought one at release.
Edit: clarified when I feel they became anti-consumer
Sorry to disagree with the reviewers on the B580. Not with the use on new systems, but on older systems. From the beginning Intel has made it known that these cards need to have Resizable Bar enabled for their cards to run at their best performance. That was made known with the Arc series of cards. That requirement did not change with the release of the Battlemage cards.
Started using Arc series when they hit the shelves. Was in the process of replacing an Msi Z270 mb with MSI Z790 ddr4 mb. This sysem used an I7-7700k with 32 g 3600 ddr4 and 2T and 2T of m.2 storage. This system used a Nvidia GTX1070 for video out. My current build has an I6 13600 cpu, 64g Corsair 3600mz(rgb) memory, and 10T of Samsung m.2 storage(3-2T 980 Pro, 1-4T 990 Pro). Video output is courtesy of an Asrock B580 Steel Challenger card. Used this card to upgrade from an Arc 770 16g. I am very happy with the results.
Usually play Fallout 76, Starfield, Doom Eternal, Genshin Impact, Bioshock, and others. The only game I've played so far with unfavorable results has been Aliens-Dark Decent. This has mainly been some objects not rendering correctly around the edges. Basically play with the game settings that are set at default. Will change settings after playing a couple of times, as needed. Video settings usually at high, very high, or ultra with very good results.
Unfortunately most good things come with a gacha. The gacha part of this card is the Resizable Bar requirement. Without it the cards run like a Porsche with flat tires. Anyone who wants to use this card need to do this...research. The B580 is definitely worth the price of entry and will do fine for a budget, or even midrange, sytem. So, let s not demean the card for using the newest tech available. It's up to builders to be aware of the technical issues of new hardware. Especially those who are just getting their toes wet building their first pc.
I've been playing with computers since the early 80's. These were mainly systems that ran their programs on paper tapes with punchouts. A lot has changed since then. The one thing that hasn't changed is knowing what parts do and running them with practical performance. Been running nVidia cards since the late 80's when I started building my own systems. AT that time AMD was relegated to budget builds...period! When Intel released the Arc A770, I jumped on it. nVidia has just gotten out of hand. Those that helped them get to where they have gotten have been left in the dirt.
Call me an Intel fanboy if you wish. I prefer a good deal and Intel provides that with their Arc and Battlemage cards. Until Intel drops out of the market I'll continue with their offerings. Hopefully that will continue with a B770 card.
The 7600 had only minor "degradation" compared to the fastest CPU on the planet.Hardware Unboxed did extensive tests. Even 7600 causes performance degradation. 7600 costs 200$. It's hard to recommend pricier CPU for 250$ GPU.
That is true. But the minor drop with 7600 allows 4060 to take the lead in titles like hogwarts legacy or spiderman. Someone could make a bad purchase based on tests with 9800x3d.The 7600 had only minor "degradation" compared to the fastest CPU on the planet.
The worrying part was for people with a 5600 or slower. I.e., people trying to upgrade a 4+ year old PC.
That is true. But the minor drop with 7600 allows 4060 to teak the lead in titles like hogwarts legacy or spiderman. Someone could make a bad purchase based on tests with 9800x3d.
Personally I dont feel an Intel ARC GPU needs to be 20% better value than a Radeon to make it worth it. Geforce yes I would agree. But I don't think Radeon has a quality edge over ARC.
From what I can see in this data I think I would go for a 4060. Im surprised the Geforce parts are that close in price. I would have thought the gulf would be higher. From my experience of over 20 years of building gaming PCs I have had far more consistent experience with Geforce parts than Radeon parts.