Are Intel Arc graphics cards dead on arrival?


Posts: 700   +630
What Intel needs is a CEO on par with Lisa Su. That woman turned the laughing stock that was AMD into a better run company than Intel and the results have made AMD investors very very happy. Pat Gelsinger is proving not to be that CEO.


Posts: 311   +433
Intel just needs to concentrate on their core business. Bad *** CPU's that use to crush everyone else. Now they are about to the point of playing catch up.


Posts: 66   +96
I saw a youtube video (smaller tech channel that I don't remember) that tested these ARC cards for a streaming machine. Their results showed that you could live stream with these cards at twice the quality for half the bitrate (AV1) over other streaming machines. That sounds like it would be pretty handy for Twitch streamers and youtubers. I would like to see you guys test that, as I would trust the results a bit more than from some random youtube video I came across.


Posts: 7,968   +7,008
The take away from the 1080p gaming tests is what a POS the RX6400 is even using PCI-E 4.

Moore's Law is Dead is saying Intel heads are deciding whether to cancel Arc altogether at least for desktop. This has been a total fluster cuck for Intel and Raja should be worried.
Intel will try to tout it as a feature. šŸ¤£


Posts: 351   +798
Intel just needs to concentrate on their core business. Bad *** CPU's that use to crush everyone else. Now they are about to the point of playing catch up.
The overall role of the badass x86 CPU is shrinking, while ARM and GPU compute is growing. Intel has 2 competitors that both are way ahead in GPU, and at least 1 that is already leveraging ARM in applications previously dominated by x86. Unlike the railroads of yesteryear, Intel realizes that they are in the computer business, not just the x86 CPU business.

Intel needs GPU compute. If we get consumer gaming cards, that's a bonus for us. Intel folding their GPU division would be a disaster for them.


Posts: 113   +67
Airbus A-380 is the king of airliners, and Intel used such a powerful name on such a crappy product. Why didn't they name the 1st series A180? Then after more learning and debugging release A280 series? And only in 3rd incarnation, after learning on their mistakes, release something as A380 where the name and the product would be in sync.

Ah, yeah.... it's Intel. That's why. The same company that brought us CPUs named as intuitively as "1155G7". Their marketing dept is legendary. If they outsourced them to North Korea the results wouldn't be any worse.

Tom Yum

Posts: 182   +438
Anybody remember nVidia's NV1? What if they had quit after that failure?

Difference being the NV1 was actually a decent GPU, but they made the bet that quadratic polygons would be the future of 3D graphics before DirectX released only supporting triangular polygons. That miscalculation is understandable given how nascent consumer PC accelerated 3D graphics was at the time (1994-1995). Not really applicable to Intel who 1) have had three attempts at dGPUs and 2) are entering a GPU market with very established norms and standards.