Intel confirms Battlemage, promises great value GPUs with better performance scaling

nanoguy

Posts: 1,355   +27
Staff member
Why it matters: After a troubled launch and several big driver updates, Intel's Arc Alchemist GPUs aren't exactly on all PC gamers' wishlists. However, the company is taking all the negative in stride by incorporating the lessons learned from architectural and software mistakes into upcoming Battlemage discrete graphics solutions. There is no release date or confirmed specs to get excited about, but we now know that Team Blue is looking to push both AMD and Nvidia to rethink their pricing strategies moving forward.

Intel's foray into the discrete graphics market has been marred with problems, from the unfortunate timing of Arc Alchemist's launch to the various performance and stability issues experienced by early adopters. That said, Team Blue appears committed to the long, arduous process needed to make its products appealing to more gamers.

Tom Petersen, ex-Nvidia veteran turned Intel Fellow, confirmed as much during a recent podcast with PC World's Gordon Ung and Brad Chacos. Petersen candidly agreed with the public sentiment that the first generation of Arc hardware isn't exactly a viable option for people building a new PC or looking for an upgrade over an aging graphics card from Nvidia or AMD, especially if you're looking to play any of the noteworthy VR titles out there.

One of the biggest reasons for that is Intel's software stack, which lags behind that of the other two companies. However, Petersen pointed to great strides made in the driver department since launch and explained it's only a matter of time before Intel will be able to squeeze all possible performance out of Arc silicon, particularly in DirectX 9, 10, and 11 titles.

Of course, most people may well avoid the first-generation Arc hardware due to the disappointing cost per frame in most regions. Petersen won't say how well Arc Alchemist is selling so far, but he insists that Intel is focused on growing market share and mind share through aggressive pricing at a time when Team Green and Team Red are even reducing supply to protect their profit margins. The end goal isn't to bring back sub-$200 graphics cards, but to offer better value for the majority of PC gamers out there.

With the leaked desktop graphics roadmap from last month, it's no surprise the hosts of the podcast popped some questions about the upcoming Battlemage GPUs. Unfortunately, Intel is keeping a tight lip on this outside of hinting that development is progressing along as expected. If the leak is accurate, that means we should see the first products from that family as soon as next year.

Peterson did say that Intel learned a great deal from its architectural mistakes with first-gen Arc silicon which is informing the design of Battlemage. This could mean a lot of things but, if anything, Intel is pooling most of its resources from the reorganized graphics division into making sure Battlemage doesn't suffer the same launch problems as Alchemist. And while the company did rush the first generation of its discrete GPUs out the door to gather user feedback, this won't be the case with future generations.

One thing is for sure — Intel's efforts in the short term are more focused on chipping away at AMD's market share and building "cool new open technologies" and less on challenging Nvidia at the high end. The company also wants to get better at DirectX 11 and 12 performance scaling as well as ray tracing and resolution upscaling, and expects these to evolve at a "discreet rhythm."

Permalink to story.

 
Go Intel! A viable GPU for fun gaming at affordable prices will help keep PC gaming accessible to a wider audience. I think they could have a lot of impact even without the bells & whistles and a "8K gaming solution" or even deep games library support - simply being the affordable choice for the most popular games will make them enough of a threat to help the other players remember the sub $500 price point.
 
"And while the company did rush the first generation of its discrete GPUs out the door to gather user feedback, this won't be the case with future generations"

That is a very charitable take. Intel released desktop Arc when they did because they told investors it would launch by Summer 2022 (which they just missed, launching instead in mid Oct 22). It was clearly not ready, and not just at the point where only wider use would find bugs, but so early there were obvious issues that Intel themselves knew about.

Based on that, there is actually no guarantee that the same can't happen again. Intel has shown countless times they don't really care about customers, but they do care very much about their share price.
 
Can say, as much as Intel is having driver problems IN WINDOWS, in Linux the driver situation is completely different. In short, any hardware newer than about 15 years old has modern, up-to-date drivers that support everything up to the limits of the hardware, running right up to current ARC. Even Sandy Bridge will run a fair portion of DX11 games, and anything newer than about 7 or 8 years old has full OpenGL, Vulkan, and (in wine and steam) full DX9/10/11/12 support.

They had an "i965" driver that supported all Intel GPUs, but was not following Mesa's modern programming practices and I'm sure getting increasingly difficult to maintain with like 20 years of GPUs in it. Within the last couple years, this got rewritten into a "Iris" driver (for gen 8 -- Broadwell -- and newer GPUs) and "Crocus" for Gen 4-8 GPUs (yes this means there is overlap and apparently both support gen 8.) The Crocus driver supports the older hardware right up to the limits of hardware. The Iris driver is excellent, full OpenGL4.6, full Vulkan, and in wine full DX9/10/11/12 support. Yes, hardware going back to like 2006 has fully modernized and optimized drivers. (The old driver, for even older OpenGL 2.1-only GPUs is in an "Amber" branch and removed from newer Mesa.) I've used all 3!

Amber -- my dad still uses a Core 2 Quad desktop. I installed Amber, found the GPU is slow enough (and CPU rendering in newer Mesa fast enough) that the software rendering is faster than the GPU (and supports a newer OpenGL version for better compatibility). This isn't the amber driver being slow, these GPUs were always slow enough (edit: at 3D, the 2D and video scaling are plenty fast) they would just about max out running a screen saver. I removed Amber.

My friend's Sandy Bridge (running Crocus), ironically can run (or "walk" depending on the frame rate...) many DX11 games in Wine, while Intel never shipped DX11 drivers for Windows for it. The number of games that work on hardware this old is shocking. Sandy Bridge supports openGL 3.3, as old as Ivy Bridge supports OpenGL 4.x and "partial" Vulkan (a bit too "partial", Vulkan doesn't really work on it).

Iris -- I have a Tigerlake notebook with "Intel Xe" graphics (ARC is based right off this.) There's seriously nothing to comment on -- it supports OpenGL 4.6, Vulkan 1.3. Wine and Steam I can start games up, no glitches, no crashes, and good performance. Reportedly ARC in Linux is like this too. It's a stark contrast from like 10 years ago when the Intel GPU drivers for Linux were god-awful and I'd have to have an NVidia card to have any hopes of most games actually working.
 
Last edited:
"And while the company did rush the first generation of its discrete GPUs out the door to gather user feedback, this won't be the case with future generations"

That is a very charitable take. Intel released desktop Arc when they did because they told investors it would launch by Summer 2022 (which they just missed, launching instead in mid Oct 22). It was clearly not ready, and not just at the point where only wider use would find bugs, but so early there were obvious issues that Intel themselves knew about.

Based on that, there is actually no guarantee that the same can't happen again. Intel has shown countless times they don't really care about customers, but they do care very much about their share price.

The problem as I see it is that'll be the case no matter how long they hold back these first few generations of cards. Intel simply doesn't know how to avoid some of these looming mistakes until they make them. As the old saying goes "hindsight is 20/20". So yeah I don't expect to see anything noteworthy from Intel until the 3rd gen at the earliest, 4th or 5th gen more realistically. I remember the first few generations of both Nvidia and ATI cards, and considering how much simpler the hardware was back in those days they still didn't have a great record. Especially ATI. Much of the criticism that AMD cards get IMHO is due to the total crap that ATI used to sell. I personally know this from the hell I suffered with my Rage Pro All-in-Wonder. So as long as Intel is committed, they'll get there, and the early adopters will pay the price...
 
The problem as I see it is that'll be the case no matter how long they hold back these first few generations of cards. Intel simply doesn't know how to avoid some of these looming mistakes until they make them. As the old saying goes "hindsight is 20/20". So yeah I don't expect to see anything noteworthy from Intel until the 3rd gen at the earliest, 4th or 5th gen more realistically. I remember the first few generations of both Nvidia and ATI cards, and considering how much simpler the hardware was back in those days they still didn't have a great record. Especially ATI. Much of the criticism that AMD cards get IMHO is due to the total crap that ATI used to sell. I personally know this from the hell I suffered with my Rage Pro All-in-Wonder. So as long as Intel is committed, they'll get there, and the early adopters will pay the price...
Wholeheartedly agree on the Rage Pro All-In-Wonder, I had a card of that era and it was like it just never quite came together.

I still think a big issue for Intel may be drivers (for Windows gamers), and I don't mean the current situation with ARC, I mean the support policy in general. The recent dropping of GPUs older than about 2 years old is troubling. But even before that, they've had a history of going from active development of new features, bug fixes, and speedups, to "security updates only" maintenance mode for any given GPU pretty quickly. I for one wouldn't care to splash $150-200+ on a GPU with the near-certainty that it'll go into "maintenance mode" support in maybe 2 or 3 years. (Running Linux, all but the oldest (15+ year old) Intel GPUs have fully modern drivers and I wouldn't think twice about buying ARC due to concerns about them becoming unsupported; but I'm sure they want to sell these to Windows gamers too.)
 
The problem as I see it is that'll be the case no matter how long they hold back these first few generations of cards. Intel simply doesn't know how to avoid some of these looming mistakes until they make them. As the old saying goes "hindsight is 20/20". So yeah I don't expect to see anything noteworthy from Intel until the 3rd gen at the earliest, 4th or 5th gen more realistically. I remember the first few generations of both Nvidia and ATI cards, and considering how much simpler the hardware was back in those days they still didn't have a great record. Especially ATI. Much of the criticism that AMD cards get IMHO is due to the total crap that ATI used to sell. I personally know this from the hell I suffered with my Rage Pro All-in-Wonder. So as long as Intel is committed, they'll get there, and the early adopters will pay the price...

Intel has driver issues on their discrete GPU's regardless of platform. It's also not just driver issues, the cards themselves have issues that no firmware or driver can fix.

ATI cards were great, I owned a couple back then. I immediately sold my card and refused to go back. I was extremely upset when AMD bought them. I was justified as AMD has crapped the bed ever since. And no, the crap AMD gets for its cards is because of what THEY do and has nothing to do with ATI. Look no further than the most recent driver corrupting entire Windows installations. This has been a consistent problem for AMD, it has never not been a problem, despite the fanboys crying that it isn't.

AMD can't make software, it's that simple. Their BIOS are always horribly unstable as well, especially when they launch a new generation. My system on Intel and Nvidia? Rock solid. Four different friends with AMD that didn't listen to me? Constant problems. They haven't had a stable system since built.
 
AMD can't make software, it's that simple. Their BIOS are always horribly unstable as well, especially when they launch a new generation. My system on Intel and Nvidia? Rock solid. Four different friends with AMD that didn't listen to me? Constant problems. They haven't had a stable system since built.
I wonder why my system is rock solid and it only has AMD 🤔
 
From what I have read on the subject, Intel has gotten way, way too big to even keep track of what is going on within its own corporate structure. The left hand literally doesn't know what the right hand is doing, and managers of all these various substructures don't tell the truth about their progress or actual status to their executive managers. So Intel is a corporate Frankenstein lurching from one disaster to another, and Moore's Law Is Dead is reporting from its sources within Intel's upper financial management that ARC is going to be cancelled completely because they have literally no way of making it profitable for years out into the future and they want to stop the financial hemorrhaging as soon as they can. That is apparently how bad it really is at Intel.


 
Last edited:
Intel has driver issues on their discrete GPU's regardless of platform. It's also not just driver issues, the cards themselves have issues that no firmware or driver can fix.

ATI cards were great, I owned a couple back then. I immediately sold my card and refused to go back. I was extremely upset when AMD bought them. I was justified as AMD has crapped the bed ever since. And no, the crap AMD gets for its cards is because of what THEY do and has nothing to do with ATI. Look no further than the most recent driver corrupting entire Windows installations. This has been a consistent problem for AMD, it has never not been a problem, despite the fanboys crying that it isn't.

AMD can't make software, it's that simple. Their BIOS are always horribly unstable as well, especially when they launch a new generation. My system on Intel and Nvidia? Rock solid. Four different friends with AMD that didn't listen to me? Constant problems. They haven't had a stable system since built.
Sure they were. Dude your bias is showing...
 
From what I have read on the subject, Intel has gotten way, way too big to even keep track of what is going on within its own corporate structure. The left hand literally doesn't know what the right hand is doing, and managers of all these various substructures don't tell the truth about their progress or actual status to their executive managers. So Intel is a corporate Frankenstein lurching from one disaster to another, and Moore's Law Is Dead is reporting from its sources within Intel's upper financial management that ARC is going to be cancelled completely because they have literally no way of making it profitable for years out into the future and they want to stop the financial hemorrhaging as soon as they can. That is apparently how bad it really is at Intel.
Not unprecedented -- before the i810 Intel "non-extreme" graphics came out, they sold an i740 graphics card. I think they found sales were poor, it's a nice "value add" to have any graphics chip built into your chipset but was not fast enough to have people spend money for it in card form. Pretty sure some of those GPUs about 10 years ago that had variants with embedded DRAM were also designed with the possibility of popping into a discrete card -- not pursued because Nvidia and ATI/AMD still had some sub-$100 cards and it would not be competitive with them.

ARC at least doesn't look terrible (in Linux), The market is different now; with Nvidia and AMD GPUs now finally selling at near MSRP that in some cases was set 3 or 4 years ago, the pricing is rather high. If Intel prices these cards right they could provide some competition. And if they're actually available -- I look up Alchemist A380 for example on Google shopping and find one Asrock model listed (then some scale model A380 Airbusses to fill out the page.) But, they do have a cash crunch, I could absolutely see them cancelling them even if they do manage to sell some cards.
 
Intel has shown countless times they don't really care about customers, but they do care very much about their share price.
Corporations are designed that way. They are not designed to be altruistic.

If people don't like it they should try to replace them with charities.
 
Sure they were. Dude your bias is showing...
First, there's no such thing as unbiased. Everybody is biased, at all times, on everything, no exceptions. However, the difference is whether the facts support your biases or not. They do in my case.
 
I wonder why my system is rock solid and it only has AMD 🤔
I also have one of the affected Samsung NVME drives but didn't experience the wear issue...there are always exceptions. If you search for issues with AMD software you will find endless posts, news articles, knowledgebase articles, and all sorts of information about issues with AMD's software. Silicon can vary wildly, even in the literal same SKU. This causes the same software to perform differently with the "same" hardware. The main problem (among many) with AMD's software is that they don't account for this variability well at all. Let's also be clear, are Nvidia and Intel perfect? Far, far, far from it. Are they the "lesser of evils"? Absolutely. I would love a reliable competitor, AMD isn't it and they never have been. Could that change some day? Absolutely.
 
If you search for issues with AMD software you will find endless posts, news articles, knowledgebase articles, and all sorts of information about issues with AMD's software.

You're talking specifically about AMD's GPU software I assume, not CPU's, even though you said 'all-AMD' systems? There is nothing wrong with their CPUs. I'm on my 2nd system (AM5 from AM4) and both my sons have AM4 systems: 3600, 5600, 5600X, 7600. No issues whatsoever on any system. All cards are now Nvidia (3060ti, 3070, 4070ti), but I had a Sapphire 5700XT after my 1070 and had no issues with it at all, but I did buy very late, but before the Crypto nonsense began. Before that I had Intel, also no issues.

If you are suggesting that AMD systems have major issues, you are a drama queen, nothing more. GPU's, yah, they had driver issues with RDNA2, well-known, and it continues to hurt them for mindshare. Had the 7900XT been intelligently priced however, I would have likely bought that instead of the 4070ti, but AMD are experts in shooting themselves in the foot.
 
No, I'm referring to their BIOS/UEFI as well. And no, I'm not being a drama queen. Reliability is paramount. Even a .5% rate of instability is grossly unacceptable, and AMD is way beyond that. I work in IT and I use what is safe and reliable. That is not now, nor has it ever been AMD. Google is your friend.

Also, your anecdotal personal issues, or lack thereof, are irrelevant. Prime example? Cyberpunk 2077. I experienced NONE of the issues that others did while playing it. I played it day one and my experience was essentially flawless. Does that make the issues others experienced not true or false? Of course not. AMD is not reliable. I knew this back when they bought ATI, which is why I stopped using those cards as soon as they were bought. Every time I dip my toes back into the AMD waters, they get burned, without fail.

I desperately wish AMD was a reliable competitor, I do, but they just aren't and never have been.
 
Back