Intel expected to unveil faster Battlemage GPUs by the holiday season

Daniel Sims

Posts: 1,377   +43
Staff
Rumor mill: Intel has remained tight-lipped about its plans for next-generation Battlemage graphics cards so far, but they're certainly hoping to avoid the multiple delays that plagued the launch of the Alchemist lineup in 2022. Recent leaks and reports indicate that the company wants to begin shipments before the end of this year.

Sources at the 2024 Embedded World Conference told ComputerBase that Intel hopes to introduce its second-generation dedicated graphics cards – codenamed Battlemage – before Black Friday this year. The new GPUs could launch around November, provided the company meets its timetable.

A leaked internal roadmap and a public presentation slide from last year indicated that Intel intended to release Battlemage sometime in 2024. The report from Embedded World suggests that Chipzilla wants to beat the holiday shopping season, during which it'll likely compete against new graphics cards from Nvidia and AMD.

Still, Intel has to prove it can maintain its roadmap, which it missed when launching the Alchemist series in 2022. The company planned to reach wide availability early that year. Instead, it only managed a slow rollout in China before shipping the Arc A750 and A770 to other countries in October.

The delays put Alchemist behind the performance curve compared to its contemporaries – budget and mainstream models from Nvidia's RTX 4000 series and AMD's Radeon RX 7000 lineup. Inefficient drivers have also been a chronic problem for Intel, but numerous software updates have dramatically improved DirectX 9 and DirectX 11 performance on Arc cards. Significant improvements for XeSS and PresentMon also show that Intel remains committed to dedicated GPUs despite the setbacks.

Exact details on individual Battlemage cards are scarce, but it's possible that the new generation will feature an enthusiast-level GPU, unlike Alchemist which only included mainstream and entry-level products. Based on the Xe2 microarchitecture, Intel's next generation GPUs should arrive with improvements in rasterization, ray tracing, and AI upscaling.

Graphics card vendors expect Nvidia to unveil enthusiast and high-end RTX 5000 products in the fourth quarter of this year. The upcoming GeForce RTX 5080 could handle ray tracing on par with the current flagship, the RTX 4090. Meanwhile, that card's successor – the 5090 – is expected to provide a 70-percent performance boost and a 512-bit memory bus. Both cards and possibly the RTX 5070 will reportedly include GDDR7 VRAM.

Meanwhile, AMD should launch its RDNA 4 series sometime this year, but it will only include mid-range and mainstream hardware. Recent reports indicate that a mid-range Radeon RX 8000 card might outperform the RTX 4070Ti at a much lower price.

Permalink to story:

 
Being inefficient in CPUs will most probably mean the same to their GPUs. Though, more options is better for the consumer and on the long run, more competition and better prices. FG / upscaling techniques are reaching a high point and there’s not too much more into it, I believe the next step will be path tracing speed improvement. Intel and AMD at present suck at it… (all suck at it, but Nvidia at least can manage some playable results…).

As far I saw, path tracing is the only meaningful way to go, the only that really changes de image quality and worth it. The next gen will be AMD being as fast on RT as current Nvidia chips, Intel being as fast as current AMD’s.
 
Being inefficient in CPUs will most probably mean the same to their GPUs. Though, more options is better for the consumer and on the long run, more competition and better prices. FG / upscaling techniques are reaching a high point and there’s not too much more into it, I believe the next step will be path tracing speed improvement. Intel and AMD at present suck at it… (all suck at it, but Nvidia at least can manage some playable results…).

As far I saw, path tracing is the only meaningful way to go, the only that really changes de image quality and worth it. The next gen will be AMD being as fast on RT as current Nvidia chips, Intel being as fast as current AMD’s.
RT is such a gimmick, it adds little to image quality...and its only usable at High level graphics that few people can afford. Does it worth to have so much R&D investment for a markting stunt ?
I really prefer, better upscaling techniques and performance than a lighting trick that cripples the GPU.
4k over 60fps its only going to be achieved with upscalling technics, better software evolution and better hardware implementations.
There is still a long road on rasterization GPU's that fully take advantage of AI processing to optimize.
The other thing that Im concern also, it about Perf/Watt, Nvidia 40xx series are really power efficient, something that Im really looking forward to see how AMD and Intel can beat that. Its not evolution if you just ramp up Power consumption to get better performance.
 
RT is such a gimmick, it adds little to image quality...and its only usable at High level graphics that few people can afford. Does it worth to have so much R&D investment for a markting stunt ?
I really prefer, better upscaling techniques and performance than a lighting trick that cripples the GPU.
4k over 60fps its only going to be achieved with upscalling technics, better software evolution and better hardware implementations.
There is still a long road on rasterization GPU's that fully take advantage of AI processing to optimize.
The other thing that Im concern also, it about Perf/Watt, Nvidia 40xx series are really power efficient, something that Im really looking forward to see how AMD and Intel can beat that. Its not evolution if you just ramp up Power consumption to get better performance.
Calling RT a gimmick then putting the eggs in the AI processing basket. LMFAO dude.

RT isnt a gimmick. It's here to stay. IDK what your point even is. A "better hardware implementation" would include better path tracing.
 
Being inefficient in CPUs will most probably mean the same to their GPUs.
Then why does AMD have the most efficient CPU's but the least efficient GPU's? You're making a big jump for an assumption there...
RT is such a gimmick, it adds little to image quality...and its only usable at High level graphics that few people can afford. Does it worth to have so much R&D investment for a markting stunt ?
Sorry, just to confirm, you then immediately go with:
I really prefer, better upscaling techniques and performance than a lighting trick that cripples the GPU.
What a completely messed up line of thinking that is. I don't want my graphics to get better, I don't want easier development tools bringing the cost of games down and a certainly don't want them to take less time to develop, I just want my fake frames and fake pixels, nothing to actually get better, just more of them...
 
Calling RT a gimmick then putting the eggs in the AI processing basket. LMFAO dude.

RT isnt a gimmick. It's here to stay. IDK what your point even is. A "better hardware implementation" would include better path tracing.
RT is here to stay as long people believe that it worths paying 2k for a capable GPU. There are more effective ways to simulate RT in rasterization (pre-baked), but Nvidia introduced the RT "on-the-fly" GPU capable model...
Maybe path-tracing is an effective model, but still adds little to gameplay performance, is just visuals, that for me only benefits a few game titles.
I keep wondering if Counter-Strike 2/Fortine/Call of Duty/ World of Warcraft would benefit from RT by your take on it ? ...
Thats my approach... nothing against RT fan boys.
 
What a completely messed up line of thinking that is. I don't want my graphics to get better, I don't want easier development tools bringing the cost of games down and a certainly don't want them to take less time to develop, I just want my fake frames and fake pixels, nothing to actually get better, just more of them...
The thing is, GPU has limitations. Noticing how big GPU's are nowdays even with crazy small nodes. It will come a point that everything needs to be virtually upscalled/resized.
But yeah, Game Dev nowdays has taken a wierd path on laking of core optimizations.
 
RT is such a gimmick, it adds little to image quality.

I remember when this same thing was said about DX7 amd Shader Model 3 doing away with fixed function pipelines, and only certain GPUs supporting it at the time...

I, for one, quite enjoy light acting like light should in games. It is a subtle effect that does quite a lot to trick your brain into immersion, more than texture quality/fidelity IMO.
 
I remember when this same thing was said about DX7 amd Shader Model 3 doing away with fixed function pipelines, and only certain GPUs supporting it at the time...

I, for one, quite enjoy light acting like light should in games. It is a subtle effect that does quite a lot to trick your brain into immersion, more than texture quality/fidelity IMO.
There is no problem with RT functionality, the problem is how its beeing used as sell value to put GPU's in such overpriced state, and bends to its knees any GPU below 800$.. "want better light effect in playable framerate , go for a 2k GPU"
 
The thing is, GPU has limitations. Noticing how big GPU's are nowdays even with crazy small nodes. It will come a point that everything needs to be virtually upscalled/resized.
But yeah, Game Dev nowdays has taken a wierd path on laking of core optimizations.
I'll just do a really quick Google...
Yep, I'll just leave this here...
d6ddnfbxfmp91.png


You didn't even do a quick Google to check what you're saying is true?
 
I'll just do a really quick Google...
Yep, I'll just leave this here...
d6ddnfbxfmp91.png


You didn't even do a quick Google to check what you're saying is true?
Probably you didnt understand what I mean with GPU sizes. its not about the die size, is the overall package to sustain the Power consumption to provide such Performance. GPU's have beeing scalling Power draw for years, even with smaller nodes that provide "power efficiency", but that doesnt mean the GPU is smaller in package. you know, VRM, Memory chips, controllers etc etc... this was discussed also by a AMD engineer ( GamersNexus ) because of Monolith and Chiplets size capabilities. Monolith generation is on the ending path, replaced with Chiplet to get more performance per "die", but even that has limitations.
Even if the PCB is smaller because of smaller nodes, the power draw is massive, that needs gigantic coolers... if they keep scalling powerdraw the cooling system needs to be bigger... I can imagine that one day buying a Mid-Tower RTX 9090 with 2000w PSU next to my micro-atx cpu...

 
Last edited:
Probably you didnt understand what I mean with GPU sizes. its not about the die size, is the overall package to sustain the Power consumption to provide such Performance.
Oh right, yeah sorry, I thought you meant die size.
GPU's have beeing scalling Power draw for years, even with smaller nodes that provide "power efficiency", but that doesnt mean the GPU is smaller in package. you know, VRM, Memory chips, controllers etc etc... this was discussed also by a AMD engineer ( GamersNexus ) because of Monolith and Chiplets size capabilities. Monolith generation is on the ending path, replaced with Chiplet to get more performance per "die", but even that has limitations.
The 4090 is one of the smaller GPU's I've owned, the cooler might be massive but the amount of physical space for VRM, Memory Chips, Controllers etc... It's almost hilariously small, I have a custom water loop and with a water block on it, it looks almost cute.

We also need to remember that GPU power usage has been going up since they were invented, I bet someone 10 years ago went "no way can GPU's use anymore power, the limit must be soon".

TechSpot has done an article on this:
2022-10-05-image-5-p_1100.webp


Question is, why only now are you worried about power consumption? And not 10 years ago?
 
Simple Google search results in 47 titles that have ray tracing in them to date. If ray tracing is a gimmick then AMD would gain market share not lose market share to a historically overpriced Nvidia products but unfortunately it lost market share. Raytracing could be a gimmick feature to some but it seems it's here to stay. While there are some titles that look great and run amazing rasterization titles like Forbidden west these are becoming more rare every year.
My personal favorite rt feature is global illuminations it makes the whole atmosphere pop especially with hdr. Rasterization ambient occlusion lighting doesn't come close in image quality imo.
Circling back to the topic on Intel's Battlemage; Intel's software currently in Xess and Drivers is improving at a faster rate than AMD's. So there is hope for them.
AMD seems to be fortifying its position in the midrange with rdna 4 and ps5pro ( rumored) and playing stagnation love game with Nvidia. This will benefit AMD long term by allowing Nvidia to swell prices with Blackwell, AMD can fill the gap with rdna 5 conveniently.
The gpu market needs Intel's Battlemage to be successful desperately. For their sake and ours.
 
Question is, why only now are you worried about power consumption? And not 10 years ago?
Well, when we a gpu the literally melts cables because of 600w power draw, and a new specs has been released to have more power on the GPU.. its concerning.
4090 top of food chain in comsumer grade GPU, its great no question, but its a mini-nuclear power plant at home every time you play a game. Energy is not getting cheap...
This comes in context, that for me evolution of GPU's is having the same or better performance from last gen with less power consumption.

Like 3080 350w top consumption, and a 4070 Ti 285w is an evolution and outshine previous generation in a performance/watt analysis... that most people doesnt seem to care enough. but for me its an important factor in GPU buying options.

And, from my experience in with older GPU's ...everything that draw's too much power without proper cooling, most likely will fail in the long run.

If to boost RT perfomance will cause more power draw just because it sells, IMHO its bad.
 
Last edited:
Simple Google search results in 47 titles that have ray tracing in them to date. If ray tracing is a gimmick then AMD would gain market share not lose market share to a historically overpriced Nvidia products but unfortunately it lost market share. Raytracing could be a gimmick feature to some but it seems it's here to stay. While there are some titles that look great and run amazing rasterization titles like Forbidden west these are becoming more rare every year.
My personal favorite rt feature is global illuminations it makes the whole atmosphere pop especially with hdr. Rasterization ambient occlusion lighting doesn't come close in image quality imo.
Circling back to the topic on Intel's Battlemage; Intel's software currently in Xess and Drivers is improving at a faster rate than AMD's. So there is hope for them.
AMD seems to be fortifying its position in the midrange with rdna 4 and ps5pro ( rumored) and playing stagnation love game with Nvidia. This will benefit AMD long term by allowing Nvidia to swell prices with Blackwell, AMD can fill the gap with rdna 5 conveniently.
The gpu market needs Intel's Battlemage to be successful desperately. For their sake and ours.
47 games over the span of 4 years isn't impressive, especially considering only the high tier GPUs make RT worthwhile. The fact that you have to buy the higher tier products to enjoy a feature actually makes it kinda gimmicky...duh. Also, how many of those titles are actually good enough to be forever memorable? 5? AMDs reason for not gaining market share isn't because ray tracing isn't a gimmick, it's because AMD still sucks.

fsr is way worse, their anti lag gets you banned and is worse than nvidias, their rt performance is and always will be worse than nvidia, their in game color adjustments you can make are trash compared to nvidia, they cant implement a proper "use the 3d application" v sync setting, enhanced sync sucks, most of the features in the control panel don't work or either apply to directx 9 games only, there hasnt been a "prefer maximum performance" option in the AMD control panel for...probably a decade now so I get to look at a 5 mhz gpu clock while on the desktop instead of getting the snappiest performance possible, their video super resolution stuff sucks, their "frame gen for every game" is buggy at best, wattman always has issues for someone, creating a custom resolution is worse on amd vs nvidia (at least nvidia lets you change your damn res in their control panel lol), you can't choose to install the driver only with amd while also maintaining a control panel to allow u to adjust individual game settings...or any setting at all, you either need minimal which will get you GLOBAL 3D settings in the control panel NOT a PER GAME setting, or you need to install the full package, which includes wattman, which honestly nobody wants. I'm sure I could think of more, just off the top of my head as a current 6650xt owner who's also owned a 2070 and sold my 3070 when I needed pandemic money.

AMD just sucks in the graphics department. What they have going for them is lower prices (they should be lower lol) and their drivers usually install now, they still crash everywhere though. look stuff up online.
 
There is no problem with RT functionality, the problem is how its beeing used as sell value to put GPU's in such overpriced state, and bends to its knees any GPU below 800$.. "want better light effect in playable framerate , go for a 2k GPU"

My 3080 handles 2077 PT quite well at 1440 output upscaled from custom internal resolution of 1422X800, looks fine, and overall hangs around 60+ in NC and down to mid 40s in Dogtown...which is within VRR range for my monitor.

Mostly my point is that options that aren't 1k-2k do exist right now that provide playable options outside of 4k resolution at acceptable settings, but yes I understand your hyperbole and was being a bit snarky.
 
Well, when we a gpu the literally melts cables because of 600w power draw, and a new specs has been released to have more power on the GPU.. its concerning.
4090 top of food chain in comsumer grade GPU, its great no question, but its a mini-nuclear power plant at home every time you play a game. Energy is not getting cheap...
The 4090 is the most efficient GPU I've ever owned, some games are taking less than half the power of my old 1080Ti to run.

The only games that make it actually sweat and need to pull all of the power (I have mine overclocked to 3GHz) are games like Cyberpunk, most other games, it's producing even more frames than my 1080Ti whilst using less power.

With how efficient the GPU is, if you were worried about power, why not just throttle power usage? It'll still be the most efficient GPU and you can pick and choose your power usage?
This comes in context, that for me evolution of GPU's is having the same or better performance from last gen with less power consumption.

Like 3080 350w top consumption, and a 4070 Ti 285w is an evolution and outshine previous generation in a performance/watt analysis... that most people doesnt seem to care enough. but for me its an important factor in GPU buying options.
No doubt, it's great to see efficiency going up and up, the problem is, people aren't going to buy a 4070Ti if they have a 3080 purely for a lower power draw, they want more performance.

Again though, the 40 Series is the most efficient GPU's ever made, have you bought a 40 series and set its power draw lower to meet your demands?
 
The 4090 is the most efficient GPU I've ever owned, some games are taking less than half the power of my old 1080Ti to run.

The only games that make it actually sweat and need to pull all of the power (I have mine overclocked to 3GHz) are games like Cyberpunk, most other games, it's producing even more frames than my 1080Ti whilst using less power.

With how efficient the GPU is, if you were worried about power, why not just throttle power usage? It'll still be the most efficient GPU and you can pick and choose your power usage?

No doubt, it's great to see efficiency going up and up, the problem is, people aren't going to buy a 4070Ti if they have a 3080 purely for a lower power draw, they want more performance.

Again though, the 40 Series is the most efficient GPU's ever made, have you bought a 40 series and set its power draw lower to meet your demands?

The thing is that is a 2k$ GPU with a massive size, with enormous powerdraw( not looking to perfomance). and its concerning that if Nvidia follows that trend for upcoming GPU's.

Imagine a 5060 but needs a 1000w PSU and its cost 500$, that is that budget tier ?
Its about the trending line to keep pumping powerdraw on GPU's for more performance and of course the money that it costs.

Its concercing at a point if you only have options of mid-tier gpu with 1k$ pricetag. Few years ago that pricetag was high-end tier GPU, the scalling of performance/watts and price are reaching to a point of thinking how much is it really worth ?...because of RT marketing ? really ?

Nvidia/Intel always had the reputation of power effeciency / watt and price, and that made its way to have market share. But there is a limit of "fair pricetag"...

Im still holding on my 2070, because the most logical upgrade for it, would be a 4070, double performance withing powerdraw, but the sell price is proibitive at my location.
 
Last edited:
Back