Intel Battlemage desktop GPU hits Geekbench with 12GB of VRAM, 2,850 MHz boost

Nintendo uses Nvidia in Switch, 150+ million units sold. AMD lost Switch 2 deal, due to using way more power and because FSR is beat by DLSS. FSR is generally really bad at low res, where DLSS does fine.

Sony did not use FSR in PS5 Pro, but PSSR, Sony in-house built solution, that beats FSR as well. Even XeSS beats FSR. You see, AMD don't have the R&D funds to compete really.

Even Nvidia don't care how many GT710 Nvidia sells. They look at profitable stuff only. There is no money in low end GPU market. People with low funds buy second hand.

Anything else?

Nintendo sales do not account on PC units market share. Sony and MS chose superior AMD over low powered Nvidia. Nintendo uses only slow stuff.

AMD is adding more units for future GPUs. They decided it was not worth it, yet. Despite XESS "awesomeness", Intel somehow didn't get even 1% market share. AMD was right.

Every sold GT710 counts for dGPU market share. So you actually care. Stop lying.
 
Is Intel going to be the most performant competitor next to Nvidia? AMD falling out of the race is a huge opportunity for Intel to really cement itself in the dGPU area above AMD in performance.
 
Is Intel going to be the most performant competitor next to Nvidia? AMD falling out of the race is a huge opportunity for Intel to really cement itself in the dGPU area above AMD in performance.

FYI, unlike on CPUs, making high performance GPU is pretty easy: add moar cores and moar everything. Challenging Nvidia on GPU side is easy: make bigger chip. Not that harder.

If Intel or AMD sees that profitable remains to be seen.
 
AMD will be fighting against 5060 series soon with RDNA4 so what is the difference here exactly?

AMD left high-end gaming GPU market officially. Top RDNA4 SKU will deliver 7900GRE / 4070 series performance. Focus will be improved RT performance. Raster performance not really going up. It is actually going down vs Radeon 7900XTX and 7900XT... Hopefully efficiency is improved as well.


Intel is coming for AMDs prime marketshare, which is low to mid-end and Xe2 looks very good so far. AMD should be worried, considering XeSS beats FSR with ease as well. Upscaling is here to stay. It replaces AA in pretty much all new games. It is listed in game requirements in most new games as well.

Nvidia has high-end for themselves and neither AMD or Intel will change anything about this. They are both years behind.

AMD has been in the GPU market for almost 20 years now, Intel is a new player and already can match AMDs mid-end stuff with Battlemage and drivers improved immensely in just a few years. AMD should be worried.

When Intel starts spitting out own GPUs on 18A and better, they can undercut AMD with ease, because AMD is stuck with TSMC that raised prices alot over the last 5 years. AMD can't even afford to use their prime-nodes anymore. Apple is always 1st at TSMC but now both Intel and Nvidia has priority over AMD too, both are using TSMC 3N long before AMD will.

Lets see if AMD will come crawling to Samsung for cheap chips, the performance will suffer tho. I know AMD has been in talks with Samsung before.

You generelly don't need to use top tier nodes to compete in low to mid-end, so it would make sense for AMD to go cheaper. Margins will probably be higher then.
Question for you, when did you last USE a AMD card? (1070 for me) fanboys are one thing, be being objective is another. I see the pro's and con's of both.

Yes AMD claim they have left the high end, well that means the Nvidia will be more expensive for everyone, being the largest vendor on steam.. yeah that hurts the community as a whole..

OK the RDNA4 is suppose to be ~7900 performance... but a lower price, who is loosing there, that is a great level of performance for people on older generations. Dont forget your beloved 4090 CAN be beaten in raster by the XTX.

Nivida has the raytracing advantatge, I'll give them that. One thing to consider though is its still an emerging tech. gen 3 ish... also nvidia has the 1st mover advantage.

AMD is behind, sure, do you see nvidia taking on intel in the CPU space? AMD is much smaller and competing on both sides. I'd love AMD to be more competitive in the GPU division, Else Nvidia will pull an Intel.... gota keep them on their toes.

Also note, all the main consoles are AMD and have been for a while, Nvidia has the switch 1 and 2... low power handhelds.

"When Intel starts spitting out own GPUs on 18A and better, they can undercut AMD with ease" Please provide evidence that the Intel GPU will be on 18a. - also note .. if true... 18a will also be supply constrained... as 18a would then be for.. server/desktop/laptop/gpu... yeah... and thats not inc any advanced packaging limitations.

AMD going to Samsung is pretty far fetched, Unless you have evidence of TSMC's pricings to AMD, this is speculation. TSMC might cost more.. but are still ahead of Samsung.. and is any of the 40 series on Samsung, last I recall its all TSMC as well.

"Apple is always 1st at TSMC but now both Intel and Nvidia has priority over AMD too" Please provide evidence. Also note, AMD is juggling their allocation with their desktop AND server market... and yeah... have a look at intels server share.

Going to higher nodes also allow less silicon to be used, there are pro's and con's to both.
 
The Xe2 GPU in Lunar Lake performs really well and is efficient too.

Xe2 sized up for desktop will easily be able to compete in low to mid-end market.
 
Meanwhile Nvidia 8GB cards perform like AMD 16GB cards in many new games:

https://www.techpowerup.com/review/god-of-war-ragnarok-fps-performance-benchmark/5.html

6800XT at 3070 / 3070 Ti level...
Yeah VRAM really futureproofs a card I see :D

However, this Intel 12GB card is not their best SKU. Top Battlemage will get 16GB.

99.9% of PC gamers don't need 20, 24 or even 32GB VRAM as 5090 is rumoured to get...

12-16GB is plenty, especially if you don't care about Ray Tracing or Path Tracing, which most don't. RT uses far more VRAM than raster. Whats funny here is, AMD can't do RT proper anyway, meaning 20-24GB VRAM is pure waste.

Going all out on VRAM is pointless if the GPU lacks power to begin with. Just look at 6700XT today. Beat by 3070 in 9 out of 10 games. Launch price was the same for both cards. People praised 6700XT for having 12GB and said it would age like wine - Aged like milk instead due to lacking GPU power and RT capabilities (many games has forced RT these days, but can be reduced)

What matters more for longevity, is upscaling and DLSS beats FSR with ease.

I will take superior upscaling (with built in AA and sharpening, replacing any other 3rd party AA solution) over "just VRAM" any day. DLSS Quality looks GREAT at 1440p. FSR at 1440p, not so much...

First point, uh-uh. The main reason I chose a 6800XT over a 3080 (apart from the latter costing a whole £600-1200 more) was borne out by prior knowledge that whatever boost GDDR6X had over GDDR6 on paper or synthetic benchmarks wasn't so in reality. That 3080's 10Gb was getting ground up when the 6800XT still had 2-3Gb left on the table.

Everything else you said in this one was either a fair point or details an outlier. I absolutely needed to use FSR in but one game with that 6800XT but the 3080 led the way in DLSS only cos it needed it more and sooner without RT making an appearance. With RT... well Nvidia do have their trends. Firstly, that AMD are the one of the two competitors still sticking to the old standard (Turing and before) of fairly priced GPU's, and a ~£250 range for one card tier, low/base/ref card to premium AiB (where Nvidia run a grand or more for one model) or that even for a record breaking price hike per gen the last two of them that RT advantage is both single gen/30 fps average and not close to lossless like you'd expect when RT alone costs up to a grand more of the card price after raster/fps perf is compared.

And you might be right that nobody needs x or y of VRAM... yes, assuming the rest of the card assets are up to the job/standard. Unfortunately having run the VRAM hard on a 6800XT at 3440x1440 two years ago the 7900XT had enough VRAM but not enough uplift while the 7900XTX might yet be overkill on VRAM but enough uplift and a third to half the price of any 4090 for only 21%/30 fps behind at 4K (traded blows against the 4080 for less too until the 4080S launched) Remember here that while VRAM and other card assets are only half the story re the rise of RT/upscaling added but a low VRAM/asset card with perfect RT would be even worse than anything AMD might be guilty of.
 
You are the fanboy spreading misinformation, could not care less which brand I use. Luckily I use 4090 meaning AMD don't have anything for me and won't have for years, since they left high-end.

Reality tho, is that AMDs 16GB cards are beat by a 8GB card, and there's zero issues with textures and stuttering, would be revealed in the minimum fps testing as usual. That is the whole point of minimum fps testing ;)

3070 mostly had problems in The Last of Us, which was AMD sponsored. The game even had issues on many AMD cards as well. The game was fixed long ago. It was a rushed console port, nothing else. Not a baseline for PC gaming at all.

This is not the first game where 16GB AMD cards are close to 8GB Nvidia ones, or even beat:


3070 8GB beats Radeon 6800 16GB even at 4K/UHD at Ultra, minimum fps included.

Sad bue true. AMD is lacking behind in more and more new games coming out, due to inferior arch and lacking features. This is not fanboyism, it is reality.

I was a 6800XT owner before I got my 4090. I know exactly what I am talking about. My 6800XT died on me, had 110C hotspot temp since day one (AMD says its fine, look at their official forums if you don't believe me) and tons of issues in all kinds of games. FSR was horrible as well.

With my 4090 I am getting 60C load temps, 85C hotspot peaks, silent operation, zero issues in any games, DLSS/DLAA which is awesome, Frame Gen that actually works and option for RT and even Path Tracing which AMD can't do at all.

It is not a coincident that AMD soon drops below 10% dGPU marketshare ;)

Again, this one makes me wonder where and how AMD hurt you.
And you're talking outliers, which AMD has had a few in their favour too. There's a pattern in that as well, not so much the cards or differences between them but certain trends re game studios and dev... Those haven't been any good to anybody whatever side they use and will only get worse for all. Unless you buy generational best cards for lower res, you might last out a gen without losses hitherto unknown then...

As for the 6800XT... a you problem then cos again, very much an outlier. This ******* had nothing but good times with theirs though I did buy a good model, did my due diligence etc. The only time I had a hotspot temp push over 90C was running a very poorly optimised demo of a game that ran the same high edge and hotspot whatever preset used (the game was 40K Battlesector fwiw)

On to the 3070... the old board assets aside that 8Gb was woeful. Honestly, they needed a bit more when the card was well cut down and 8Gb on any prior gen was getting increasingly hard pressed with AAA's at 1080p before I even got my 6800XT. I had a 1440p laptop with a full power 3070ti (equivalent to desktop 3060ti-3070) should've been 1080p cos that might've held up. But look how easy it is for Nvidia to get it right when they need to for whatever reason... or have a crafty strategy at least... 4070 launches with 12Gb (ok there was that 4080 16/12 furore but honestly, it's like they did that for the attention cos nobody could be that feckless, right...) Then the 4070ti Super at 16Gb... Had to look twice to see if it I hadn't misread 'AMD'.

In fact, I had such a good time with that 6800XT that when the new gen dropped I went around again. Tbh I did wait to see if Nvidia had made a rounding error what with scalpers and crypto long gone after 2021 but no... they really didn't see any wrong in those prices that finally put the concept of MSRP or the idea of a fair premium tax to bed. Like the 6800XT the 7900XT had had all the pros (cool temps, hotspot, quiet, no issues... in fact all better than the 6800XT) of your 4090 except of course the RT part and needing DLSS to make it back to a 4090 again instead of a lower card and I still don't absolutely need to use FSR or FG either... unless, in most cases for such, I want more frames than my monitor can actually display. Not bad. I could've also got 2 or 3 for the price of any 4090 (or 4080 in 2023) and matched it's fps by cutting a couple of unnoticeable settings... but that was also the cost of a full upgrade for every other part too so there's that.

Finally, as for AMD leaving the high end. We all know why and how that is. It's cos it's been an exercise in futility for them since coming back to compete vs Nvidia in the important side (moving lots of pixels around a screen fast) for the first time in years (before that it was barely competing mid range at best) All they've had is short shrift for doing damn near the same for far less. It's just not worth the R&D, production etc anymore nm that they compete on both fronts like nobody else, and do it with less. I just don't know what's going to happen first, Nvidia being sane on actual value vs price, or Nvidia fans balking at next gen pricing and maybe downgrading to 1440p cos they 'prefer it to 4K'...
 
You can ramble about AMD having more VRAM for hours, does not change the fact that Nvidia beats AMD while using less VRAM overall and it is a non-issue for the actual gamers. Besides 99% of PC gamers use 3440x1440 or less and not 4K/UHD or higher.

AMD can't do RT and RT is where you need alot of VRAM. Path Tracing especially.

I laugh when AMD users praise having alot of VRAM yet can't use RT which is where you actually need alot of VRAM.

In no game at all, you need more than 12GB maxed out with raster only at full ultra settings, at 4K/UHD. That is reality for you. In 9 out of 10 games, even 8GB is plenty.

When you enable RT, FrameGen or even PT, then sure, some games might require slightly more than 16GB at 4K, yet most people will be forced to use upscaling with settings like this, unless you own a 4090, or soon, 5090 or 5080 too. AMD have nothing that will run demanding games at 4K with RT or PT.

7900XTX with Path Tracing is a slideshow at 4K, with 3-4 fps on average.
 
Back