Apple's M1 Ultra is a performance beast, but definitely not an RTX 3090 killer

Claiming that Apple didn't beat the comparison hardware when it did beat that hardware in performance-per-watt is overstatement.

'A lot of that difference could be down to the 100-watt power envelope of the M1 Ultra GPU and the way it shares memory bandwidth with the CPU. For reference, the RTX 3090 alone has a TGP of 320 watts and the Core i9-12900K can add over 241 watts on top of that.

'an Nvidia RTX 3090 GPU and obtained a Geekbench 5 Compute score of over 215,000 points. By comparison, the M1 Ultra in the Mac Studio was only able to muster .... 102,156 points when using Metal.

Getting 50% of the score at 1/3 of the power consumption is a win.

The other question is the comprehensiveness of the benchmarks. One review I saw showed Apple's M-series chip (don't recall which one) crushing the competition in some benchmarks and falling short in others. If the reviewer had only picked three benchmarks the verdict could have been so overstated as to be worthless.
 
Getting 50% of the score at 1/3 of the power consumption is a win.

Is a win if you have to work with a battery in the middle, but in a workstation/desktop segment this is not the point, the central matter is performance/bucks, and if you pay over 5k bucks for a workstation (that not run games) you will want raw performance, period. (remember, for nice desktop family computer you have iMac and Mini, not the studio)

If Apple would have what it takes to beat everything out of the water with performance numbers they would already done it, by rising M1 frecuency, but because yields and architecture their cost budget went to the roof (no one has the money to massive produce a 5nm chip that huge, put some serious frecuency over is just pushing too far) so they cant afford it right now and because that theres isn't any Mac Pro yet (even if the studio beats the Pro, Apple needs the Mac Pro to beat everything out there, everything, if not, they could fail, a very expensive one)
 
Last edited:
Is a win if you have to work with a battery in the middle, but in a workstation/desktop segment this is not the point, the central matter is performance/bucks
I would normally say yes, however... even you have to admit that the power consumption among with many PC hardware components is approaching hideous levels. Yes, it may be a desktop or workstation but if you need to rewire your house (like some people suggest you might need to do with the coming RTX 4090), it's beyond stupid.

Hell, I remember the jokes that people often had regarding Intel chips. Yep, they're powerful; no doubt. But will you need your own personal nuclear power station to run it?
 
I don't know why the hype over this M1 Ultra chip comparing to Intel's, the darn chip is four times the size of Intel's, what if intel to join 4 chips together.
 
I would normally say yes, however... even you have to admit that the power consumption among with many PC hardware components is approaching hideous levels. Yes, it may be a desktop or workstation but if you need to rewire your house (like some people suggest you might need to do with the coming RTX 4090), it's beyond stupid.

Hell, I remember the jokes that people often had regarding Intel chips. Yep, they're powerful; no doubt. But will you need your own personal nuclear power station to run it?

In desktop (and specially on workstations) we're being using dual sockets for years, and at least 10 years using more than 1 GPU, high power consumption in this regard is quite normal, if you can do your job quickly you can have more $$ in return, is that simple.

Some folks that use this kind of systems if they could they'll put two 64c 280w Threadripper without hesitation, just go an see puget systems and you'll have some idea.
 
I would normally say yes, however... even you have to admit that the power consumption among with many PC hardware components is approaching hideous levels. Yes, it may be a desktop or workstation but if you need to rewire your house (like some people suggest you might need to do with the coming RTX 4090), it's beyond stupid.

Hell, I remember the jokes that people often had regarding Intel chips. Yep, they're powerful; no doubt. But will you need your own personal nuclear power station to run it?
Our wall sockets take 2400W devices like heaters. It's common to have 3, 4 such heaters in a house. PCs aren't really high power devices by kitchen or heating standards.
 
It's luxury. That doesn't make the card not for gaming. Like a car can go faster than anyone needs it to.
No. It’s not luxury. 24 GB are useful for rendering and video editing to some extent. People buying a 3090 for gaming have just not enough competence or too much money … or a combination of the above
 
No. It’s not luxury. 24 GB are useful for rendering and video editing to some extent.
rendering and video editing to some extent.
to some extent.
So, if it was a video editing card why would I buy it if it's only useful to an extent? You just admitted it's not for those things. It could be used for editing as any card, of course, but it's made primarily for gaming and it's marketed as such.

Also, 24GB might be too much today but it might not be a few years down the line.

People buying a 3090 for gaming have just not enough competence or too much money … or a combination of the above
That's a different discussion entirely.
 
So, if it was a video editing card why would I buy it if it's only useful to an extent? You just admitted it's not for those things. It could be used for editing as any card, of course, but it's made primarily for gaming and it's marketed as such.
the extent is related to the software you are using, not the purpose.
Some software are not really using the VGA properly.

Also, 24GB might be too much today but it might not be a few years down the line.
no it's not.
24 GB could be useful for resolutions above 8K. When (and IF, I would add) such a resolution would be widely available, 3090 performance won't be enough anyway.

That's a different discussion entirely.
it is the main point of the discussion, since you are confusing marketing with design.
3090 was designed to replace TITAN class graphic cards.
 
the extent is related to the software you are using, not the purpose.
Some software are not really using the VGA properly.


no it's not.
24 GB could be useful for resolutions above 8K. When (and IF, I would add) such a resolution would be widely available, 3090 performance won't be enough anyway.


it is the main point of the discussion, since you are confusing marketing with design.
3090 was designed to replace TITAN class graphic cards.
Suuure, if that's the main point of discussion then I agree on that point, but the main purpose of the card is for gaming and since it's the most powerful card in the series it's likely it's going to have a load of VRAM as well.
 
Back