Intel Core Ultra 9 285K Review: Arrow Lake is a Mess

I'm not ignoringthe 5800x 3d and the 7800x 3d but those are megaslow cpus on every other task, they are not meant to compete with the 285k. The 9950x is. Nobody looking to get a 9950x is remotely interested in the 5800x 3d for obvious reasons.
Then why do you keep referring to the 285K’s power efficiency in games? Why are you cherrypicking specifically it’s efficiency in games and not it’s efficiency in other workloads?
 
Most of the review is fine, until they start throwing "flop" around. That is where their professionalism left the building. Intel has fallen on some hard times, just as AMD in years past, but there is no need for their obvious fanboyism in published articles especially when Arrow Lake looks really bad compared to 14900 versus comparison of AMD 7000 versus 9000 parts.
But it is a massive flop. That's the reality.
 
Benchmarking a new flagship processor only in 1080p is a farce. Since this seems to be TechSpot's new standard now, it's time to move back to scanning multiple websites and trying to painstakingly piece some real-world relevant results together.
 
Then why do you keep referring to the 285K’s power efficiency in games? Why are you cherrypicking specifically it’s efficiency in games and not it’s efficiency in other workloads?
First of all when I'm using averages across 15 games you can't say I'm cherry picking by definition. I'm doing exactly the opposite. Cherry picking would be using only one gase. I'm not. You just don't like the results so you call them cherry picked.

If you actually read my post you would see I wasn't just using gaming. 285k is competitive or winning in efficiency across the board compared to the 9950x. At least according to TPU, toms hardware, igors lab, computer base de and club386 reviews. I'm mentioning these cause they monitor power draw across a lot of applications, not just one.
 
Benchmarking a new flagship processor only in 1080p is a farce. Since this seems to be TechSpot's new standard now, it's time to move back to scanning multiple websites and trying to painstakingly piece some real-world relevant results together.
Why? Higher resolutions are pointless for a cpu review. What are higher resolution that are gpu bottlenecked will tell you about the chip?
 
Regarding this review: even as someone who has an entirely AMD-based PC at this point, I actually don't see this as a total loss for Intel. At the very least, I think this is a better showing than they've had for the last couple of generations; they've managed to put performance in the new chips that more-or-less matches the previous gen on average, while significantly reducing power consumption. While I don't think this generation will be worth buying, it makes me hopeful that they could at least be laying a good foundation here that could ramp up rapidly in the next couple of gens.
Most of efficiency gains come from fact that Intel is now using Better node* than AMD. Still, Intel is far behind. I'd really like to know what is this "good foundation". Basically, if Intel was using same node that AMD does, we would not be talking any good foundations here.

*CPU tile.

AMD Zen5 is much faster than Zen4 if developers just would waste few seconds and add support for AVX-512. That gives mid to high double digit performance gains on many cases and makes Zen5 future proof. Arrow Lake? There is no AVX-512 support of Arrow Lake. Therefore AMD maintains lead on future proofing AND power consumption, good AVX-512 support does not come free.

There is nothing good Arrow Lake, AMD is just too far ahead.
 
First of all when I'm using averages across 15 games you can't say I'm cherry picking by definition. I'm doing exactly the opposite. Cherry picking would be using only one gase. I'm not. You just don't like the results so you call them cherry picked.

If you actually read my post you would see I wasn't just using gaming. 285k is competitive or winning in efficiency across the board compared to the 9950x. At least according to TPU, toms hardware, igors lab, computer base de and club386 reviews. I'm mentioning these cause they monitor power draw across a lot of applications, not just one.
So I thought I’d bite, TPU reviews of the 285K, no recommendations, TPU review of the 9950x highly recommended but expensive.

Before I go and checkout the rest, are they all the same? Do all of them not recommend the 285K?

Also, what do you mean I don’t like the results? I WISH Intel would become competitive again, I’m bored of prices constantly spiralling over the last 5-6 years. All they needed to do was sell these at a much reduced price (say $100 less or more) and they’d actually become far more palatable.
 
So I thought I’d bite, TPU reviews of the 285K, no recommendations, TPU review of the 9950x highly recommended but expensive.

Before I go and checkout the rest, are they all the same? Do all of them not recommend the 285K?

Also, what do you mean I don’t like the results? I WISH Intel would become competitive again, I’m bored of prices constantly spiralling over the last 5-6 years. All they needed to do was sell these at a much reduced price (say $100 less or more) and they’d actually become far more palatable.
The CPU market has never been as competitive as it is now. It was only 7 years ago when intel was selling the i7 7700k for £350/$400 for a quad core CPU. For the same price now you're literally getting 12 for more cores or 8 cores with a **** ton of cache for gaming.
 
So I thought I’d bite, TPU reviews of the 285K, no recommendations, TPU review of the 9950x highly recommended but expensive.

Before I go and checkout the rest, are they all the same? Do all of them not recommend the 285K?

Also, what do you mean I don’t like the results? I WISH Intel would become competitive again, I’m bored of prices constantly spiralling over the last 5-6 years. All they needed to do was sell these at a much reduced price (say $100 less or more) and they’d actually become far more palatable.
What I mean you don't like the results? Exactly what you did just mean. Instead of verifying the data you are bothered with whether or not they are recommended. Bro whatever it's a lost cause. 285k is terrible, let's move on
 
Only TechSpot? Please can you link me to a CPU review out there done at higher resolutions?
By the way, I didn’t say it's only TechSpot. It's as stupid when Tom's or whoever else does it,
 
Why? Higher resolutions are pointless for a cpu review. What are higher resolution that are gpu bottlenecked will tell you about the chip?
I have explained it numerous times in these comments, there's no point doing it again. If you don't get it, you don't get it.

Although, I'm not even sure it needs explaining that people might want to use this processor for gaming, and at this price point they will probably want a bit more than 1080p, and that higher resolutions are not always GPU bottlenecked, especially with the friggin' 4090.

If I was considering upgrade, this review wouldn't tell me much. Though seeing as it's mostly negative it's a bit better than if it was talking about overall uplift, which could be much more misleading. Nevertheless, the trend remains. If that's what will happen soon with 9800X3D then it's really curtains for TechSpot.
 
By the way, I didn’t say it's only TechSpot. It's as stupid when Tom's or whoever else does it,
Oh wow, a whole chart of CPU’s and the difference is 101fps at the top, and 95fps at the bottom, what useful information that is…
 
I have explained it numerous times in these comments, there's no point doing it again. If you don't get it, you don't get it.

Although, I'm not even sure it needs explaining that people might want to use this processor for gaming, and at this price point they will probably want a bit more than 1080p, and that higher resolutions are not always GPU bottlenecked, especially with the friggin' 4090.

If I was considering upgrade, this review wouldn't tell me much. Though seeing as it's mostly negative it's a bit better than if it was talking about overall uplift, which could be much more misleading. Nevertheless, the trend remains. If that's what will happen soon with 9800X3D then it's really curtains for TechSpot.
If the game isn't gpu bottlenecked at higher resolutions like you are claiming then it won't be gpu bottlenecked in lower resolutions either.

There is no information you can get from testing at higher resolutions. If the chip can get 100 fps at 720p it will get them at 8k as well. It's the gpu you need to worry about.
 
If the game isn't gpu bottlenecked at higher resolutions like you are claiming then it won't be gpu bottlenecked in lower resolutions either.

There is no information you can get from testing at higher resolutions. If the chip can get 100 fps at 720p it will get them at 8k as well. It's the gpu you need to worry about.
You and me don’t agree on very many things, but at least we agree on how to test things.
 
But it is a massive flop. That's the reality.
The 9000 series was not a "massive" flop. AMD improved other aspects of the CPU that didn't translate to much for gaming performance increases. They demolish Intel in AVX-512 stuff. Not the best choice for gamers, but not a massive flop. Just as Intel improved multi-core performance with the 285, it isn't a great choice for gamers, and more often loses to 14900. But I get it, fanboys are going to fight for Intel no matter the facts/data.
 
This is sad, how is it that both single and multi-core cinebench leading performance do not translate to real world performance?

There must be some serious inter core latency going on here and that used to be Intel's strength?

Weird!
 
I wonder if Memory Integrity or virtualization-based security is at least partly responsible for the 24H2 performance hit.
 
With so big improvement in E-core performance - they should have just put 32 E-core and gived them HT, that would destroy all :)
 
What a shame. Not ready I guess. We won't be seeing any mercy with the prices of the 9800x3ds.

AMD is not stupid enough to ask 500 dollars for 8 cores, will affect review scores too.

Also, 3D is mostly for gaming. Not a do-it-all chip so AMD can't really ask 500-600 dollars for 9800X3D even tho it will probably beat even 9950X3D in gaming, while costing less and uses less power.

Single CCD pretty much always beat dual CCD in gaming.

Will they put 3D cache on both CCDs for 9900X3D and 9950X3D tho? Most leakers says no.
 
The 9000 series was not a "massive" flop. AMD improved other aspects of the CPU that didn't translate to much for gaming performance increases. They demolish Intel in AVX-512 stuff. Not the best choice for gamers, but not a massive flop. Just as Intel improved multi-core performance with the 285, it isn't a great choice for gamers, and more often loses to 14900. But I get it, fanboys are going to fight for Intel no matter the facts/data.
For what they test here, it is a massive flop. And the sales reflect that. And you forgot the context where the launch prices were higher than they are now, while Zen 4 was really cheap.

The fact that it improved value wise means that AMD understood the problem and adjusted their strategy... somewhat. They need to save Zen5 with a good x3D launch.
 
I wonder if Memory Integrity or virtualization-based security is at least partly responsible for the 24H2 performance hit.
Possibly, efficiency focus for datacenter in their architecture. AMD is eating their lunch in the datacentrer space.

I also assume the X3D cache is patented by AMD so Intel can't go that route?
 
Possibly, efficiency focus for datacenter in their architecture. AMD is eating their lunch in the datacentrer space.

I also assume the X3D cache is patented by AMD so Intel can't go that route?
While AMD has patents for their implementation, Intel also has their own patents for stacking chips (Foveros). But more specifically, 3D V-Cache is TSMC's tech.

"When you reference V-Cache, you're talking about a very specific technology that TSMC does with some of its customers as well. Obviously, we're doing that differently in our composition, right? And that particular type of technology isn't something that's part of Meteor Lake, but in our roadmap, you're seeing the idea of 3D silicon where we'll have cache on one die, and we'll have CPU compute on the stacked die on top of it, and obviously using EMIB that Foveros we'll be able to compose different capabilities." - Intel CEO Pat Gelsinger, 2023
 
Back