AMD: efficiency will soon be more important than core count

Emil

Posts: 152   +0
Staff

Today, Advanced Micro Devices and Intel are competing over which company's processors have the most cores. AMD says that the core war will end soon and will instead be replaced by an efficiency war. In the coming years, useful on-die specialized computing capabilities will be more important than the core count. Known as heterogeneous computing, processors will resemble systems-on-a-chip, in which sections of each chip will be dedicated to specific tasks, such as encryption, video rendering, or networking, instead of being largely composed of a single general-purpose processing core.

"There will come an end to the core-count wars. I won't put an exact date on it, but I don't myself expect to see 128 cores on a full-sized server die by the end of this decade. It is not unrealistic from a technology road map, but from a deployment road map, the power constraints that people expect [servers] to live in" wouldn't be feasible for chips with that many cores, Donald Newell, AMD's chief technology officer for servers, told PC World.

Newell compares the core-count wars to the frequency wars, which ended recently. Improvements in CPUs were measured largely by clock speed, with each new generation of processors sporting "more and more GHz." Eventually chip companies realized that chips simply got too hot and increasing clock speed indefinitely just wasn't possible.

The solution was dual-core server and desktop chips, followed by quad-core, six-core, and eight-core options from both AMD and Intel. It will end someday, but how many cores will be claimed as the optimal number?

Permalink to story.

 
I have 6 and to be fair I feel Its unnecessary, I only got the x6 1055t because I believe it is more future-proof than the x4 965.
 
so, once again they can`t. so they will do just what they can, and of course take our money.
 
for most users ive never seen the need for anything more than dual core, and even then a really fast single core is plenty in most cases. even for gamers, there are very few games that can take advantage of 3 cores let alone 6 or 8 anyway.
 
Eddo22 said:
As far as I'm concerned efficiency has always been more important.

This.

Besides, it is indeed a vague and redundant statement, since both things are intrinsically correlated; more cores translates to a much more efficient CPU.

Now, that the amount of such cores will eventually become irrelevant because of the introduction of the company's fusion technology, that, I see plausible. But until then, even if a hexa-core capacity is never really fully utilized, it is nevertheless a more efficient CPU compared to lesser core CPU. That's common sense.
 
For the desktop, performance is the key. But I think that the desktop use will probably cap out with 6 cores consisting of 3 CPU cores and 3 GPU cores in the next 10 years. And that would be an enthusiasts' part number.

The typical power efficient laptop would consist of no more then 4 cores. And that would probably consist of 3 CPU and 1 GPU or possibly 2 CPU and 2 GPU cores in the same chip.

But I wanted to comment on their "systems-on-a-chip" design and outlook. While I think that this is promising, and probably the future, it starts to sound a lot like console systems. My one concern is that their hardware remain available to open standards (if there is even such a thing in this future). As a consumer and business technology supporter, I am leery of buying hardware for the sake of hardware. Even if I can capitalize on this design in many ways, the hardware design needs to first support all the applications that I'm wanting to run.
 
Guest said:
for most users ive never seen the need for anything more than dual core, and even then a really fast single core is plenty in most cases. even for gamers, there are very few games that can take advantage of 3 cores let alone 6 or 8 anyway.

I will not deny that this is currently true. However, given ray tracing is coming onto the graphics scene quite soon, multicore processing will be a big hit. Real time ray tracing requires a minimum of an eight core processor. Given that eight core processors are already on the top end of the market, this is not so unfeasible. I recommend you check this out.

On the other hand, multicore processing for everyday home usage is a null factor. Most people do not use programs that require anything above a dual core processor, if that.
 
9Nails said:
For the desktop, performance is the key. But I think that the desktop use will probably cap out with 6 cores consisting of 3 CPU cores and 3 GPU cores in the next 10 years. And that would be an enthusiasts' part number.

The typical power efficient laptop would consist of no more then 4 cores. And that would probably consist of 3 CPU and 1 GPU or possibly 2 CPU and 2 GPU cores in the same chip.

But I wanted to comment on their "systems-on-a-chip" design and outlook. While I think that this is promising, and probably the future, it starts to sound a lot like console systems. My one concern is that their hardware remain available to open standards (if there is even such a thing in this future). As a consumer and business technology supporter, I am leery of buying hardware for the sake of hardware. Even if I can capitalize on this design in many ways, the hardware design needs to first support all the applications that I'm wanting to run.

See my above post about ray tracing.

I do admit that you have some valid points, however your numbers are a bit off. If anything the GPU (read: general processing unit) will cap out at 8&8 or 8 &16 depending on your needs and cash flow.

This will require a complete rethink of how programs are written, nonetheless. Programmers need to take multithreading to a level that is just not seen today. I also would not be surprised to see the average enthusiast processor move up beyond the 3.0GHz range to the 4.0 bracket, given the supposed increases to efficiency.
 
Guest said:
for most users ive never seen the need for anything more than dual core, and even then a really fast single core is plenty in most cases. even for gamers, there are very few games that can take advantage of 3 cores let alone 6 or 8 anyway.

Even though almost every company just makes horrid console ports and should be destroyed for good, many games take advantage of far more than 4 cores. If I remember correctly Medal of Honor can use all parts of an i7. A theoretical 8 cores.
 
The question is not can we fully utilize multiple cores. It is will it run crysis?
 
2 or 4 is all you will use.in every day programs and games. a handfull of games use 4 cores so GHz range and ram,HD speed are the numder to look at too.
 
Did AMD just figure this out?

This is why the Core2 and now i7 is destroying your products IPC should always be the first priority.

It seems they forgot this after the Athlon 64 days.
 
Guest said:
for most users ive never seen the need for anything more than dual core, and even then a really fast single core is plenty in most cases. even for gamers, there are very few games that can take advantage of 3 cores let alone 6 or 8 anyway.

You have no clue what you are talking about.

Any game released in the last 2 years requires a dual core.

Hmm lets see Starcraft 2 can use more than 2 core, Battlefield bad company 2 etc....

Now let move away from games for a second.

I have a PSP, laptop, phone and lets image for a second I want to put some videos I downloaded on my devices. Encoding video needs processing power you can wait 2 hours for your Dual core rig to convert a file while I will do it in 30mins on my quad transfer it and be out of the house and own the road while you are still sitting there waiting.

And this isn't a extremly difficult task a 12 year old can do it.

People that render or even use photoshop for work will need alot of cores.
 
If they are talking up efficiency, then look out for bulldozer and their fusion derivatives. Also, looking at their Llano demo, running Hyper pi, 1080p video and a n-body physics simulation simultaneously, AMD seems to have finally caught up.
 
Did AMD just figure this out?
No.
But since AMD will soon(hopefully) be moving to process that allows them to make a CPU that's smaller than a Big Mac, now is the time to get revved up about efficiency. Can't bang the performance/watt drum if you're churning out K7/K8/K10 on 45nm.
More into here
 
I find multi tasking to be the main benefit of a multi core setup. Being able to encode something, or burn a BluRay disk while playing a resource heavy game is what I want. I remember back in the day where I'd be afraid of opening a browser window lest it interrupt my 20 minute CD burning session.
 
This thread looks so much like funny quotes from the 80's about computers.
There's no way we'll need more than 6 cores.
Games only need 2 cores.
Fusion will make cores irrelevant.

I wish I was anal-retentive enough to archive these threads to laugh at in 20 years.

Guest said:
Hmm lets see Starcraft 2 can use more than 2 core, Battlefield bad company 2 etc....

According to the Starcraft 2 review done on Techspot https://www.techspot.com/review/305-starcraft2-performance/page13.html There's no derived benefit from more than 2 cores.
 
Is this news? Anyone that wasn't stupid enough to fall for the multi-core thing when we all knew software and OS'es weren't made for it means we damn well knew that increasing RAM access means nothing just as well.

Given the "my pants are bigger than yours" society, its not going to mean a damn thing.
 
As hellokitty pointed out, this comment is taken a little out of context, and was primarily aimed at the server market. Servers generally are able to take much more advantage of multiple cores, which is why the "core wars" is rampant in that side of the business now. If the true context of the comment is considered, Newell's observations are and prediction are pretty valid. Efficiency is always a big concern with servers, particularly in server farms (which are popping up in ever-increasing numbers and densities). At some point, the "throw more cores at it" approach would have diminishing returns, where diversifying the processors to do more around the cores might pay off bigger dividends as far as efficiency. Sort of a brute force vs finesse comparison.
 
Sounds great and just imagine if they can make the Phenom II X6' very efficient, then you could stick them in laptops making them... crap no more hahaha, wouldn't be too costly either =P
 
Back