AMD's Zen processors to feature up to 32 cores, 8-channel DDR4

Take 2, they're small...and slow...and hot temperature wise, not sales. :)
With 32 cores I cannot imagine more than ! ghz or so with air cooler - massively parallel - and nitrogen cooler might get it to 3 ghz. Of course a couple of 16 core Xeons would be fine too.
 
Intel has been outdoing AMD every year without fail since AMD started competing, if competing is the appropriate word when there's no actual competition.
The same is true of nVidia on their other front.
That isn't entirely true and smacks of a very blinkered memory. AMD's ClawHammer/SledgeHammer Athlon 64 was generally the equal or better (depending upon SKUs being compared) of Intel's Pentium 4, and the Athlon 64 X2 traded blows with Pentium D - often besting the Intel product while generally being much cheaper.

As for the Nvidia comment...well, firstly it doesn't really belong in a thread about processors, and secondly, AMD/ATI's graphics hardware has generally been the equal of Nvidia's - and in many cases, its superior.

If you're talking about management rather than product, that is an entirely different conversation.
 
That isn't entirely true and smacks of a very blinkered memory. AMD's ClawHammer/SledgeHammer Athlon 64 was generally the equal or better (depending upon SKUs being compared) of Intel's Pentium 4, and the Athlon 64 X2 traded blows with Pentium D - often besting the Intel product while generally being much cheaper....[ ].....
This is true. Unfortunately many AMD fanbois are still reveling in the glory of that era.

AMD had their own hype when numbering those old dual cores, where they numbered their CPUs to comparison match what they felt equaled, (or bettered), the much higher clocked P-4 single core..Plus, Intel's first foray into dual core territory, resulted in some very primitive space heaters....., ( er, also).

Now, this is where history is a metaphor of itself. Hollywood, (and historical rumor), alleges that after the sneak attack on Pearl Harbor, Japanese Admiral Yamamoto said, "I fear we have awakened a sleeping giant... (and some other stuff).

So, this is pretty much an analog of what has happened to AMD after its brief run of CPU glory. They "awakened a sleeping giant", in Intel.

That being said, and for God knows what reason, I do read the gaming benchmarks in reviews. AMD acquits itself pretty admirably, but the highest overall everything scores do rest with Intel. But, AMD certainly appears well before 31st place.

OTOH, Intel's i3 series will do as much or more than many AMD quads, on about half the power. I award Intel a big chunk of points on the subtlety factor of this alone .

For example, the i3 sitting next to me is idling, but using its IGP, is only registering 28/30 C (82/85F for each core. The air cooler is an Arctic 7 Pro (or something). It's not the stocker, but it was less than 20 bucks. The thermometer on the wall behind me is claiming 69F. Hey, even at idle, I'll take 13 degrees F over ambient any day of the week. It's an Ivy Bridge i3-3225 and TDP is claimed as 54 watts.(y)
 
Last edited:
I liked my FX6300 but it had nothing to do against a Core i5 and for the better part I never thought of switching lanes, and this difference is also clear on the pricing. As someone said after Phenom XII they stopped trying to win the fight.

There is nothing wrong with people enjoing one or the other but blinding themselves out because "it works for them" which again, it's not bad, but it's not a valid point of comparisson when you are comparing raw power.
 
That isn't entirely true and smacks of a very blinkered memory. AMD's ClawHammer/SledgeHammer Athlon 64 was generally the equal or better (depending upon SKUs being compared) of Intel's Pentium 4, and the Athlon 64 X2 traded blows with Pentium D - often besting the Intel product while generally being much cheaper.

As for the Nvidia comment...well, firstly it doesn't really belong in a thread about processors, and secondly, AMD/ATI's graphics hardware has generally been the equal of Nvidia's - and in many cases, its superior.

If you're talking about management rather than product, that is an entirely different conversation.
Intel has always made more money than AMD, that's how a company outdoes another. That's what they compete for, That's why they are in business in the first place.

As for the nVidia comment, I guess you are unaware than nVidia produces gpu's for use in compute applications such as Deep Learning and AI as well as eco tracking.
(Kespry, a commercial drone system company, demonstrated a prototype drone that uses NVIDIA artificial intelligence technology to recognize objects. The Kespry prototype uses the newly introduced NVIDIA Jetson TX1 module for deep learning, which offers complex algorithms to make autonomous devices…)
 
You will be digging real deep, to find something related to GPU's that DBZ is unaware of.
As for the nVidia comment, I guess you are unaware than nVidia produces gpu's for use in compute applications such as Deep Learning and AI as well as eco tracking.
@Technician . No worries with regards my understanding DNN (or cuDNN if you want to be pedantic) with regards Nvidia's GPU's. I've been discussing the technology on this site for almost a year before you penned your first post on the site. You could also add any number of HPC tasks using bespoke UNIX code ( weather/energy/economic simulations, medical imaging, even rendering), and of course the Quadro workstation ecosystem that also leverages CUDA ( CompleX, OptiX , SceniX, and of course the cross-processor/-product segment IRAY renderer).
The point is that as I've pointed out many times over the years, Nvidia's pre-eminence is not based on hardware but realizing very early (around 2003) thanks to SGI's example* that leveraging a software ecosystem engages hardware on a deeper level.
Then why question the inclusion of another processor company in a discussion about processors? Oh well.
I'd ask you the exact same thing since all I did was add counterpoint to your original post concerning Nvidia
Intel has been outdoing AMD every year without fail since AMD started competing, if competing is the appropriate word when there's no actual competition.
The same is true of nVidia on their other front.
If you think Nvidia has no place in the conversation why even mention them in the first place. Sounds like you first attempted to troll this thread and are now attempting to troll me. Not a particularly auspicious start as a community member IMO.



* Obviously helped by Nvidia and SGI having a relationship that dates back to the first Quadro's, and Nvidia acquiring SGI's professional graphics business back in 2002.
 
Last edited:
@Technician . No worries with regards my understanding DNN (or cuDNN if you want to be pedantic) with regards Nvidia's GPU's. I've been discussing the technology on this site for almost a year before you penned your first post on the site. You could also add any number of HPC tasks using bespoke UNIX code ( weather/energy/economic simulations, medical imaging, even rendering), and of course the Quadro workstation ecosystem that also leverages CUDA ( CompleX, OptiX , SceniX, and of course the cross-processor/-product segment IRAY renderer).
The point is that as I've pointed out many times over the years, Nvidia's pre-eminence is not based on hardware but realizing very early (around 2003) thanks to SGI's example* that leveraging a software ecosystem engages hardware on a deeper level.

I'd ask you the exact same thing since all I did was add counterpoint to your original post concerning Nvidia

If you think Nvidia has no place in the conversation why even mention them in the first place. Sounds like you first attempted to troll this thread and are now attempting to troll me. Not a particularly auspicious start as a community member IMO.



* Obviously helped by Nvidia and SGI having a relationship that dates back to the first Quadro's, and Nvidia acquiring SGI's professional graphics business back in 2002.
I don't see what the date of your first post on a particular web site denotes in your mind, but your impeccable pedigree is on a web site, that's all I need to know.
;)
 
I don't see what the date of your first post on a particular web site denotes in your mind, but your impeccable pedigree is on a web site, that's all I need to know.
;)
It is pretty basic really.
1. I don't really have a lot of time for trolls. Calling out a forum member when you are the one initiating the exchange is just sad...
Then why question the inclusion of another processor company in a discussion about processors?
Oh well.
As for the Nvidia comment...well, firstly it doesn't really belong in a thread about processors
The same is true of nVidia on their other front.

The second point I was making is that if you are going to make a freshman attempt at schooling someone, why not spend a few seconds ascertaining whether it's all going to blow up in your face by checking their previous postings and knowledge base? Your little attempt at condescension was obviously a quick and dirty attempt to deflect from your earlier trolling, but even a rudimentary knowledge of GPGPU would reveal that Nvidia's overwhelmingly strong advantage in the industry is predicated upon CUDA and the foresight to integrate it into a framework of professional software tools suited to the GPU architecture (compare this to the immature state of OpenCL API options).

TL;DR : You shouldn't have bought up the subject of Nvidia in this thread as it wasn't warranted, but once you did you shouldn't have predicated your argument on hardware since that isn't anywhere close to being the differentiator between themselves and AMD in GPGPU.
[OT²]Concerning deep learning and your blathering on about the Jetson TX1 - the only real advantage comes from the fact that it implements half precision (FP16) which s ideal for most neural net applications (even quarter precision -FP8 - is enough in some instances) , whereas GPUs in general for many generations- and from all vendors have concentrated on single (FP32) and double precision (FP64). Note that both AMD's and Nvidia's next architectures will both have FP16 and mixed precision support.[/OT²]
 
I don't change what my opinion is based on the audience as you suggest I should. I would suggest you ignore me if my comments upset you to such a point as this.
Relax, life is short, enjoy it and don't worry about me, I do fine on my own. I live for myself and my family and God, and not for you, and you shouldn't care what I think.
 
While I have a lot of respect for IBM and their management, the fact that INTEL still exists was a bone headed mistake by IBM. People bring up history, have you ever tried to program an 8086 based system. Segmented memory? BRILLIANT!!
The INTEL programming model was straight STUPID. Know what AMD64 is? Why were there references to AMD64 even when using X86 based systems? AMD was first, Intel had to catch up( queue Homer Simpson)...
The Motorola 680x0 kicked the tush of EVERY NEW INTEL 80286, 386, 486....
I have a $79.00 smart phone(MOTO E gen.2) that runs Android 5.1 and connects 4G/LTE.
While INTEL does have the top CPUs on PASSMARK, at what cost?
They are like EXXON or BP.
Someone bragged about an i3-3225. At $308.00 and a 4347 score, while a FX-6300 is $99.00 with a 6345 score? You go ahead start recalculating that spreadsheet, I'll give you the head start you'll need.....I know I would be using 40 more watts, but seriously.
I have a Core2Duo(E8500) CPU that has a higher Passmark Score than an Intel Core i7-640LM. It is 7 years old and still considered a High Mid Range CPU by Passmark, you can get one for $20.00 on eBay.
INTEL is greedy, plain as that...
 
Totally agree Anteater.

till you made that comment I had decided to ignore this "Gaggle of Intel Lovers" I bet most of them (sorry about the swear word) love iPhones by apple as well :eek:
 
While I have a lot of respect for IBM and their management, the fact that INTEL still exists was a bone headed mistake by IBM.
Sorry, but that doesn't make any sense at all. Intel was founded on IP that basically tore the heart out of IBM's magnetic core memory business.
Someone bragged about an i3-3225. At $308.00 and a 4347 score, while a FX-6300 is $99.00 with a 6345 score? You go ahead start recalculating that spreadsheet,
maybe you should start recalculating first. The i3-3225 launch price was $134. Comparing prices for an old EoL'd part is a flawed exercise from the outset - after all nobody today spends $700 to buy a Phenom II X4 and HD 5870 do they?
 
Great job comparing a 65W Desktop to a 25W (LM - low power mobile) Notebook CPU. Do you want to compare them to Atom while you are at it. Everybody know a Notebook CPU has less performance than its desktop variant. Ohh my would you look at that, even the newer Atom blows the doors off the Core2 Duo.
  • Intel Atom C2750 @ 2.40GHz - Passmark score - 3874
Lets not mention the next 4 generations, which are a two-fold performance boost over the platform that was chosen for comparison, on top of it being an LM variant.
 
@cliffordcooley
I think our new member Anteater is just a drive-by troll TBH. A one post wonder. A like-for-like comparison wouldn't fit his narrative ( I.e. pick a current $120 i3-6100 w/ 27% higher CPU and 123% higher GPU Passmark score instead of a three-generation-old EOL'd IB, or the cherry-picked E8500/i7-640LM comparison that aren't even remotely in the same market segment).
New wave of uninformed trolling. Maybe refreshing their facebook page is losing its lustre.
 
While I have a lot of respect for IBM and their management, the fact that INTEL still exists was a bone headed mistake by IBM. People bring up history, have you ever tried to program an 8086 based system. Segmented memory? BRILLIANT!!
The INTEL programming model was straight STUPID. Know what AMD64 is? Why were there references to AMD64 even when using X86 based systems? AMD was first, Intel had to catch up( queue Homer Simpson)...
The Motorola 680x0 kicked the tush of EVERY NEW INTEL 80286, 386, 486....
I have a $79.00 smart phone(MOTO E gen.2) that runs Android 5.1 and connects 4G/LTE.
While INTEL does have the top CPUs on PASSMARK, at what cost?
They are like EXXON or BP.
Someone bragged about an i3-3225. At $308.00 and a 4347 score, while a FX-6300 is $99.00 with a 6345 score? You go ahead start recalculating that spreadsheet, I'll give you the head start you'll need.....I know I would be using 40 more watts, but seriously.
I have a Core2Duo(E8500) CPU that has a higher Passmark Score than an Intel Core i7-640LM. It is 7 years old and still considered a High Mid Range CPU by Passmark, you can get one for $20.00 on eBay.
INTEL is greedy, plain as that...
If you are a stockholder, that's a good thing. :)
 
If your concern is strictly gaming why limit yourself to an 'also ran'? If your concern is BUGET gaming, AMD enter the picture. Of course there are also less expensive Intel nVidia packages that the best one.


But does it djent?

Why are you mad, bro?:eek:[/QUOTE]
Mad? Not angry at all. I have providing evidence or DATA as DBZ called for often and frequently for a long time and in many forms. Games frame rates, CPU and GPU load graphs, placing in the top 10 aginst Intel OC CPU's, and scaling results.
If they don't like or what they see than so biet it.
My 'DATA' results have been posted time and time again and glossed over time and time again so whatever they want to think.
I built it, benched it, presented it. If those who wish to ignore want to ignore it...then ignore it.
I enjoy building these and demonstrating that the "it can't be done" or does not work is spin by those who need to defend their decision, they can keep spouting all they want.
 
Back