Intel 28-core fantasy vs. AMD 32-core reality

And you should be able to comprehend the basics.

I'll get you your link since even using google challenges you. I have to remember the title. In the meantime I'm taking note how none of your comments to me contain any links to back up your claims after I challenged them since they seem so important to you now. Just saying.

Start with these poor Xeon yields you claim exist.

ECC support doesn't cost Intel much of anything. Not sure where you even got that.
What the actual point was, was they CHARGE more for it. Cause they can!!

That 8180 is built using 14nm and launched Q3 2017 so not sure where you're getting this nonsense about incredibly low yields.

GIMME THE LINK!!! smh...
Don't bother looking, because one doesn't exist.

14nm is a MATURE process. Google how many chips Intel built using it.

This is BASIC stuff, so I'm confused on how you can be so uneducated about what you're talking about, and even more confused about how your comment got so many likes.

FML....

I found the video....

You just commented...

"ECC support doesn't cost Intel much of anything. Not sure where you even got that."

Your prior comment...

"Xeons are more expensive primarily, because of ECC support, which is crucial for servers...."

You should probably remember what you said or at least make an effort to make sure you aren't completely flip flopping like you just did. This is pretty blatant pandering to be "correct".


"I'm taking note how none of your comments to me contain any links to back up your claims after I challenged them since they seem so important to you now. Just saying."

First you'd have to challenge one of my claims, just saying. Given how you already completely flipped on your xeon point, I'd have to question how good those notes you take are.


"What the actual point was, was they CHARGE more for it. Cause they can!!"

This isn't entirely true either. Sure the profit margins are higher but they are also more expensive to produce and support.


"That 8180 is built using 14nm and launched Q3 2017 so not sure where you're getting this nonsense about incredibly low yields."

My comment mentioned "in comparison to lower end products", which will always hold true as it correlates to die size. Unless you are going to tell me Intel gets more 8180s out of a single wafer than it does any i3 model, which would be funny.


FYI the video you linked is a review for Xeon-W processors and even in that video Linus specifically states they have no idea why the price is so high. Word for word

"What is a Xeon-W and how does Intel justify the hefty premium that you would pay over a desktop Core i9? We're not sure."

It does not include regular xeons not does it explain why they are priced so high.

I don't write comments for the links, I
 
Last edited:
What are you rambling on about?

My e3-1245 had ecc support and it didn't cost anything more than a normal Skylake.
You do realise that you are talking about the some of the lowest end server CPUs Intel was offering 7 years ago, right?
That CPU is for all intents and purposes just an lower clocked i7 2600 that works on the more expensive server boards.
 
That 8180 is built using 14nm and launched Q3 2017 so not sure where you're getting this nonsense about incredibly low yields.

GIMME THE LINK!!! smh...
Don't bother looking, because one doesn't exist.

14nm is a MATURE process. Google how many chips Intel built using it.

This is BASIC stuff, so I'm confused on how you can be so uneducated about what you're talking about, and even more confused about how your comment got so many likes.

So with more "basic stuff". Intel's Xeon 8180 is based on XCC die, that is 28 cores, around 700mm2 size. Because XCC is 28 cores, that also means EVERY core must work.

Now, we take, say, i7-7700K. Die size 150mm2.

We don't need any further evidence to prove that yields for fully working 700mm2 chip are miles lower than yields of fully working 150mm2 chip.

When taking AMD Epyc into account. 32 cores total, 4*220mm2 per core. So AMD Epyc also has miles better yields than Intel's 28 core Xeon.

Not surprising Intel fanboy struggle with basics :D
 
You do realise that you are talking about the some of the lowest end server CPUs Intel was offering 7 years ago, right?
That CPU is for all intents and purposes just an lower clocked i7 2600 that works on the more expensive server boards.

It was Skylake based. E3-1245 v5. I should have been more specific but Intels naming scheme is kind of retarded for those.
 
Of course, 5 GHz would be exciting even with far fewer cores than 28. And Intel does have that eight-core anniversary part. But I saw that AMD had made a 5 GHz eight-core chip too, the FX-9590. Unfortunately, though, I later discovered that, no, it was 4.7 GHz, and it wasn't really all that good either. Which underscores what a major advance Ryzen has been for AMD.
 
Of course, 5 GHz would be exciting even with far fewer cores than 28. And Intel does have that eight-core anniversary part. But I saw that AMD had made a 5 GHz eight-core chip too, the FX-9590. Unfortunately, though, I later discovered that, no, it was 4.7 GHz, and it wasn't really all that good either. Which underscores what a major advance Ryzen has been for AMD.

FX-9590 was 4.7 GHz base 5 GHz turbo, so it was 5 GHz part just like Intel's i7-8086K whatever is.
 
So with more "basic stuff". Intel's Xeon 8180 is based on XCC die, that is 28 cores, around 700mm2 size. Because XCC is 28 cores, that also means EVERY core must work.

Now, we take, say, i7-7700K. Die size 150mm2.

We don't need any further evidence to prove that yields for fully working 700mm2 chip are miles lower than yields of fully working 150mm2 chip.

When taking AMD Epyc into account. 32 cores total, 4*220mm2 per core. So AMD Epyc also has miles better yields than Intel's 28 core Xeon.

Not surprising Intel fanboy struggle with basics :D

I never said Xeon and Core yields were similar. It was buddy that tries to say Intel was getting 1 chip per wafer, and you're defending him so that says a lot about the both of you.
 
Last edited:
You just commented...

"ECC support doesn't cost Intel much of anything. Not sure where you even got that."

Your prior comment...

"Xeons are more expensive primarily, because of ECC support, which is crucial for servers...."

You should probably remember what you said or at least make an effort to make sure you aren't completely flip flopping like you just did. This is pretty blatant pandering to be "correct".


"I'm taking note how none of your comments to me contain any links to back up your claims after I challenged them since they seem so important to you now. Just saying."

First you'd have to challenge one of my claims, just saying. Given how you already completely flipped on your xeon point, I'd have to question how good those notes you take are.


"What the actual point was, was they CHARGE more for it. Cause they can!!"

This isn't entirely true either. Sure the profit margins are higher but they are also more expensive to produce and support.


"That 8180 is built using 14nm and launched Q3 2017 so not sure where you're getting this nonsense about incredibly low yields."

My comment mentioned "in comparison to lower end products", which will always hold true as it correlates to die size. Unless you are going to tell me Intel gets more 8180s out of a single wafer than it does any i3 model, which would be funny.


FYI the video you linked is a review for Xeon-W processors and even in that video Linus specifically states they have no idea why the price is so high. Word for word

"What is a Xeon-W and how does Intel justify the hefty premium that you would pay over a desktop Core i9? We're not sure."

It does not include regular xeons not does it explain why they are priced so high.

I don't write comments for the links, I

I'll say it again, YOU were the one bringing up the cost TO Intel to implement ECC when I was talking about what they charge US for ECC.

The video DEFINITELY explains ECC is THE primary diff between WS and Desktop. If you can't see it then try opening your eyes and ears.

Ask any author on this site.....
 
I never said Xeon and Core yields were similar....

That 8180 is built using 14nm and launched Q3 2017 so not sure where you're getting this nonsense about incredibly low yields.

It needs no more than common sense to tell that fully working 700mm2 die will have very low yields when using 14nm manufacturing tech.
 
It needs no more than common sense to tell that fully working 700mm2 die will have very low yields when using 14nm manufacturing tech.

Did I stutter? Or are you just set on putting thoughts in my head to win an argument?

Either way, your reading comprehension is poor at best.
 
Intel has nothing to compete with the AMD APUs and love that AMD is putting up enough fight to make intel squirm a bit..but this dis-allusion that AMD is any more pro consumer is dumb founding.. out side of the APUs benchmarks would still lead me to believe the best value at most price points still fall at an intel /nvidia combo

I hope my post didn't suggest that AMD was any more pro consumer than Intel... ? ... unless by "pro consumer" you mean NOT engaging in shady, underhanded deceptive practices to make a sale. But they're probably even both a bit guilty of stretching the truth about their products.

I have no love or allegiance to either company. I'm just happy that competition between AMD and Intel is becoming more of a reality now. I pretty much agree with your entire post. However, I paid almost double for my Haswell quad-core CPU just a year (or two) earlier than what I paid for my 2200G. To compare them side-by-side right now makes me want to hang myself when looking at the iGPU capabilities of each CPU. I know that's the general price pattern (more for less) of buying technology as time goes on. But still, I think AMD made it happen a whole lot faster than it would have by releasing a great line of CPUs.

EDIT: It is interesting to see how Intel reacts to the pressure AMD is bringing to the game. At "most price points" may still be true mostly from a hardcore gamers perspective, I imagine. But for how long?
 
Last edited:
I'll say it again, YOU were the one bringing up the cost TO Intel to implement ECC when I was talking about what they charge US for ECC.

The video DEFINITELY explains ECC is THE primary diff between WS and Desktop. If you can't see it then try opening your eyes and ears.

Ask any author on this site.....

WS /= xeon
 
X vs Xeon
6:30 mark of Linus' video....

6:30 pretty clearly illustrates my point, Linus clearly states that it is unclear as to why the CPUs and chipset even exist at their current pricing. As per Linus, ECC support is one of the few features this chipset even features over consumer products, not that it is a major factor in the cost. The video is littered with moments where Linus explicitly states the he does not know why this CPU/chipset are so expensive, completely contrary to your alternative facts of the situation.


"I'll say it again, YOU were the one bringing up the cost TO Intel to implement ECC when I was talking about what they charge US for ECC.

The video DEFINITELY explains ECC is THE primary diff between WS and Desktop. If you can't see it then try opening your eyes and ears."

For everyone's convenience, the following is the first post to bring up the cost of ECC on Xeons, which is your post...

"Xeons are more expensive primarily, because of ECC support, which is crucial for servers..."

You continue to try to put words into other people's mouths, completely switch opinions, ect all to try to back out of admitting you were wrong.

On top of that you tried to use WS Xeons, which have been panned as overpriced, as a reason that all xeons are expensive due to ECC support.

Just so everyone can follow your "ops I flipped my opinion again" meter, you once again flipped back to saying that Xeons are expensive again due to ECC support after you said just the opposite. This is the 2rd time you've completely flipped sides in this discussion, from "Expensive-ECC" to "Non-expensive ECC" and back to "expensive ecc".

For the record, here is the last time you completely flipped sides

"ECC support doesn't cost Intel much of anything. Not sure where you even got that."
 
Yep, Linus mentioned that the ws prices could only be justified when they are multi cpu socket capable, but is confused as to why they are such a high price difference now.

Hell, vpro would be a better argument for the price increase.

Back on topic though, it is clear Cascade Lake is a long ways off, and AMD will be sitting very good in the hedt market.
 
6:30 pretty clearly illustrates my point, Linus clearly states that it is unclear as to why the CPUs and chipset even exist at their current pricing. As per Linus, ECC support is one of the few features this chipset even features over consumer products, not that it is a major factor in the cost. The video is littered with moments where Linus explicitly states the he does not know why this CPU/chipset are so expensive, completely contrary to your alternative facts of the situation.


"I'll say it again, YOU were the one bringing up the cost TO Intel to implement ECC when I was talking about what they charge US for ECC.

The video DEFINITELY explains ECC is THE primary diff between WS and Desktop. If you can't see it then try opening your eyes and ears."

For everyone's convenience, the following is the first post to bring up the cost of ECC on Xeons, which is your post...

"Xeons are more expensive primarily, because of ECC support, which is crucial for servers..."

You continue to try to put words into other people's mouths, completely switch opinions, ect all to try to back out of admitting you were wrong.

On top of that you tried to use WS Xeons, which have been panned as overpriced, as a reason that all xeons are expensive due to ECC support.

Just so everyone can follow your "ops I flipped my opinion again" meter, you once again flipped back to saying that Xeons are expensive again due to ECC support after you said just the opposite. This is the 2rd time you've completely flipped sides in this discussion, from "Expensive-ECC" to "Non-expensive ECC" and back to "expensive ecc".

For the record, here is the last time you completely flipped sides

"ECC support doesn't cost Intel much of anything. Not sure where you even got that."

Watch it again and you'll hear what he clearly says about the differences between the two chips compared. There is no other way to say it.
 
Yep, Linus mentioned that the ws prices could only be justified when they are multi cpu socket capable, but is confused as to why they are such a high price difference now.

Hell, vpro would be a better argument for the price increase.

Back on topic though, it is clear Cascade Lake is a long ways off, and AMD will be sitting very good in the hedt market.

He clearly says ECC and V Pro are the only differences he could make out between the two. Oh, and an additional 4 PCIe lanes.
 
The video DEFINITELY explains ECC is THE primary diff between WS and Desktop. If you can't see it then try opening your eyes and ears."

Wow. I've said this from the BEGINNING. See what talking in circles does to you?

"On top of that you tried to use WS Xeons, which have been panned as overpriced, as a reason that all xeons are expensive due to ECC support."

ALL Xeons could fall under that category. You'd know that if you did some research as to exactly why. I didn't "try" anything. That's what happens when you jump into a conversation without getting clarification. You just assumed I meant all Xeons. And your buddy assumed I was comparing Core to Xeon in his wafer talk. Get clarification, and ask the proper questions!

Rookie mistake I will forgive you for this time. Always get clarification. When you don't you end up writing novels and taking yourself in circles.

You're no techie. You don't know how to present facts or ask the right questions.
 
Last edited:
Watch it again and you'll hear what he clearly says about the differences between the two chips compared. There is no other way to say it.

Correct, he does note the differences in the chips but that does not mean he attributes those differences to the cost. He even states multiple times throughout the video that he is completely confused as to what constitutes the product's high price, after all you can get Xeon processors with ECC and more performance at a lower price. Even business class motherboards with compatible processors that come with the secure processor (if that's a feature you need) can be had elsewhere for much less than the Intel WS lineup.
 
Correct, he does note the differences in the chips but that does not mean he attributes those differences to the cost. He even states multiple times throughout the video that he is completely confused as to what constitutes the product's high price, after all you can get Xeon processors with ECC and more performance at a lower price. Even business class motherboards with compatible processors that come with the secure processor (if that's a feature you need) can be had elsewhere for much less than the Intel WS lineup.

I feel sorry for you. I really do.
You just called Linus an incompetent reviewer.
You do know it was a review, right?
You do know the head to head between the two chips were chosen for a reason?
You do know the 6:30 mark was the conclusion right?
You do know any respectable reviewer comparing two CPUs would NEVER say they didn't know the differences in the conclusion when the whole point was to find the performance differences and the justification for the price difference. That's employment suicide, and not how you keep your job.

ESPECIALLY with the industry connections and experience he has!
 
Last edited:
I feel sorry for you. I really do.
You just called Linus an incompetent reviewer.
You do know it was a review, right?
You do know the head to head between the two chips were chosen for a reason?
You do know the 6:30 mark was the conclusion right?
You do know any respectable reviewer comparing two CPUs would NEVER say they didn't know the differences in the conclusion when the whole point was to find the performance differences and the justification for the price difference. That's employment suicide, and not how you keep your job.

ESPECIALLY with the industry connections and experience he has!

Incorrect, trying to put words into other's mouth again.

"You do know any respectable reviewer comparing two CPUs would NEVER say they didn't know the differences in the conclusion when the whole point was to find the performance differences and the justification for the price difference"

And yet linus specifically states that the performance nor the features justifies the cost and that he does not know why Intel charges as much as it does for WS products. That's not a fault of Linus, it's a fault of Intel for overcharging for the chips.
 
Some are working hard to distract from the topic at hand.

The fact is, the best Intel HEDT will get this year is an overpriced, 22 core SKLx part as CCLx will be next year.

Meanwhile, AMD is launching 24 and 32 core TR+ parts next month.

The butthurt is strong in this thread.
 
I advise any tech enthusiasts to check out AdoreTV on youtube with his video of Intel's scams through out the last few decades. This recent news is just another antic of theirs....truly deplorable.

Exactly, they are just freaking out that AMD is now putting up some real competition, and winning... they just got a little too cosy over the last decade or so, without any challenge coming from other CPU manufacturers... :p

And doing so just shows how pitiful their reaction is...
 
Back