Intel 28-core fantasy vs. AMD 32-core reality

Isn't this all just posing, on both sides of the aisle, until there's some actual solutions to Meltdown/Spectre? I don't care how fast their new CPU is if it requires me to run a firmware that turns off all its features in order to run it securely when connected to the internet.
Good point! This is part of the reason that I am waiting to upgrade.
 
Hmmmm.....your comparison is forgetting one thing that every HEDT enthusiast, except apparently you, would take into consideration. Price!!!! While you may be willing to pay $10,000 plus for your 4 socket Threadripper destroyer I'm pretty sure 99% of people would go for AMD's $1,799 option as it will work out of the box on most software used in the enthusiast level market. Your 4 socket solution will require expensive optimisations to take advantage of all those cores and still suffer from latency issues as those 4 sockets won't be as efficiently linked as Threadripper.

Go ahead and pull the trigger on your "superior" approach and be sure to post your benchmark scores to show everyone how much of a genius you are.
Multi-CPU configurations are supported by most PC software, no extra work required. This is true for both Intel and AMD, in fact if it wasn't then Threadripper simply wouldn't work, because that is all it is, a number of CPUs bundled together. If you are so concerned with price you can get a multi-CPU Xeon system for half the price of the equivalent Threadripper.
 
Nice to see Techspot keeping up their AMD fanboi status. How about comparing Intel's 28 core single-die offerings with AMD's single-die offerings. Or perhaps compare Threadripper with a multi-Xeon system, you can get motherboards that will take 4 Xeons - how does a 112 core Xeon compare with Threadripper? Just because you bolt two CPUs together and sell them as a package doesn't make it any different to plugging multiple CPUs into one motherboard.

Also, any allegations of shenanigans can be totally dismissed if you actually watch the presentation, the box was right there on stage with massive cooling very apparent. Perhaps Techspot thought people didn't watch it.
Maybe they should just wait to compare the upcoming dual socket 64 core Epyc procs with the quad socket 112 core Xeon system.

But this article was about exposing Intel's BS and not a processor review.
 
Hmmmm.....your comparison is forgetting one thing that every HEDT enthusiast, except apparently you, would take into consideration. Price!!!! While you may be willing to pay $10,000 plus for your 4 socket Threadripper destroyer I'm pretty sure 99% of people would go for AMD's $1,799 option as it will work out of the box on most software used in the enthusiast level market. Your 4 socket solution will require expensive optimisations to take advantage of all those cores and still suffer from latency issues as those 4 sockets won't be as efficiently linked as Threadripper.

Go ahead and pull the trigger on your "superior" approach and be sure to post your benchmark scores to show everyone how much of a genius you are.
Multi-CPU configurations are supported by most PC software, no extra work required. This is true for both Intel and AMD, in fact if it wasn't then Threadripper simply wouldn't work, because that is all it is, a number of CPUs bundled together. If you are so concerned with price you can get a multi-CPU Xeon system for half the price of the equivalent Threadripper.
Yeah, in the 10-year old used market on e-bay.

I think most of us here will stop feeding you.
 
Intel's feat was a shocker for all of about a day. It lasted long enough to steal AMD's thunder and maybe panic a few of their executives. But it turned out to be nothing more than an exercise in extreme overclocking. Threadripper CPUs have already gone over 5GHz since August of last year. Some Intel chips have gone over 7GHz. But all required massive voltages, extreme liquid nitrogen cooling and probably delidding. Maybe next time AMD will keep quiet until the day of their announcement.
 
Sigh...
Xeons are more expensive primarily, because of ECC support, which is crucial for servers....

I'd prefer a link. You said there was a video, you should be able to provide a link.

Regardless of what the actual answer is, it is fact in the foundry business that yields go down as die size goes up. Therefore, it is fact that Intel's Xeon 8180 has lower yields than their lower end chips.

I find it very hard to believe that ECC support costs Intel more money than bad chips on a wafer.
 
I'd prefer a link. You said there was a video, you should be able to provide a link.

Regardless of what the actual answer is, it is fact in the foundry business that yields go down as die size goes up. Therefore, it is fact that Intel's Xeon 8180 has lower yields than their lower end chips.

I find it very hard to believe that ECC support costs Intel more money than bad chips on a wafer.

Exactly. All AMD Ryzen chips (expect Raven Ridge?) support ECC so that much for "expensive ECC" BS...
 
@Tim Schiesser it would probably be a good idea to give credit to the sites where the pictures in this article originated. The one with the water chiller was taken by Paul Alcorn from Tom's Hardware. It makes the people from those sites just a little happier when you do.
 
Exactly. All AMD Ryzen chips (expect Raven Ridge?) support ECC so that much for "expensive ECC" BS...

Yeah, you just have to make sure you get the correct motherboard. Unfortunately not all motherboards provide support for it (at least 1st gen ryzen boards, can't speak for 2nd gen). Technically speaking though AMD does not validate it's consumer processors's ECC memory usage, which means there isn't a 100% chance it will work 100% of the time. Not that it's a concern for consumers or small businesses and it's easy to find out if ECC is working. Really only need that 100% for the enterprise which epyc covers.
 
I advise any tech enthusiasts to check out AdoreTV on youtube with his video of Intel's scams through out the last few decades. This recent news is just another antic of theirs....truly deplorable.
You need to realise that AdoredTV is an AMD fanboy channel. All he does is scold Intel and Nvidia. Even though some of what he says is true, he definitely exposes his bias
 
Competition is good for us. I don't want AMD or Intel to win. I want them to keep fighting. Just like I don't want liberals or conservatives to win. That would be a disaster for us. I want them to fight. As long as they fight, it's good for us.
 
You need to realise that AdoredTV is an AMD fanboy channel. All he does is scold Intel and Nvidia. Even though some of what he says is true, he definitely exposes his bias
Of course I know that he's more on AMD's side but him presenting facts in the video I mentioned are objective things that happened. Everyone is bias to some degree. To his credit, he does do videos slamming AMD as well.
 
You need to realise that AdoredTV is an AMD fanboy channel. All he does is scold Intel and Nvidia. Even though some of what he says is true, he definitely exposes his bias

I wouldn't go as far to call him a fanboy but he does lean AMD. In order to consider him a fanboy, he would have to never criticize AMD, which he has done a few times already.

It may appear that he's more pro-AMD lately simply because Nvidia and Intel are making headlines with the stupid decisions they are making. It's not like adored goes out of his way to criticize them, Intel was chastised by plenty of the tech press before adored made his video.
 
This will be the same as the server market and all come down to power requirements, heat and scalability.

I have to say that I think AMD wins all 3 of those at the moment..... There is a reason Cisco is picking them up for their UCS market and Cisco is billions and billions a year
 
I'd prefer a link. You said there was a video, you should be able to provide a link.

Regardless of what the actual answer is, it is fact in the foundry business that yields go down as die size goes up. Therefore, it is fact that Intel's Xeon 8180 has lower yields than their lower end chips.

I find it very hard to believe that ECC support costs Intel more money than bad chips on a wafer.

Exactly. All AMD Ryzen chips (expect Raven Ridge?) support ECC so that much for "expensive ECC" BS...
No, the feature itself doesn't cost Intel much more, but the validation process and customer support does cost a pretty penny.
The main reason it costs so much is because the server profit margins are much much higher than the mainstream ones. Offering ECC mainstream support will cut into their profits and will also fundamentally change their business model.
 
But this article was about exposing Intel's BS and not a processor review.
Perhaps if you'd actually watched the video you wouldn't need an article to point out what was clearly shown. The only BS is this article pretending to expose something that was never hidden.
 
No, the feature itself doesn't cost Intel much more, but the validation process and customer support does cost a pretty penny.
The main reason it costs so much is because the server profit margins are much much higher than the mainstream ones. Offering ECC mainstream support will cut into their profits and will also fundamentally change their business model.

I can understand that but that still doesn't explain why AMD can provide that same feature in their enterprise class processors at half the cost. If ECC use and validation were the main drivers of cost, AMD EPYC processors would be near equally expensive as xeons. It's pretty clear that while ECC validation costs them money, it is a fraction of their overall costs.
 
Perhaps if you'd actually watched the video you wouldn't need an article to point out what was clearly shown. The only BS is this article pretending to expose something that was never hidden.

So in the future we should ignore whatever an Intel SENIOR VICE PRESIDENT OF CLIENT COMPUTING says, and instead guess at things based on what we might glimpse in a corner of some video? Anything to excuse Intel chicanery, right? Which of course got them dozens of headlines such as:

"Intel will launch a 28-core 5GHz CPU by the end of the year" - Engadget Today

Nevermind that no CPU manufacturer should ever intro a part based on its OC potential, because that's explicitly NOT part of the deal when you buy one.

This stupid demo of a never-to-be product is just another data point in the sad, sad decline of Intel's technical and marketing leadership. Thank god AMD was able to come from behind, somehow, and restore progress and competition to the market. Think about it: for a comparatively tiny outfit like AMD to even do that shows how badly Intel has been dragging its feet, and milking us all.

The big lesson: We desperately need AMD to stay in the game, and I for one will be buying "AMD Instead" for a long, long time.
 
I'd prefer a link. You said there was a video, you should be able to provide a link.

Regardless of what the actual answer is, it is fact in the foundry business that yields go down as die size goes up. Therefore, it is fact that Intel's Xeon 8180 has lower yields than their lower end chips.

I find it very hard to believe that ECC support costs Intel more money than bad chips on a wafer.

And you should be able to comprehend the basics.

I'll get you your link since even using google challenges you. I have to remember the title. In the meantime I'm taking note how none of your comments to me contain any links to back up your claims after I challenged them since they seem so important to you now. Just saying.

Start with these poor Xeon yields you claim exist.

ECC support doesn't cost Intel much of anything. Not sure where you even got that.
What the actual point was, was they CHARGE more for it. Cause they can!!

That 8180 is built using 14nm and launched Q3 2017 so not sure where you're getting this nonsense about incredibly low yields.

GIMME THE LINK!!! smh...
Don't bother looking, because one doesn't exist.

14nm is a MATURE process. Google how many chips Intel built using it.

This is BASIC stuff, so I'm confused on how you can be so uneducated about what you're talking about, and even more confused about how your comment got so many likes.

FML....

I found the video....
 
Last edited:
No, the feature itself doesn't cost Intel much more, but the validation process and customer support does cost a pretty penny.
The main reason it costs so much is because the server profit margins are much much higher than the mainstream ones. Offering ECC mainstream support will cut into their profits and will also fundamentally change their business model.

What are you rambling on about?

My e3-1245 had ecc support and it didn't cost anything more than a normal Skylake.
 
I can understand that but that still doesn't explain why AMD can provide that same feature in their enterprise class processors at half the cost. If ECC use and validation were the main drivers of cost, AMD EPYC processors would be near equally expensive as xeons. It's pretty clear that while ECC validation costs them money, it is a fraction of their overall costs.

Sigh...
AMD isn't Intel.

You're a frequent flyer on this site right? So why are you talking like an outsider? The things you're wondering about can be researched, and you're not doing that. Please start, and I don't mean by asking for links.
 
Last edited:
Back