Intel's upcoming Coffee Lake CPUs will reportedly require a new motherboard

Shawn Knight

Posts: 15,306   +193
Staff member

In response to a user question regarding Intel’s upcoming Coffee Lake CPU lineup, motherboard maker ASRock recently said on Twitter that Coffee Lake chips will not be compatible with 200-series motherboards.

Revelations of this nature are typically shared by chipmakers at the time of reveal, not motherboard partners during the lead-up.

The tweet in question has since been removed although the information has been independently confirmed by other publications.

Most assumed that, because Coffee Lake shares a similar microarchitecture with Kaby Lake, that it would be compatible with 100- and 200-series LGA 1151 boards. As AnandTech highlights, if the rumor is indeed true, it could mean one of several scenarios: the new chips will not be LGA 1151 (which goes against an earlier rumor), motherboards could lock out the new processors using firmware or the CPUs and sockets will use a different notching system to physically prevent them from being installed in certain boards.

Given that Coffee Lake is expected to be a six-core part, some were no doubt hoping for a plug-and-play upgrade using their existing motherboard. That doesn’t appear to be possible now and could present a scenario in which those looking to upgrade may toss in the towel and jump ship to AMD’s camp.

As always, keep in mind that this is just a rumor and nothing is truly official until Intel publicly announces it.

Permalink to story.

 
Shawn, would you guess that these coffee lake six core processors would be competitively priced to AMD (around $200-220) or do you believe prices would begin at $300+? I'm not holding your feet to the fire if you are wrong just curious what you think or may have heard.
 
Aaaand thats why you go amd this timw around. The AM4 platform, while not yet up to intels speed, is much more future proof and has room to grow.

Amd has said they expect 15% increments to gen 2 and another 15 to gen 3. Wether thats a clockspeed bump or ipc increase, I'm down for whichever.
 
Shawn, would you guess that these coffee lake six core processors would be competitively priced to AMD (around $200-220) or do you believe prices would begin at $300+? I'm not holding your feet to the fire if you are wrong just curious what you think or may have heard.

Given that Intel knew AMD was competitive and still priced it's x299 chips out of the market, I would not expect much from Intel.
 
Considering most consumers buy prebuilt machines
Shawn, would you guess that these coffee lake six core processors would be competitively priced to AMD (around $200-220) or do you believe prices would begin at $300+? I'm not holding your feet to the fire if you are wrong just curious what you think or may have heard.

Given that Intel knew AMD was competitive and still priced it's x299 chips out of the market, I would not expect much from Intel.

Not saying it couldn't happen, but an aggressive price drop from Intel would cheapen the brand, and that's the last thing they want.
 
Considering most consumers buy prebuilt machines

Not saying it couldn't happen, but an aggressive price drop from Intel would cheapen the brand, and that's the last thing they want.

The problem with that is Intel has been cheapening it's brand this whole time without AMD's help. Cheap, dinky little CPU coolers, thermal paste instead of solder on thousand+ dollar CPUs, thin bendable skylake CPUs. At every turn where Intel can save even a penny they have done it and they have also charged as much as they could while doing it. Intel has spent most of the last decade forcing a monopoly onto the market and then bleeding PC owners dry.
 
This makes AMD even cheaper, when you consider the fact that there is no need to throw away a perfectly good motherboard when you upgrade.
 
Its really odd that when actually presented with compitition, intel becomes even more anti-consumer
It's almost as if they have a huge fanbase that they know will buy their **** regardless of price and performance. They are similar to Apple in that sense.
 
The problem with that is Intel has been cheapening it's brand this whole time without AMD's help. Cheap, dinky little CPU coolers, thermal paste instead of solder on thousand+ dollar CPUs, thin bendable skylake CPUs. At every turn where Intel can save even a penny they have done it and they have also charged as much as they could while doing it. Intel has spent most of the last decade forcing a monopoly onto the market and then bleeding PC owners dry.

Yea I don't believe that. CPU performance doesn't scale like GPU's do. 7nm is also an issue meaning Intel had to stay on 14nm longer than anticipated. That could also explain the new Mesh tech. Intel is trying.

TSMC is even attempting a full (7nm) process for the first time so expect them to experience delays too.
 
Yea I don't believe that. CPU performance doesn't scale like GPU's do. 7nm is also an issue meaning Intel had to stay on 14nm longer than anticipated. That could also explain the new Mesh tech. Intel is trying.

TSMC is even attempting a full (7nm) process for the first time so expect them to experience delays too.

The only reason CPU power hasn't scaled like GPU power is because Intel has kept it that way. Remember how Intel was saying it was super hard to improve CPUs and that's why they were only getting 5% each gen? Yeah, and then AMD come in with Ryzen. Intel was spending the entire value of AMD in R&D alone each quarter and I really don't think that money was going to their CPUs. It was either going to other projects or they were using it as a slush fund. You don't spend that much money and get zero results and if Intel had any competition at the time their shareholders would have been stringing them up.
 
The only reason CPU power hasn't scaled like GPU power is because Intel has kept it that way. Remember how Intel was saying it was super hard to improve CPUs and that's why they were only getting 5% each gen? Yeah, and then AMD come in with Ryzen. Intel was spending the entire value of AMD in R&D alone each quarter and I really don't think that money was going to their CPUs. It was either going to other projects or they were using it as a slush fund. You don't spend that much money and get zero results and if Intel had any competition at the time their shareholders would have been stringing them up.

No, when Intel was giving us 5%, AMD dropped out for 5 years. And do you really think GloFo has better fabs than Intel? You think AMD has better engineers? You know Intel is the only one doing full nodes vs hybrids right? 10nm is tough. We know this, because we read the articles. We know this because TSMC is attempting a full 7nm node for the first time, meaning hybrids aren't cutting it anymore. TSMC could stumble. Intel could stumble. Probably will stumble.

What Intel can't control are software devs. That is what we could use right now. You can only do so much with hardware before you need to start working closer with the software guys. It's hard though, because there are a lot of them. And now AMD wants a piece of that optimization.

A lot can happen between now and 7nm/10nm... or not.
 
The only reason CPU power hasn't scaled like GPU power is because Intel has kept it that way. Remember how Intel was saying it was super hard to improve CPUs and that's why they were only getting 5% each gen? Yeah, and then AMD come in with Ryzen. Intel was spending the entire value of AMD in R&D alone each quarter and I really don't think that money was going to their CPUs. It was either going to other projects or they were using it as a slush fund. You don't spend that much money and get zero results and if Intel had any competition at the time their shareholders would have been stringing them up.
I know they are milking the tech. I also know government has technology years ahead of the public. That's probably where the R&D budget goes. One day they'll release killer robots on the public and then we won't care about the latest cpus. HA
 
No, when Intel was giving us 5%, AMD dropped out for 5 years. And do you really think GloFo has better fabs than Intel? You think AMD has better engineers? You know Intel is the only one doing full nodes vs hybrids right? 10nm is tough. We know this, because we read the articles. We know this because TSMC is attempting a full 7nm node for the first time, meaning hybrids aren't cutting it anymore. TSMC could stumble. Intel could stumble. Probably will stumble.

What Intel can't control are software devs. That is what we could use right now. You can only do so much with hardware before you need to start working closer with the software guys. It's hard though, because there are a lot of them. And now AMD wants a piece of that optimization.

A lot can happen between now and 7nm/10nm... or not.

At this point, the nm of node used has very little to do with the performance of the CPU. Sure you can throw more transistors at the problem but you can get a lot more by innovating on the architectural level. Graphics cards have slowed down node changes as well and yet they are constantly eking out more and more performance. So why is it that graphics cards are continually able to see large improvements even on the same node while CPUs are not? Simple, Nvidia and AMD are actually improving their architecture.

If you haven't noticed either, Intel could have given everyone more performance by removing the iGPU, which takes up a significant amount of die space. Intel has consistently mitigated it's gains in node technology by integrating more and more on the CPU and reducing the size of the die. Kabby lake, Sky lake, both have a tiny die. No, Intel could have given everyone more performance but they decided instead and on purpose to keep everyone on a string to force incremental upgrades. It's the reason why you'd have to be pretty uninformed to buy a 7700k even for a gaming rig. It's already maxed out in most games while it's counterpart the 1700 has plenty of room to spare. Who in their right mind would advise someone to buy a CPU that's already topped off for a tiny and likely impreciable increase in FPS (especially if you don't have a high refresh rate monitor).

We don't need anyone to apologize for Intel, It has been obvious from the start that they conductive business in a way to specifically hamper the market to extract maximum profits. This included releasing hamstrung products that revert in features and product lines that overlap because Intel knows exactly what the maximum people will pay.
 
At this point, the nm of node used has very little to do with the performance of the CPU. Sure you can throw more transistors at the problem but you can get a lot more by innovating on the architectural level. Graphics cards have slowed down node changes as well and yet they are constantly eking out more and more performance. So why is it that graphics cards are continually able to see large improvements even on the same node while CPUs are not? Simple, Nvidia and AMD are actually improving their architecture.

If you haven't noticed either, Intel could have given everyone more performance by removing the iGPU, which takes up a significant amount of die space. Intel has consistently mitigated it's gains in node technology by integrating more and more on the CPU and reducing the size of the die. Kabby lake, Sky lake, both have a tiny die. No, Intel could have given everyone more performance but they decided instead and on purpose to keep everyone on a string to force incremental upgrades. It's the reason why you'd have to be pretty uninformed to buy a 7700k even for a gaming rig. It's already maxed out in most games while it's counterpart the 1700 has plenty of room to spare. Who in their right mind would advise someone to buy a CPU that's already topped off for a tiny and likely impreciable increase in FPS (especially if you don't have a high refresh rate monitor).

We don't need anyone to apologize for Intel, It has been obvious from the start that they conductive business in a way to specifically hamper the market to extract maximum profits. This included releasing hamstrung products that revert in features and product lines that overlap because Intel knows exactly what the maximum people will pay.

Wow...
Small process allows for better efficiency, power usage and more dies per wafer. Not sure how you ignored the key reasons shrinking transistors is a good thing. Not everyone wants AMD's hot stuff.

An Intel iGPU was the top GPU on Steam before the GTX 970 took its spot. So um, removing it to give "everyone" more performance is foolish at best. Not everyone is into computers. They turn it on, open their handful of apps and websites and they're happy as can be. They care about video and pictures and that GPU is more than good enough.

If you actually did some research, everyone wants to move to the next process for the above reasons. That's EVERYONE.

You seem to know very little about what goes on behind the scenes. You're just that "gimme gimme type." I get it, little performance improvements year after year sucks. We might be surprised in the future, but in the meantime, no one is forcing anyone to upgrade.
 
Wow...
Small process allows for better efficiency, power usage and more dies per wafer. Not sure how you ignored the key reasons shrinking transistors is a good thing. Not everyone wants AMD's hot stuff.

An Intel iGPU was the top GPU on Steam before the GTX 970 took its spot. So um, removing it to give "everyone" more performance is foolish at best. Not everyone is into computers. They turn it on, open their handful of apps and websites and they're happy as can be. They care about video and pictures and that GPU is more than good enough.

If you actually did some research, everyone wants to move to the next process for the above reasons. That's EVERYONE.

You seem to know very little about what goes on behind the scenes. You're just that "gimme gimme type." I get it, little performance improvements year after year sucks. We might be surprised in the future, but in the meantime, no one is forcing anyone to upgrade.

/facepalm

"Small process allows for better efficiency, power usage and more dies per wafer. Not sure how you ignored the key reasons shrinking transistors is a good thing"

and yet that wasn't what I was saying. I was saying that process node doesn't mean much because we have hit a point were very little gains are being made on node size.

I don't know why I even bother, you just jump to an incorrect assumption and start insulting me. Instead of even attempting to understand, you attack. This is the perfect example of why it's so hard to have a conversation on the internet.

But yeah, I know nothing, it's not like you can't look at my many many previous posts here and confirm anything. Nope, can't do that at all.
 
Last edited:
Back