Intel Core i9-7980XE & 7960X Review: 16-Core Is Old News!

This is much worse than I expected. I knew the lower clocks would keep the performance numbers in check, but with 60/80% more cores than the 1900x, I seriously expected a lot more from them. With the exception of a few extreme situations, it's just not worth it.
And if we're talking about big companies with money to burn, ECC memory will be a much more important feature than the best workstation performance the money can buy. Proper Epyc/Xeon servers or TR workstations are a much better buy.
 
Last edited:
Nice review. I was actually expecting some bigger performance advantage from these monstrosities - yet only the power consumption is as big as I thought.

BTW I think that Power Consumption and Blender charts on the 4th page are swapped in position - the text under PC chart describes Blender performance and vice versa.
 
These things are getting like smartphones processors. For the 99%, they are worthless. Gamers, bench mark geeks would be about the only reason to have one of those.
 
This is much worse than I expected. I knew the lower clocks would keep the performance numbers in check, but with 60/80% more cores than the 1900x, I seriously expected a lot more from them.
Why?

The limited performance of these chips in many tests has absolutely NOTHING to do with their capabilities and EVERYTHING to do with proper utilization/coding/programming/support.
 
These things are getting like smartphones processors. For the 99%, they are worthless. Gamers, bench mark geeks would be about the only reason to have one of those.
I'm of the opinion that at the moment phone CPUs are not powerful and/or efficient enough. The same with their GPUs. Both the user and the developer are restricted by the hardware a lot especially since very few have 1000$ smartphones. It's only a matter of time until your "99%" becomes "10%". Remember when smartphones launched and how people didn't think they needed one?
Ideally we should get in a few short years the current IPhone X levels of performance in budget phones (150-200$).
 
Last edited:
Why?

The limited performance of these chips in many tests has absolutely NOTHING to do with their capabilities and EVERYTHING to do with proper utilization/coding/programming/support.
It actually has. They added a lot more cores, but as you've seen with the OC results... these chips don't have very high clocks for all of their cores. While software can indeed be optimised further, the biggest advantage Intel had over AMD was the much higher clocks.
With the core counts being very similar you can say that any optimisation that will benefit Intel will also benefit AMD (we've seen this with Ryzen patches sometimes where multithreading optimisations helped the 8 threads of the i7 too)
 
It actually has. They added a lot more cores, but as you've seen with the OC results... these chips don't have very high clocks for all of their cores.
Clock speed will certainly help performance but not that much.


While software can indeed be optimised further, the biggest advantage Intel had over AMD was the much higher clocks.)
Even with the same clock speed Intel destroyed AMD chips, until Ryzen.
 
These things are getting like smartphones processors. For the 99%, they are worthless. Gamers, bench mark geeks would be about the only reason to have one of those.
About the Intel parts, I agree. They seem like nothing more than bragging rights.

Besides server workloads, the kind of workstation workloads that these multi-core/multi-thread with wide memory bus CPUs are good at are CAD/CAM especially running FE. That said, I cannot see spending twice the money for at best, a 25-percent improvement in performance. For those workloads, the extra money would be better spent on additional RAM.

Intel is still crying glue, however, I think the kind of clientele that these parts appeal to will not so easily be fooled.
 
The Results are not surprising. Games are not optimized properly for CPUs with more than Four-Cores and Eight-Threads. However, it's surprising to see BOTH ThreadRipper and Ryzen 7 offering almost Identical Performance in some tasks.

Overall, My Stance on ThreadRipper remains the same. X399 has officially KILLED the X299 Platform. There is almost no justification to purchase X299 CPUs when X399's ThreadRipper lineup, Despite only having 3 CPUs currently, clearly offers Better value and better price-to-performance, price-to-watt, doesn't turn your Computer into a Winter Heater, and doesn't force you to Break the Bank.

Intel is now relying on Blind Brand Loyalty to keep the profits coming. Let's hope the Consumer can wake up and realize this.
 
I wonder if we could have some benchmarks with programs such as Solidworks or Catia. It would be nice to see how much benefit these CPUs bring to workloads with huge assemblies and with structural simulation.
 
This is much worse than I expected. I knew the lower clocks would keep the performance numbers in check, but with 60/80% more cores than the 1900x, I seriously expected a lot more from them. With the exception of a few extreme situations, it's just not worth it.
And if we're talking about big companies with money to burn, ECC memory will be a much more important feature than the best workstation performance the money can buy. Proper Epyc/Xeon servers or TR workstations are a much better buy.

I am very curious why Intel still won't allow ECC memory support on their HEDT, yet TR does ECC, has more PCIE lanes and cost much less. I guess some people would still pay double the money for an i9 chip for 10% more performance which also means much hotter running and higher power draw as well, also giving up ECC support.
 
It actually has. They added a lot more cores, but as you've seen with the OC results... these chips don't have very high clocks for all of their cores.
Clock speed will certainly help performance but not that much.


While software can indeed be optimised further, the biggest advantage Intel had over AMD was the much higher clocks.)
Even with the same clock speed Intel destroyed AMD chips, until Ryzen.

Athlon destroyed P4 the same way with 2/3 of the clock speed. Guess tides do turn.
 
Athlon destroyed P4 the same way with 2/3 of the clock speed. Guess tides do turn.
What tides? Intel having better architecture 90% of the time for the past 15 years, with an occasional temporary challenge by AMD? Don't mold information to fit your views, mold your view to fit the information.

AMD makes solid stuff and Ryzen is the best they have had in many years, its rivaling Intel's best. But saying/implying it's been about equal in the last decade+ is blatant bias and rejection of the facts. AMD fans are like wresting fans, the ref counted too slow!
 
There is nothing fundamentally wrong with these chips except the one critical thing that really matters- the price! How can Intel really price these so high, do they really cost that much to produce? Surely they can't cost intel more than $300-400 tops to actually manufacture, they are getting a big profit margin on them.

If 7960X was $400 cheaper at around $1200 then it would be a perfectly reasonable proposition against the generally slower $1000 1950X threadripper. At $1700 it's terrible.
 
I read another review.that got 7980XE to 4.5 ghz across all cores with an air cooler.check the mount.j/k.lookin for the unboxing link!
 
Last edited:
Oh? I guess these weren't as much fun to receive as the snatchdrippers,er,I mean ,pantyrippers,erThreadrippers.yeah.thats it
? No Jacket? No SWAG?WTF Intel? FAIL!
What were they thinking ?can't just send a proc in a box anymore...
 
Why this 16 and 18 cores was not tested with 4k resolution?

For the same reason they don't test 4-, 6-, & 8-core CPUs at 4K resolutions. Repeat after me, class:

At 4K, it's the GPU that becomes the limiting factor in your FPS, not the CPU.

Doesn't matter whether it's a "mere" 4C/8T i7-7700K, or some monster 20C/40T beast that you managed to get to 5GHz, even a GTX Titan Pascal or 1080TI is going to struggle to render everything at 4K (especially on Ultra settings). Heck, in some games at 4K you'd see little difference in performance between an i7-7700K & an FX-8350, because the GPU is struggling to keep up with almost any CPU.
 
I'm surprised to seee Intel is still arrogantly overpricing it's CPUs.

I thought they would have waken up after the arrival of Ryzen and Threadripper, but no.

I'm a i7 4790K user, and it's still fine for all my needs. I don't see any upgrades in the near future, except if the CPU fries. If I need to upgrade, I think I will go for the AMD.
 
Back