Building an Affordable 16-Core, 32-Thread Xeon Monster PC

I just picked up a dell t7600 on ebay, dual socket "should" support the 2670s. When all is said and done I'll have 675$ for a dual socket 32 Gb ram, raid card, 1300W gold psu, fire pro 2270, case and 3 x 2 Tb hdd's, plus some ram and a cpu to sell and a windows 7 key. Really this is about doing cheap and I spent days trying to find the cheapest way. 2011 boards are not cheap. the dell isn't perfect but I could not build it my self for less, so it wins. if you want a single socket 2011 the dell t3600 can be had for even less.
 
Hopefully people don't discount the cost of actually using this PC. The increased horsepower over a more energy-efficient model comes at a price. If you were to build this computer with 2 Xeon E5-2670's and two GTX 980 ti's, the at-load power consumption of this thing could exceed 900 watts per hour. Assuming you used the computer 8 hours a day for intensive tasks, or you just left it on all the time and didn't use it at load quite as often, the computer would likely be consuming $28-$33 dollars a month in electricity (I'm using an average cost of $.13 per kwh), or about $360 dollars per year. If you put together the same components on a newer board and with a more efficient processor, the computer would use anywhere from 220 to 290 fewer watts per hour. This doesn't sound like a huge difference, but it would be about $9 per month cheaper to run. That's over a hundred dollar difference per year. In a typical use-case, I'll guess that a person might keep this setup for about 3 years before upgrading/swapping any components. That's $300 more dollars in operating costs compared to a more efficient model over the typical-use lifespan of the computer. Just something to keep in mind. By the way, I ran my numbers using the calculator at http://outervision.com/power-supply-calculator.

There is no way you are going to exceed 900 watts even with a pair of 980 Ti’s. As shown in the article they are actually quite efficient in terms of power usage using less than 50% more power than a single 5960X. Under maximum load the system can’t use more than 120 watts more than a single 5960X assuming they have the same GPU setup which obviously they would in an apple to apples comparison.

Even if you take your figures which are massively inflated, you say this system will cost you $300 more after 3 years of operation. I have no idea what this is compared to? Some other 16-core rig?

Let’s compare it to the 5960X because we have those figures and it is the most powerful consumer grade desktop processor you can buy. The entire dual Xeon build can easily be done for $1000, the 5960X build would cost over $2000. So you would have to run them for 10 years before the faster Xeons ended up costing the same amount.

We can also see looking at the power consumption figures that a single E5-2670 consumes less power than the 5960X and FX-8350. What’s more is it has twice the cores of the Core i7-6700K yet it consumes just 40% more power, not twice as much as you might expect (this is obviously due to the lower clock speeds).

Finally, I should point out that at idle the dual Xeons consumed the same amount of power as a regular desktop system using a Core i5 processor for example. At a cost of $.13 per kwh the Xeon system can’t cost over $15 more than the 5960X per month and that is assuming it sees 100% load across all 32-threads 24/7. If you are buying it for a rendering box it obviously won’t be doing that and the time you save will be incredibly valuable.

As of this moment, a pair of E5-2670 can be purchased from Amazon for $134.

While i7-5960X is frequently available for $900 in some shops. Some with delivery included, some without: http://www.microcenter.com/product/437205/Core_i7-5960X_30_GHz_LGA_2011-V3_Boxed_Processor

In Fact, Amazon was selling it for $900 just yesterday, and that stock is already gone, prices are up again.

This means i7-5960X is selling really well. And E5-2670 isn't selling at all :)

You will get much better pricing on the E5-2670 from eBay.

How would it handle stuff like ZBrush and Maya?

Anything that can fully utilize the huge amount of threads this configuration offers will be incredibly fast.


Steve-
After reading all the issues listed by various about power consumption, I'm
more in the camp that anyone bright bulb enough and motivated enough
to assemble and use a dual Xeon burner like this one will simply run
with a solar mini farm and deal with it.
 
run
with a solar mini farm
What exactly would that be, one of these?

solar-farm.jpg
 
I can tell you guys that I will actually SAVE some money running this on my electric bill! No need to run the heater as long in the winter months :)

This thing gets hot! I have a test bench setup in my room because I'm out of room in my workshop, and walking into my room yields a huge temperature difference. That's at IDLE!!! Imagine having it perform a 48 hour render at 100% load!
Yep, gonna dial down the heater in those winter months :p Proud owner of a Dual Core Xeon Space Heater Rendering Machine.

Now if I could figure out a way to harness the heat during the summer months... water heater???
 
I just finished building a system like this one after reading this article. Been using it for a week now, and it rocks! Even gaming is very good with it on my MSI Gaming GTX 970, GTA V etc runs at 60 FPS. Encoding is just great. Latest version of Handbrake does use more cores, and I get 97% cores utilization and really good FPS performance when re-encoding DVDs. In Premiere pro also, I see a 2 min gain over my i7 3770K for the same rendering, but premiere is *not* efficient with many cores, and that's a shame! (see other videos on youtube demonstrating that). An overclocked skylake beat the dual xeon in premiere, and leaves it in the dust. (see here: https://youtu.be/L2Kr_yjQiDU )
So, yes, it's good, but not *that* good for everything, so think about it. For me, it's ok as it is an improvement over my previous rig, but it is definitely *not* a rational choice I made here! I would have been better off buying an X99 board and top CPU, but that would have cost a little more. As I'm not a pro, it's ok for the Xeons and I'm happy with what I've got. The benchs in AIDA64 regarding memory bandwidth are tremendous (almost 100GB/s!) and globally, it remains faster than most computers, except the more recent and overkill expensive machines, for a fraction of the cost. So... do a little thinking and choose wisely!
 
Steve article thank you. Followed your build to the letter. Could you please offer your optimized bios/ram bios suggestions as I am not so familiar with all server settings,
Thanks in advance.
 
Hello Steve, thank you for your your great effort to write the reuse "old" Xeon article. I have used it to build up a new PC, and he works fine.
With changing Bios setting, CPU Power Management Configaration, Energie Performance, Balanced Performace to Performance or the activation of Vt-d I have killed my Asrock EP2C602 motherboard. After returning and repairing (I think Bios was flashed) it is back on track but now with disabled Vt-d and Balanced CPU energie settings. Have you enabled in your PC Bios Vt-D and CPU Energie Performance to Performance? Can you give advice?
 
I just spent all summer researching hardware and have been trying to put this build together and keep having issues. This is the first time EVER to try and build a computer from scratch. I am intending to use it as a render farm node. The first time around I had an issue with the ASRock EP2C612D16C-4L second CPU socket not detecting the Xeon. Still not detecting Xeon in second socket. So processors are fine. Eventually returned the board for refund. Now I have purchased the ASRock EP2C602 and I can't even get the POST to run. Bought a new MB battery and reset the CMOS with the ME recovery jumper. Now ASRock support is saying my RAM, the same exact G.SKILL Ripjaws Z series in your article, is gaming RAM and has not been validated for that board. I'm at a loss here and am hoping someone can help me out. Please.
 
Pulled all RAM and one of the CPU's. Started loading RAM one at a time. Was able to get to the Start-up screen on each one. Placed other CPU in and tested again. Same thing. Now with both CPU's in and RAM in slots A,C & G and I can get to the settings. As soon as I placed a 4th in the E slot I get the 'b9" motherboard LED display and monitor & keyboard won't start-up.

I've tried to find a way to message Steve Walton but his contact is on lock-down.

Can anyone please help me?

MB- ASRock EP2C602
PSU- EVGA 850
RAM- G.SKILL Ripjaws Z(same as linked in article)
 
I can tell you guys that I will actually SAVE some money running this on my electric bill! No need to run the heater as long in the winter months :)

This thing gets hot! I have a test bench setup in my room because I'm out of room in my workshop, and walking into my room yields a huge temperature difference. That's at IDLE!!! Imagine having it perform a 48 hour render at 100% load!
Yep, gonna dial down the heater in those winter months :p Proud owner of a Dual Core Xeon Space Heater Rendering Machine.

Now if I could figure out a way to harness the heat during the summer months... water heater???

Did you get this running with the G.Skill RAM and the ASRock EP2C602 motherboard? I am not convinced the RAM works with this MB.
 
So I just finished this build, mainly for 3ds max and other rendering software. I opted for a dual Strix GeForce GTX 970 OC for budget reasons and since the article mentioned somewhere about using two GTX 980, I assumed that there would be a benefit of using a pair of GPUs.
The thing is, Asrock EP2C602 doesn't seem to support SLI, only Crossfire. Does a dual GTX offer any advantage on this build, or is it just a waste of money?
 
For anyone interested, my dual Intel Xeon E5-2670 build with Asrock EP2C602-4L/D16, a GeForce GTX 970 OC and 128 gb RAM has an idle consumption of 150 - 165 Watts, depending on use.
Of course, If I render a scene in 3dsmax and use all the cores, the consumption can reach 370 Watts, but what do you expect?
By the way, I have a spacious CoolerMaster HAF XB Evo case with the tried and trusted Cooler Master Hyper 212 EVO fans - no hydro cooling - and the processors run at 52-58 degrees Celsius. I can actually feel cool air coming from the box. And I'm at a relatively hot climate, I might add.
One note, the Asrock EP2C602-4L/D16 is huge but fits the CoolerMaster HAF XB Evo like a glove. You will have to bend the internal MB tray a bit so it doesn't touch the Asrock motherboard, but that is an easy hack to do.
 
Last edited:
Funny how prices are region-dependent. I just checked and E5-2670 retails for… wait for it — between $2000 and $2150 here (EU, or more precisely Poland).

No cheap workstation for me, I'm afraid.
 
Funny how prices are region-dependent. I just checked and E5-2670 retails for… wait for it — between $2000 and $2150 here (EU, or more precisely Poland).

No cheap workstation for me, I'm afraid.
You should be looking for a used E5-2670 (SR0H8), which you can find for as low as 70 Euro. That's the whole point of this article.
 
I can't say which is the best, but the CoolerMaster HAF XB Evo, which is a horizontal mounting case, worked for me. It's spacious, and has fantastic airflow, no need for water cooling.
The removable motherboard tray makes makes for an easy assembly, but you will have to bend the tray's lip a bit so it doesn't touch the motherboard. That was actually the case with the EP2C602-4L/D16 as it was too big for the tray (but a perfect fit for case itself). Also, note, that some of the M/B's holes might not match with all the screw positions of the tray, but you don't have to use all of them.
In the end, I would definitely recommend CoolerMaster HAF XB Evo if you are willing to tamper with the tray a bit.

Hi All,

What would be the best cabinet for the asrock EP2C602 board
 
I have a question to anyone with an Asrock EP2C602 (or variants).
I run the passmark performance test, and found out that my GPU has much a lower score than it should. In fact I tested 2 gpus, a GFX970 which scored 6500 when it should have scored 8000-9000, and a GFX1080 which scored around 8000, when it should have scored around 12000. I tested all PCI-express ports with both cards, and had similar results. A friend of mine who has an i5 processor, tested the GFX970 and got a score of 9300, which is higher than the one I got with the GFX1080.
Has anyone noticed such a drop in GPU performance with the EP2C602? I can't tell if I got a faulty motherboard, or if it is something that has to do with my system settings/drivers. Or is this normal for a server motherboard? Can anyone offer any input?
Anyway, the CPU performance on the other hand, was better than expected.
 
I'm calling this article for being bogus, and the author is getting paid by G Skill to include their RAM. The RAM in this article IS NOT compatible with the ASRock motherboard. Has anyone actually built this system as detailed? ANYONE? Steven Walton is getting paid by G. Skill to mention their RAM. THIS BUILD IS BOGUS AND DOES NOT WORK WITH THE RAM STATED.
 
I'm calling this article for being bogus, and the author is getting paid by G Skill to include their RAM. The RAM in this article IS NOT compatible with the ASRock motherboard. Has anyone actually built this system as detailed? ANYONE? Steven Walton is getting paid by G. Skill to mention their RAM. THIS BUILD IS BOGUS AND DOES NOT WORK WITH THE RAM STATED.

He must have gone to a lot of trouble to build that system (but with non-G Skill RAM), run all those benchmarks, then claim to have used the G Skill RAM.... If G Skill had actually paid him, don't you think he'd have simply done a "rigged" review of their RAM instead of hiding it in an article that barely features it at all?
 
He must have gone to a lot of trouble to build that system (but with non-G Skill RAM), run all those benchmarks, then claim to have used the G Skill RAM.... If G Skill had actually paid him, don't you think he'd have simply done a "rigged" review of their RAM instead of hiding it in an article that barely features it at all?
All I know is I have tried this build with three brand new motherboards and keep getting a "b9" code and no POST when I take all the RAM out, but one, I can get the machine to POST. I have cycled through each stick and am able to get POST. I place a stick in the second slot and can POST. Third slot POST. Fourth slot and I am back to getting the "b9" code. I have spent countless hours working with ASRock support in trying to get it running and even when I shipped the boards back they found no defects in the board and were able to get it running with same CPU's and a different set of RAM. They even tested the third board in-house before sending it to me. Still, I am getting the "b9". I have swapped the CPU's and even tried it with just one of the CPU's loaded at a time. The E5's both work. So, tell me why I am not able to get this working? It would be nice if Mr. Steve Walton was reachable via in site messaging, but I have found no way to contact him directly through the website. I will be happy to rescind my comment if anyone can explain to me what I am doing wrong when I have every single same part he specs out and the only variable not working is the RAM? Until then I don't believe those test were actually run with the G.SKILL RAM.
 
Back