AMD Threadripper 3990X Review: Absolute Madness!

As new PC buyers come of age (read: get a job and start burning their paychecks on PC parts), whatever the current newly released hardware is becomes the "must have".

The 8700k became the 9900k and will become the 10900k and will become the 11900k.

My thing is: These CPU are ridiculously overpowerewed for gamers. Most games nowadays are GPU intensive and will run fine on even a Core i3 with any RTX card or any new AMD card.

We don't have any "Crysis moments" where you get a game that threatens to melt your computer.

These ridiculous core counts are for workstations. The same entities who buy a new Mac Pro and don't blink.

These AMD processors look fantastic if I was doing work, but in gaming Intel is blowing their socks off.
 
That's not even the case at 1080p with an RTX 2080 Ti. If you can spot the difference between the Ryzen 9 3900X and Core i9-9900K with an RTX 2080 Ti @ 4K or even 1440p, I'll eat a TRX40 motherboard.


The 2080Ti accounts for less than 2% of all registered STEAM users.

Even tho I have one, the reality is the bulk of the market doesn't.

Tell me what the difference looks like when you're using a GTX 1060.
 
We don't have any "Crysis moments" where you get a game that threatens to melt your computer.

You are completely right!
I miss those Crysis "WOW" moements, just noticing the photo realistic graphics, they were truly ground breaking, you could see the difference between the other games and on top of that everything in that world was destructible. It launched 12 years ago and I have never seen any game truly making me feel that way since then, it´s very frustrating. I blame consoles, of course.
 
The 2080Ti accounts for less than 2% of all registered STEAM users.

Even tho I have one, the reality is the bulk of the market doesn't.

Tell me what the difference looks like when you're using a GTX 1060.

That's my point mate! With a GTX 1060 you'll be 100% GPU bound with a Ryzen 5 2600, so there is no chance you'll notice the difference between AMD and Intel with a more realistic GPU. Hell we've provided GPU/CPU scaling content proving this.
 
You are completely right!
I miss those Crysis "WOW" moements, just noticing the photo realistic graphics, they were truly ground breaking, you could see the difference between the other games and on top of that everything in that world was destructible. It launched 12 years ago and I have never seen any game truly making me feel that way since then, it´s very frustrating. I blame consoles, of course.


I've run Crysis, Warhead and Crysis 2 on a lowly i7 with 1080 and16GB of DDR4 in ULTRA.

The game is amazing, but it makes me wonder how they thought they could run it with anything less?

The games minimum requirements are damn near unplayable and the recommended requirements aren't high enough either.

when you step up to my i9Extreme and 2080Ti with 32GB, it gets even better.
 
You are completely right!
I miss those Crysis "WOW" moements, just noticing the photo realistic graphics, they were truly ground breaking, you could see the difference between the other games and on top of that everything in that world was destructible. It launched 12 years ago and I have never seen any game truly making me feel that way since then, it´s very frustrating. I blame consoles, of course.


We aren't getting any games lately that actually justify the cost of our rigs.
Red Dead Redemption 2 doesn't count because it's a poorly optimized console game ported to the PC.
 
The 2080Ti accounts for less than 2% of all registered STEAM users.

Even tho I have one, the reality is the bulk of the market doesn't.

Tell me what the difference looks like when you're using a GTX 1060.

Did you just ask what the difference looks like when using GTX 1060?!
Zero. Zero!
Forget about the 1060, difference between the two is non-existent unless you're using flagship GPUs at 1080p. And by flagship I mean the 2080Ti.

Now let's consider the following,
AMD and Intel gaming performance is identical unless we're talking about 1080p gaming with 2080Ti.
But wait you just said the 2080Ti was 2% of the market as if implying it didn't matter.
And yet here you are saying Intel blows AMD's socks off in gaming.
But negate 2080Ti.
Intel blows AMD's socks off in gaming.
Negate 2080Ti.
Blows socks, Negate 2080Ti, Asks about 1060 difference. Zero!

Man, I smell hypocrisy, bias, ill-logic.

 
I appreciate you hard and precise work and hope you don't think I underestimate your tough job.
Can please you test some PRO apps like Media Composer/ Maya/ Nuke 3d/ Mocha Tracking etc... , as I think those apps are the most targeted one for such power and price
 
The 2080Ti accounts for less than 2% of all registered STEAM users.

Even tho I have one, the reality is the bulk of the market doesn't.

Tell me what the difference looks like when you're using a GTX 1060.
I think that was Steve's point - you will barely notice the difference with the highest end 2080Ti even at resolutions below what you'd get such a GPU for.

Going with anything lower tier, there should not be any difference at all - if you only run the game and nothing else at the same time that is.

The point of these super high core count CPU is that you can do things that you simply couldn't do before. Waiting for a render / encoding / calculation.... job to finish ? Why not do a bit of gaming at the same time, both with good performance.

You could also game, SW encode and stream on the same PC.

Those are some specific use cases but they are there.
 
Quick question @Steve : Which OS version did you run your tests on ?

Asking because the Anandtech review looked at performance difference between Windows 10 Pro and 10 Enterprise, as regular Win 10 will actually see the 64C TR as a dual socket system.

Edit: I know doing the review is a lot of work, but how about a multi-task scaling review for 8C-64C CPU, I.e. to see how well they support running multiple intensive tasks simultaneously ?
 
To be fair, it should be now clear that Crysis was just really badly coded, given some parts *still* struggle.
I dont agree with you, sorry.
I tell, you, with some mods applied I still think it´s one of the best looking games out launched, even 12 years later. And the fact that almost EVERYTHING was destructible (every tree, bush, house, brick, etc) is impressive. Not one single game is like that, even today.
 
This is the chip for the guy who wants a desktop mainframe and has the deep pockets to accommodate such a behemoth.
I think that was Steve's point - you will barely notice the difference with the highest end 2080Ti even at resolutions below what you'd get such a GPU for.

Going with anything lower tier, there should not be any difference at all - if you only run the game and nothing else at the same time that is.

The point of these super high core count CPU is that you can do things that you simply couldn't do before. Waiting for a render / encoding / calculation.... job to finish ? Why not do a bit of gaming at the same time, both with good performance.

You could also game, SW encode and stream on the same PC.

Those are some specific use cases but they are there.

Yeah, for the non-professional the multitasking is what its all about IMO. This is the chip for the guy who wants a desktop mainframe and has the deep pockets to accommodate such a behemoth.
 
Am I the only one who doesn't like that these days a system with a 280W CPU consumes 452W? These power consumptions are crazy. AMD usually just throws die at it, and it works, it's performant, but wasteful AF.
 
Am I the only one who doesn't like that these days a system with a 280W CPU consumes 452W? These power consumptions are crazy. AMD usually just throws die at it, and it works, it's performant, but wasteful AF.
How is that wasteful ? Sure, the CPU uses more power, the mainboard does too, then there's cooling, PSU are not 100% efficient...

But look at it another way: It has eight times the cores / threads of the i9-9900k system but does not even use twice as much power (452W vs. 244)

Now, this is not something that most of us would buy, but if you are e.g. doing rendering, you are getting around 5x the performance for 1.9x the energy consumption. Seems quite efficient to me.

If you get this to browse the web or play Fortnite...much less so.

Efficiency = the good use of time and energy in a way that does not waste any.
 
Server grade hardware on a consumer board LOL :joy:

It's extremely niche but for those looking at this processor it will still be a good deal compared to having multiple workstations, simplifying workflow perhaps
 
You are completely right!
I miss those Crysis "WOW" moements, just noticing the photo realistic graphics, they were truly ground breaking, you could see the difference between the other games and on top of that everything in that world was destructible. It launched 12 years ago and I have never seen any game truly making me feel that way since then, it´s very frustrating. I blame consoles, of course.
I'd rather we have games playable with mid-range hardware at 1080p and 60 fps than another Crysis. Today you can find a ton of badly optimized games that will struggle with even powerful hardware.
 
These AMD processors look fantastic if I was doing work, but in gaming Intel is blowing their socks off.
Yeah, the 3800X versus the 9900K tells you all you need to know about gaming when you match them up core for core and thread for thread, The Ryzen gets beat pretty badly, you'll need the 3900X just to match the 9900K. That being said and while Intel has 1st and 2nd place for gaming CPU's with the 9900K and 9700K, it's nice to see AMD atleast hanging in there now and in a well deserved 3rd place, even if the value isn't there for gaming only purposes unless your getting something like the 3600.

Will be interesting to see if they make a Comet Lake KS processor that will hit 5.3GHz or so on all cores. Good review and same o'l same o'l, happy to see AMD pushing Intel, its been 10 years so it was bound to happen sooner or later.
 
Did you just ask what the difference looks like when using GTX 1060?!
Zero. Zero!
Forget about the 1060, difference between the two is non-existent unless you're using flagship GPUs at 1080p. And by flagship I mean the 2080Ti.

Now let's consider the following,
AMD and Intel gaming performance is identical unless we're talking about 1080p gaming with 2080Ti.
But wait you just said the 2080Ti was 2% of the market as if implying it didn't matter.
And yet here you are saying Intel blows AMD's socks off in gaming.
But negate 2080Ti.
Intel blows AMD's socks off in gaming.
Negate 2080Ti.
Blows socks, Negate 2080Ti, Asks about 1060 difference. Zero!

Man, I smell hypocrisy, bias, ill-logic.

If you haven't realized it by now, all Quantum ever does is come into AMD threads, boast about how much he's spent on his 2080Tis and/or say Intel is still better for gaming despite the opening sentence of the gaming benchmarks indicating "it's dumb" to game on this processor.

Big shocker here he is AGAIN.
 
If you haven't realized it by now, all Quantum ever does is come into AMD threads, boast about how much he's spent on his 2080Tis and/or say Intel is still better for gaming despite the opening sentence of the gaming benchmarks indicating "it's dumb" to game on this processor.

Big shocker here he is AGAIN.


He defeats his own logic with his examples LOL.
 
Crysis used a lot of techniques that were so sophisticated that the hardware couldn't handle it back then.

It's kinda like forcing Ray Tracing to be a new standard.

Except the hardware can't really handle it NOW. There's still segments that have performance issues.

Crysis, and the engine it's built on, is an unoptimized mess.
 
Back