Razer Core X External GPU Review

So they pitched an i9-10900K against an i7-10750H? A 10C/20T 5.3GHz CPU versus a 6C/12T 5GHz one? A 125W TDP chip pitted against a 45W TDP processor? At 1080p?

And there was a performance difference? Well colour me surprised...

Sarcasm aside, a far more appropriate setup would have been to use the same desktop system, and compared the RTX 3090 in the motherboard against the external unit. There are plenty of LGA1200 motherboards that support Thunderbolt 3.

While certainly not a close comparo between processors, those actual differences are directly relevant to what was tested. In theory I'd love a direct comparison with identical processors, just to see what the bandwidth reduction and latency penalties are with TB3 in different games, but that's not relevant when you actually use a TB3 eGPU.

Because you won't be using a TB3 enclosure for your GPU on any LGA1200 motherboard. Why would you?

Instead, many people will be using a TB3 with an i7-10750H or similar, that's a middle of the road processor option for a TB3 laptop someone would be considering an eGPU for. There are some less powerful 4c8t and more powerful 8c16t systems, so this 6c12t i7-10750H is a reasonable option to test with.

And the results certainly suggest that spending on the highest tier GPUs for an eGPU setup is just throwing money away (current ridiculous pricing aside). Spending the same on an eGPU w/3080 ($350+$800, realistic eventual price?) and a desktop w/3060Ti ($650+$500) will net you more FPS on the desktop and greater headroom for performance upgrades in the future.

Yes the 1 computer advantage is real nice, but IMO the performance loss at the top as tested by KG suggests that money spent here is mostly wasted. IMO the performance loss with my 1080 is not acceptable (GPU money wasted) but the smaller loss seems OK at around the 1060 level.

I'd like to get my 1660 Super (and the 1080) in the TB case again now that I recently acquired a 6c6t TB3 Core i5-8500B Mac Mini (booted to Win 10) to compare to my 6c6t Core i5-8400 PC. 0.1 GHz difference in ACT and I can manually match the RAM timings, so this will be about as close as I can get to an apples-to-apples comparo.
 
Testing an eGPU system on a laptop is perfectly sensible - as you point out, it's the typical application for such a thing; as this is the bulk of KitGuru's review, there's no problem with any of that.

But comparing a high end desktop against a decent, but power limited, laptop in a CPU-bound configuration, isn't showing the relative merits or drawbacks of using an eGPU. Far too many variables are being altered between the test platforms to draw any particularly meaningful conclusions.

Had they tested the eGPU setup on the same platform, at least one could see what the actual difference is, between using a Thunderbolt-connected chassis and the native PCIe slot. If it turned out to be minimal, then the reason for the performance figures seen in KitGuru's desktop-vs-laptop results is clearly the platform. This could be investigated further by comparing a mobile RTX chipset and against a reasonably similar discrete eGPU in the same laptop. Only then could one really say whether or not using an high end eGPU is cost effective.

On the other hand, if there turned out to be a large fps difference between using TB3 and the motherboard slot, then the issue becomes more straightforward - there's little point in purchasing a high end graphics card, if the eGPU mechanism is going to throttle it to death.
 
I bought a Chroma, and it works as I expected it to, giving me good graphics performance on my Dell 9560 laptop. I need a laptop for travel, and don't want two PCs and can make do without extra graphics for travel where it's mostly work. Some gotchas:
* Dell is only 20gbps (9560, at least) so not as fast as it could be.
* I have mine on the floor, which needs a long thunderbolt cable. They are far and few between, and you really have to go for the Apple one imo which is well over $100 here in oz.
* I went for an NVIDIA graphics card. The laptop has a 1050 NVIDIA chip, and the drivers won't support that and my 3070 at the same time so I have to disable the 1050 on the laptop, meaning while I can have 2 screens the laptop one is driven by Intel on-chip which is very slow. Ok for work. I might have been better off with an AMD, but don't know if the AMD and NVIDIA will work together. At the worst, you'd have to disable the 1050 like I did but you might get something better than I have.
* The Chroma has an ethernet port. One less cable is it comes in on the Thunderbolt. But the internet is full of stories about it not working, dropping out and so on, and no fixes. I gave up and use a USB dongle off the laptop. Won't try again until things get reported as being better, if ever.
 
I saw this and wanted it for an easy expansion into crypto mining. Looks great for those unable to go ASIC!
 
This kind of peripheral has always confused me. Is there a significant market? Is it really that much cheaper than simply buying or building a dedicated desktop if the performance you get is actually quite a lot less for any GPU you fit inside?

I mean realistically you would only have to fit an RTX3070 to get the same performance as the 3080 tested here. Probably more consistent too.

With the money saved on that GPU combined with the steep cost of this and particularly the $500 version they want you to buy you can get a decent machine. You would probably have more muscular CPU performance as well unless you have a monster notebook.

So if the budget for a desktop machine is say around $700 without the GPU I would almost certainly go for a proper desktop build. Hmmm. The usage scenarios versus cost seem to be pretty slim.
personally I love my razer core because its like a portable powerhouse. I can easily afford the most speced out PC, my problem is that I have to travel a lot and cant take it with me. My mac 16 is always with me and it has a 4k screen and maxed out i9 cpu...
Even the on board radeon gpu is pretty capable to run things for 1080p...

But when I'm home the core x lets me just power it up much better.. it wont overheat and it will simply run aything. (not even mentioning rendering stuff is super fast)...

I think there is a market for it, and if apple didnt kill egpus it would do very well. The performance downside is mater of technology, there is no fundamental problem with it. Its just new tech and its not optimised well.
 
Back