GeForce GT 1030: The DDR4 Abomination Benchmarked

2c3808304881cac61b61fb2b5aeef4cc--type--diabetes-diabetes-memes.jpg
 
In Australia this practice would be illegal if brought to the attention of our consumers affairs office. Nvidia would be slapped with very large fines and ordered to clearly label the products differently. I guess in USA big corporate scum can do whatever they like.
 
It’s their reputation to uphold, or in this case not to. Seems like a pretty dumb decision to not have made a clear distinction between what is clearly two different products.
 
I have an nVidia card but this really is bad. DDR4 shouldn't exist at all on any dGPU, even the bottom rung tier. Do the right thing and pull them from the market.

Edit: Not sure what's up with your power consumption numbers. +30w disparity for i3-8100 vs 2200G when both are idling sounds way off compared to others (link and link). Likewise, "207w" for a 60w 750Ti? Eh? You were maxing out at 120-140w on same card with less efficient 22nm 95w Haswell's in previous reviews and moving from 95w to 65w test CPU's increases it by 70w?...)
 
I have an nVidia card but this really is bad. DDR4 shouldn't exist at all on any dGPU, even the bottom rung tier. Do the right thing and pull them from the market.

Edit: Not sure what's up with your power consumption numbers. +30w disparity for i3-8100 vs 2200G when both are idling sounds way off compared to others (link and link). Likewise, "207w" for a 60w 750Ti? Eh? You were maxing out at 120-140w on same card with less efficient 22nm 95w Haswell's in previous reviews and moving from 95w to 65w test CPU's increases it by 70w?...)
The first one has a discreet GPU in both AMD and Intel systems and the second one lacks a dGPU for both.
In these results the AMD systems does not have a dGPU while the intel one has a dGPU. It's normal for an APU to draw less power than a CPU+dGPU combo.
As for the power draw of the 750ti it depends a lot of how it is measured and what game is used while doing it (anandtech has the 750ti at around 185W in Crisys 3). The 750ti is also known to have a fairly low power usage.
 
Last edited:
The first one has a discreet GPU in both AMD and Intel systems and the second one lacks a dGPU for both. In these results the AMD systems does not have a dGPU while the intel one has a dGPU. It's normal for an APU to draw less power than a CPU+dGPU combo.
No, that's not it by itself. I asked because in my 2nd rig (HTPC) I have an i5-7500 (virtually same chip as i3-8100) + GTX 1050Ti (twice the horsepower & wattage of GT1030) and I'm seeing 22w idle iGPU / 25w idle dGPU (pulling the dGPU only saves 3w difference, nowhere near 30w) and 98w max gaming load. Even ignoring the AMD, I'm just curious how it's possible to draw +50w more power with only half the fps with a dGPU half the wattage, and also for a 750Ti to leap up +70w by downgrading 95w to 65w test CPU's. Is everything else the same (ie, not a case of suddenly switching to 550w Platinum vs 1200w "White" PSU or something which screws up low-end comparisons vs new?) APU power figures are accurate but the rest seem to be virtually double what they should be (and other sites have) given the target market for this stuff is running on sub 500w OEM PSU's down to and including 120w Pico PSU + laptop bricks for GT1030's and still not hitting triple digits. (Edit: In fact, again, Techspot's own previous 1030 review was 30w lower on a platform drawing +10w more, so unless the old card was drawing minus 10 watts...)
 
Last edited:
Tech enthusiasts will largely forgive Nvidia. The same thing happened with the 970, a collective sigh. Nvidia doesn't care about possible reputation damage because it knows it's customers will buy their products regardless.
 
I have an nVidia card but this really is bad. DDR4 shouldn't exist at all on any dGPU, even the bottom rung tier. Do the right thing and pull them from the market.

Edit: Not sure what's up with your power consumption numbers. +30w disparity for i3-8100 vs 2200G when both are idling sounds way off compared to others (link and link). Likewise, "207w" for a 60w 750Ti? Eh? You were maxing out at 120-140w on same card with less efficient 22nm 95w Haswell's in previous reviews and moving from 95w to 65w test CPU's increases it by 70w?...)

As @Puiu pointed out the Core i3-8100 configurations have a discrete GPU and a higher rated PSU, the APU has a 220w PSU. It's not a fair comparison but that's not at all the focus here ;)

Tech enthusiasts will largely forgive Nvidia. The same thing happened with the 970, a collective sigh. Nvidia doesn't care about possible reputation damage because it knows it's customers will buy their products regardless.

Bloody big difference mate. The GTX 970 performed exactly as advertised by the reviews, the DDR4 GT 1030 is around 50% slower!
 
The big question is why use DDR4?
Surely any cost saving is negated by the massive bottleneck in performance.
This just seems like a freak lab experiment that accidentally made it to store shelves.
 
Bloody big difference mate. The GTX 970 performed exactly as advertised by the reviews, the DDR4 GT 1030 is around 50% slower!
I never understood the 970 backlash because of what you just stated; once the memory specs were clarified performance was exactly the same.

I am curious how Nvidia has stayed silent on this one though. Half the performance for the same name/price is unconscionable. Thanks for highlighting an issue I had no idea was occurring.
 
This has been happening since I can remember.
About 15 years ago I needed the cheapest card I could get and landed with MX4000. I didn't really care about it's performance, but after playing with it for a while I found out it had 32bit (!) memory bus, when one could buy one with 128bit bus, with the same name.
Maybe that's why I really don't trust nVidia :)
BTW Wikipedia does a good work tracking all this versions, always good to take a look before purchase to know about potential traps.
 
Last edited:
This things have been happening since I can remember.
About 15 years ago I needed the cheapest card I could get and landed with MX4000. I didn't really care about it's performance, but after playing with it for a while I found out it had 32bit (!) memory bus, when one could buy one with 128bit bus, with the same name.
Maybe that's why I really don't trust nVidia :)
BTW Wikipedia does a good work tracking all this versions, always good to take a look before purchase to know about potential traps.


I havent been a pc gamer for that long but when I jumped into it the first card I got was a 750ti, so it was a good jump in point and saved money, just gotta research.
 
TS- good article. nVidia- bad marketing. FTC- enforcement opportunity if a mislead US customer / ddr4 1030 owner can document and is willing to file complaint. (IANAL)
 
This always happens on the low end and it is crap. What bugs me the most is that the people buying these cards are usually poor and can not afford to make a mistake like this. It isn't like somebody dropping $800 on a 1080Ti. Even if you "screw up" and get the wrong OC'ed version of one of 90 different eVGA cards doing the rounds, you are still getting a great card, that can play anything at super high frame rates.
 
I never understood the 970 backlash because of what you just stated; once the memory specs were clarified performance was exactly the same.

I am curious how Nvidia has stayed silent on this one though. Half the performance for the same name/price is unconscionable. Thanks for highlighting an issue I had no idea was occurring.
Specs should not be something magical and mythical but something clearly written in black and white. Besides, for the few games that did use more than 3.5GB of VRAM the 970 did see some frametime spikes or slow .
It's indeed not a big deal (certainly not as big as some ppl may want you to believe), but it is something that should not be allowed. :D
 
Good article all things considered but I didn't enjoy all the constant rhetorical questions in it.

We get it, you think it's rubbish, it is actual rubbish, you don't need to over state it.
 
Good article all things considered but I didn't enjoy all the constant rhetorical questions in it.

We get it, you think it's rubbish, it is actual rubbish, you don't need to over state it.
Sometimes people need to be told multiple times before they understand just how bad it is :D
 
Back