Today we’re taking a look at what's one of the worst graphics cards ever released. But before we get to that... imagine you're on a seriously tight budget, likely some of you won’t have to be too creative, after all PC gaming is something most of us do to unwind after a day of work, relax a bit, and have some fun.
So not everyone can afford to invest hundreds or thousands of dollars in PC gaming when there are other bills to be paid. You might also be a young kid getting into PC gaming and computing. That can be tough for some, most parents are going to be more interested in the idea of buying a console than a computer, which I feel is a mistake, but let’s not get into that here.
Point is all of these people don’t have endless money to spend on upgrading or building a new PC, so they’re after an affordable graphics card. More often than not they’re also just getting started on their PC gaming journey, so are not that experienced when it comes to buying a graphics card.
Some light research will show you that sub $100 graphics cards such as the GeForce GT 1030 are capable of delivering over 60 fps in Fortnite at 1080p, using competitive quality settings. So for most that will be enough and although something like the GTX 1050 is only about $30 more, the GT 1030 does what you need, so why spend the extra money? Of course, we could come up with a few good reasons but I understand why so many people buy these entry-level graphics cards.
So you’ve seen the benchmarks, you know your system can produce at least 60 fps with the GeForce GT 1030, so you buy one for the typical asking price of $90. However what you don’t know is not all GT 1030 graphics cards are the same, not even close. You could be buying a GT 1030 that will indeed spit out over 60 fps in Fortnite at 1080p, or you could be getting one that barely keeps the frame rate above 30 fps.
The problem for you is they both have the same name, the exact same name. You might expect the version that’s around 50% slower to be called the GT 1030 LE or 1030 SE or better yet a completely different name like GT 1020 for example. Instead, Nvidia has decided to grossly mislead customers into thinking all GT 1030 graphics cards are equal.
So what exactly is going on here then? Well, more than a year ago, in May 2017, Nvidia released the GT 1030. As the most affordable GPU in the GeForce 10 series it came at an MSRP of just $70. It packed 384 CUDA cores clocked at 1.28 GHz with a boost frequency of 1.47 GHz and the GDDR5 memory was clocked at 1.5 GHz and using a mere 64-bit wide memory bus provided a bandwidth of 48 GB/s.
The GT 1030 wasn’t blowing any socks off but it was what it was, an ultra affordable current generation GPU.
However as of March 2018 budget shoppers could no longer buy a GT 1030 and know exactly what they were getting, at least not without taking note of the memory type used. This is because Nvidia quietly introduced a DDR4 version using the type of memory modern desktop PCs use as system memory. This is a big problem because DDR4 memory is significantly slower than GDDR5 memory.
In this transition, the memory base frequency has been reduced by 30% and as bad as that sounds, the end result is far worse. GDDR5 memory has two parallel links which provide double the I/O throughput when compared to DDR4. This means the memory frequency has actually been reduced by 65% and because we’re still using the same 64-bit wide memory bus, this also means the memory bandwidth has been reduced by 65%, down from an already anemic 48GB/s.
The end result sees the DDR4 version of GT 1030 with a 16.8 GB/s link to its 2GB memory buffer. That’s only slightly more bandwidth than the miserable DDR3 version of the GeForce GT 730, released back in 2014. Then for whatever reason Nvidia has also cut down on the core frequency, dropping the DDR4 version down to 1.15 GHz, a 6% reduction.
So, the DDR4 version has way less memory bandwidth, but has the same name and looking around online looks to come in at about the same price. So as a potential GT 1030 buyer, how worried should you be? Nvidia didn’t feel the need to change the name, so can it really be that different? Well, we’re about to find out.
For testing all graphics cards we've benchmarked on our Core i3-8100 test system with 8GB of DDR4-2666 memory. Also thrown in for comparison is the Ryzen 5 2400G APU using it’s integrated Vega 11 graphics with 8GB of DDR4-2666 memory. Let’s get to the results...
For testing Battlefield 1 we’ve gone with the low quality preset. Now normally we test low-end graphics cards and even APUs with the medium quality settings, but at 720p the 1030 GT DDR4 version was barely able to deliver playable performance. Using the low quality preset did allow for 51 fps on average and that doesn’t seem too bad, that is until you realize the GDDR5 version is 104% faster.
Moving to 1080p and now the DDR4 version of the GT 1030 can’t even achieve 30 fps using the lowest possible quality settings, are you kidding me. The Ryzen 5 2400G APU was 64% faster and way worse than that is the fact that the GDDR5 version, the original GT 1030, is 118% faster. 118%, I’m not even sure what to say about that right now so let’s move on.
Using the lowest possible quality settings in Prey at 720p the DDR4 GT 1030 did managed to average just over 60 fps and again that meant the GDDR5 model was roughly twice as fast.
Upping the resolution to 1080p and now the GDDR5 model is more than twice as fast as the patchetic DDR4 version. Again we see that with the lowest possible quality settings enabled the DDR4 version can’t even achieve 30 fps on average.
Far Cry 5 is new title and it’s fairly well optimized but if you bought the DDR4 version of the GT 1030 you’re gonna have a bad time. Even at 720p using the lowest possible quality settings we couldn’t average 30 fps, meanwhile the GDDR5 model was almost 80% faster and that’s the smallest margin we’ve seen yet.
Here are the 1080p results and really what can you say? The GDDR5 version of the GT 1030 sucks here, but the DDR4 model is something else, pure garbage it really is.
Although DiRT 4 was playable using the lowest possible quality settings it was still a far cry from what the GDDR5 versions are capable of.
At 1080p the DDR4 version dips well below 60 fps while the GDDR5 model was able to keep the frame rate above 80 fps at all times.
I suspect a lot of GT 1030 graphics cards are currently being purchased by those wanting to get into the Fortnite action and if true those who get caught out by the DDR4 version will be livered. Results like these will show the GT 1030 pushing over 100 fps at all times using the medium quality settings at 720p. The DDR4 version though struggles to keep the dips above 60 fps, but I guess it’s playable so there is that.
Of course most won’t want to play at 720p if they can avoid it and avoid it they can’t with the DDR4 version. Here we see just 38 fps on average while the original GT 1030 spat out an average of 66 fps.
Moving on to Rocket League and here we see just 60 fps on average at 720p using the quality preset named ‘quality’. In this instance the GDDR5 model was almost 90% faster with 114 fps on average.
Then at 1080p the DDR4 GT 1030 scrapes by with 32 fps on average and at this more memory intensive resolution the GDDR5 model was 122% faster, just ridiculous that is.
The last game we’re going to look at is Rainbow Six Siege and here we see when using the lowest possible quality options at 720p, the DDR4 GT 1030 is good for 73 fps which isn’t bad, apart from the fact that the GDDR5 model is almost 80% faster with 130 fps.
Then once again at 1080p the seriously limited DDR4 models performance is heavily limited and here we saw an average of 39 fps making the GDDR5 model 64% faster.
Not that it matters in the least but here are the power consumption figures. Because the memory is choking the GPU the DDR4 version uses less power, reducing total system consumption by 16%. That said given we often saw a 50% reduction in performance that actually makes the DDR4 version significantly worse in terms of performance per watt.
Wow. Where do you even start with a product like this? I’m still coming to terms with the fact that this product even exists, how the bloody hell does this exist?
Actually I should be clear about this, as rubbish as the DDR4 GT 1030 is, I don’t have too much of a problem with its existence. The real issue is Nvidia had the gall to call it a GT 1030 and not the GT 1020, for example. I don’t even know how they can get away with doing that.
It’s often 50% slower, yet has the exact same name and price, again how is this possible?
It’s not like we were trying to cripple performance or test under some unusual condition to make it look worse than it really is. Almost all of the games were tested using the lowest possible quality preset at 720p and 1080p. The GDDR5 model can actually handle higher quality visuals at 1080p and doing so would further cripple the DDR4 version, so we’re actually showing a best case scenario, if anything.
I can’t imagine how upset I’d be, if I’d been saving to upgrade my graphics card only to end up with this thing. I couldn't imagine buying another GeForce product. I’d also be very upset with the retailer that sold be the product, even though I do realize they are not the problem here.
This actually lead me to reach out and ask a few local retailers what their take was on the DDR4 version of GT 1030, and do they ever see any kind of backlash from this sort of thing.
For obvious reasons I won’t name any of the retailers' names, but I can tell you what they said, and they all said pretty much the same thing: First, the big issue for retailers is that they often don’t know exact what models they are buying, and I know that sounds silly but it makes sense. The people in these purchasing roles often aren’t computer geeks, they pay attention to product codes and pricing, not what type of memory a graphics card uses. They will see GT 1030 stock is low and they’ll order more GT 1030 stock, it’s not like cheap brand has on GT 1030, even before the DDR4 mess most brands offered 4 or 5 models.
So many of them unknowingly bought up DDR4 GT 1030 stock, assuming it was just the same GDDR5 GT 1030 stock they’d been buying for the past year. The specs are often supplied and then just copy and pasted to the website and this isn’t unusual.
Almost all of them admitted that if it wasn’t for the controversy around the spec change they would have been none the wiser, and it would have just been pure chance if they happened to notice the change at all. They also said unless the model name is changed or there is some obvious difference this stuff often goes under the radar.
This then causes massive issues for these retailers as they are unknowingly selling their customers inferior products and that’s not something they want to do for obvious reasons. So Nvidia aren’t just hurting their customers, but also potentially hurting the retailers and their own board partners, this is a bad look for all involved and needs some kind of fix soon.