The New 3GB GeForce GTX 1050: Good Product or Misleading Product?

This could be because of crypto mining but perhaps there is also a demand for budget 'Fortnite' capable cards or 4K video playback upgrades such as Netflix streaming. The extra memory over the GTX1050 might have advantages in a few other games besides Wolfenstein that I can think of.

This depends on pricing, it seems like Nvidia are squeezing all they can from their inventory. If they were going to cut the memory bus down they should have added faster RAM. 8Gbps modules would have done the trick.

What I would say is you need to re-test with overclocks, that would be very interesting. At least with a memory overclock. This could reveal if memory bandwidth has a significant impact on his card.

I have had great experiences with memory overclocking on various Nvidia parts the past few years and it looks like this card could benefit significantly if the memory used is receptive.

I would also be curious about what core speeds could be achieved.
 
Last edited:
This could be because of crypto mining but perhaps there is also a demand for budget 'Fortnite' capable cards.
Speaking as someone who has a 4GB 1050Ti in my secondary rig, it would make far more sense to stop churning out cut down junk versions (DDR3 vs DDR5, 128-bit to 96-bit, etc), and simply lower the price of the 4GB card by $20 or so. Here in the UK, barely £10 separates 3GB 1050 vs 4GB 1050 Ti's (literally £135 vs £145 for the cheapest versions), and even the most budget of budget gamers isn't going to argue over £10 to avoid losing up to 20% performance (eg, ARMA3) caused by 96-bit bottleneck that's supposed to "save" money, but clearly isn't. That's why 128-bit was chosen in the first place, it "just fit" 640-768 shaders without being too much / too little. I can only hope their GT1130 / 1150 cards turn out to be a nice improvement because their current low-end "affordability enhancements" to the 1030/1050's are a total mess.
 
This could be because of crypto mining but perhaps there is also a demand for budget 'Fortnite' capable cards.
Speaking as someone who has a 4GB 1050Ti in my secondary rig, it would make far more sense to stop churning out cut down junk versions (DDR3 vs DDR5, 128-bit to 96-bit, etc), and simply lower the price of the 4GB card by $20 or so. Here in the UK, barely £10 separates 3GB 1050 vs 4GB 1050 Ti's (literally £135 vs £145 for the cheapest versions), and even the most budget of budget gamers isn't going to argue over £10 to avoid losing up to 20% performance (eg, ARMA3) caused by 96-bit bottleneck that's supposed to "save" money, but clearly isn't. That's why 128-bit was chosen in the first place, it "just fit" 640-768 shaders without being too much / too little. I can only hope their GT1130 / 1150 cards turn out to be a nice improvement because their current low-end "affordability enhancements" to the 1030/1050's are a total mess.

Don't expect much of an improvement without a cost until AMD counters them. If Nvidia does another 30% performance jump and lows the power consumption even more Nvidia can comfortably up low end card prices.
 
Speaking as someone who has a 4GB 1050Ti in my secondary rig, it would make far more sense to stop churning out cut down junk versions (DDR3 vs DDR5, 128-bit to 96-bit, etc), and simply lower the price of the 4GB card by $20 or so. Here in the UK, barely £10 separates 3GB 1050 vs 4GB 1050 Ti's (literally £135 vs £145 for the cheapest versions), and even the most budget of budget gamers isn't going to argue over £10 to avoid losing up to 20% performance (eg, ARMA3) caused by 96-bit bottleneck that's supposed to "save" money, but clearly isn't. That's why 128-bit was chosen in the first place, it "just fit" 640-768 shaders without being too much / too little. I can only hope their GT1130 / 1150 cards turn out to be a nice improvement because their current low-end "affordability enhancements" to the 1030/1050's are a total mess.

You can get this card for under £130 in the UK. Nvidia are clearly using this to get rid of inventory, using all the viable dies they can. It's not a terrible product but as I pointed out it hinges heavily on pricing. Which in fairness to Nvidia hasn't been precisely within their control since the past year's inflated GPU and DRAM prices. The real problem is that you can get the 2GB version for only about £115 which means you wouldn't buy this unless it was priced virtually the same.

Really this card should be more around the £110 price range, as the original GTX1050 2GB was before the market blew up.

As the article says though unless you are desperate then you should wait, that applies to any GPU purchase at this stage while we wait with baited breath to see if Nvidia throw us a bone this month.
 
Last edited:
Just seems odd to me to cripple bandwidth. Why bother? Do they have a huge pile of chips with defects in the memory subsystem?

Makes me glad I went for the GTX 1050 Ti.
 
With all the bad reviews from the last one, I'm surprised they would unleash this dog .....
 
I think this is (or was supposed to be) a mining card that just no longer has a market. 2GB isn't enough to mine Ethereum, but 3GB is, at least for now. For mining, more cores is good and more ram is necessary, so here's the gtx 1050 based mining card. But now nobody is buying cards to mine with, so here they are.
 
You must stop using percentages for FPS and then distorting further by averaging percents you create factless facts a big no-no in statistics.
Diff of 30vs34 FPS and 100vs104 FPS calcs very different percentages much larger for the lower numbers, when the true fact is 4 FPS AVG. For the 8 detailed games the true diff is 5 avgFPS 3 lowFPS the 3GB faster in 4 games the 2GB faster in 3 and a tie in 1 game. (rounded up the 3 +4.75A +2.25L the 2 +5A +3.33L)
You made it bad by using ULTRA which kept FPS low creating larger percent difference when you should have used settings that would keep the avg FPS close.
 
Back