GeForce GTX 1650 graphics cards are transitioning to GDDR6 due to "industry running out...

midian182

Posts: 5,682   +43
Staff member

Speaking to PC Gamer, an Nvidia rep said the company released the new card because “the industry is running out of GDDR5, so we’ve transitioned the product to GDDR6.”

An early review of the new GTX 1650 version appeared on Expreview (via Videocardz). It uses the same TU117-300 GPU with 896 CUDA cores as its predecessor, though the base clock in the newer card drops from 1485 MHz to 1410 MHz, while the boost falls from 1665 MHz to 1485 MHz.

The memory clocks might be lower, but the updated card comes with 12 Gbps memory modules instead of the 8 Gbps found in the GDDR5 version, pushing the maximum bandwidth from 128 Gb/s to 192 Gb/s.

Overall, it was found that the latest GTX 1650 is 5 to 8 percent faster in synthetic benchmarks and 2 to 10 percent faster in most games than its GDDR5-sporting predecessor.

We weren’t overly impressed with the GTX 1650, giving it a score of 60 in our review; the GTX 1650 Super, which uses GDDR6, is a much better buy.

It's noted that most modern graphics cards use GDDR6. The GTX 1660 is one exception, but buyers will likely opt for the GTX 1660 Ti or GTX 1660 Super, both of which use GDDR6 instead of GDDR5.

Permalink to story.

 

JohnSmithESP

Posts: 19   +7
I really wish Nvidia would have just made a sub 2060 model (2050?) instead of spending the money on the 1660 and 1650.

Everyone should be buying into RTX now.
The problems comes when you want to put a "worse" chip while keeping some performance on RTX, I don't see it worth
With that said, 5 models under the 2060 are too much
 
  • Like
Reactions: Reehahs

0dium

Posts: 58   +46
"Nvidia isn’t known for sneaking out new graphics cards without announcing it to the world, even when they only offer minor upgrades over their predecessors"

Yeah, they do it only when they get worse memory
 

Nobina

Posts: 2,439   +2,037
Imagine the stress of wanting to buy Ryzen 5 1600AF and GTX1650 GDDR6. You'll be looking for a needle in a haystack.
 
  • Like
Reactions: 0dium

Lew Zealand

Posts: 1,310   +1,262
TechSpot Elite
Honestly they shouldn't have made GPUs lower than the 2060 other than for laptops and specific oems.
RTX is the future regardless of some morons statements so if you want to transition customers it would have been better not to offer cards that specifically could run raytracing at some level, yes it is expensive but the 1060 was at $250 and up, and sold like hotcakes while the 1050ti dominated cheap gaming laptops, at this point I hardly see the relevance of sub $300 desktop GPUs when APUs are capable, something tells me the problem this gen won't effect the 3000series as much performance from 7nm and reliability of UVL consistency should In theory allow for RTX pretty much across the board other than something like the mx lines.
The problem is that APUs are *not* capable.

I have a GTX 1660 Super and it's a great GPU and I didn't have to pay extra for useless RTX cores. With DLSS2.0 there's finally an argument to be made about the tensor cores but GTX 1660 Super levels of CUDA core performance are 4.5x the level of the best APU, the Vega11 in the Ryzen 5 3400G. I'd rather have that than pay 10-15% more for RTX cores which cut my framerate in half.
 

Rayneofpayne

Posts: 94   +73
The problem is that APUs are *not* capable.

I have a GTX 1660 Super and it's a great GPU and I didn't have to pay extra for useless RTX cores. With DLSS2.0 there's finally an argument to be made about the tensor cores but GTX 1660 Super levels of CUDA core performance are 4.5x the level of the best APU, the Vega11 in the Ryzen 5 3400G. I'd rather have that than pay 10-15% more for RTX cores which cut my framerate in half.
You are not making sense...
Won't pay for a 2060 because it cuts your framerates in half, yet you buy a lower tier card with no capability meaning you never have the option, and you lose out on the better performance, not all games require 120+frames.
There is a difference between graphical context and graphical fidelity.

Also techspot, I am calling people for what they are, people troll and spout nonsense without understanding of critical elements in which they are typing about.
Sprite-Vector-rastererization-Ray Tracing
It's a rendering method not a gimmick like people of lesser qualification can't seem to distinguish, The fact MVidia made it possible at this level of compute is amazing. Although I could argue how in order for us to progress unified Shaders and the like are becoming irrelevant to what we could be capable of with specialized cores and how we could be returning back more to a 2d/3d early video card nightmare, but apparently that is above people. Also people don't apparently remember the begining of raster during PS1 era where resolution and framerates were garbage.........
Trolls need to be slayed not encouraged.
 

Lew Zealand

Posts: 1,310   +1,262
TechSpot Elite
You are not making sense...
Won't pay for a 2060 because it cuts your framerates in half, yet you buy a lower tier card with no capability meaning you never have the option, and you lose out on the better performance, not all games require 120+frames.
There is a difference between graphical context and graphical fidelity.
The point of buying a 2060 over a 1660S is for more CUDA cores (good) and RTX/Tensor (not so good). More CUDA cores gets you higher performance, great. RTX gets you much lower performance, and is not worth the slight improvement in graphical fidelity all of what, 6 games? I think HUB says RTX actually seems to be useful in one game: Control. Tensor now seems better and may be worth the expense. In 6 games.

The point of buying the 1660S is significantly lower cost and correspondingly not wasting those extra dollars on features which are a mixed bag in 6 whole games and not implemented in all the rest of the games out there.

RTX and DLSS are tech of the future but not today. Not enough cores in the 2060 to be useful and not implemented in the vast majority of games. I wish there was a 1670 Super (for the CUDA cores) instead of the 2060, though the 2060 is an OK if a little overpriced card just based on the CUDA cores alone.
 

JohnSmithESP

Posts: 19   +7
You are not making sense...
Won't pay for a 2060 because it cuts your framerates in half, yet you buy a lower tier card with no capability meaning you never have the option, and you lose out on the better performance, not all games require 120+frames.
There is a difference between graphical context and graphical fidelity.

Also techspot, I am calling people for what they are, people troll and spout nonsense without understanding of critical elements in which they are typing about.
Sprite-Vector-rastererization-Ray Tracing
It's a rendering method not a gimmick like people of lesser qualification can't seem to distinguish, The fact MVidia made it possible at this level of compute is amazing. Although I could argue how in order for us to progress unified Shaders and the like are becoming irrelevant to what we could be capable of with specialized cores and how we could be returning back more to a 2d/3d early video card nightmare, but apparently that is above people. Also people don't apparently remember the begining of raster during PS1 era where resolution and framerates were garbage.........
Trolls need to be slayed not encouraged.
Obviusly you need to have another tier under it, because faulty chips (whatever is called in english) exist, but I'd like to see the previous gens really cut off price, because at least in Spain is super slow to go down.

And RTX, is impressive, but and the end of the day is a visual effect, if isn't better enough over traditional effects... why? and to be better ENOUGH has to be better than traditional fx + higher framerates.
With that said, the improvement in rtx will come to cheaper costs for the user, but until AMD does it it wont happen

Apart form that, if dlss 2.0 is as good as it seems it would reason enough, but this would make rt cores worth, not pathtracing
 

Rayneofpayne

Posts: 94   +73
Obviusly you need to have another tier under it, because faulty chips (whatever is called in english) exist, but I'd like to see the previous gens really cut off price, because at least in Spain is super slow to go down.

And RTX, is impressive, but and the end of the day is a visual effect, if isn't better enough over traditional effects... why? and to be better ENOUGH has to be better than traditional fx + higher framerates.
With that said, the improvement in rtx will come to cheaper costs for the user, but until AMD does it it wont happen

Apart form that, if dlss 2.0 is as good as it seems it would reason enough, but this would make rt cores worth, not pathtracing
Ray tracing is a form of rendering, not an effect...... Also welcome to tech progression, as I stated, PlayStation and the beginning of Rasterization it was expensive and a sub par experience.
 

JohnSmithESP

Posts: 19   +7
Ray tracing is a form of rendering, not an effect...... Also welcome to tech progression, as I stated, PlayStation and the beginning of Rasterization it was expensive and a sub par experience.
Excuse me, my bad for saying fx
And the problem is rasterizantion it was expensive... but rtx (on nvidia) is expensive + expensive the hardware. And this extra cost in a low tier it's hard to assume