Nvidia might have killed off the RTX 2060 and GTX 1060

midian182

Posts: 9,741   +121
Staff member
Rumor mill: Some of Nvidia's budget-orientated Turing-era cards remain incredibly popular among gamers, but the company is reportedly calling time on some of them: the RTX 2060 and GTX 1660 models. According to the latest Steam survey, the RTX 2060 is the second-most-common graphics card among participants, while the GTX 1660 sits in eighth position.

News that Nvidia was ending production of the RTX 2060 arrived earlier this month via Chinese media outlets. ITHome has backed up these reports, adding that the GTX 1660 is also being killed off.

The RTX 2060 line covers three cards: the standard 6GB version from 2019, the RTX 2060 12GB released in 2020, and the RTX 2060 Super. The original card sits behind the GTX 1060 as the second-most-popular GPU among Steam users. Despite its age, the RTX 2060 also saw the second-highest number of new Steam users in October, too.

The GTX 1660 line, meanwhile, also includes the GTX 1660 Super and the GTX 1660 Ti. It's another budget card in the top ten most-popular and top ten best-performing Steam charts from October.

Ending production makes sense for Nvidia. The company is trying to reduce its inventory of RTX 3060 cards—a GPU that will likely overtake the RTX 2060 on the Steam survey at some point—and is hoping to push buyers toward the RTX 4060 once that arrives next year.

Despite the end of production, it'll be a long time before these cards fade out of existence. The RTX 2060, which is still a solid performer, sells for an average of just $167 on eBay, while the GTX 1660 goes for around $107. The RTX 3060, while admittedly more powerful, usually costs more than its $330 MSRP when bought new, and its used price averages about $275.

The global economic downturn has seen demand for expensive tech items crash, from smartphones to PC components. Even the powerful RTX 4080 has failed to sell as well as expected. The situation has made older, budget cards a more appealing prospect.

Conservative consumer spending could be good news for AMD, whose upcoming RX 7900 XTX ($999) and 7900 XT ($899) will likely be compelling buys for those looking to upgrade to new flagships without paying Nvidia's prices.

Permalink to story.

 
About time.
Now if they would also adjust 3060 pricing accordingly... Because the 3050 won't cut it.


that's caused by the never-ending demand for mid-range miners - wont go away until we have a 4060 12gb card to replace it

two months after merge several forks of origial ethe4rium are worth at least a dollar now.
 
"best-performing" - isn't that a subjective term?

Who's to say that a GTX 960 isn't a "best-performing" card for them? Maybe it plays all the games for that person the best and they don't need anything else.
Or a RX 5700?
Or a RTX 4090?

My point is, you can't claim any one card is "best-performing" over another. I can see the idea of claiming the cards are most popular based on the Steam survey data provided, but that doesn't make them "best-performing".
 
Article Title: Nvidia decided it does not yet have enough money resulting in killing off RTX 2060 and GTX 1060 best selling cards.

FTFY
:)
 
I think the RX 6600 Black Friday deal for $190 after MIR beat the hell out of any of these cards but it still sucks to see two budget cards leave production. We desperately need more competition in the sub-$300 market.
 
"best-performing" - isn't that a subjective term?

Who's to say that a GTX 960 isn't a "best-performing" card for them? Maybe it plays all the games for that person the best and they don't need anything else.
Or a RX 5700?
Or a RTX 4090?

My point is, you can't claim any one card is "best-performing" over another. I can see the idea of claiming the cards are most popular based on the Steam survey data provided, but that doesn't make them "best-performing".

No it's not, article is clearly talking about performance in terms of user adoption. By that metric the 2060, 1660 series are among the best-performing on steam.

As always words don't mean anything without their surrounding context.
 
These make some sense as long as Nvidia has stock of the cards they should be selling in their stead at decent prices.

2060 $250 --> 3050 $265
1660 $180 --> 1660 Super $200

No fool should be buying a 1660 when they can get a 1660 Super for $20 more.

The 2060 is 15% faster than the 3050 for less money so that's just a cash grab. The 3050 needs to be $220 or less as it's only advantage is 8GB which may serve it better in the long run than the 2060's 6GB.
 
The 1660 might as well be grouped together, it would make it the second most popular card series.

Right now the best card for the money used is the 5700XT. The best card for the money new is the RX 6800 XT.

I think an RTX 2060 12GB would be a decent back up card if prices keep crashing.
 
It is about time! Nvidia facing a GLUT of cards in inventory, it makes no sense to keep producing these old cards.
 
No it's not, article is clearly talking about performance in terms of user adoption. By that metric the 2060, 1660 series are among the best-performing on steam.

As always words don't mean anything without their surrounding context.
I was going to "grammar Nazi" that that post myself, but you beat me to it.
 
It is about time! Nvidia facing a GLUT of cards in inventory, it makes no sense to keep producing these old cards.
It may not make any sense to you, but for those of us not bent on "world domination of the gaming realm", it's a huge disappointment.

If only for the fact that after these cards are gone, the majority of gamers on this site, will be back to crying their eyes out about the price of the new cards. And those of us who don't need them, will be stuck paying the new card prices as well...Then we'll have to listen to a bunch of crap from the elitists here, proclaiming that if, "computers are too expensive, we should find another hobby".

Which causes me to wonder if, "money were no object" why people have been buying Intel "F" model CPUs to save $20.00. "You don't need IGP if you're going to use a separate card". Get back to me on that, when you fry that precious card, and can't use the computer I know, I know, you've all got those contemptible "last year's cards,. around to "hobble your machine with"..But what a letdown, ay? Me, I'd just re-plug into the IGP, and keep up the "lively discussions", here at Techspot.
 
Last edited:
F processors is not realy a save for 20$ :( what is a save is not-K models with external clock generator OC, OC is useless if you pay 100$ more for it :(
 
I *just* got a GTX 1650 GDDR4 for my system -- it's like the ONLY non-ancient video card (I.e. Geforce2-era, which I was shocked was still available for sale at all) that will run off a PCIe slot without needing extra power connectors -- it just stays within the 75W power limit. (Barely -- the GTX 1650 GDDR5, the GDDR5 is enough extra power draw to push it over the limit.) I hope they don't discontinue this! (Edit: Or, if they do, they offer a sub-75W replacement, even if they have to de-rate a card to do so -- which I think is what they did with the GTX1650, the "no power connector" model I think cut the max clock by about 5% to cut max power from 80W or so to 75W. Whatever they did, it works.)

Surprising, I used to always get a ~$100-150 video card both because I didn't want to spend more, and because that was about the range where you could get a video card that didn't have unusual power or cooling requirements. Now, rather than having those card be raised in price to $200, they are just completely off the market (other than GTX1650)... the AMD competition all needs a power connector, and so do the rest of the NVidia lineup.

Luckily, I got one lightly used for $100; and also luckily, I'm happy with the purchase, I put it in an Ivy Bridge system and the card is essentially a bit overpowered for it -- Furmark can max out the GPU, but generally the CPU (or one core if the app isn't multithreaded enough) will max out before the GPU does. Every game I've thrown at it runs a treat!
 
Last edited:
I *just* got a GTX 1650 GDDR4 for my system -- it's like the ONLY non-ancient video card (I.e. Geforce2-era, which I was shocked was still available for sale at all) that will run off a PCIe slot without needing extra power connectors -- it just stays within the 75W power limit. (Barely -- the GTX 1650 GDDR5, the GDDR5 is enough extra power draw to push it over the limit.) I hope they don't discontinue this! (Edit: Or, if they do, they offer a sub-75W replacement, even if they have to de-rate a card to do so -- which I think is what they did with the GTX1650, the "no power connector" model I think cut the max clock by about 5% to cut max power from 80W or so to 75W. Whatever they did, it works.)
GDDR4, are you sure. I've only seen them for sale as GDDR5 & GDDR6
Yours is what's called a "reference card", It's not really "underclocked", as such, but rather not overclocked. I think the stock clock speed of the GDDR5 is slightly higher than GDDR6

What many miss about the video card market today, is the fact that the dealers were forced to buy them at jacked up dealer prices. Consequently, they still have to sell them up, or take a loss.

I watched with particular interest the prices and options on the 1650 models. As of a month or so ago, Newegg was still trying to pull in $220.0 for the GDDR5 Asus "Phoenix Fan" 1650 (1 fan GDDR5), while a later arrival, the Asus "TUF" 1650 OC cards, (2 fans GDDR6), were selling at the same price..

Up until the spring, maybe summer, people were trying to get at least $190,00 for a 1050 ti. (No way Jose, I paid a buck forty for mine and wasn't going any higher on a re-buy).

To make a long story even longer, I recently treated myself to a 1660 ti, for the same price as the 1650. It's another of the Asus "TUF OC series. I can speculate that Asus is blowing out their inventory, as the 1660 ti, had a list price of $280.00

As for cooling, all of the OC'd cards have massive coolers on them, and twin fans, so I don't think heat in the 1650 ranks is that much of an issue, whether the card is a reference card. or an OC'd model..

I don't game so I don't know how much this will matter to you. But, I have an EVGA "FTW" (2 fan 4 GB) 1050 ti, and I've never heard the fans start up. With the photo editing I do, it's essentially "passively cooled" The fans aren't designed to spin up until the GPU temp hits 50 C..

So, with proper case fans and PSU, I wouldn't fear an OC\d 1650 or 1660. I think the 1660s come in at about 124 watts, with an 8 pin connector. I panicked when I saw "8 pin connector" in the reviews, but the 500 watt PSU for the new rig has a "6+2" connector.
 
It may not make any sense to you, but for those of us not bent on "world domination of the gaming realm", it's a huge disappointment.

If only for the fact that after these cards are gone, the majority of gamers on this site, will be back to crying their eyes out about the price of the new cards. And those of us who don't need them, will be stuck paying the new card prices as well...Then we'll have to listen to a bunch of crap from the elitists here, proclaiming that if, "computers are too expensive, we should find another hobby".

Which causes me to wonder if, "money were no object" why people have been buying Intel "F" model CPUs to save $20.00. "You don't need IGP if you're going to use a separate card". Get back to me on that, when you fry that precious card, and can't use the computer I know, I know, you've all got those contemptible "last year's cards,. around to "hobble your machine with"..But what a letdown, ay? Me, I'd just re-plug into the IGP, and keep up the "lively discussions", here at Techspot.

Ever hear about this new thing, it's called the second hand market. There are MILLIONS of choices!
 
Sure, I have. But couldn't buying a used VGA be proverbially sort of like, "marrying an ex-porn star"?
🤣 🤣 🤣 🤣 🤣 🤣 🤣
Not the same, the card it's not supposed to give you children. More like engaging to one since you only plan to keep it 1-2 years. Now tell me what will it make buying a used card from a miner??
 
Now tell me what will it make buying a used card from a miner??
You mean, "what difference will it make buying a used card from a miner??? You write a bit like me. I have the words in my head, and then forget to write them down.

I guess you could say there's no warranty. Or, "it could have been run hard and put away wet". Maybe ask yourself, how many times did it hit the century mark with the temp? I'm not denying there could be some deals out there. But you have to be an astute shopper to make sure you find them

But really, when you crack open a brand spanking new piece of gear, isn't it a bit like giving yourself a big hug, perhaps even a wet kiss or two? :heart_eyes::D And well, I confess to sniffing that 'new box' before I open it. ;).
 
Last edited:
You mean, "what difference will it make buying a used card from a miner??? You write a bit like me. I have the words in my head, and then forget to write them down.
I'm not an English native so I guess my writing is not quite grammar proofed.
The phrase topic is reversed in my native language.
Now what I wanted to say is:
Since I got a used card from a miner, that would make me look worse than "marrying an ex-porn star". And since I didn't receive any BJ, DT, CIM, HJ or any other p0rn related stuff from my card makes me look even worse.
Warranty, I got that on purchase since the miner kept original invoice and in theory I still have 4 months left of it.
But I teared down the card and replaced paste and pads. Also swapped the fans with regular ones and give it a custom shroud.
It looks now like a bimbo, tuned with big "tities" hanging out, aka new fans.

Now back to the topic, Nvidia didn't killed only old cards, it also wants to kill the entire gaming industry. The prices of new GPU's are still way over inflated and above what most of gamers are looking to spend. The 4090 is like a Titan class or a datacenter/professional card and priced accordingly. But the 4080 it's the elephant in the room, this card makes no sense to me even priced at $700+VAT. Waiting for the card formerly known as 4080 12GB to see pricing and laugh my arse off.

Looks like my ghetto modded bimbo card is here to stay for at least 6 months. I would most probably buy used since for me buying new equals more money to GPU makers from me.
 
snip

Now back to the topic, Nvidia didn't killed only old cards, it also wants to kill the entire gaming industry. The prices of new GPU's are still way over inflated and above what most of gamers are looking to spend. The 4090 is like a Titan class or a datacenter/professional card and priced accordingly. But the 4080 it's the elephant in the room, this card makes no sense to me even priced at $700+VAT. Waiting for the card formerly known as 4080 12GB to see pricing and laugh my arse off.

snip
I doubt Nvidia wants to "kill the gaming industry". It may seem like that but I guarantee they want that market. Unfortunately for them their plans were greatly impacted by supply chain shortages, Covid and inflation. Their design is expensive to manufacture and they thought the increasing sales and high prices would continue forever, or for a long time. But, crypto mining imploded and there you are.

As Cap'n Cranky stated, many of the AIB companies bought at high prices and cannot absorb the losses to excessively discount old cards to clear out inventory. Nvidia is assisting by withholding inventory of new cards to force sales of last gen series. If the 4080 pricing makes no sense, then AMD is not in a better situation. Even a $1000 7900XTX is out of reach for many gamers.

Enterprise products will generally get you much better margins than consumer grade devices. Nvidia knows this and you can see it in the increase in their datacenter sales while consumer GPU sales are falling. I don't blame a company for wanting to be profitable, that's kind of what they are supposed to do. If you're looking for someone to blame, blame the miners and other gamers that paid top dollar for GPUs driving prices up and keeping them their. I mean, if someone offered you a 100% raise on your salary, would you decline because you don't want to be viewed as "greedy"? Somehow I don't think you would. Also, people have to accept that spending $1000s of dollars on gaming machines is a luxury and, frankly, not the best use of that money.

Maybe we all have to accept that getting 400 fps isn't imperative and that a good GPU that can deliver 100-150 FPS is fine. There seem to be plenty of options in that range.
 
GDDR4, are you sure. I've only seen them for sale as GDDR5 & GDDR6
Yours is what's called a "reference card", It's not really "underclocked", as such, but rather not overclocked. I think the stock clock speed of the GDDR5 is slightly higher than GDDR6

What many miss about the video card market today, is the fact that the dealers were forced to buy them at jacked up dealer prices. Consequently, they still have to sell them up, or take a loss.

As for cooling, all of the OC'd cards have massive coolers on them, and twin fans, so I don't think heat in the 1650 ranks is that much of an issue, whether the card is a reference card. or an OC'd model..

I don't game so I don't know how much this will matter to you. But, I have an EVGA "FTW" (2 fan 4 GB) 1050 ti, and I've never heard the fans start up. With the photo editing I do, it's essentially "passively cooled" The fans aren't designed to spin up until the GPU temp hits 50 C..

So, with proper case fans and PSU, I wouldn't fear an OC\d 1650 or 1660. I think the 1660s come in at about 124 watts, with an 8 pin connector. I panicked when I saw "8 pin connector" in the reviews, but the 500 watt PSU for the new rig has a "6+2" connector.

Indeed, you're correct, I got GDDR5 card, my mistake. I got a Asus GTX 1650 4GB Dual Fan OC edition (which I do not think is overclocked despite the "OC edition" in the name). You could be right about it just being run at reference speed, I do see how many cards are factory overclocked. Indeed, this thing has a nice thick heat sink and dual fans, and even running Furmark, "nvidia-smi" says the fans ramp up just a tad under that load but even then I cannot hear them at all over the also pretty quiet power supply fan (I'd hope that fan was quite given it's only a 240W power supply.)

Indeed I'm wondering what is going to ultimately happen to some of this stock that stores bought at well over what their current market value would be. I was amused when recently looking for a couple SATA SSDs (for my parent's computers) to find the Samsung 860 Evo 1TB is going for like $140 (I assume because places stocked up on them over the past year when prices were high) while the Samsung 870 Evo 1TB is running under $90. That did make me suspicious that the 870 Evo did something nasty like go DRAMless... nope! For the SATA model, the 870 Evo is not really *better* than the 860 Evo (since both easily max out the SATA interface) but it's not worse either (both have DRAM cache and both have the same TBW rating). (I've dealt with the "cacheless" or "DRAMless" SSDs and I think they are crap, I'm avoiding them like the plague.)
 
I honestly don't even know why they're still producing these out-of-date cards to begin with. The RTX 2060 is now two generations old and the GTX 1660 is 2.5 generations old.
 
I honestly don't even know why they're still producing these out-of-date cards to begin with. The RTX 2060 is now two generations old and the GTX 1660 is 2.5 generations old.

The problem is that the 2060 is only 1 generation old at it's market point (to the 3050) and the 1660 is barely 0.1 generations old at it's market point (to the 1660 Super).

Nvidia fails to make any new generation cards at appropriate price points to replace these.
 
Back