Nvidia might have killed off the RTX 2060 and GTX 1060

The problem is that the 2060 is only 1 generation old at it's market point (to the 3050) and the 1660 is barely 0.1 generations old at it's market point (to the 1660 Super).

Nvidia fails to make any new generation cards at appropriate price points to replace these.
Current 3050 is slower than 2060 like 15% and 3060 is faster than 2060 like 19% according to TPU graphs.
https://www.techpowerup.com/gpu-specs/geforce-rtx-2060.c3310
The gap was too big not to fit there old 2060.
 
I honestly don't even know why they're still producing these out-of-date cards to begin with. The RTX 2060 is now two generations old and the GTX 1660 is 2.5 generations old.
OK, the GTX- 16xx cards were brought out either just a little before, or a little after, the RTX-2xxx, circa 2019. It was an alternative to those who didn't want, or couldn't afford the ray tracing offering. The pandemic and mining happened, and the scalping began in earnest..

I began to buy parts to build a new box around the beginning of the year, but based on DDR4, since DDR5 was way overpriced at the time

So, a cheapie B660 board was purchased / 16 GB DDR4 and don't laugh, an Intel i3- 12100 w/IGP.

When the prices dropped in April, I purchased an Asus "TUF" GTX-1650 OC 4GB @$220,00, which at the time was a huge "bargain".

Fast forward to a couple of weeks ago, and Newegg had the Asus "TUF" GTX-1660 ti OC 6GB. on sale for $230.00.

So, like Pinocchio I wanted to be a "real (gamer) boy" (**), and called Newegg to see if I could trade in the unused, in the box, 1650.

Well, they said NO, but I did get them to match the price of the 1650, plus throw in a $10 gift card, and I bought it. So, effectively I got the 1660 ti for $210.00 :)Which is $70. below its release price of 280.

Last night, out of curiosity I went back and shopped the same 1660 ti, wondering if they were still in stock. Well, it's back ordered, a new run is supposed to be in tomorrow.

The new run of that very same card 1660 ti, is tentatively priced @ $359,95 Whether that's a typo, or a scalp, I have no idea whatsoever.

(**) Even though I don't game. But since, the xx60 series cards have always come in as most popular on Steam, and so , like Pinocchio, I wanted to feel like I was "part of something". :rolleyes:

To conclude, "Jezus H. Christmas" I'm as thankful as the Pilgrims and the Indians first dinner together, that I bought it when I did..!

(In case you weren't wondering). Since t's brand new, I do go into the closet and sniff its box every coupla days. :heart_eyes:
 
So, a cheapie B660 board was purchased / 16 GB DDR4 and don't laugh, an Intel i3- 12100 w/IGP.
I'm not laughing, I'm shaking my head sadly because this means that you've never known the joys of cheap and easy AM4 upgrades...

Ok, that's not true... :laughing: :laughing: :laughing: :laughing: :laughing:

But hey, you said yourself that you're not a gamer and that system is more than enough for general windows use. Everyone has their own uses for PCs and not everyone is a gamer so paying more for high-end parts would've been a waste.

However, having to buy even a "cheapie" B660 motherboard was a waste because if you had jumped on the Ryzen train back in 2017, you could've gotten 10+ years of great performance out of an A320 motherboard it and that would've ended up being much cheaper.
When the prices dropped in April, I purchased an Asus "TUF" GTX-1650 OC 4GB @$220,00, which at the time was a huge "bargain".
Yeah, I suppose... but the question is, what was the point? I know that you don't game and would've been just as well off with a much cheaper GT 730. That's why I mentioned the A320 board. You don't game and, judging by the Intel model you chose, you don't overclock either. A gaming card just doesn't make sense for you. Well, I mean, based on what you've told me in the past.
Fast forward to a couple of weeks ago, and Newegg had the Asus "TUF" GTX-1660 ti OC 6GB. on sale for $230.00.
So, like Pinocchio I wanted to be a "real (gamer) boy" (**), and called Newegg to see if I could trade in the unused, in the box, 1650.
Well, they said NO, but I did get them to match the price of the 1650, plus throw in a $10 gift card, and I bought it.
I'm a bit confused by this part. Are you saying that you bought two cards?
So, effectively I got the 1660 ti for $210.00 :)Which is $70. below its release price of 280.
Well, the thing is... a GTX 1660 Ti for $210 is actually a horrible value. This is especially true when you consider that the RX 6600 has been $216 on newegg for about 2-3 weeks now. The RX 6600 is 31% faster than the GTX 1660 Ti (according to TechPowerUp) and comes with a free copy of The Callisto Protocol. There isn't even a huge power difference with the RX 6600's TDP being only 12W more.
Last night, out of curiosity I went back and shopped the same 1660 ti, wondering if they were still in stock. Well, it's back ordered, a new run is supposed to be in tomorrow.

The new run of that very same card 1660 ti, is tentatively priced @ $359,95 Whether that's a typo, or a scalp, I have no idea whatsoever.
Even if it were, the fact that it went up to some ungodly bad price doesn't mean that it was worth what you paid for it. This is what I talk about when I say that people screw themselves because they're willing to pay anything for a video card in a green box without even considering looking at Radeons. This is exactly why Jensen thinks it's ok to do all the bad things that nVidia does, like this:
He knows that so many will make bad purchasing decisions just because they want a green box so he doesn't care. It's the same reason that Intel pulled all of their crap. People just seem to spend their money on autopilot and they get burned.
(**) Even though I don't game. But since, the xx60 series cards have always come in as most popular on Steam, and so , like Pinocchio, I wanted to feel like I was "part of something". :rolleyes:
And if they all jumped off of a bridge.....? (Sorry, couldn't resist!) :laughing:
To conclude, "Jezus H. Christmas" I'm as thankful as the Pilgrims and the Indians first dinner together, that I bought it when I did..!
Now that I've pointed out that the RX 6600 would've been an order of magnitude better buy, I don't know if you'll remain that thankful.
(In case you weren't wondering). Since t's brand new, I do go into the closet and sniff its box every coupla days. :heart_eyes:
You know, I don't blame you. I also love the smell of new circuit boards. I don't know what it is but I've loved it ever since I was a kid. Going into a computer store had that "new PCB" smell all through the store and I couldn't get enough. Maybe that's called a tech addiction? :laughing:
 
I'm not laughing, I'm shaking my head sadly because this means that you've never known the joys of cheap and easy AM4 upgrades...

Ok, that's not true... :laughing: :laughing: :laughing: :laughing: :laughing:
No, it's not true. After all, I get inundated with all aspects of AMD's superiority, including performance, corporate integrity, and morality, every time I login here. One might even say "regaled" or degraded and talked down to about my, (suspected), allegiance with "team blue".

So, as morally bankrupt and naive as it would seem to many here, I build with Intel and Nvidia parts. It isn't out of a sense of loyalty, nor am I an Intel; groupie, it's just that I'm the farthest thing from a rabid AMD fanboi that you're ever likely to find..

I "build", (plug parts together), with those brands, simply because, I understand the numbering systems they have always worked thus far..

Yes, I have a GTX-1650 AND a GTX-1660 ti. (Feast or famine, so they say).

However, spread across my other machines, I have a, (wait for it), a GT-730 (Which I just grabbed recently for $59. US), 2 GT-1030's, a GTX 1050 ti (4GB), and, a lowly GT-710, (I only own that turd for the 2 GB of Vram, and it's better than nothing).

The point you've missed is, I never bother with "upgrading", I simply build another machine. Although I have adder extra RAM and SSDs to all my machines.

I would like to discuss this further. however, I have to go outside and do more "manly things", like play with an acetylene burning torch, on my 25 year old Suzuki "Sidekick's" exhaust system.

We'll chat further, but for now Adieu. :)

BTW, "Box" is a colloquial term in the US in reference to.....(never mind).
 
No, it's not true. After all, I get inundated with all aspects of AMD's superiority, including performance, corporate integrity, and morality, every time I login here. One might even say "regaled" or degraded and talked down to about my, (suspected), allegiance with "team blue".
I've never accused you of that and never would. I don't believe for a second that you have an allegiance. You have far too much sense for that. I don't have an allegiance either, as in, I'm not a fanboy, I'm a hater! :D
So, as morally bankrupt and naive as it would seem to many here, I build with Intel and Nvidia parts. It isn't out of a sense of loyalty, nor am I an Intel; groupie, it's just that I'm the farthest thing from a rabid AMD fanboi that you're ever likely to find..
That's fair. I can't blame another for being a hater because I am one myself. I am curious though, what made you hate AMD? I ask not to belittle you but because I know that you won't BS me and I really am curious.
I "build", (plug parts together), with those brands, simply because, I understand the numbering systems they have always worked thus far..

Yes, I have a GTX-1650 AND a GTX-1660 ti. (Feast or famine, so they say).

However, spread across my other machines, I have a, (wait for it), a GT-730 (Which I just grabbed recently for $59. US), 2 GT-1030's, a GTX 1050 ti (4GB), and, a lowly GT-710, (I only own that turd for the 2 GB of Vram, and it's better than nothing).
I see nothing wrong there. For your purposes, to spend more would be a waste. Since AMD has no competitor to the GT 730, that's what I usually recommend to people who just want a non-gaming display adapter. It's cheap and it works which is the sensible thing to do.
The point you've missed is, I never bother with "upgrading", I simply build another machine. Although I have adder extra RAM and SSDs to all my machines.
Waste-not, want-not. That's the best way.
I would like to discuss this further. however, I have to go outside and do more "manly things", like play with an acetylene burning torch, on my 25 year old Suzuki "Sidekick's" exhaust system.
OMG, I love Sidekicks! I was a Suzuki fanatic back in the 90s. I personally had a Swift GT and I used to marvel at how many derivatives of the Swift and Sidekick there were.

Chevy/GMC/Geo Tracker, Asuna Sunrunner, Chevy Sprint, Geo Metro and Pontiac Firefly come to mind. Those were the days and as long as it doesn't rust, you can't kill that Suzuki 1.6L L4 in the Sidekick.
We'll chat further, but for now Adieu. :)
I look forward to it good buddy. Have a good evening!
BTW, "Box" is a colloquial term in the US in reference to.....(never mind).
That reminds me of a blonde joke that I think you'll enjoy..

Q: What does a fake blonde have in common with a Boeing 747?
A: They both have a black box.

Cheers! ;)
 
The problem is that the 2060 is only 1 generation old at it's market point (to the 3050) and the 1660 is barely 0.1 generations old at it's market point (to the 1660 Super).

Nvidia fails to make any new generation cards at appropriate price points to replace these.
Since the current generation is 4000 and the last generation is 3000, doesn't that make the 2000 two generations old? It does by my count. It was only one generation old until the RTX 4090 was released.

Derivatives are semantics because the GPU in the GTX 1660 variants is the same.
 
Since the current generation is 4000 and the last generation is 3000, doesn't that make the 2000 two generations old? It does by my count. It was only one generation old until the RTX 4090 was released.

Derivatives are semantics because the GPU in the GTX 1660 variants is the same.

The tech is definitely 2 generations old for the 2060 and 1660, as is the corresponding power efficiency (or performance level at the same wattage). However there are zero better choices at these price points from Nvidia in the current generation, and 1 gen back the choice is actually *worse* at the ~$270 price point than the 2 gen old parts.

Discontinuing older gen parts usually happens because there are newer gen parts to replace them which are better. But not for Nvidia below a laughable $360.
 
That's fair. I can't blame another for being a hater because I am one myself. I am curious though, what made you hate AMD? I ask not to belittle you but because I know that you won't BS me and I really am curious.
I don't "hate" AMD. It's just that I'm vexed and very annoyed at the cult worship displayed by many members here.. The simple fact here is, I'm not ramming Intel down anyone's throat, but I seem to be consistently having AMD rammed down mine. I get it, people are trying to, "lead me out of the darkness into the light". Unfortunately, I'm a night owl. Whooooo, Whooooo.

AMD doesn't have foundries of their own. TSMC (?) cooks all their wafers. Which makes them, (IMO), a CPU "design firm", not a manufacturer. Granted designing these chips is no mean feat. However, I think that TSMC should be given the lion's share of the accolades, now being bestowed upon AMD. It's one thing to think it, but an entirely different and more difficult thing, to bring it to physical fruition.

On a sociopolitical note, suppose China decides they really want Taiwan under their control and invade. Point blank, that could spell AMD's demise. Would the Taiwanese government destroy the foundries? We obviously won't know until if, or when, that happens.

TSMC is building fabs here in the US. Interesting choice, ay? What that does, is tacitly makes TSMC a US company. Which of course forces the US into a (tacitly again) mutual defense pact with Taiwan. Which of course could possibly force the US into a direct war with China in defense of TSMC's "overseas holdings". That's of course, worst case. Obviously there are many other potential scenarios or outcomes.

The dumbest thing I heard was that, "the CHIPS Act", was going to give Intel an "unfair advantage over poor beleaguered AMD" Intel granted, is being given a sweetheart deal. However, AMD doesn't have fabs, nor is it intent on building any..Thus the CHIPS Act can have no direct effect on them.

When Intel released the Core 2 Duo E 6300, it, for all intents and purposes, got AMD kicked out of Silicon Valley because they couldn't pay their bills. Then AMD hibernated for a decade, erstwhile Intel either rested on its laurels, or was incapable of getting 10 nm or below processes to function..

As a result, I swore off building a new machine from Skylake (Gen 6?) to Alder Lake, (gen 12). My "newest" machine, ATM, is Based on Z 170 / i5 6600K. The i3-12100 handily kicks it's a** in all performance aspects, and competes reasonably with higher core count (1 gen earlier?), AMD CPUs. "Wait until Ryzen 7", then becomes the cult battle cry.. To which I say, "fu*k off and die".

My initiative is to build something better than my last project, but not necessarily something better than anybody else has.

As for my overbuilding with respect to not being a gamer. FWIW, all modern imaging programs, even the lowly Photoshop Elements, use, VGA hardware acceleration. So it's, (IMO) better to have an overabundance of VGA, instead of whimpering in forums how, "my video card is running 100 C when I run 'Warlike Simpleton', what can I do about it? Meanwhile I sit here with my lowly 1050 ti running just above room temperature. Who's better off?

Anyway, all my desktop monitors are all 1440p now. (60 hz though) So, should I dip my toe into light gaming the 1650 & 1660 ti should work for that resolution

Afinity Photo 2 is, (I hope), still on sale for $41, and I'm considering a later version of M$ Flight Simulator. Again, my new cards just about hit recommended requirements for that. Who knows, maybe I'll come down with a case of "gaming fever", and hit up GoG for a coupla freebies.

So, "AMD is morally upstanding, while Intel is a lecherous bully". I honestly don't care either way. My avoidance or ignorance of those facts is, as they say, "bliss".

Last year I bought an MSI GT-730 from an Amazon seller for $50.00. It was almost DOA. That didn't deter me from buying a another GT-730, it just deters me from buying anything else from MSI. Can't help it, that's the way I roll. (It twas the first part I ever bought by them). The EVGA GT-730 I bought to replace it works all fine and dandy. It doesn't crash the driver until I have about 200 tabs of 'durty pitchurs' open. And even then, all I have to do is walk around the back, pull the DVI cable, plug it back in, and we're back up and drooling. :rolleyes: The GT-710 it replaced crapped out much sooner

As to the MSI card debacle, to go through a bunch of aggravation, phone calls, emails, obtaining an RMA, along with return shipping charges over a $50 part, is, (IMO), pointless. So, I wrote it off, considered it a life lesson, and moved on.

That's about all I have the literary ineptitude to post for now,
Cheers.

We can talk about ships, shoes, sealing wax, and Suzukis, down the road a bit. :) (y) (Y)
 
Last edited:
I don't "hate" AMD. It's just that I'm vexed and very annoyed at the cult worship displayed by many members here.. The simple fact here is, I'm not ramming Intel down anyone's throat, but I seem to be consistently having AMD rammed down mine. I get it, people are trying to, "lead me out of the darkness into the light". Unfortunately, I'm a night owl. Whooooo, Whooooo.
:laughing: No, I'm not trying to lead you "out of the darkness" and I'm not a fan of AMD. I only buy AMD by default because I hate both Intel and nVidia for what they've done and there's nobody else. If VIA/S3 or Matrox came back, I'd consider them just as much as I do AMD. I was never personally fleeced by Intel or nVidia because I dropped them before their worst practices took place. I was lucky, working at Tiger Direct meant that I saw it coming and got out of the way in time. My hate isn't about what Intel or nVidia have done to me, it's about what they've done to others. I just don't like seeing good people getting fleeced. I suppose you could say that I'm being altruistic.

The other thing I had learnt from Tiger Direct was how delicate the market is and how easily it can be destroyed if it goes too far out of balance, something that would impact all of us directly. We saw what Intel did when they had the dominant position and we see now what nVidia does. I don't want anyone to dominate, I want them all to compete so that the market remains fairly even. It's what's best for all of us and I think you'd agree with that.

The thing is, with the market nowhere close to even right now, it's in all of our best interests to buy AMD products in an attempt to "right the ship" so to speak. Right now, AMD has less than 10% of the video card market. One more generation like that and I can guarantee you that AMD will stop making video cards and Jensen will get to keep telling us that "Moore's Law is dead" as he smiles and charges you $3000 for a mid-range card.

It's not about "loving" AMD (which would be ludicrous), it's about loving the community and the PC/PC gaming market in general. Both Intel and nVidia have committed acts that have severely damaged these markets and thus, the community. This is not open for debate, it is proven and indisputable. To be fair to them, if people weren't a bunch of spineless sheep just buying Intel and nVidia because they don't know better and don't want to, the tactics of Intel and nVidia wouldn't have been successful. Someone like you who buys nVidia because of "AMD fanboys" isn't the norm. If you look at the market, the vast majority of fanboys are fanboys of Intel and nVidia and they always have been.

The reason you may see or hear AMD fanboys more is because they're not the establishment, they're the rebels. Now, a lot of the time you'll probably mistake someone like me who just wants the best and healthiest market for all of us as an AMD fanboy because a lot of them realise just how close we are to market collapse. Intel and nVidia can afford to stumble and stumble badly. AMD doesn't have that luxury because they never had the fanbase of the other two. If they did, they wouldn't be where they are today.

AMD has lost market share during a GPU generation in which they were closer to nVidia than they had been since the R9 Fury came out almost 8 years ago. It's patently clear that buying a Radeon right now is a far better idea than buying a GeForce card. Punishing AMD for the actions of some *****s who seem worship them, *****s over whom they have no control, is only slitting your own throat. Sure, it might get annoying (VERY ANNOYING, I know) and doing what you did might make you feel good in the short-term but if Radeons leave the market, we're all royally screwed.

Keep one other thing in mind. We've never met and the same thing is true about probably 99.9& of people that you've spoken to on the internet. So, like, how do you know that these "AMD fanboys" aren't "nVidia fanboys" who are deliberately being annoying to push you towards nVidia through reverse-psychology? If nothing else, nVidia has shown that it knows marketing and marketing is just psychology. I'm not saying that this is the case but what I am saying is that the chance of it being the case is no less than 50% because we literally don't know who we're talking to on the internet.
AMD doesn't have foundries of their own.
Neither does nVidia but I don't see you knocking them for it.
TSMC (?) cooks all their wafers. Which makes them, (IMO), a CPU "design firm", not a manufacturer. Granted designing these chips is no mean feat.
I don't think that you're aware that the only chipmaker that doesn't use 3rd-party fabs is Intel. All oher chipmakers like AMD, nVidia, Apple, etc. use 3rd-party fabs like TSMC, Samsung and GlobalFoundries.
However, I think that TSMC should be given the lion's share of the accolades, now being bestowed upon AMD. It's one thing to think it, but an entirely different and more difficult thing, to bring it to physical fruition.
Ok, now you've really lost me because by that (completely flawed) logic, the lion's share of accolades bestowed upon nVidia, Apple and others should also go to TSMC, Samsung and GlobalFoundries. You seem to be hating AMD for doing exactly what everyone else does which isn't even remotely fair.

Have you ever heard of GlobalFoundries? They used to be AMD's fab but Intel's illegal practices hurt AMD so badly that they had to spin it off. They were trying to compete with Intel at Intel's level which would've been better for everyone but Intel misused their market position to cut AMD off at the ankles. Intel did this knowing that it would be worse for everyone in general (including YOU) but just didn't give a rat's posterior. I'm guessing that you didn't know any of this because I can't believe that you'd support a company that did that to you and everyone else. You don't strike me as the spineless type.
On a sociopolitical note, suppose China decides they really want Taiwan under their control and invade. Point blank, that could spell AMD's demise. Would the Taiwanese government destroy the foundries? We obviously won't know until if, or when, that happens.
No it wouldn't. AMD would simply switch to Samsung or GlobalFoundries just like Apple and nVidia would. Intel has even said that they would act as a third-party fab for those three companies if the need ever arose (and of course they would because it's money).
TSMC is building fabs here in the US. Interesting choice, ay? What that does, is tacitly makes TSMC a US company. Which of course forces the US into a (tacitly again) mutual defense pact with Taiwan. Which of course could possibly force the US into a direct war with China in defense of TSMC's "overseas holdings". That's of course, worst case. Obviously there are many other potential scenarios or outcomes.

The dumbest thing I heard was that, "the CHIPS Act", was going to give Intel an "unfair advantage over poor beleaguered AMD" Intel granted, is being given a sweetheart deal.
You really don't know what happened between Intel and AMD do you? Now, I'm starting to understand where you're coming from. I don't blame you for not knowing because it happened long ago (well over a decade) but the damage that Intel did to AMD meant that AMD was almost forced into insolvency and had to sell their fabs. If you want to know what Intel did, here's the cold, hard truth of the matter:
Now, to be 100% fair, AMD was on a spending spree because the Athlon 64 had been a resounding success so AMD was trying to expand while they could afford to do so. This is also when they bought ATi. Now, AMD wasn't expecting Intel to do what they did and the money that they spent made them all the more vulnerable to the anti-competitive actions that Intel was about to take.

I'm going to not read the rest because without knowing what went down, you lack the needed context to understand where I'm coming from. To be honest, if I didn't know what I know, I'd have the exact same opinion that you do. I do not fault you for it because nobody is born knowing anything and when this was going on, only people who worked in the industry knew about it. Remember that I worked at Tiger Direct and it was then that I swore off of Intel. Well this is why. I assure you that Jim's video is 100% accurate and what you're about to see was only ever disputed by Intel themselves. Other collaborators like Dell, HP, Compaq and Acer all admitted to what Intel had been up to.

You may not know me well but I think that you've seen enough of my posts to know that I don't make anything up.
 
Back