The Real Nvidia GPU Lineup: GeForce RTX 5060 is Actually a Mediocre 5050

GPU shrinkflated because the entire problem with scalpers continues to be unmitigated.
Apple decided to put an end to scalpers with the release of the iPhone 7 plus. They didn't want lines. They didn't want campers. They simply made you preorder online and then come to pick up your device with your ID. That's it. Suddenly, the scalpers dropped to minimal and people weren't walking away disappointed. Nvidia, Microcenter, Best Buy, - they all want the optics of high demand.

What they didn't want is the leaks proving their scarcity. 100 guys on line and only 6 of the 5090.

No one is lining up or camping out for anything lower than the 5070Ti and they know it. The lower-end cards sit on the shelves despite their higher availability and supply.

Full Disclaimer: I've been invested in Nvidia since before 2018 so this is excellent for my shares and dividend.

My only regret is that EVGA stopped doing business with them. I loved the EVGA product. I bought a 5090FE but I gladly would have bought a Kingpin 5090.
 
Last edited:
GPU shrinkflated because the entire problem with scalpers continues to be unmitigated.
Apple decided to put an end to scalpers with the release of the iPhone 7 plus. They didn't want lines. They didn't want campers. They simply made you preorder online and then come to pick up your device with your ID. That's it. Suddenly, the scalpers dropped to minimal and people weren't walking away disappointed. Nvidia, Microcenter, Best Buy, - they all want the optics of high demand.

What they didn't want is the leaks proving their scarcity. 100 guys on line and only 6 of the 5090.

No one is lining up or camping out for anything lower than the 5070Ti and they know it. The lower-end cards sit on the shelves despite their higher availability and supply.

Full Disclaimer: I've been invested in Nvidia since before 2018 so this is excellent for my shares and dividend.

My only regret is that EVGA stopped doing business with them. I loved the EVGA product. I bought a 5090FE but I gladly would have bought a Kingpin 5090.
I sold the last of my NV shares earlier this year. I have a rule where anytime something doubles in the last 12 months, I sell half. So I could have made a lot more money in NV, but when you have a strategy, you stick to it. Also, I have a few ETFs that have NV stock in them some in not exactly worried about it. Ive been invested in them since 2011. Ive been selling my NV stock and buying Intel. Im sure many of you are laughing at that, but I have my reasons. TSMC has a monopoly on fabs right now and Intel started to lose because they got lazy, ot wasn't because they aren't capable. With the Chinese/Taiwan tensions, NV and AMD could goto zero at any point and I honestly believe that Intel will become competitive again with AMD within the next 5 years.
 
TSMC has a monopoly on fabs right now and Intel started to lose because they got lazy, ot wasn't because they aren't capable. With the Chinese/Taiwan tensions, NV and AMD could goto zero at any point and I honestly believe that Intel will become competitive again with AMD within the next 5 years.
whilst I completely agree with you on the fab front, even Intel use TSMC, and I do think Intel will become competitive again on the fab front, when it comes to their product lines, I can see them making a come back on CPU’s, the thing about the CPU market though, there’s competition, x86, ARM, RISC-V, and then various manufacturers between them, all doing different things for different purposes.

With GPU’s though, you don’t have that, it’s three companies and one of them completely dominates the discrete GPU market, while the other (mostly) dominates the high-end APU market, then there’s Intel.

Unless Intel truly pulls something special to really pull ahead of AMD (and with fab improvements hitting physical limits, I doubt they can) I don’t actually think Intel will win much back from AMD in the CPU market, unless AMD just sit on their hands and do nothing for years, Intel style.

In the GPU market though, I do truly believe Intel has a chance here to pickup a decent chunk of the market if they can get their own fabs making them and price them properly. The GPU market is crying out for an alternative to Nvidia and AMD.
 
I sold the last of my NV shares earlier this year. I have a rule where anytime something doubles in the last 12 months, I sell half. So I could have made a lot more money in NV, but when you have a strategy, you stick to it. Also, I have a few ETFs that have NV stock in them some in not exactly worried about it. Ive been invested in them since 2011. Ive been selling my NV stock and buying Intel. Im sure many of you are laughing at that, but I have my reasons. TSMC has a monopoly on fabs right now and Intel started to lose because they got lazy, ot wasn't because they aren't capable. With the Chinese/Taiwan tensions, NV and AMD could goto zero at any point and I honestly believe that Intel will become competitive again with AMD within the next 5 years.

I'm not laughing. I'm buying Intel too. It's cheap and pays dividends.
 
whilst I completely agree with you on the fab front, even Intel use TSMC, and I do think Intel will become competitive again on the fab front, when it comes to their product lines, I can see them making a come back on CPU’s, the thing about the CPU market though, there’s competition, x86, ARM, RISC-V, and then various manufacturers between them, all doing different things for different purposes.

With GPU’s though, you don’t have that, it’s three companies and one of them completely dominates the discrete GPU market, while the other (mostly) dominates the high-end APU market, then there’s Intel.

Unless Intel truly pulls something special to really pull ahead of AMD (and with fab improvements hitting physical limits, I doubt they can) I don’t actually think Intel will win much back from AMD in the CPU market, unless AMD just sit on their hands and do nothing for years, Intel style.

In the GPU market though, I do truly believe Intel has a chance here to pickup a decent chunk of the market if they can get their own fabs making them and price them properly. The GPU market is crying out for an alternative to Nvidia and AMD.
As Zen1 and 2 showed, they don't have to beat them, they just have to be competitive. Intel has a lot of talent, talent it was probably wasting for the benefit of shareholders. With all the bad PR theyre getting, in sure Intel is pushing funding into their engineering department
 
I sold the last of my NV shares earlier this year. I have a rule where anytime something doubles in the last 12 months, I sell half. So I could have made a lot more money in NV, but when you have a strategy, you stick to it. Also, I have a few ETFs that have NV stock in them some in not exactly worried about it. Ive been invested in them since 2011. Ive been selling my NV stock and buying Intel. Im sure many of you are laughing at that, but I have my reasons. TSMC has a monopoly on fabs right now and Intel started to lose because they got lazy, ot wasn't because they aren't capable. With the Chinese/Taiwan tensions, NV and AMD could goto zero at any point and I honestly believe that Intel will become competitive again with AMD within the next 5 years.


I created my own Index: MANGOAT

Microsoft, apple, Nvidia, Google, Oil/gas stocks, Amazon, Tesla.

Add: Intel, TSM, AMD, Micron, SCMI, credit card stocks and bank stocks and you have my portfolios.

I refuse to give money to smuckerberg over at Meta and if Musk pumps Tesla to $500, I'll sell all just to get out. I don't stick to any "forced sell" rules because I focus on Dividend building and Dividend reinvestment.

Nvidia blew up in a way I couldn't imagine. I got really lucky there.
 
I created my own Index: MANGOAT

Microsoft, apple, Nvidia, Google, Oil/gas stocks, Amazon, Tesla.

Add: Intel, TSM, AMD, Micron, SCMI, credit card stocks and bank stocks and you have my portfolios.

I refuse to give money to smuckerberg over at Meta and if Musk pumps Tesla to $500, I'll sell all just to get out. I don't stick to any "forced sell" rules because I focus on Dividend building and Dividend reinvestment.

Nvidia blew up in a way I couldn't imagine. I got really lucky there.
I focus on 2 things for long term investments. Growth has to outpace inflation and it has to pay dividends
 
This is a fascinating article, I love these! Can you make one with AMD? I'd love to see how that lineup has (or has not) changed over time, and if that sheds light on the interplay between Nvidia and AMD both in terms of tech specs at each tier but also their pricing.
AMD are exactly the same. They are both in lockstep with each other as they are too busy feeding off the server (Epyc) and AAAAAAAAAAIIIIIIIIIIIIIIIIIIIIII market.
 
I'm not making excuses, I'm describing the exact reality, Nvidia decides what the product is. Period.

And I'm saying that as someone who hasn't used any Nvidia product for over a decade.

However, I'm surprised techspot posted such a bland article, with little or no technical content, neither from a financial point of view nor from the chip development. Normalize your table with this information, even I want to know:
Chip size vs cost/wafer vs total development cost vs dGPU market size to amortize the total amount.
I love it when readers complain that an article didn't delve into their own obsessive informational "needs" - which are completely irrelevant and uninteresting to the other 99% of us.
 
We must really love gaming on PCs and building our own machines because, if we were rational, we would have given up this hobby years ago. Yet here I am, trying to buy a GPU worth 600–700 dollars that can barely run games at native 4K. We can complain as much as we like, but we brought this nonsense upon ourselves.
 
I love it when readers complain that an article didn't delve into their own obsessive informational "needs" - which are completely irrelevant and uninteresting to the other 99% of us.
We are in the matrix and you speak for all connected minds? No? Then you only speak for yourself. Don't say what you don't know.

Yes, I hope the site called techspot offers content beyond the superficial that fanatics point out on social networks.

TechSpot's comments/forums need a Reddit type format where you can downvote crap comments and upvote the good stuff. It's so annoying when the first comment is nonsense and yet it's always the first thing everyone sees.

If it had, the article would have already been downvoted to death.
 
TechSpot's comments/forums need a Reddit type format where you can downvote crap comments and upvote the good stuff. It's so annoying when the first comment is nonsense and yet it's always the first thing everyone sees.
We can like the good stuff and report the egregious stuff. I imagine egregious comments could be hidden or removed.

There are pros and cons to having a downvote button, one of which is that the downvote can be spammed, If Techspot did add that functionality, it would probably make sense to keep the downvote counts hidden, at least at first, so that the staff could get a feel for when to respond and such.
 
I'm seeing this since past gen, smaller returns, hunger meets the desire, so now we have products well beyond their expected place just to push the profits of the already filthy rich Jen-Hsun, and the worst is people seem dormant, they enjoy being stolen when +90% GPU marketshare is from Nv. I understand the tech part, higher BoM from expensive complex chips, but pushing the chips to the max to be sold as higher tier stuff is absurd, sub 200mm2 GPUs should be more towards 50-100w power usage, some available without external power (up to 75w can be delivered from the PEG socket), and a related price. We have to choose with the pocket fellas...
 
TechSpot's comments/forums need a Reddit type format where you can downvote crap comments and upvote the good stuff. It's so annoying when the first comment is nonsense and yet it's always the first thing everyone sees.


I agree. We should be able to downvote comments out of existence.
 
Plenty of this hinges on the idea that everything must have the same performance relative to a 90 class card, but the 90 class cards have always been outliers. Redo things to normalize to the 80 class cards and you are likely not going to see many charts that support this argument. The main thing I see here is that the 30 series was an amazing deal compared to other generations.

There might be an argument to be made for memory, but that is not under Nvidia’s control. Memory prices have not declined over the past decade like they had in previous decades. Nvidia also tends to use the latest GDDR version, which carries a premium. This is why the memory bandwidth figures have been rising so much over the last several generations. They had used GDDR6X and now are on GDDR7. AMD on the other hand just launched the Radeon RX 9070 XT with GDDR6. Intel’s B580 also uses GDDR6.

That said, I realize this will be a very unpopular opinion, but the days of Moore’s Law giving us big jumps in capabilities from generation to generation without corresponding price increases are gone. Additionally, SRAM scaling is also dead, and the main way to improve performance in these chips is to add SRAM, which is increasingly expensive to do the more the process nodes shrink, since an ever larger amount of die area is needed for SRAM.
 
Im pretty convinced that the 40 series and the 50 series are the same chips. You can say different this or that, but really.

There were no yields between 16k and 9700 on the 40 series and the same thing happened, on the same node for 50 series. 21k to 10700? They didnt have any 12k's or 14k's, or even 18k's?

AMD on the same process node from the same company and they got good yields? Where Nvidia didnt? 🤷‍♂️

They're focus is on AI and have said as much. Just my opinion, I could be completely wrong here. I think they had great yeilds on the lovelace chips, and they are using the bottom tier yields for each class of chip they produced, slowly working back up to the good yield numbers. I think the 20 series was a test of this. 4300 to just under 3k, while the flag ship had 4600. Performance increase was meh, and people complained. 30 series have everyone happy again.

There has been a supply issue for 50 series since launch, could just be low number of yields in those ranges. Look at the ROPs issue.

I think the new 6000 series flagship will have meh performance increase, while the 80 and below have huge jumps.

Just my opinion.
 
Last edited:
I agree. We should be able to downvote comments out of existence.

That just creates an echo chamber. I agree the article is fundamentally flawed because it’s based on “relative configuration vs flagship”. The configuration and economics of flagship chips varies wildly over the years so asserting that the x60 should be always some fixed percentage of the top chip is a completely made up expectation. Without deeper analysis of relative costs and performance it’s misleading at best.
 
Last edited:
This dirtball we live on has finite resources. Stuff doesn't just magically replenish itself overnight after we've dug it out of the ground (or wherever it originates).

None of the resources involved in making GPU chips (or any other kind of chips) are scarce, at least not in the 'OMG we're gonna run out' sense. The only material that might be significantly characterized as 'scarce' is the ultra-pure quartz found in the Spruce Pine mining district in North Carolina. The scarcity is that it's virtually the only source on earth for the quartz used in making the typically one-time-use crucibles for silicon ingots. Estimates are around 100 years of reserves at current consumption rates. There is also a continuous use method that obviously uses less of that quartz, and which industry will likely move to in time.

Many resources are finite, of course. Some are geopolitically limited in availability. None involved in making CPUs and GPUs (and ugh, all the AI-specific NPUs and TPUs and DPUs and other pu's) are scarce to the degree that we need to be particularly concerned about them. Cautious, obviously. None are a factor in chip prices in the here and now or the foreseeable future.
 
I bought my EVGA FTW 3080 (plus plus extra super whatever they appended) at the peak of the COVID shortages, after a year on a waiting list with EVGA. When the 4080 series came out, it seemed too expensive for the relative uplift to a new generation, so I passed, biding my time until the 5080 generation came around.

You can pretty much take that last sentence, swap in "5080" for 4080, and "6080" or whatever for the 5080.

I can comfortably play most games I frequently play at at least 60fps at 4k, and more typically at 144fps at 4k. The card is overclocked a bit, but nothing outrageous (though a bit loud while in game).The graphics are stunning, splendid, fantastic, etc.

Nvidia hasn't given me a good reason to upgrade yet. Nvidia also could not care less about that. My fear is they just up and abandon graphics cards entirely for the AI segment. Then again...they sort of have already.
 
Lots of arguments to be made for inflation, looking at the wrong metrics etc.

But imo it has been very clear that as soon as the GTX turned into RTX value took a massive nose dive. The first generation of it (RTX 2xxx) was overpriced, 3xxx reigned this in somewhat and if it actually got sold at MSRP initially and gradually dropped in price over time it would have been alright (neither really happened), not amazing but alright. Pandemic combined with the mining bubble led to crazy prices.
4xxx was lazy, very much same same to keep prices high, same performance for the same price (that's not how a new generation is supposed to go).
5xxx is very much same same again but now they want even more money because it can do even more fake frames.

Who's to blame? Id say 20 percent TSMC and 80 percent NVIDIA.
TSMC with the last few shrinks has been raising prices a lot. However the die is just one part on the card, the most expensive part but not as expensive as NVIDIA prices would have you believe.
And the dies for the low end cards have been getting really small whilst the prices have been going up.

Oh well, since AMDs first APU Ive always thought that they would replace entry level graphics cards. I was expecting AMD to release APUs at the PS4 level 2 years after the the PS4 was launched, but it never happened. It's only just now that AMD is starting to put in some serious effort with the handhelds and laptops. We need a desktop beefier version of that. Bring the price of an entire system to about 600 bucks for high refresh 1080p and ~90fps 1440p gaming.
Something to make PC gaming affordable again. Something to grow the market.
AMD is taking so long at this point to do it that it will probably be NVIDIA with ARM instead and a 900 bucks price tag
 
Last edited:
So a 5080 is basically an xx60 class. And yet an xx60 gpu is faster than amd's 700$ 9070xt. What the hell are amd thinking selling an xx50 gpu for 700$?
 
I think Tim's argument doesn't really hold up as well to the US readers - because consumers will just see the 5060 as a 5060.

But in Tim's home country (Australia) which has suffered a dip in exchange rates, in many cases you quite literally have to pay double in local currency in 2025 to buy the same performance that was available in 2023-24.

Someone who purchased a 4080 in 2022 would have to pay double or more in 2025 to upgrade to a 5080, which makes no sense for locals.
 
Honestly, I’m with @Scorpus on this one the “60-class” badge doesn’t mean what it used to.


Back in the Pascal/Turing days a 60 card gave you roughly a third of the flagship’s muscle; now the RTX 5060 offers barely 18 % of the 5090’s CUDA-core count and ships with just 8 GB of GDDR7 VRAM That’s not just shrinkflation, it’s a full-on tier-shift.


Price-wise, $299 feels okay on paper, but when you realise you’re effectively getting what would’ve been a “50-class” part a few years ago, the value proposition gets shaky fast I’d rather grab a used or discounted RTX 2070 it still handles 1440p in plenty of titles, has 8 GB of VRAM that actually gets used, and costs about the same these days. For anyone shopping around.


Until Nvidia puts more silicon (and VRAM) back into the mid-range, the “new” 5060 looks like a side-grade at best.
 
Back