Nvidia reportedly spent $10 billion for a chunk of TSMC's 5nm manufacturing capacity

I know a lot of people are hoping that this next gen will bring down prices (like they thought Ampere would with Turing), but everything is saying the opposite.

Manufacturing costs have skyrocketed, shipping costs have increased multiple times (perhaps for a long time), raw material prices have gone up, labor costs have gone up, and other inflationary pressures are pushing up the prices of everything.

It seems like people are willing to pay about $600 for a midrange x60 card, so I wouldn't be surprised if they charged that for the 4060. $999 for the 4070, $1300 for the 4080, $2500 for a 4090.
Think this sounds insane? It's the pricing that's already here that retailers are already asking for. Remember we thought prices would go down over each generation? That's over. People accepted the Turing price increase and we can't argue with Nvidia to go back.
Only hope is that as coins transistion to proof of stake, instead of mining, demand for GPUs returns to more normal levels.

For gaming and CAD, you really only need one GPU per-person. Maybe 2-3, if you get into some really specialized use cases. But for mining? The more GPUs per-person, "the better". If BTC follows ETH into the proof of stake model, we should see GPU demand plummet. Then a bunch of scalpers will be left holding the bag, and retailers and OEMs/ODMs will need completely re-evaluate their financial plans.
 
Only hope is that as coins transistion to proof of stake, instead of mining, demand for GPUs returns to more normal levels.

For gaming and CAD, you really only need one GPU per-person. Maybe 2-3, if you get into some really specialized use cases. But for mining? The more GPUs per-person, "the better". If BTC follows ETH into the proof of stake model, we should see GPU demand plummet. Then a bunch of scalpers will be left holding the bag, and retailers and OEMs/ODMs will need completely re-evaluate their financial plans.
The worst problem with BTC is that the maxis are so heavily invested in the status quo. They think the corruption of the network with ASICs is somehow intended and won't change it other than minimal upgrades because of some misplaced reverence. It's code. Code needs updated. I see next to zero chance they'd go PoS because they'd rather cling to their ASICs as nations outlaw PoW mining.
They don't realize that the risk is tremendous with ASIC based mining, especially with countries cracking down on the industry.
 
This is...Overall terrible news for AAA pc gaming.
This may seem hyperbolic, but I just can't see PC gaming as we know it today surviving the next few years. GPU prices are going to continue to skyrocket for various reasons including the increasing complexities of these devices and then of course, scalpers.

Yeah, we're seeing the end of PC gaming as we know it. At this point, when and if my desktop dies, I'll be buying a Mac. Heck, I don't even game these days because I just don't have the time to do so between work and other various things that adults that have responsibilities need to deal with.
 
This may seem hyperbolic, but I just can't see PC gaming as we know it today surviving the next few years. GPU prices are going to continue to skyrocket for various reasons including the increasing complexities of these devices and then of course, scalpers.

Yeah, we're seeing the end of PC gaming as we know it. At this point, when and if my desktop dies, I'll be buying a Mac. Heck, I don't even game these days because I just don't have the time to do so between work and other various things that adults that have responsibilities need to deal with.
Oh absolutely I've been saying PC gaming is on the verge of dying out at least when it comes to AAA gaming.

In fact I would argue this is already happening: single player experiences for AAA games are very few, just derivative open world sandbox games from Ubi that keep making somewhat decent numbers but that's about it really. Everybody else moved, is moving to or wants to move to live services, possibly even free-to-pay live services and the current king of those, Fortnite, will run on basically any CPU from the last 5 years or so even without a gpu: A Kaby Lake CPU with just integrated graphics can run it.

Not that I think this is necessarily a bad thing: it could give a chance for indie games to start grabbing that spotlight again.
 
Well we know it's not going to be Apple simply because we know Apple has a sweetheart deal being the biggest TSMC customer of them all.

Which means that naturally, at least one of them will be AMD. Now from their perspective AMD knows this is coming as we have seen them voluntarily leave market segments like the lower end chips with no offering below the 5600g and no desktop Zen3+ announced beyond a single chip, the 5800X3D, which means they're likely wanting to get as many allocation as they can for Zen 4 chips for both enterprise and consumer segments.

What I think has a good chance of happening is that AMD eventual RDNA launch of a 7000 series will be redefining what a paper launch is and maybe even just releasing for the top SKU and it's derivatives (So think 7800xt, 7800 and 7700xt all the same chip with cut down, defective versions for the lower tiers) and pretty much nothing else.

I think we'll run a great risk of the GPU race switching to being just Nvidia vs intel for a good chunk of the next generation, and if Nvidia was already at the very least equal to AMD in tech and ahead in software support and features then it's going to be a bloodbath vs intel that will probably not be able to match AMD in terms of software support and features at all, let alone stay competitive with Nvidia.

This is...Overall terrible news for AAA pc gaming.

I agree with most of this except your expectation that Intel's software support will be worse than AMDs(although tbf I've only ever had AMD gpu's and have never had a problem). Intel has a SIGNIFICANTLY larger pool of software engineers to pull from than AMD, most of AMD's engineering team consists of hardware engineers, the bulk of Intel's are software and the company in general is MUCH larger. I don't think they'll have any trouble keeping up with Nvidia on the software front.
 
The NVIDIA tax is going up. If you thought Ampere cards were expensive, well you ain't seen anything yet.
 
Well I can see prices going up everywhere in near future thanks to the new "cold war" happening right now. I'll say 500euros/usd for the 4050, 700 for 4060, 900 for 4070 and so on. So guys, hold on dearly and tightly to your current PC.
 
The NVIDIA tax is going up. If you thought Ampere cards were expensive, well you ain't seen anything yet.
And that's why I basically think that PC gaming as we know it today is dead. I can't see from this point on where your average person on the street will be able to afford a GPU from this point on.
 
And that's why I basically think that PC gaming as we know it today is dead. I can't see from this point on where your average person on the street will be able to afford a GPU from this point on.
Well since I think AMD is going to follow suit and raise prices. I am hoping that Intel might help keep GPU prices at a reasonable amount. But again it's Intel and they need to feed those greedy stockholders.
 
I am hoping that Intel might help keep GPU prices at a reasonable amount. But again it's Intel and they need to feed those greedy stockholders.
Uh huh. Yeppers. If people think that Intel is coming to save the day, they're going to be sadly mistaken. Intel simply wants more money like any big corporation. They just want a piece of the expensive GPU pie.
 
I agree with most of this except your expectation that Intel's software support will be worse than AMDs(although tbf I've only ever had AMD gpu's and have never had a problem). Intel has a SIGNIFICANTLY larger pool of software engineers to pull from than AMD, most of AMD's engineering team consists of hardware engineers, the bulk of Intel's are software and the company in general is MUCH larger. I don't think they'll have any trouble keeping up with Nvidia on the software front.
Graphic drivers take many iterations to mature into nice optimizations and intel has shown that time and time again, even with all their money their driver support for their graphics on their igpus is beyond terrible.

Sorry but until they actually release a GPU product to the public and the DAY 0 DRIVERS (Because I don't want to hear about later improvements and optimizations, that's the same bs AMD always pulls contributing to their terrible product launches) are actually on par with Nvidia or at least somewhat close I have to go with their previous graphic driver efforts: not only unoptimized but EXTREMELY buggy.
 
Back