GPU Mining Is Dead, Where Are My Cheap GPUs?

The prices have been speculated to fall for years now and they did to some degree but it seems like they won't fall as fast as people thought or down enough to justify the wait.
If prices are not dropping, there are only two reasons. Either supply is lower than expected or demand is higher than expected. We know that supply is going to vastly increase over the next few weeks and months with used mining GPUs. That means demand is rising as prices are dropping.

Keep your pants on, you greedy fools, and don't buy a GPU until prices get lower, much lower.
 
I would imagine these cards will start to fall, probably dramatically but that will be governed by the market. Used cards should start to show up on eBay soon ....
It's also governed by profits and loss. If it's true that Nvidia GPUs have less profit built-in then it's unlikely the prices will fall significantly until companies start to feel the pinch from lower sales. At some point they may have to dump 30 series if the new AMD cards show better price/performance. For example, if I can get 3080Ti performance at $700 instead of $800, you may see some adjustments. On the other hand if AMD is at parity in price and performance, then prices will remain high, until such time as companies need to reduce inventory.

Black Friday sales should be interesting this year.
 
Let's look on the bright side. Prices are steadily getting lower and we are not all lamenting the insanity of the crypto boom anymore. I for one am happy. My target price range is around $500. I paid around $400 for a 1070Ti in 2017, so for me something that is equivalent to a GPU just shy of the $500 range. I am really hoping that the new AMD cards can make this a real option for me.

Am I happy with the new nvidia pricing, not at all. More than just the price, it is the size and power needs of the card. The joke now seems to be that you need to plug your MB/CPU into your GPU card.

For some context:
1) 1080 Ti was released in 2017 at $699. Inflation adjusted for today it would cost $850.
2) The Titan Xp (maybe this is a more realistic comparison to the 4090?) was released in 2017 at $1200 and is inflation adjusted to $1500. The MSRP for the 4090 will be $1,600, which is actually pretty close.

I guess it is all about perspective.
I would argue that after the 40 series announcement they are not getting steadily lower. In fact, I've seen some price increases in the last week. The 2 GPUs I'm tracking haven't move a penny in the past 2 weeks. Mobos have dropped a tiny bit, maybe $10-20. Otherwise, things are either out of stock, or at the same price.
 
A lot of former GPU miners are probably holding out hope that another POW will moon, but I think the landscape is too fragmented; Ravencoin, Ergo, Eth Classic (and technically, ETHW) are all jockeying for it and all of them seem to have pretty entrenched communities. At some point they'll have to liquidate something to keep running.
They're all sh1tcoins. People always make this mistake: they expect past performance to be indicative of future results. ETH's story of success may never be replicated. There's a big chance ETH's merge could throw all POW under a bridge. But tell that to a guy who spent a ton of money on a mining rig.
 
I don't know how much capacity AMD has available and how quickly, but Nvidia seems to be leaving them a very nice window to introduce say a 7800, 7770, and 7600 at prices well below the new Nvidia cards with performance that would still be exciting to be many consumers and maybe make themselves a nice jump in the market share with it.
 
My guess is that both Nvidia and AMD will create a false shortage of the new products until they can sell off their remaining inventory of the previous gen. By that time, the gullible will be bursting to give them their cash for the new (very expensive) stuff, that gives them 10 more FPS in their games.
 
My guess is that both Nvidia and AMD will create a false shortage of the new products until they can sell off their remaining inventory of the previous gen. By that time, the gullible will be bursting to give them their cash for the new (very expensive) stuff, that gives them 10 more FPS in their games.
As long as their is a gap in pricing like the price model Nvidia shown off 4000 series on top and 3000 series on the bottom. Nvidia wants them to coexist 1) probably because Ampere yields are so good that they are still making a profit and have room to move. 2) they want to sell as many as 4000 series cards as possible especially before AMD can launch and have competitive products.
I have heard that they will likely have more the 4090s at launch than the 4080s source Moore's law is dead ( which is what they did with the 3090s vs 3080s at launch)
Once AMD launches 7000 series Nvidia can take the worse dies from ad 102 and will likely make a 4080ti and also make a fuller fat gpu 4090ti to fill in the gaps and adjust the price stack accordingly. Nvidia is selling the ad 104 chip with 192 memory bus for $900 they want as many suckers to buy this as possible.
I am not sure if Nvidia can afford to artificially hold back on the 4000 series supply to create a panic response to increase demand.
What I do see happening from both camps is they will not release new mid to low end products until previous gen stock dries up. So that means previous generation is used as the new mid to low end and new gen is only launching as high end/enthusiast as we are seeing.

Update the 4000 series have a hidden cost of a new psu with the current news of adapter cables burning and wearing out.
 
People wake up !! The only reason that you don't need this **** is not only the prices ,its the actual gain gaming wise ,really ,I have a 2080 rtx with a 1440p 240hz monitor ,can get 240 fps so I was thinking to buy a new card just to get there, a good reason ok !!but let's see what it will bring to my visual experience, NOTHING ! so what the point to pay 1600 euros to buy a 4090 that will give like 30 fps or 40 or 60 fps more and the price doesn't include the rest of the hardware needed to be able to use it PSU CPU MB ,so who cares ,unless prices go down at a human level and not alien level nothing is happening for me .. I WOULD LIKE TO HAVE AN ANSWER TO THIS QUESTION-- WHAT IS THE COST OF PRODUCTION FOR ONE 4090 ;;;
 
If they aren't bottoming out by Cyber Monday they'll probably keep gouging right through the holidays, and this is exactly what I expect. Jan 1st will probably be magic time.


yeah man cyber Monday was when the highest clearance pascal cards went on sale: they offered Zotac oem 1070 cards for $175(better than half), and their two 1060 cards for half-off!

I missed the 1070,but 1060 AMP: edition for $125 was incredibly good value, and hope to see a 3060 AMP to upgrade my htpc for the same price as that 1070 (once miners get freaked, and start liquidating stock)

or worst case l compromise AND GRAB A 3050 FOR $125 (currently running 960 on htpc,so 50% higher performance at 4k plus dlss over my 1060 6gb wold be a nice upgrade)
 
Last edited:
We all need to stop calling the 4080 12GB a 4080 and just call it the 4070. 2 reasons. Firstly that's what it is, secondly it will annoy nVidia which can only be a good thing.
 
Still on a FHD 60 Hz monitor. Just switched my GTX 1060 (sold for 160 USD) for a RTX 2070 Super (bought for 250 USD). Pretty happy with the upgrade and the RTX might be just enough to go for a QHD monitor with at least a 75 Hz display.
 
If we tink about dx12 3dmark benchmark there would be a welcome with 76% more speed.
games that runs good must peform 100& better on new 30xx ti to 40xx ti. amd giving yeah best gpus witouth rtx rtx. their own tecnology. rtx nvidia are somehow new and it takes time to get it inside games. when you can inplent it in older games and new ones yhats fantastic..
the price are a big lol when blaming on c-19. stop bugging us with those darn prices and give back us real prices. not 100 000 nok/usd for a old gpu. 10 000 usd in nok would be about 100 000 nok. oil price 9.76 or 10.?? oil index should not tell europans how high it it selling with taxes and mva. we just want normal sale.
 
"Is this a deliberate strategy from Nvidia to limit losses from the high volume of RTX 30 series inventory left to be sold?"

Yes. They said exactly that on their end of quarter investor call. Nobody should be surprised at all they said that they had adjusted 30 series pricing in anticipation of the 40 series launch for a coexistence through the new year.
 
Denialism with price corrections is common, especially after a period of shortages. Then, overhead begins its inevitable squeeze, and the price goes down. Panic ensures, and price declines more aggressively.

Nvidia management screwed up by overpaying for manufacturing capacity, couldn't renegotiate its contracts, and is trying to pass on those costs to the customers and slow walk wall street with these gimmicks. Delay in case surprise bails them out. Gpu miners screwed up by investing in a bubble. They are hoping the market bottoms and another coin bails them out.

No one wants to book a loss, so be patient.
 
Does anyone knows to give me an answer !?How much does it cost to manufacture a 4090 Gpu (end product)!?
Nobody can give you an actual answer, because such information is propitiatory and kept under close supervision. One can make some educated guesses though. The AD102 has a 608 square millimetre die and is fabricated on TSMC's N4 process - this is based on their N5 node, but is a little cheaper to manufacture. A single 300 mm wafer, on the N4 process, can produce approximately 90 dies in total, although the percentage that will be usable, let alone 4090-usable, is unknown.

But let's say 70% of them are good enough to be used in the relevant models and out of those, say 40% are for the 4090 (the remainder will get used for the professional line of Lovelace-powered cards, and the new 4080s). A single wafer can cost up to $15k to manufacture, and given that N4 is relatively new, there's not going to be any huge cost savings yet. But let's take a round number of $10k - that means each AD102 die costs around $110 to make. However, since 25 dies from each wafer end up in a 4090, Nvidia will charge a higher price for them to AIB vendors, simply because the supply of them is less than the more damaged ones that get used for the 4080s.

The dies need to be packaged and it's anyone's guess as to how much that costs, in terms of materials and manufacturing, but then one needs to add in the mainboard, all of the components on it, 24GB of 21Gbps GDDR6x, and the cooling system being used. Once again, the price of that RAM isn't made public, but given that all electronic components are still in relatively short supply/priced higher than normal.

Nvidia stores and distributes its packaged GPUs from a center in Hong Kong, as the manufacturer of every other part is based in China or Taiwan. So those costs need to be included, as well. So one could be looking as high as $400 to $500 to manufacture and distribute a 'standard' 4090, perhaps considerably more.

Nvidia charges a small fortune for that model for three reasons: (1) because they can; (2) because it's an LVS (low volume seller); (3) market positioning.
 
Jensen can eat my *** with a spoon. YOU serve gamers. Gamers will not bend to your will. You are going to learn that real fast Jensen. The age of affordable GPUs IS returning. Mining is over. Get bent.
 
So in short and understated that's over 100% profit for each card !regardless the model !! Well KMA NVIDIA !! and resellers are looking at 50% profit KMA again!!
 
Nobody can give you an actual answer, because such information is propitiatory and kept under close supervision. One can make some educated guesses though. The AD102 has a 608 square millimetre die and is fabricated on TSMC's N4 process - this is based on their N5 node, but is a little cheaper to manufacture. A single 300 mm wafer, on the N4 process, can produce approximately 90 dies in total, although the percentage that will be usable, let alone 4090-usable, is unknown.

But let's say 70% of them are good enough to be used in the relevant models and out of those, say 40% are for the 4090 (the remainder will get used for the professional line of Lovelace-powered cards, and the new 4080s). A single wafer can cost up to $15k to manufacture, and given that N4 is relatively new, there's not going to be any huge cost savings yet. But let's take a round number of $10k - that means each AD102 die costs around $110 to make. However, since 25 dies from each wafer end up in a 4090, Nvidia will charge a higher price for them to AIB vendors, simply because the supply of them is less than the more damaged ones that get used for the 4080s.

The dies need to be packaged and it's anyone's guess as to how much that costs, in terms of materials and manufacturing, but then one needs to add in the mainboard, all of the components on it, 24GB of 21Gbps GDDR6x, and the cooling system being used. Once again, the price of that RAM isn't made public, but given that all electronic components are still in relatively short supply/priced higher than normal.

Nvidia stores and distributes its packaged GPUs from a center in Hong Kong, as the manufacturer of every other part is based in China or Taiwan. So those costs need to be included, as well. So one could be looking as high as $400 to $500 to manufacture and distribute a 'standard' 4090, perhaps considerably more.

Nvidia charges a small fortune for that model for three reasons: (1) because they can; (2) because it's an LVS (low volume seller); (3) market positioning.
Thanks for the effort!
 
Back