Nvidia's Jensen Huang once again claims Moore's Law is dead

Imagine the scene at Nvidia HQ when meeting to set the prices of the 40xx cards..

Jensen: 'Guys, I've come up with a genius solution to beating the scalpers, Lets just set the MSRP at scalper prices!'

Rest of the room: 'Jensen, you're so smart, we love you'
I'd rather give my money to nVidia than scalpers. The former actually provides something of value. Scalpers just exploit the market and add nothing of value. But if nVidia makes msrp low and scalpers take all the supply, consumers will complain nVidia aren't doing enough to stop them. Consumers will ***** regardless.
 
and the 4080 12 gb, which arent and wont sell well at all.
I doubt it. I think all the 4000 will sell very well.
If 4080 12gb does not sell as fast as they want it, they will just drop 100 on it and it will be gone. Most people are not aware about things like ad104 and ad103. The fact that 4080 12gb is practically 4070 is something 99% of buyers will never discover.
Well, we ll see very soon.
 
Frankly, I don't see the cost of silicon being the biggest driving factor in the increased cost of these cards. They are using absolutely REDICULOUS coolers, I wouldn't be surprised if the cost on nVidia's side for the 4080 cooler is well over $100 a unit. The other part to this is since they are making absolutely crazy power limits they need to put the electronics back bone on the chips to provide proper, clean power to these chips. nVidia's solution to performance is just pump more power into the chip, there is very little innovation on their part.

nVidia is catering to the deep learning crowd and putting al this AI stuff on their chips taking up valuable silicon. These cards don't seem like consumer cards, they seem like enterprise cards and that consumers are an after thought. Then they find software solutions to use up the extra silicon they wasted on the chip. Going on power limits alone, these cards look like more like something that belongs in a server rack, not a computer case. And I would cite these MASSIVE coolers and videocard support beams as evidence of that.

And now, if we want to use these cards, not only do we have to go out and buy a new powersupply, we have to buy a VERY EXPENSIVE powersupply. Something in the 1000W+ range. I'd say if you want a 4080 you'd likely need something in the 1200w+ range. That is getting very close to what you find in server racks.

It just seems like laziness and that nVidia doesn't want to develop cards for the consumers. They want to keep development costs low and pass the secondary prices to use their cards onto the consumer(ie: coolers, powersupplies, video card support beams, ect.)

Like what is actually going on with this crap? It was crazy watching things like the GTX 280 and vega 64 power consumption numbers but these cards are DOUBLE those. These numbers are DOUBLE what people already thought was absurd and now they're just saying, "shut up, buy it, moores law is dead."

We all complained about the 280 and vega 64 cards but the industry accepted it. In my 25 years of following the tech industry I have never seen anything this absurd. This is craziness and I don't think the market will accept it like it did in the past. These cards are too expensive and there are too many downsides to owning them. They want me to pay $1200 for a card that's going to make it uncomfortable to sit in my room? I mean I guess I can run some cables and keep it in the garage or something but I like having the aesthetics of buying a computer in my room. It's fun choosing coolers and picking out a case I like. That's a major part of the hobby for me. Building something I think is cool and showing it off in my gaming room.

I guess what I'm getting at is that I don't really see where these products fit into the market. It isn't just a question of price anymore, there are too many things going on surrounding the cards that make them unattractive to consumers. I have never seen such a strong push back from consumers in all my years involved in the hobby.

Sorry for the long post
They can make a 3 fan AIO liquid cpu cooler for $90 brand new (meaning the production cost was less than $45) but suddenly the 1/8 size mini cooler on the card costs $100 to make? Since when does a 100 grams of aluminum and copper costs $100?
 
That's interesting you brought up the 6900. I was thinking about why that was and came up with the thought that maybe one of the reasons AMD cards are slightly cheaper is that they don't have as many capacitors or lower quality capacitors on the board. That would lead to increased transient spikes going to the power supply. However, capacitors are so cheap that it doesn't really make sense that that would be a cost cutting measure. Maybe it requires increased board complexity and therefore cost?

I don't have an answer, just an observation
The main difference in production costs between nvidia and amd is that amd uses a chiplet design that is easier to produce
 
Back