Nvidia's Jensen Huang once again claims Moore's Law is dead

Nvidia do not "expect" they dicatate to AIB's what to do. Just look at EVGA.
Don't forget Sparkle (no longer makes video cards), XFX (jumped to ATi) and BFG (RIP).

I expect that the next one to fall to nVidia will be Zotac because they're not as big as ASUS, ASRock, Palit or Gigabyte. PNY is safe because their expertise lies primarily in the professional-series cards (like the old Quadros).
 
That's interesting you brought up the 6900. I was thinking about why that was and came up with the thought that maybe one of the reasons AMD cards are slightly cheaper is that they don't have as many capacitors or lower quality capacitors on the board. That would lead to increased transient spikes going to the power supply. However, capacitors are so cheap that it doesn't really make sense that that would be a cost cutting measure. Maybe it requires increased board complexity and therefore cost?

I don't have an answer, just an observation
The finger of blame appears to be pointed, sometimes equally, sometimes in one direction, at the PCIe power connectors on the cards and the PSUs. Some graphics cards appear to create an oscillation on the +12V sense pin, which ultimately causes (cheaper) PSUs to go a bit nuts. It doesn't require a heavy load on the 12V delivery pins: it just 'happens', although Ampere cards do seem to be more prone to it than others.

Take TechPowerUp's results again, using Furmark to ramp up the power demand, showed some interesting culprits. The % figures below are the maximum transients recorded during the Furmark test as a percentage increase over the mean power demand recorded:

RX 6900 XT = 103%
Vega 64 = 73%
RX 6800 = 35%
RX 6800 XT = 31%
RTX 3090 = 28%
RTX 3080 Ti = 26%
RTX 3070 Ti = 17%
RTX 3090 Ti = 11%
RTX 2080 Ti = 10%

Yes, that first entry is correct! One 'solution' to the whole problem is to display the 12V sense pin, on the 8 pin block, altogether. A far better solution is to use the best quality PSU you can get.
 
A far better solution is to use the best quality PSU you can get.
I do that anyway. Not buying the best-quality PSU that you can is a fool's errand because it's what actually runs your PC. Everything is connected to it and everything depends on it. I won't touch a PSU that isn't at least 80+Gold and rated for at least 1kW. At least, not for my main rig. My main PSU is an EVGA 1000 G2 Supernova and my backup PSU is an OCZ Z1000M. Both are rated for 1kW and both are 80+Gold-certified.

I really think that more people should pay attention to sage advice like yours because while, sure, the higher-end PSUs are more expensive, they're more or less immortal. I bought my EVGA because it was on some stupidly-crazy sale and figured that my Z1000M was getting old. I realised this year that I was a fool because there was no need to replace my Z1000M.

I was using the Z1000M in my mining build where it was powering an FX-8350, an RX 5700 XT and an RX 6800 XT without issue for over 3,600 hours straight. Not bad for a PSU that's 12 years old, eh?
 
Last edited:
I'm quite interested in seeing how moving to a chiplet design will come into play with their power consumption.
I *really* want to see them on par with NV as far as additional features (specifically raytracing) and performance goes.

That said, as power use goes, I'm already pulling 400 under load on the GPU as it is, so *for me* as long as whatever I decide to upgrade to (if I choose to do so) is at or below that is fine.

We already know RDNA3 is another 50% per watt gain.
Also, ready ANY reviews of the 3080ti vs AMD cards..? AMD typically wins every game and benchmark (price/performance)... and uses less power while doing it. RDNA3 is going to be an even bigger win...

Again, Ray Treason criteria is fringe. Nearly ALL Gamers (ie players) would rather have 3D positional sound (ray traced sound) than ray treason. No gamer was sitting around and asking for this... the industry itself will not move to Ray Tracing for another 5 years.

Just ask 90% of the market.... RDNA Consoles.
 
Last edited:
Even so, AMD video cards are better priced than Nvidia counterparts for the same performance and often for better performance.
Nvidia suffer from a chronic illness called greediness, so far away beyond business.
And even so, Nvidia still enjoys a quasi-monopoly in the dGPU market. Either people give value to Nvidia's better raytracing performances and proprietary techs like DLSS or I don't know, Huang is just a magician...
 
Just make a cost effective and energy efficient X060 card .
Immersion is lots of things , real life effects , sound , dust kicked up etc . A 1080p done well - is better than a 8k high detail mess
 
Frankly, I don't see the cost of silicon being the biggest driving factor in the increased cost of these cards. They are using absolutely REDICULOUS coolers, I wouldn't be surprised if the cost on nVidia's side for the 4080 cooler is well over $100 a unit. The other part to this is since they are making absolutely crazy power limits they need to put the electronics back bone on the chips to provide proper, clean power to these chips. nVidia's solution to performance is just pump more power into the chip, there is very little innovation on their part.

nVidia is catering to the deep learning crowd and putting al this AI stuff on their chips taking up valuable silicon. These cards don't seem like consumer cards, they seem like enterprise cards and that consumers are an after thought. Then they find software solutions to use up the extra silicon they wasted on the chip. Going on power limits alone, these cards look like more like something that belongs in a server rack, not a computer case. And I would cite these MASSIVE coolers and videocard support beams as evidence of that.

And now, if we want to use these cards, not only do we have to go out and buy a new powersupply, we have to buy a VERY EXPENSIVE powersupply. Something in the 1000W+ range. I'd say if you want a 4080 you'd likely need something in the 1200w+ range. That is getting very close to what you find in server racks.

It just seems like laziness and that nVidia doesn't want to develop cards for the consumers. They want to keep development costs low and pass the secondary prices to use their cards onto the consumer(ie: coolers, powersupplies, video card support beams, ect.)

Like what is actually going on with this crap? It was crazy watching things like the GTX 280 and vega 64 power consumption numbers but these cards are DOUBLE those. These numbers are DOUBLE what people already thought was absurd and now they're just saying, "shut up, buy it, moores law is dead."

We all complained about the 280 and vega 64 cards but the industry accepted it. In my 25 years of following the tech industry I have never seen anything this absurd. This is craziness and I don't think the market will accept it like it did in the past. These cards are too expensive and there are too many downsides to owning them. They want me to pay $1200 for a card that's going to make it uncomfortable to sit in my room? I mean I guess I can run some cables and keep it in the garage or something but I like having the aesthetics of buying a computer in my room. It's fun choosing coolers and picking out a case I like. That's a major part of the hobby for me. Building something I think is cool and showing it off in my gaming room.

I guess what I'm getting at is that I don't really see where these products fit into the market. It isn't just a question of price anymore, there are too many things going on surrounding the cards that make them unattractive to consumers. I have never seen such a strong push back from consumers in all my years involved in the hobby.

Sorry for the long post
I have a spacious full tower, a platinum Corsair 1200W PSU, cheap electricity and so on. I would only need to buy a support stick for the behemoth. Anyway, I would never buy that monstrosity. Bad/lazy engineering.
 
Typical corporate speak for taking advantage of a supply shortage 'yea even though things were short during a global catastrophe we're going to charge even more when we have too much supply to sell.'

How is moores law dead exactly? We have 5 nm chips coming this year from amd and 3 nm after that. Smaller chips are CHEAPER to make, they take less silicon.

This is the most obvious play to sell their remaining rtx 3000 stock that they could possibly make
 
insisted that the price increases were justified when compared to the performance levels offered by previous generations.

Nah, give us affordable price instead. Performance increase every gen is the most basic commonsense, not additional feature to justify the insane pricing.

This guy is crazy.
 
Well let's see if Jensen Huang is right, shall we? Now, originally, Gordon Moore specifically wrote the following:
The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page). Certainly, over the short term, this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years
So let's take all of Nvidia's top-end chips (I.e. the ones with the most transistors) and plot a logarithmic graph of transistor count against year, just like Moore did for his article:
nvidia_versus_moore.png

So, Huang is right. Moore's Law is indeed 'dead'. Turns out, for Nvidia, it's been no relevant for quite a few years.

Joking aside, though, this is just transistor count, after all -- modern chips contain many components, so perhaps this is what Huang is referring to. But I suspect not and really, just as we all suspected, the claim of 'Moore's Law is dead' is nothing more than marketing hyperbole.

Not that he needed to really make such a vacuous statement -- the AD102's transistor count speaks for itself:

nvidia_versus_moore2.png

This graph alone shows just how much of an engineering leap it is, although a significant reason for the leap is the switch to TSMC's N4 process node. This is evident when you look at a logarithmic graph of the die densities:

nvidia_versus_moore3.png
 
These high end cards have support for upto 675watts of power draw between the PCie slot and the 2x12 pins. Why would they put that in there if they didn't expect people to use it or need it in the future? And with as scummy as a company they are I wouldn't doubt they hide behind their board partners. From what I could see of the EVGA situation, they were sick of being told what to do and taking the heat for what nVidia was telling them to do. "the reference cards are 450, we're giving you 675 watts to play with. We want you to go wild"
Ok, let's say they can eat 675 watts, is it safe for the psu, wires, motherboard though?
 
Ok, let's say they can eat 675 watts, is it safe for the psu, wires, motherboard though?
Speaking of wires, that card will be drawing almost 7A at 120V. In North America, a standard 14ga house wire is rated for 15A at 120V. It's getting to the point that you're going to need a separate 15A circuit just for the PC wired into your walls because the power that it draws may become a fire hazard. This may be especially true in older buildings or in countries where the quality of electrical wires isn't up to snuff.
 
Man, people are in for a surprise whenAMD GPUs are more expensive than they thought they'd be due to increased wafer and other component costs being passed onto the consumer through them as well.

This is simple business practices. Nvidia, AMD, nor Intel are going to just eat the cost of these products, we are and it's not going to be exclusive to Nvidia either.
They can be profitable at much lower prices, they dont actually cost that much. The difference in revenue can be made up by more units sold, unlike the 3000 series update and the 4080 12 gb, which arent and wont sell well at all.

You should check out how much cheaper it is for amd to make chiplet cpus and gpus vs nvidia monolithic dies, youd be surprised it is completely nvidias fault their production costs are high, but not nearly as high as they claim.

Its hilarious to think people would but a 4080 for $900-1200 when the 3090 ti is already dropped to $1000 on best buy lmfaooo. Once they come out the 3090 ti will be 700 bucks
 
Last edited:
Speaking of wires, that card will be drawing almost 7A at 120V. In North America, a standard 14ga house wire is rated for 15A at 120V. It's getting to the point that you're going to need a separate 15A circuit just for the PC wired into your walls because the power that it draws may become a fire hazard. This may be especially true in older buildings or in countries where the quality of electrical wires isn't up to snuff.
Thats a ridiculous concern when people run 1200 watt radiators, air fryers, air conditions. Borderline hysterical.

P.s. theyre called breakers, heard of em?

P.p.s. miners run 8 gpus with over 1200w at once and are very popular in poor countries, heard of em?

The only reason this is a concern is because its coming out of 1 port on the psu, the total power draw is laughable as a danger to your inner house wiring
 
So, Huang is right. Moore's Law is indeed 'dead'. Turns out, for Nvidia, it's been no relevant for quite a few years.
Incorrect. The graphs you yourself post prove that. Context is important..
Joking aside, though, this is just transistor count, after all -- modern chips contain many components, so perhaps this is what Huang is referring to. But I suspect not and really, just as we all suspected, the claim of 'Moore's Law is dead' is nothing more than marketing hyperbole.
Ah... Well... nevermind then...
 
Incorrect. The graphs you yourself post prove that. Context is important..

Ah... Well... nevermind then...
It was a bit of joke post, but also to point out that even Nvidia's best chips haven't quite followed Moore's prediction for some time now. Even if one is very generous and assumes the AD102 has as many other components as it does transistors, the total count is still well short of what Moore's estimate suggests the AD102 would have.

So Huang is right, but not the way he wants us to think it, I.e. he's suggesting their chips are 'better' than the law. But let's face it: Moore made the statement 57 years ago, when VLSI chips were yet to be born.
 
Moore's law, (IMO), was stillborn anyway. At most, it should have been categorized as, "Moore's theorum". A theory which was based on prevailing advancements of the period it was conceived..
Speaking of wires, that card will be drawing almost 7A at 120V. In North America, a standard 14ga house wire is rated for 15A at 120V. It's getting to the point that you're going to need a separate 15A circuit just for the PC wired into your walls because the power that it draws may become a fire hazard.
I believe most, if not all electrical codes in the US have upped wire size to 12 gauge, which yields a 20 amp circuit.

We had a big hiccup a few years back, when a copper shortage forced us into wiring with aluminum. That was a recipe for disaster, mixing copper connections with aluminum wiring.

I did some wiring a few years back with split grades, 14 gauge for lightly loaded light circuits, with 12 gauge on separate circuits for air compressors and air conditioning.

That 12 gauge is working on treble the price of 20 years ago.

You're absolutely correct though, about older homes, thinner wires, and tired insulation. The house I'm living in still has knob and tube wiring in the upstairs. Since it's only on the ceiling light circuit which draws at max, about 300 watts, I still sleep well. .
 
Back