More rumors suggest Nvidia Lovelace a.k.a. RTX 4000 series will use TSMC N5

Daniel Sims

Posts: 1,376   +43
Staff
Rumor mill: A leaker with a strong track record added to rumors that Nvidia's next generation of gaming graphics cards will use TSMC's 5-nanometer node process. It may also arrive a bit earlier than previously expected.

Kopite7kimi, who knew details of Nvidia's current Ampere graphics card lineup before they launched last year, asserted last week that Nvidia's next series—probably the RTX 40 cards—would use TSMC's N5 node process. The Ampere series uses Samsung's 8nm silicon. This is the second consecutive time Nvidia has switched back and forth between the two companies.

Known leaker Greymon55 stated as much last month, so this puts more strength behind existing rumors. Kopite also said the RTX 40 series, supposedly running on an architecture called "Ada Lovelace," should arrive a bit earlier than previously expected. Those previous expectations put it around the end of 2022. Nvidia tends to launch new generations of graphics cards every couple of years, so the latest rumors could mean as early as Q3 2022.

Earlier, both leakers suggested the top-of-the-line cards in the RTX 40 series, similar to the rumored RTX 3090 super, would draw over 400W of power, setting a new precedent in a concerning upward trend.

The flagship of the 40 series will supposedly be based on the AD102 GPU, which according to estimates from Kopite's leaks about its structure, might have as many as 18432 CUDA cores, compared to the RTX 3090's 10496. The flagship 40 series card may double its predecessor's performance based on these rumors.

Permalink to story.

 
The move is not unexpected because it is clear so far that Samsung's node is still not as good as what we see with TSMC. So if they want to keep up with competition because it is getting really hot up there, so they will have to make the jump.
 
Could you imagine the energy bill running an RTX 3090 super at 400w alongside an Intel Alder Lake-S which is rumored to be at 250watt?

I'm all for high-end gaming, but this is getting ridiculous.
A trivial amount of energy compared to running a load of wash through your 220v dryer, or charging 10 miles in your electric car, or turning your AC on for a few hours.

It wasnt all too long ago that a single lightbulb pulled 65 watt. 400 watt peak draw isnt that much.
 
Oh goody, expect massive price increases thanks to TSMC milking the chip shortage for what it's worth.

I'm starting to think the only reason the chips get smaller is because of the silicon shortage. America actually have more silicon than anyone, but they wanna gotta keep making bigger boobies, forcing the phones to be smaller.
Because there is less room in bras?

Well you tell me I'm mad then.
 
A trivial amount of energy compared to running a load of wash through your 220v dryer, or charging 10 miles in your electric car, or turning your AC on for a few hours.

It wasnt all too long ago that a single lightbulb pulled 65 watt. 400 watt peak draw isnt that much.
Exactly this. Using a vacuum cleaner for 30 minutes will consume as much power as 2 hours of flat out full power draw gaming at 400w. That is, of course, unless you're in the EU where they've already banned high powered vacuum cleaners (that's why they're all crap in Europe) because they'd rather punish consumers for using energy instead of actually bothering to trying to produce it cleanly. Much easier that way.

Or be like Germany, where the closure of nuclear power plants is causing their net emissions and emissions per kwh to INCREASE compared to the early 90's. Absolute madness.
 
Could you imagine the energy bill running an RTX 3090 super at 400w alongside an Intel Alder Lake-S which is rumored to be at 250watt?

I'm all for high-end gaming, but this is getting ridiculous.

I agree, we should just ignore the rabid fanbois.

Personally, I find acceptable a max power consumption of abt 180W for the GPU and abt 100 Watts for the CPU which is what I am seeing while using my GTX 1080 and 4770K.

This reckless misuse of energy and NVIDIA's 500W GPUs designed for mining is pure insanity.

I can game just fine on my reasonably powered components, I have no need for kilowatt PSUs, 300W Intel CPUs and 500W GPUS, that's pure insanity.
 
The best thing to do with GPU rumors is ignore them. The more you pay attention the more you will want the card. Then the more you will be willing to pay. And then there will be no stock so you'll be even more desperate. Until one appears on ebay at twice the price you had convinced yourself it was worth. So you snap it up and slowly watch resentment breed as you consider what else you could have bought with the majority of that cash. And then, just as the resentment has fizzled out, the next card rumors will appear. And so the cycle continues.
 
The best thing to do with GPU rumors is ignore them. The more you pay attention the more you will want the card. Then the more you will be willing to pay. And then there will be no stock so you'll be even more desperate. Until one appears on ebay at twice the price you had convinced yourself it was worth. So you snap it up and slowly watch resentment breed as you consider what else you could have bought with the majority of that cash. And then, just as the resentment has fizzled out, the next card rumors will appear. And so the cycle continues.
Yeah, that's about how marketing works. It's a never ending cycle of rumors generating hype making you want to consooom and when you do the novelty wears out pretty quick and you end up justifying your purchase on TechSpot. Meanwhile a new product better than yours is already cooking up starting the cycle again.
 
Exactly this. Using a vacuum cleaner for 30 minutes will consume as much power as 2 hours of flat out full power draw gaming at 400w. That is, of course, unless you're in the EU where they've already banned high powered vacuum cleaners (that's why they're all crap in Europe) because they'd rather punish consumers for using energy instead of actually bothering to trying to produce it cleanly. Much easier that way.

Or be like Germany, where the closure of nuclear power plants is causing their net emissions and emissions per kwh to INCREASE compared to the early 90's. Absolute madness.

What's concerning is the reversal of a trend where GPUs were actually getting more efficient while maintaining similar incremental performace improvements. It's fair to ask what the reasons are.

 
Could you imagine the energy bill running an RTX 3090 super at 400w alongside an Intel Alder Lake-S which is rumored to be at 250watt?

I'm all for high-end gaming, but this is getting ridiculous.

Add your Monitor, Printer, router and other accessories, and you could be maxing out a single electrical loop, especially in older buildings.

Add to that the extra cost in electricity, PSU and cooling solution upgrades, which all leads to more cost and wastage. Its not a good trend.
 
Exactly this. Using a vacuum cleaner for 30 minutes will consume as much power as 2 hours of flat out full power draw gaming at 400w. That is, of course, unless you're in the EU where they've already banned high powered vacuum cleaners (that's why they're all crap in Europe) because they'd rather punish consumers for using energy instead of actually bothering to trying to produce it cleanly. Much easier that way.

Or be like Germany, where the closure of nuclear power plants is causing their net emissions and emissions per kwh to INCREASE compared to the early 90's. Absolute madness.

Akchually regarding vacuum cleaners, Dyson appealed this law and won. The result is that vacuum cleaners are still as powerful as they were before. My new Miele sucks harder than my 10 year old one (pun intended).
 
Could you imagine the energy bill running an RTX 3090 super at 400w alongside an Intel Alder Lake-S which is rumored to be at 250watt?

I'm all for high-end gaming, but this is getting ridiculous.

If you can afford such a system you can afford the energy costs.
 
What's concerning is the reversal of a trend where GPUs were actually getting more efficient while maintaining similar incremental performace improvements. It's fair to ask what the reasons are.

Is it though? The rate games were evolving was pretty stagnant for awhile. Come RTX and every game wants to push the limits again. So instead of reducing power usage to maintain same level of detail they have to push it up to make up the difference.
 
I think all the fabs are the same. Whether you stick with TSMC or Samsung, eventually they will increase price due to high demand.
Not to support price increases, but the cost of a new fab is over $150 million. The time to produce a finished wafer at 5nm is longer than at 7nm, since they add more layers which can add weeks to the processing time. Even with more processors per wafer, the cost is staggering.
 
Well we were expecting that already anyway: it's either that or 800 for a 3060 and 1000 for a 3070 range card will be the new official MSRP.
Just wait until the instant upgraders to jump on Lovelace and try and get a heavily discounted Ampere card. That's how I got my 2080 Super for a very good price.
 
Just wait until the instant upgraders to jump on Lovelace and try and get a heavily discounted Ampere card. That's how I got my 2080 Super for a very good price.
It's not a terrible idea but 3000 series handmedowns are very likely former miners so it's not exactly a great prospect, at least not as good as that 2080 deal you got.

I much rather Nvidia would just release a desktop 3050 ti already but they're doing so much more business with the way more expensive cards they might not even release it for desktop this generation.
 
Back