Nvidia GeForce RTX 40 series might get announced as early as July

Tudor Cibean

Posts: 182   +11
Staff
Something to look forward to: Nvidia might push up the release date of their next-gen RTX 40 lineup in order to get a head start on AMD, who is also planning to launch new GPUs later this year. So far, leakers have teased massive generational performance gains and TGPs saturating the new PCIe 5 power connector, but no one has a clear idea of pricing yet.

According to reliable GPU leaker kopite7kimi, Nvidia might release their GeForce RTX 40 series in early Q3. While just a rumor at this point, it means that we could see an announcement of the eagerly-anticipated GPU lineup as early as July.

Such a launch wouldn't be without precedent. The GeForce RTX 20 Super lineup was made public on July 2, 2019, while the first RTX 30-series graphics cards got announced in early September 2020.

Nvidia's RTX 40 series is based on the Ada Lovelace architecture and will probably be built on TSMC's N4 process node. The company is expected to launch high-end models first comprising the RTX 4090, RTX 4080, and RTX 4070, with more affordable cards coming a few months later.

The tipster has also shared a table with the configuration of the AD102 GPU that's going to be used in Nvidia's new flagship.

According to the leaks, the chip could feature 96MB of L2 cache, 16 times more than its predecessor:

The reference RTX 4090 is also rumored to come with a whopping 600W TGP (or more?), while some third-party cards might include two 16-pin power connectors, allowing enthusiasts to push the power limit even higher for overclocking.

As for the RTX 4080, it's believed that it'll use the AD103 GPU paired with 16GB GDDR6X and a 400W TGP. The RTX 4070 would feature an AD104 GPU, 12GB of GDDR6 memory, and a relatively-tame 300W TGP.

Permalink to story.

 
I'll believe it when I see it.
The economic downturn, sky high energy fees and crypto collapse might make the 4000 series accessible, but we won't know what's worth what till they are released.

I will definitely build a new system with a 1000+ W PSU and liquid cooled CPU, but this time around, I'll hold out for the 4000 Hybrid with its own AIO.
 
I'll believe it when I see it.
The economic downturn, sky high energy fees and crypto collapse might make the 4000 series accessible, but we won't know what's worth what till they are released.

I will definitely build a new system with a 1000+ W PSU and liquid cooled CPU, but this time around, I'll hold out for the 4000 Hybrid with its own AIO.
ew AIO GPU? Disgusting. Custom water block or GTFO man.
 
I won't be surprised if geforce 4000 gpu chips use 7nm or 8nm duv.
Using euv to make consumer gpu is still very expensive.
Euv production rate is also low compared to duv
 
First 900 watt tdp gpus. Now early launch rumors. They must be really threatened by rumors of rdna3 mcm at 3ghz. Nvidia, Quickly launch the 4000 series so we can capitalize on gap, our falling stock needs a boost.
Wait it must be the 6950xt performance that has them worried. One one side cryptomining is on a downward trend, on the other hand they have Intel potentially eating its low end market and AMD potentially taking the gpu crown. A desperate Nvidia will indeed to strange things.
 
First 900 watt tdp gpus. Now early launch rumors. They must be really threatened by rumors of rdna3 mcm at 3ghz. Nvidia, Quickly launch the 4000 series so we can capitalize on gap, our falling stock needs a boost.
Wait it must be the 6950xt performance that has them worried. One one side cryptomining is on a downward trend, on the other hand they have Intel potentially eating its low end market and AMD potentially taking the gpu crown. A desperate Nvidia will indeed to strange things.

I reckon you overestimate the impact that a competitor's moves have on launch planning. It is mostly dictated by silicon yield, board part supply chain, and fiscal reporting schedule. Nvidia is selling everything it can produce while maintaining strong profit margins.

Meanwhile, AMD's has so far not even allocated enough wafers to their RDOA chips to have a chance of seriously growing their tiny discrete GPU marketshare.
 
Let the rumor games begin.....

OK,
Ada Lovlace is just the cover story
These cards were secretly named after a pornstar

With 2 - 16 pin power connectors that can deepthroat 600 throbbing watts of power all night long, we can hardly wait to see it suck so hard
 
Lets hope prices will drop by then. I am still rocking 770 GTX and skipped all the price insanity so far. Will play all the missed games later.
 
I think we need a company which will sell portable nuclear reactors ASAP.

Even if I was still on my old 1080Tis I would require some serious convincing to even think about 4xxx series. My whole system when rendering eats about 650W and I'm very happy with the performance. To think that 1 GPU alone would rise that to 1kW of sustained load. It would have to be like 10x+ the performance to even - maybe - consider it. It's enough that electricity got 30% price hike already and I have no idea if it stays that way after 31/05 (frozen pricing for households), which I very much doubt.

The only positive take on those rumors is that nVidia will -maybe- finally stop offering high end GPUs with 8GB of VRAM or even 10 GB. While for most games it still should be fine, for work even 10G is laughably small buffer.
 
I think we need a company which will sell portable nuclear reactors ASAP.

Even if I was still on my old 1080Tis I would require some serious convincing to even think about 4xxx series. My whole system when rendering eats about 650W and I'm very happy with the performance. To think that 1 GPU alone would rise that to 1kW of sustained load. It would have to be like 10x+ the performance to even - maybe - consider it. It's enough that electricity got 30% price hike already and I have no idea if it stays that way after 31/05 (frozen pricing for households), which I very much doubt.

The only positive take on those rumors is that nVidia will -maybe- finally stop offering high end GPUs with 8GB of VRAM or even 10 GB. While for most games it still should be fine, for work even 10G is laughably small buffer.
1.21 Gigawatts? Great Scott!!!

 
With those TGPs, GeForce is a big NO to me this generation. I will continue with Radeons, except they go crazy too.
 
If this drops current gen GPU prices then I‘m all for it.

Not really trusting the next gen launch / availability to be better than this gen‘s, so if it means I can get something in the 3060Ti / 6700XT range for really low prices, I‘ll probably go for that.

 
I reckon you overestimate the impact that a competitor's moves have on launch planning. It is mostly dictated by silicon yield, board part supply chain, and fiscal reporting schedule. Nvidia is selling everything it can produce while maintaining strong profit margins.

Meanwhile, AMD's has so far not even allocated enough wafers to their RDOA chips to have a chance of seriously growing their tiny discrete GPU marketshare.
It's true but Nvidia plays the mind share game it simply can't have the inferior gpu hence the 900 watt historical tdp. Launch can also be a paper launch a limited silicon at extremely high premiums and scale up supply as yields improve.
 
Let the rumor games begin.....

OK,
Ada Lovlace is just the cover story
These cards were secretly named after a pornstar

With 2 - 16 pin power connectors that can deepthroat 600 throbbing watts of power all night long, we can hardly wait to see it suck so hard

So many puns in one post......well done!!
 
At this rate, nGridia will be producing GPUs with their own ravenous external power supplies, maybe by next year at the latest. I could see an RXT 5090 sucking 1200 Ws or even more.

Not even 3 years ago, some shilly tech websites and fankids were practically lynching AMD for daring to have a product with 125 watts!!
 
I remember an interview but dont recall the persons name, whom stated that AMD, Nvidia and Intel talk to each other openly about their upcoming products and seems to plan ahead accordingly.

For example, Nvidia knew that AMD didnt had anything to compete with their RTX 20, hence did that "nice and sweet" price increase at launch.

Now, they must know that RDNA 3 will be that good, that have them spending extra money on youtubers and tech sites to start the rumor mill like this.

I simply wish that the nvdrones would open their eyes and see what a dirty company it is, so they would not be that loyal like some of the comments here are indicating.
 
Last edited by a moderator:
I simply wish that the nvdrones would open their eyes and see what a dirty company it is, so they would not be that loyal like some of the comments here are indicating.

Company loyalty is dead for the majority of consumers (but not for fanboys on tech sites)... whoever makes the best products for the best prices will get most of the money. Or, failing that, whoever can actually ship these products instead of simply "paper launching"...

I have owned Radeon and Nvidia GPUs - currently using a Radeon 7970 on my work PC which has done yeoman's work for over a decade while using a 2080Ti on my home PC.

For the past few years, however, AMD GPUs have tended to be pretty crappy at the high end - perhaps that will change this generation - I hope so :)
 
Company loyalty is dead for the majority of consumers (but not for fanboys on tech sites)

Thats the sad part, consumers these days dont exist, only hard core fanbois.

When you tell someone think like a customer by demanding more, it tends to fall on deaf ears, especially on tech sites, as you said.
whoever makes the best products for the best prices

See thats the other problem, somehow, I have noticed that people loyal to nvidia dont look at that part (best prices), really strange.
AMD GPUs have tended to be pretty crappy at the high end - perhaps that will change this generation

I agree and sadly, for some usages, is still the case (anyone that needs a program that only uses CUDA), but performance wise, I have to say, RDNA2 has really matched and sometimes, beat RTX30 GPUs.

Yes, yes, I know what the nvdrones will say after "bUt mAh DLSS and RT yoo!"

Well, on an even playing field (FSR2.0), resizable BAR and even some RT, they do compete.

Personally, I have seen only a couple of games that I would say, "wow RT is great" but its not something that so far I would them worthy of the performance hit involved.

And the usual disclaimer" I hate nvidia because of how the treat their customers and the industry, but I am not a rabid fanboi of AMD, I simply like their current approach to customers and industry.

If that changes, then should I.
 
For that much wattage, you may need to plug the machine in its own dedicated electrical circuit. Cant run other items and appliances on that same circuit. Most household bedrooms have 15 amp circuit, which is saffe for up 1440 watts of safe use (15 amp x 120 volts x 80% safety margin for continuous load)
 
Remember when they said that increasing miniaturisation would enable ultra-low power systems?
And yet, here we are, talking about 1kw PSU's for a desktop computer....
They do both... we HAVE ultra-low power laptops/tablets/smartphones.... you can also use it to cram in as much as you can - hence the 1kw beasts...
 
Like a high priced Hooker, she gives you a couple of extra long headers to suck all that power down

But can she do it all night long while keeping her cool?
 
If the rumors are true, I'm not terribly excited about RTX 40 at the moment. The AD102 GPUs are all likely to be priced over $1000 and use a minimum of 450 watts of power. Once you go down to AD103, the increases over current gen are more moderate. The 4080 from what the rumors have been claiming has been moved to the AD103 and is currently looking to be the least impressive jump in terms of performance with only 768 additional cuda cores than the current 3080. Obviously architecture and RT gen 3 will help as well, but I would not be surprised if 3080 to 4080 is only a 15-20% jump. Having a 3080 and not wanting to spend $1000+ to upgrade nor replace my PSU, this makes the lineup rather mundane. The rest of the line up is getting a nice cuda core boost, but nothing like the 3090 to 4090 jump. Not that I need a new GPU with the 3080, but, because I more than recouped the cost of my 3080 due to the overpriced used GPU market in late 2020, I thought I would perhaps upgrade if the 4080 was significantly better. Maybe RDNA3 7800XT range will prove to be the appropriate upgrade path.
 
Back