Rumored flagship Nvidia Lovelace card to pack 48GB of GDDR6X, 18,176 CUDA cores, and 800W...

midian182

Posts: 9,720   +121
Staff member
Rumor mill: Every day that passes is another one closer to the launch of Nvidia's RTX 4000 (Ada Lovelace) series. As is always the case with upcoming hyped products, rumors about what the cards might offer are landing thick and fast. The latest involves what could be the series flagship: a card with 18,176 CUDA cores, 48GB of GDDR6X memory, and a TDP of 800W.

Regular leaker kopite7kimi posted a tweet about what he appropriately nicknamed "the beast." The card is supposedly based on a new PG137-SKU0 board design, rather than the PG139 expected to be utilized by the RTX 4090, and could be the first to feature a triple-fan reference cooler from Nvidia.

kopite7kimi believes the card will pack an AD102-450-A1 GPU featuring 142 streaming multiprocessors (SMs) on 18,176 CUDA cores. That's still a cut-down version of the full-fat AD102 GPU, which packs 144 SMs on 18,432 CUDA cores.

The beast is also said to come with a monstrous 48GB of 24 Gbps GDDR6X VRAM with a 384-bit bus interface, enabling a maximum bandwidth of 1.15 TB/s. That represents an increase of about 14% over the RTX 3090 Ti.

The card's final spec is one that lines up with previous rumors of Lovelace being obscenely power hungry: an 800W TGP, which means we can expect it to feature dual 16-pin connectors.

The specs of the card, especially that 48GB of GDDR6X VRAM, suggest that the beast could be the first Titan-branded Nvidia product we've seen since 2018. Titans are aimed more at developers, researchers, and creators, rather than gamers. Nvidia never released any during the current Ampere era, noting that its BFGPU (big ferocious GPU) RTX 3090 and RTX 3090 Ti are much faster than the previous-gen Titans.

As with all rumors, this one needs to be taken with a heavy dose of salt, but we seem to be hearing similar claims about the RTX 4000 series every week, including a recent leak suggesting the RTX 4090 will be twice as fast as the RTX 3090. But with consumers reigning in their spending, energy prices at an all-time high, and more people opting for cheaper Ampere cards, it'll be interesting to see just how well-received Lovelace proves, and if Nvidia releases more than one card this year.

Permalink to story.

 
While 48gb of memory is cool I have no idea how this would be useful outside of workstation applications. It certainly wouldn't be a gaming card and until we have a 16k60 standard it seems pretty moot. Just think, that's more memory than many highend PCs have. I don't see many PCs with more than 32gigs.

That said, this is a cool peice of tech as ridiculous as that power limit is. Are we sure this isn't a quadro series card or whatever they're called these days?
 
While 48gb of memory is cool I have no idea how this would be useful outside of workstation applications. It certainly wouldn't be a gaming card and until we have a 16k60 standard it seems pretty moot. Just think, that's more memory than many highend PCs have. I don't see many PCs with more than 32gigs.

That said, this is a cool peice of tech as ridiculous as that power limit is. Are we sure this isn't a quadro series card or whatever they're called these days?

A game developer could create very high resolution textures just for this card but I doubt many would bother.
 
But can it play Crysis?

Yes. It can render at max spec and run a neural net to play by itself.

And supply a secure source of revenue for a regional power plant.
 
While 48gb of memory is cool I have no idea how this would be useful outside of workstation applications. It certainly wouldn't be a gaming card and until we have a 16k60 standard it seems pretty moot. Just think, that's more memory than many highend PCs have. I don't see many PCs with more than 32gigs.

That said, this is a cool peice of tech as ridiculous as that power limit is. Are we sure this isn't a quadro series card or whatever they're called these days?

Machine learning and GPU porn. Even at that, the 800w specification means this has to be intended for either niche applications or home users who had a very good electrician.
 
A game developer could create very high resolution textures just for this card but I doubt many would bother.
you'd have to make absurdly high textures. I mean we'd be talking games in the terabyte+ range. But that's part of the reason I said this wouldn't be practical unless we had cards capable of 8k120 or 16k60. You'd need an absurdly high resolution to take advantage of textures that high.

that brings me back to my original point, there are 8k60 screens but can this card even play at 8k60? It's pointless as a gaming card, even the 3090 couldn't use all of it's 24GB while gaming, why double it?
 
you'd have to make absurdly high textures. I mean we'd be talking games in the terabyte+ range. But that's part of the reason I said this wouldn't be practical unless we had cards capable of 8k120 or 16k60. You'd need an absurdly high resolution to take advantage of textures that high.

that brings me back to my original point, there are 8k60 screens but can this card even play at 8k60? It's pointless as a gaming card, even the 3090 couldn't use all of it's 24GB while gaming, why double it?
You don't need higher screen resolutions to notice a difference, most games at the moment assume limited VRAM and use reduces texture sizes on most non important objects so the game will run on as many devices as possible, a game solely aimed at this card could apply 16K textures to almost everything, large file sizes in games are usually due to cinematic movie files and audio files along with localised versions, more textures would only increase overall size by 2x as most.
 
You don't need higher screen resolutions to notice a difference, most games at the moment assume limited VRAM and use reduces texture sizes on most non important objects so the game will run on as many devices as possible, a game solely aimed at this card could apply 16K textures to almost everything, large file sizes in games are usually due to cinematic movie files and audio files along with localised versions, more textures would only increase overall size by 2x as most.
going from 8k to 16k is a 4X increase. It also depends on the size of whatever the texture is on. You wont notice going from 8k to 16k on something like a pot but it would have to be pretty large to be noticeable at 1080p or 1440p. 16k textures at even 4k would be hard to notice. That brings me back to the 8k/16k resolutions. And this card might not have the GPU power to run 16k textures at 8k resolution. We will have to just wait an see.
 
The only consolation of these high-end cards is that maybe they'll leave more room to actually give us solid mid-range cards this generation. The mid-range offerings of Ampere and RDNA2 could have been a lot better than they were. The 3060 was about 11% faster than the 2060 and 1% slower than the 2060 S. The RX 6600 XT was only about 15% faster than the RX 5600 XT. The 3080 on the other hand was a whopping 55% faster than the RTX 2080 and even surpassed the 2080 Ti. It would be nice to see the 4060 at +50% compared to the 3060, but don't hold your breath.
 
Oh lowd he coming........that's a phat arse BFGpu...but damn 800 Watt
I suspect transient current to be 2-2.5 time that, this means you should be safe with a 2KW power supply just for this.
I just wonder how US households would be able to use them, 120v and 15Ah circuits = 1800Wh
People in northen part of the world will use 3D Mark to heat up the place.
 
These are not for gaming.... they are a cheaper sourced dGPU for Content Creators.

Gamers don't need 48GB, they need faster rasterization.
 
Oh lowd he coming........that's a phat arse BFGpu...but damn 800 Watt
I suspect transient current to be 2-2.5 time that, this means you should be safe with a 2KW power supply just for this.
I just wonder how US households would be able to use them, 120v and 15Ah circuits = 1800Wh
People in northen part of the world will use 3D Mark to heat up the place.
We're just going to have to run 220 lines to our PCs or setup next to the washer and dryer. Maybe we can tap into the dryers exhaust port to remove some of the heat from the room
 
Given that this is meant to probably be for researchers , commercial etc why call it RTX 4090?
I seriously think Nvidia wants to sell this to gamers as well - if only to cream the huge mark ups and bragging rights - we have the fasted GPU .
Outside of curiosity I have no interest in having this near any PC I build .
I don't even check at celebrity gamers machines - but even they normally build sensible PCs - from the few I saw
 
Given that this is meant to probably be for researchers , commercial etc why call it RTX 4090?
I seriously think Nvidia wants to sell this to gamers as well - if only to cream the huge mark ups and bragging rights - we have the fasted GPU .
Outside of curiosity I have no interest in having this near any PC I build .
I don't even check at celebrity gamers machines - but even they normally build sensible PCs - from the few I saw
No one actually knows what this card will be called. Also, from all that is known of this rumor and card, its experimental. I doubt serious that it is 800W. I doubt seriously that it will even be 500W, if and when it releases. Some speculation that this is a high-end Quadro card. Why not? Nvidia sells those cards for a considerable markup.
 
Sounds like a Titan spec or really the basis of a high end Quadro. No way this will become even a 4090Ti IMO for regular desktop users.
 
No one actually knows what this card will be called. Also, from all that is known of this rumor and card, its experimental. I doubt serious that it is 800W. I doubt seriously that it will even be 500W, if and when it releases. Some speculation that this is a high-end Quadro card. Why not? Nvidia sells those cards for a considerable markup.
Yeah when back and read it - just 4000 series - and different board design than 4090. As for price for that market - suppose lots of factors - sunk investment in Nvidia , calculation per dollar and many more . Normally people need reasonable % diff to change to competitor. I believe there is an algorithm that shows how much better something needs to be - given a number of inputs, qualifications etc - somethings need to be twice as good to break into markets ( not the case here )
 
This is clearly not meant for gamers. But I am sure those with deep pockets will still go for it just for the sake of having the "fastest" Nvidia GPU. Assuming the specs are true, then you will surely need a very high wattage PSU or 2x PSUs with at least 1 supporting 2x 12pin connector as 1 of the investment on top of the GPU. In addition, you likely will need to deploy custom cooling to keep the GPU cool, so another costly investment.
 
While 48gb of memory is cool I have no idea how this would be useful outside of workstation applications. It certainly wouldn't be a gaming card and until we have a 16k60 standard it seems pretty moot. Just think, that's more memory than many highend PCs have. I don't see many PCs with more than 32gigs.

That said, this is a cool peice of tech as ridiculous as that power limit is. Are we sure this isn't a quadro series card or whatever they're called these days?

48gb is PURELY for mining ... keeping all the data assets OUT of super slow main ram and avoiding paging to the RAM or the DRIVE.

This is the ''IT'' card for the next mining rush
 
Back