Are these the first photos of a quad-slot, 800W Nvidia Titan RTX Ada?

midian182

Posts: 9,718   +121
Staff member
Rumor mill: When it comes to performance, size, and price, the RTX 4090 is a beast of a graphics card, but it appears that an even more monstrous product is coming to the Ada Lovelace line. Alleged images of what is probably a Titan RTX Ada, though it could be an RTX 4090 Ti, have leaked, showing its massive quad-slot design and four display connectors on the I/O that are aligned vertically rather than the usual horizontal setup.

Assuming they're the real deal, the photos from Hardware leaker MEGAsizeGPU confirm that the most powerful Lovelace card will use an Nvidia design quite different from what we've seen before. They suggest that the PCB is vertical, meaning it will be parallel to the motherboard when the card is inserted.

We can also see an exhaust grill so large that it covers two PCI slots, which could have led to Nvidia stacking the three DisplayPorts and one HDMI port vertically instead of horizontally.

There has been debate over whether this new card will be an RTX 4090 Ti or a Titan RTX Ada. Given the gold theme, it seems we're looking at the latter. There are more hints on the I/O bracket's sticker; the PG137 PCB board number was revealed by kopite7kimi in July last year in a leak about a flagship Lovelace card with 48GB of GDDR6X, 18,176 CUDA cores, and a TDP of 800W, earning it the appropriate nickname of 'the beast.' For reference, the RTX 4090's board number is PG139, and it has 16,384 CUDA cores.

Elsewhere, the Titan is said to be the first to feature a triple-fan reference cooler from Nvidia and will draw its 800W TBP via dual 16-pin connectors, which sounds concerning given how much some people have struggled with the RTX 4090 adaptors.

Other than the gold theme, it's the rumored 48GB of 24 Gbps GDDR6X memory that suggests this card is a Titan RTX Ada, as the brand has long been aimed chiefly at creators and the scientific community; the RTX 6000 Ada also has 48GB of memory, albeit slower 20 Gbps GDDR6. More evidence points to Micron's announcement last year that it was producing 24 Gbps modules, which have never been used in an Nvidia card.

There were rumors (emphasis on that word) back in October that Nvidia had canceled the Titan RTX Ada because it was tripping breakers, melting PSUs, and occasionally dissolving entirely. It could be that the company has now solved these issues, and the card is on its way to becoming a purchasable, and no doubt very expensive, product.

Permalink to story.

 
*the only acceptable GPUs for gaming are the ones I like*
what makes me angry about this card is that if it's real, the 4090(which is really an 80 series card based on chip size) is that this is going to be ungodly expensive. After everyone flocking to the 4090 nVidia comes with this cash grab. I'd be furious at nVidia if I bought a 4090.

That said, the 4090 is a beast of a card and there really isn't anything that can max it out. A titan class card seems kind of pointless for gaming. But it also seems pointless for compute workloads because they want to paywall them behind the quadro series.

nobody is happy with nVidia's business practices right now but I'll give them that they know how to make money.
 
But it also seems pointless for compute workloads because they want to paywall them behind the quadro series.
AI workloads would suit this card perfectly and don’t require anything firmware locked to a Quadro.
 
AI workloads would suit this card perfectly and don’t require anything firmware locked to a Quadro.
I can maybe see an AI researcher having one of these in some type of HEDT, but AI research is more suited to something rack mounted.

Without a price or even confirmation this is real it's pointless to speculate
 
It's great to see graphics cards coming out like this, when the tech becomes obsolete they can be repurposed as boat anchors.
 
How to turn your computer into a heater for your room. 800w TDP wow. This is doable in the UK but from what I gather in the US this type of load with the rest of a computer could trip a breaker.
 
The room where I keep my PC is kind of cold, so if I buy this I'm getting a heater and a GPU... and in that order too. Hopefully they'll make a 220v version so that it can heat the room a little more efficiently. How cool would it be if the cooler glowed red, and not from the LED lights, but the 800watts of power?
 
We need a new PC case standard for this the NVTX. From how many Nvidia cards it can fit will be named NVTX1 to NVTX12

Something like this:
iu


How cool would it be if the cooler glowed red, and not from the LED lights, but the 800watts of power?
They should make also green and blue versions for RGB fans.
 
It's not so unusual for a heatsink if the thing has a board power of 800W.

281375


Just another engineering marble. I like it tho.
 
I've said this before but nVidia does seem to be going down the 3DFX rabbit hole where you just stick more cooling and bigger and bigger heat sinks and fans on the same product then just continue to crank it up. It's noisy and uses ridiculous amounts of power, its expensive and takes up half the house but it does run fast. Barn-door engineering.
 
I've said this before but nVidia does seem to be going down the 3DFX rabbit hole where you just stick more cooling and bigger and bigger heat sinks and fans on the same product then just continue to crank it up. It's noisy and uses ridiculous amounts of power, its expensive and takes up half the house but it does run fast. Barn-door engineering.
Not just yet there, waiting for 1.6KW dual chip or quad chip 3.2KW 4090Ti

At least those guys got right the power delivery connectors
iu
 
This is fake no way we get a quad slot gpu its almost as big as a pc case itself

The photos dont seem renderlike at all to me, and the manufacturing is way too complex for someone doing a gag, so ehh, it’s probably a real card. Whether it gets released, who knows.
 
Remember the days when they would increase performance while simultaneously increasing power efficiency? Pepperidge Farm remembers.
 
Remember the days when they would increase performance while simultaneously increasing power efficiency? Pepperidge Farm remembers.
That still happens today, perf/watt has improved dramatically every gen. If by "power efficiency" you mean" reduce TDP", then no I dont remember that, every gen of GPUs has chased higher power usage as nodes are pushed to the limit.
 
Nvidia, when will you learn that it is going to make A LOT more sense to water cool something with this extreme of a TDP? Look at how big that is! Just put a rad on it and be done with it!
 
That still happens today, perf/watt has improved dramatically every gen. If by "power efficiency" you mean" reduce TDP", then no I dont remember that, every gen of GPUs has chased higher power usage as nodes are pushed to the limit.
At least in the days of the GeForce 400 to 900 series many, of the same class cards generation-to-generation would be in the 200-250 TDP range. So yes, increase in performance with the same TDP.

 
There's a Kickstarter incentive to produce a special addition for those cards. It will come with 4 specially designed pipes that are attached to 2 special shoes and 2 special gloves. It will enable heating of your feet and hands for winter gaming. As the experts predict the winter of 2023 / 2024 to be extremely cold, this combo may become very popular.
 
At least in the days of the GeForce 400 to 900 series many, of the same class cards generation-to-generation would be in the 200-250 TDP range. So yes, increase in performance with the same TDP.

The 400-900 series was also the weirdest time for GPUs. For the decade preceding that, GPUs continuously pushed higher envelopes. The geforce 2 ti was entirely slot powered. The 4tis needed molex for the high end card, then both the mid range and high end FX, ece. The 8800 ultra normalized having 2x6 pin connectors, the 280 6+8 pin. With all these steps, the power use of low end cards also went up.

Fermi pushed the envelope too far, and we had to wait for a few generations for cooler design to catch up. Kepler gen 2 pushed the envelope once again with the 780ti, then again with the 1080ti, then 2080ti, and so on.

There's still plenty of improvements in perf/watt. If you want to stay in the same TDP range, you have plenty of choices. You can also apply a lower TDP and for minimal performance loss dramatically increase perf/watt if you want to as well. The fact we can now push even larger GPUs then ever before has not changed that. The names are just different, thats all.
 
Back