Nvidia RTX Titan Ada reportedly canceled after it melted PSUs, tripped breakers

midian182

Posts: 9,743   +121
Staff member
Rumor mill: The God-level RTX 4090 Ti or Lovelace Titan many have been expecting has reportedly been put on hold by Nvidia because it was tripping breakers, melting PSUs and occasionally dissolving. It's believed that when (if) it does arrive, the card might be one of the first to pack GDDR7 memory.

With its 450W TDP, $1,600 MSRP, and 24GB of GDDR6X clocked at 21 Gbps, the GeForce RTX 4090 is a monster in every respect. However, we'd heard plenty of rumors before Ampere's launch of a Lovelace flagship with 48GB of memory @24 Gbps and a TGP of 900W.

Update: Our GeForce RTX 4090 review is now live. Is it a power hog? Let's find out...

According to Moore's Law is Dead, which can admittedly be a bit hit-and-miss with its claims, this card is the Titan Ada. The channel cites a source that has proved "more than legit" over the last six months, so take it all with some salt.

The source adds that the Titan Ada never had an 800W – 1,000W TGP like many claimed but was around 600W – 700W in testing. It was supposedly to be so big that the motherboard was usually mounted to the side of the card, as opposed to the card being seated in the motherboard. It's also said to require two 16-pin power connectors, be four slots thick, and pack the full-fat AD102 GPU rather than the cut-down version found in the RTX 4090.

But it appears that the Titan Ada was simply too much card to handle, tripping power supplies, breaking PSUs, and melting. As such, the project has been canceled -- at least for now.

It's only a theory, but Nvidia could release a Titan Ada card in the future. The company might be waiting for 27 Gbps memory or GDDR7 to arrive first, allowing more throughput without increasing power consumption. The Titans have traditionally been aimed more at creators, researchers, and enterprises, so even with those power demands and what will doubtlessly be a comically high price, it'll likely still sell plenty of units.

In other graphics card news, we've just seen the Radeon RX 6900 XT, which can match the best of Nvidia's current generation, drop below $700.

Permalink to story.

 
"The company might be waiting for 27 Gbps memory or GDDR7 to arrive first, allowing more CUDA cores without increasing power consumption."

I am unclear on how this would allow more CUDA cores without increasing power consumption. Is it chip power consumption, or board power consumption? Is GDDR7 less power hungry than GDDR6?
 
Hmm, this is interesting. It seems because the core count is so high, even though there are 2100 more cores, that's only a 13% increase. If you slow down the core even just a couple of hundred mhz to keep the power levels of the 4090 (16300 cores) the same for the 4090 ti (18400), you are getting nearly identical TFlops ratings. I thought 300 mhz wouldn't matter that much when you are adding 2000 cores, but it does with a core that is this high. So yeah, your going to need more power, not just more cores to really see a big leap for a 4090 Ti. I bet in this scenario the 4090 Ti would still outperform the 4090, but not by much.

16300 * 2 * 2.5 = 81.5 TFlops
18400 * 2 * 2.2= 81 TFlops
 
Last edited:
I guess my days of wanting/needing the best are long gone. And even if I still had that mindset I'm not going to have a 600w+ video card. I bought a 3060 Ti specifically because of the wattage/price to performance, I honestly didn't want anything above it. But that's just me, I can understand that I'm not the hardcore audience this is intended for.
 
I guess my days of wanting/needing the best are long gone. And even if I still had that mindset I'm not going to have a 600w+ video card. I bought a 3060 Ti specifically because of the wattage/price to performance, I honestly didn't want anything above it. But that's just me, I can understand that I'm not the hardcore audience this is intended for.
Strongly depends on the screen you're using. I have 38 uw and I need something stronger to run it. Currently you can get relatively cheap 6900xt and that can run a 4k display with not to crazy investment. And because I work on pc uw monitor is so much better than 1080p...
 
Frankly, I don't see the problem here. :confused:

All those uber gaming addicts, (you know who you are), have to do is these simple things:

1: Go back to full tower cases, but newly designed models with provisions for two (2) PSUs, perhaps a 1200 watt for the VGA, and an 800 watt for the system.
2: Most PSUs (all?) offer an operational voltage of 110 to 220 VAC.
3: Lobby the electrical standards board to provide 10 gauge 220 volt extension cords with two (2) female 120 outlets. for the PC.
4: Be patient with Nvidia, while they develop a housing for this monster with four (4), 110 CFM cooling fans. (Or replace the air cooling with a triple fan AIO liquid cooler. (with its own separate PSU).

Then when you're ready to "game your a**es off, simply unplug your electric clothes dryer, and plug the rig into that outlet.

After these "simple steps", have been completed, you can kick and tell anyone who questions your ultimate rig's abilities, and say, "yeah, m*****r f****r, it runs Crysis, at.t000 FPS.. :mad:
 
Full-fat versions of the AD102 will obviously exist (that's just the way chip binning works), so Nvidia and its board test partners would have certainly made a 'Titan' version, just as part of their standard product development cycle.

If it was 600-700W in terms of total board power, and used two 12VHPWR cables, it categorically wouldn't have been breaking PSUs. Seasonic has a 1600W PSU, the Prime TX, that comes with two 12VHPWR cables, and a sustained +12V current delivery of 108 to 133A. Is Moore's Law is Dead suggesting that this would blow up? A 700W graphics card, using both 12-pin connectors, would draw 58A (roughly 325W/27A per power connector, the rest via the PCIe slot) leaving plenty for the rest of the test PC and well within the limits of the PSU.

Not that this is the first ultra-high-power card/model that Nvidia has developed. Their recent H100 has a TDP of 700W, although it does use the SXM5 module for connectivity and power (thus designed for rackmounts only). Servers do have different PSUs to desktop PCs but they're not magically more powerful.

Their old desktop Quadro Plex units used standard PSUs and required around 600W or so, and this was back in 2008. The GTX Titan Z from 2014 was a dual GPU card, with a TDP of 375W, and could be run in SLI, albeit in a very custom setup - no blowing up of anything. And high power isn't exclusive to Nvidia. AMD's 2015 Radeon R9 390 X2, another dual GPU card, had a TDP of 580W and it didn't go around tripping PSUs of that time, either.

With 76.3b transistors, the AD102 is the second-largest single-die GPU ever made (the top slot is taken by Nvidia's GH100 at 80b and that's not for general use). Even using TSMC's tweaked N4 process node, it's going to require a lot of power -- 450W for the 4090 is clearly a lot, but that's the same for the 3090 Ti and it's 130W less than the 390 X2.

One can successfully argue that it didn't need to be that high (I.e. Nvidia could have clocked the 4090 lower and it still would perform very well), but stories about the full chip blowing things up is just silly.
 
Full-fat versions of the AD102 will obviously exist (that's just the way chip binning works), so Nvidia and...

Thank you for saying something rational and intelligent in this whirlpool of intellectual mediocrity.

As for this so-called article, quoting "Moore's Law Is Dead", a well known AMD fangirl from the cesspools of YouTube, about anything GPU related, is not a very wise thing to do. The guy is NOT a trained technician, he is not a professional, he is NOT part of the GPU industry. What he is, is a guy with a loud mouth and a YT channel, who really likes tech stuff, worships AMD and probably has regular wet dreams about Lisa Su (which must certainly be an improvement over Raja Kadoori). Stop quoting these YouTube buffoons, because by doing so, you only hurt your own credibility.
 
Stop quoting these YouTube buffoons, because by doing so, you only hurt your own credibility.
I personally see nothing wrong with letting people know what others are claiming, in the world of computing, et al. It is all of interest to me, simply because it generates discussion. The backgrounds of the claimants and their choice of platforms are irrelevant.
 
The more opinions the beter, it's up to us to select valid and trustworthy sources.
P.S. I cant watch MLID because it make me sleepy.
 
This is a good idea. Hopefully they get this idea to and release SLI and work it so that it is software transparent/agnostic.
I miss how cool it was having 2 graphics cards in a PC case. Crossfire and SLi had a frame studdering issue that would need to be resolved. They also need a link where the cards pool their ram together instead of each card using its own ram. I don't know if NVlink has solved this issue or not.

But with all the progress we've made with multi chip processing I would think that solving the studdering issue would be easy.

Heck, maybe we could have a card dedicated to Ray tracing? I don't know too much about this kind of thing but it's cool to think about.
 
Well, I saw the problem right here:
titan-collectors-edition-p_1100.webp

That looks remarkably like a Romulan Disruptor Bank mounted on the ventral side of the saucer section of a Starfleet hull. Starfleet phasers are rated at 5.1MW and Romulan Disruptors are pretty much on par with them. It's no surprise that it's melting everything! ;):joy:
 
Back