New leaks reveal Nvidia RTX 5090's massive PCB and full specs for 5070 Ti, 5070

Daniel Sims

Posts: 1,877   +49
Staff
In context: Nvidia is expected to unveil the next-generation RTX 5000 graphics cards next month, but leaks have uncovered most of the new lineup's technical details. The latest reports offer a close-up of the flagship GPU and answer lingering questions regarding the mid-range options.

Leakers on the Chiphell forums recently provided two close-up images of the upcoming GeForce RTX 5090. Meanwhile, established tipster @kopite7kimi has provided near-complete specs for the 5070 Ti and 5070.

The first RTX 5090 photo, appearing to show a naked custom PNY variant, confirms prior rumors that the card will be big. A PCIe 5.0 connector, a single-slot 12V-2x6 port, 16 memory modules, and over 40 capacitors are visible. Another image shows the flagship card with its components installed, including its GB202-300-A1 GPU, said to be Nvidia's largest consumer GPU die in six years at 24mm x 31mm.

Based on prior reports, the RTX 5090 substantially outperforms the 4090 with 32 GB of GDDR7 VRAM running at 28Gbps on a 512-bit bus and 21,760 CUDA cores. While its TDP is rumored to be 600W, the final product's power draw might be considerably lower.

Meanwhile, two recent Twitter posts from Kopite corroborate prior leaks regarding the RTX 5070 Ti and 5070. Notably, the 5070's 6,144 CUDA core count confirms that it is slightly cut down from the GB205's maximum of 6,400. The GPU features 12 GB of VRAM on a 192-bit bus and draws 250W – a slight increase over the 4070.

Customers who are disappointed with the RTX 5070's modest VRAM pool but unwilling to pay the 5080's likely quadruple-digit price tag should consider the 5070 Ti. This card boosts memory to 16 GB while drawing 300W with 8,960 CUDA cores. All RTX 5000 graphics cards are equipped with GDDR7 VRAM.

Clock speeds and prices are the only remaining critical details that remain unknown. The latter likely won't leak before Nvidia's expected unveiling during its January 6 CES keynote, but the new images of the RTX 5090's monstrous PCB will likely stoke fears that it will break the $2,000 mark.

The upcoming Radeon RX 9000 series from AMD is also expected to debut at the trade show. Together with Intel's recently launched Arc Battlemage lineup, it could pose a serious challenge to the RTX 5060, which Nvidia might introduce later in the first quarter of 2025.

Permalink to story:

 
Look again at the PCB, there's nothing "massive" about it. Maybe the heatsink will be massive.
The PCB in the picture it's not longer than 7 inches(~18cm). This was the PCB length before the external power was included.
Some PCB cracks to be expected from the weight of the cooler if not re-enforced correctly on some AIB cards.
600W TGP=VRAM+GPU. And looking at the 20+3 VRM design in such a small footprint the cooler will be impressive.
 
Look again at the PCB, there's nothing "massive" about it. Maybe the heatsink will be massive.
The PCB in the picture it's not longer than 7 inches(~18cm). This was the PCB length before the external power was included.
Some PCB cracks to be expected from the weight of the cooler if not re-enforced correctly on some AIB cards.
600W TGP=VRAM+GPU. And looking at the 20+3 VRM design in such a small footprint the cooler will be impressive.
Considering their connector is classified at max 600w, I think they’ll need to make sure it stays well below that - 600 consistant watt on that plug will be dangerous as we’ve already seen it has definite design flaws
 
Look again at the PCB, there's nothing "massive" about it. Maybe the heatsink will be massive.
The PCB in the picture it's not longer than 7 inches(~18cm). This was the PCB length before the external power was included.
Some PCB cracks to be expected from the weight of the cooler if not re-enforced correctly
Well, the PCB on my GTX-1050 ti (twin fan), tapes out to 10". (254 mm ) That will fit in pretty much any ATX or Matx case
In the same position as the photo my hand, from tip of thumb to just before the wrist maybe 5 1/2"
Granted, there are 2 PCBs in the first photo, but the photo is cut off at the capacitors on the right.
If nothing else, the board looks fairly deep. You can gauge the width by the length of the PCI-e connector. The top photo's scale is somewhere close to 1:1.
 
Considering their connector is classified at max 600w, I think they’ll need to make sure it stays well below that - 600 consistant watt on that plug will be dangerous as we’ve already seen it has definite design flaws
There's nothing dangerous about using a part within its specification. That's the point.
 
Look again at the PCB, there's nothing "massive" about it. Maybe the heatsink will be massive.
The PCB in the picture it's not longer than 7 inches(~18cm). This was the PCB length before the external power was included.
Some PCB cracks to be expected from the weight of the cooler if not re-enforced correctly on some AIB cards.
600W TGP=VRAM+GPU. And looking at the 20+3 VRM design in such a small footprint the cooler will be impressive.

-The board's y-axis is pretty standard, but we don't know it's z-depth (which will be determined by the cooler but will almost certainly be a 3 or 3.5 slot monstrosity) or it's x-length because the picture cuts off on the right hand side (but at the end of the day it won't matter cause it's only going to come with massive 13" coolers anyway).
 
There's nothing dangerous about using a part within its specification. That's the point.
Aside from the fact that the connector melts well below its specified limits. So at best, the specifications are inaccurate and, at worst, a complete lie. Just because the melting connector issue is not in the news any more does not mean it has gone away. Repairing a melted power connector is the most common repair done on 4090s. The second is trying to repair the 16x connector because the weight of the card causes it to crack in the slot if not given a secondary support outside of the mounting screws in the case.
 
Aside from the fact that the connector melts well below its specified limits. So at best, the specifications are inaccurate and, at worst, a complete lie. Just because the melting connector issue is not in the news any more does not mean it has gone away. Repairing a melted power connector is the most common repair done on 4090s. The second is trying to repair the 16x connector because the weight of the card causes it to crack in the slot if not given a secondary support outside of the mounting screws in the case.
A bad specification or a part that doesn't meet the specification doesn't invalidate my post. In this particular case, there's not much difference in risk between a 600w connector delivering 600w and a 600w connector delivering 450w. The flaws are more directly different issues with the design and implementation.
 
A bad specification or a part that doesn't meet the specification doesn't invalidate my post. In this particular case, there's not much difference in risk between a 600w connector delivering 600w and a 600w connector delivering 450w. The flaws are more directly different issues with the design and implementation.
well, it's more about the connector not being able to meet specifications when made by multiple manufacturers. This problem isn't unique to a single brand or manufacture making it more of an engineering problem with the design. Since you're bringing up risk. There is significantly more risk using 600 watts over it rather than 450watts. If we see enough problems with it to make the news at 450 watts, we'll almost certainly see more problems at 600watts. It's why 12 gauge extension cords don't blow breaks when 14 gauge ones do. The difference here is that there is no breaker to shut off the power. Even if the smart features they're adding into the cable DO work, they will throttle the card down to cool the cable. If your $2000-3000 GPU has to power down to 300 watts so the connector doesn't melt itself then I'm going to call that a design flaw and a failure. That's a flat out bad product. Sure, they might be able to prevent themselves from melting with some electronic trickery, but it'll be at the expense of performance.
 
12GB on the 5070? Please tell me that's a mistake, as the 4070 has 12GB as it is barely good enough for the newest games out. I had to lower Indiana Jones to Very High (from Supreme) as it was capping out my VRAM which caused micro-stuttering. /facepalm
 
A bad specification or a part that doesn't meet the specification doesn't invalidate my post. In this particular case, there's not much difference in risk between a 600w connector delivering 600w and a 600w connector delivering 450w. The flaws are more directly different issues with the design and implementation.
OK, if a connector is rated at 600 watts, it needs a safety overhead of at least maybe 800 watts_full stop.

The problem is the molex type circular pin and socket plug in. They're simply not designed to handle that much current. After all, 600 watts at 12 volts is >50< (!) amps. They bend, become loose, which develops resistance, which in turn, causes heat. They work fine for cards up to maybe 250 watts, after that you're on your own.

IMO, (which likely doesn't matter), somebody's going to have to come up with an "ATX 4.0 PSU spec, which uses blade & spring blade sockets for the VGA, the same as house plugs use.

Making the insulators out of carbon fiber instead of plastic wouldn't hurt either
 
Last edited:
12GB on the 5070? Please tell me that's a mistake, as the 4070 has 12GB as it is barely good enough for the newest games out. I had to lower Indiana Jones to Very High (from Supreme) as it was capping out my VRAM which caused micro-stuttering. /facepalm
Like sucks, and then you keep demanding more photo realistic game imaging. Life sucks, and then you want 600 FPS. Life sucks, and then the designers don't bother to optimize the games because you keep wanting them released before they're truly ready. Life sucks because people absolutely "need" to game at 4K.

Life really sucks when you get the 2K bill for the graphics card you, "can't live without".

Life sucks in general, period. But your gaming, "habit", seems to compound it in multiples.
 
It's why 12 gauge extension cords don't blow breaks when 14 gauge ones do. The difference here is that there is no breaker to shut off the power.
I don't know whether this is intentionally oversimplified or just plain wrong.
The max current draw on a 15 amp circuit is 1875 watts. It's the wattage load that blows the breaker, not the gauge of the extension cord. The 1875 watt figure is predicated on the fact that is all a 14 gauge household circuit can safely carry. (125 volts x 15 amps = 1875 watts).
Using the same calculation, I get 2400 watts for 12 gauge wiring, at the socket, albeit at only 120 volts.
Now, if you want those 1875 watts @ 100 feet from the socket, you would likely need a 12 gauge extension cord. At 25 feet, you likely can get away with 14 gauge.

As "watts" equals volts times amps, when the voltage drops, the current draw goes up, which is what causes power tools to heat up, in the presence of lower voltage.

FWIW, solid copper wire will carry more current than stranded wire of the same gauge.

On a side note, you didn't like my "Skyrim X-mas card"?
 
I don't know whether this is intentionally oversimplified or just plain wrong.
The max current draw on a 15 amp circuit is 1875 watts. It's the wattage load that blows the breaker, not the gauge of the extension cord. The 1875 watt figure is predicated on the fact that is all a 14 gauge household circuit can safely carry. (125 volts x 15 amps = 1875 watts).
Using the same calculation, I get 2400 watts for 12 gauge wiring, at the socket, albeit at only 120 volts.
Now, if you want those 1875 watts @ 100 feet from the socket, you would likely need a 12 gauge extension cord. At 25 feet, you likely can get away with 14 gauge.

As "watts" equals volts times amps, when the voltage drops, the current draw goes up, which is what causes power tools to heat up, in the presence of lower voltage.

FWIW, solid copper wire will carry more current than stranded wire of the same gauge.

On a side note, you didn't like my "Skyrim X-mas card"?
I simplified it because other people are dumb. And, yes, I did get your Xmas card. Haven't "opened" it yet, been a busy few days and I mostly just argue with strangers in the internet to pass the time while hopping between jobs. I planned on looking at it tomorrow, I'm finally off.

Did you get that pillow I sent? It belonged to my uncle.
 
-The board's y-axis is pretty standard, but we don't know it's z-depth (which will be determined by the cooler but will almost certainly be a 3 or 3.5 slot monstrosity) or it's x-length because the picture cuts off on the right hand side (but at the end of the day it won't matter cause it's only going to come with massive 13" coolers anyway).
As I mentioned earlier, what you're calling the "Y Axis" looks to be deeper than normal as well. If you feel like goofing around, you could use the Pci-e x 16 socket as a reference to scale the width.

But you're right, I don't know where anybody could come up with the idea that the board will only be, "7 inches long", when the photo cuts off at the VRM capacitor line.
 
There's nothing dangerous about using a part within its specification. That's the point.
That's assuming the spec is realistic and that the design is good enough. It's like you're forgetting the fire hazard that was the original 12VHPWR, and the updated connector isn't trouble free either.
 
Like sucks, and then you keep demanding more photo realistic game imaging. Life sucks, and then you want 600 FPS. Life sucks, and then the designers don't bother to optimize the games because you keep wanting them released before they're truly ready. Life sucks because people absolutely "need" to game at 4K.

Life really sucks when you get the 2K bill for the graphics card you, "can't live without".

Life sucks in general, period. But your gaming, "habit", seems to compound it in multiples.

Pro tip: if your comment can be used to excuse ANY poor product design chip, you might just have a bad take.

The real LPT: be wrong or be an *******, but don't be both.
 
Pro tip: if your comment can be used to excuse ANY poor product design chip, you might just have a bad take.

The real LPT: be wrong or be an *******, but don't be both.
Well, I'm not wrong, but by gamer's standards, I suppose I am an *******. But by the same token, I'm not sure with 7 posts, you're entirely qualified to give me, "Pro tips" either.

I don't game myself, but I do spend too much time at GamersNexus, listening to Steve rant. It somehow soothes me.
Half the things that "the real experts" have posted in this thread are f***ing dead wrong. The most salient being, "that circuit board is only 7" long". Another being, "if a part is rated at 600 watts, it should handle 600 watts". Well no. If a part is rated at 600 watts, it should be able to handle 800+ watts transient load. (Without going up in flames) You know "safety margin" and all that.

This industry keeps d!cking around with pin and socket connectors, as VGAs consume more and more power with each generation. By all rights, @600 watts "stated draw", and still using molex style pin and socket connectors, the standard should be up somewhere near 2 x 4 times 4. And that's assuming you manage to get them all plugged in correctly

Several things stand out to me in regard to gaming mentality and gear "necessity". Since our "muscle cars" have been taken away, and there's no draft or war to supply the blood thirsty with, "recreation" . Those same personalities have "gone underground", into their houses to fantasize about those activities through video games.

There an old saying that goes, "the first sign of being crazy is denying it". That holds true for more things than people might care to admit. The first two things that spring to mind following that logic are cellphones and video gaming. SIC: "The first sign of being addicted, is denying it."

So, every generation of games seem to "magically" require more powerful gear to run it. Then comes the histrionics, "oh my god I'll have to upgrade", then reality sets in, "holy crap those prices are outrageous".

Well, the first shot of heroin is usually free too. But many of y'all don't know when to (figuratively), "pull the needle out". There are many forms of addictions, and most have an "********* anonymous" organization to cope with them. I'm wondering if "gamers anonymous" will be next on the list

Now, you can still buy 700 and 900 watt microwave ovens. But at least they pull the wattage on the label. Not to mention that they pull that load off of an 1875 watt capable circuit..! While it's likely "a true rumor" that the new 5090 will be rated at 600 watts, it's liable to have up to 1000 watt transient spikes. IMHO, (for whatever that's worth), this new generation of VGAs needs a different connector "solution" entirely, than what is being offered at present. I have a couple of suggestions, which would likely be dismissed as "too ugly". But then again, form doesn't dictate functional ability, it's the other way around

And I'm certainly not arguing that some of what's available in connectors, adapters, and even PSUs, aren't garbage. A lot them are, very much so.
 
Last edited:
Regarding the disclosure of information about soon to be released tech products, why are we still using the word leaked? Seriously.
 
Back