Nvidia shares RTX 3000 graphics card details ahead of August 31 unveiling

Shawn Knight

Posts: 15,291   +192
Staff member
In brief: Nvidia confirmed that it created a new 12-pin power connector to replace the commonly used 8-pin connector. The company said the new connector is smaller than its predecessor but is able to carry more power.

Nvidia on Wednesday published a video in which it explained some of the design philosophies that went into creating the thermal solution for its upcoming Ampere family of GPUs.

As Nvidia thermal architect David Haley recounts, the first law of thermodynamics states that energy cannot be created or destroyed in a closed system.

Thus, in order to get more performance out of a GPU, you need to be able to bring in more power and effectively dissipate the increased heat. To do that, Nvidia’s engineering team used computational fluid dynamics tools to simulate how air flows through a system. From there, they tweaked the PCB design, created a new spring system to attach the cooler to the card, relocated the cooling fans and adjusted the software that controls the fans.

The heatsink and fan tweaks are in line with what we’ve seen from leaked images thus far, further bolstering their validity.

At the end of the video, we get a brief preview of what is believed to be the 3080 cooler.

Nvidia is expected to share all during an online presentation scheduled for August 31.

Permalink to story.

 
"...engineering team used computational fluid dynamics tools to simulate how air flows through a system. From there, they tweaked the PCB design, created a new spring system to attach the cooler to the card, relocated the cooling fans and adjusted the software that controls the fans."

Sounds like they're trying to cool a nuclear reactor.
 
...and on the eighth day GOD said "let there be Thermal Dynamics" ... and there were thermal dynamics, and the world was Gooooooood.... Some millennia we'll all be able to afford them.
 
Oh God, just think about it.
Ampere will be made on a smaller node, right?
But still they had to make cards bigger so they can cool them down.
Also, they had to make a new connector to feed cards with even more power.
Something doesn't add up.
 
Something doesn't add up.
There's nothing suspicious going on here - smaller nodes don't automatically mean lower power consumption/less heat. You have the choice of reducing power, but retaining the previous transistor performance, or doing the opposite: increase performance but at the higher power cost. Generally speaking Nvidia have typically gone down the low-power-big-die route, but with a change to a new node, that's probably not the case here.
 
This level of power was previously unimaginable.

I am truly impressed.

I may even be able to run Microsoft Flight Simulator at max settings in 4K
 
Nvidia is clearly gimping the ram on these cards. Going from an 8gb standard to a 10gb standard instead of 16gbs is probably a decision they made to give people more reason to upgrade again before the ps5 era ends. Since PC exclusives don't happen anymore, a gaming PC only needs to be as good as a console, bit if they don't make any cards with at least as much vram as consoles will have, these cards won't run certain games well. Nvidia is playing a shitty game here and I hope people see it for what it is and skip these cards. 1080ti has 11gbs but the 3080 will have less? Clearly planned obsolescence.
 
Ok, so new connector... from where? inside the box, provided with the card? or....?

According to rumors an adapter will be included. I have also heard Nvidia did not solicit any of the PSU companies about the new connector so now power supply will support the new connector, at least at launch.

Nvidia is clearly gimping the ram on these cards. Going from an 8gb standard to a 10gb standard instead of 16gbs is probably a decision they made to give people more reason to upgrade again before the ps5 era ends. Since PC exclusives don't happen anymore, a gaming PC only needs to be as good as a console, bit if they don't make any cards with at least as much vram as consoles will have, these cards won't run certain games well. Nvidia is playing a shitty game here and I hope people see it for what it is and skip these cards. 1080ti has 11gbs but the 3080 will have less? Clearly planned obsolescence.

Yep, I imagine that it won't make an immediate impact as RAM doesn't become an issue until you exceed capacity by 40% but when it does, it'll be a stuttery mess. We've had games that can use 12GB of RAM since 2016.

More power, more heat and a bigger card. If AMD did this the Internet would lose its collective mind. Suddenly since it's Nvidia doing it everything is A OK lol!

Yep, it's the reason Fermi still sold well despite being extremely hot. Nvidia just has a massive brand power advantage.
 
Going from an 8gb standard to a 10gb standard instead of 16gbs is probably a decision they made to give people more reason to upgrade again before the ps5 era ends.
10 GB means 10 memory controllers, so the RAM configuration can only 10 or 20 GB. Assuming all the various rumours are true, the full GA102 chip has 12 controllers, so any card based on that will be 12 or 24 GB.
 
There's nothing suspicious going on here - smaller nodes don't automatically mean lower power consumption/less heat. You have the choice of reducing power, but retaining the previous transistor performance, or doing the opposite: increase performance but at the higher power cost. Generally speaking Nvidia have typically gone down the low-power-big-die route, but with a change to a new node, that's probably not the case here.
If it's a smaller node, you can squeeze more performance at the same power cost or do the opposite. This time both the power consumption and performance go up. Looks like Ampere is a bad design for me. I hope that they just decided to make them jump in performance by a lot.
 
This time both the power consumption and performance go up.
Performance has gone up mostly through component scaling. There doesn’t seem to be much, if any, increase in the clocks - time will tell on that one, of course. Consumer grade Ampere may well be Nvidia’s FX moment again but even if it’s not, I suspect many people’s expectations are not going to be met.

Reading around the web shows that too many people are hoping for huge performance gains, in the region of 50% or more. That’s just not going to happen for these products - the chip required would be the size of the GA100. If the likes of the 3090 is, all things considered, 30% better than the 2080 Ti, then despite the decent improvement (how off does one see a CPU improve by 30% with each generation), Nvidia is likely to get panned for it.
 
If it's a smaller node, you can squeeze more performance at the same power cost or do the opposite. This time both the power consumption and performance go up. Looks like Ampere is a bad design for me. I hope that they just decided to make them jump in performance by a lot.


I've been watching power connectors go up in size since I got my n6600, it had a 4 pin molex connector. Then my x1900xt had a single 6 pin, as did my 8800GT, don't remember what I had next, then My GTX 580 had 2 6 pin connectors, my 1070ti Has a single 8 pin. Just did a build for my buddy, his 2080ti had 2 8 pin connectors. Now we are having 12 pin connectors. When are cards going to have 2x12 pin?
 
Just did a build for my buddy, his 2080ti had 2 8 pin connectors. Now we are having 12 pin connectors.
One 12 pin is fewer pins than two 8 pins ? The latter, including the PCIe slot’s power, is good for 375 W - would be quite surprised if more than that is needed just yet.
 
For efficiency I only bought a 650 watt PSU with titanium rating. Seeing how Seasonic recommends at least 850 watts the lower end cards I will not be buying this card. We will see if the more entry level cards will be worth getting.
 
One 12 pin is fewer pins than two 8 pins ? The latter, including the PCIe slot’s power, is good for 375 W - would be quite surprised if more than that is needed just yet.
I should have said it in my post looking back on it, but my point there was that most people already have 2x8 on their PSU, they could have just used that. Now, 1X12 will start having it's own port on PSUs, we're adding complexity for no reason. I bet the 12pin adapter will likely be a 2X6pin-to-12pin anyway since most 8pins are actually 6+2pin connectors. like, thanks nvidia. Adapter or not, what this connector starts appearing on PSUs in a year or two it's going to start causing legacy issues. very few cards are going to use it, only the highest end likely will. The high end is also the smallest market for these cards.

I just don't understand what the point is. 2x6pin has more power than 1x8 pin. There is no point to this, it's annoying and will cause some legacy issues. I don't see it being very many, but they'll be there.
 
Performance has gone up mostly through component scaling. There doesn’t seem to be much, if any, increase in the clocks - time will tell on that one, of course. Consumer grade Ampere may well be Nvidia’s FX moment again but even if it’s not, I suspect many people’s expectations are not going to be met.

Reading around the web shows that too many people are hoping for huge performance gains, in the region of 50% or more. That’s just not going to happen for these products - the chip required would be the size of the GA100. If the likes of the 3090 is, all things considered, 30% better than the 2080 Ti, then despite the decent improvement (how off does one see a CPU improve by 30% with each generation), Nvidia is likely to get panned for it.

Disappointing if it turns out to be true. I think a lot of people would be fine with smaller gains if the prices weren't so crazy right now but Nvidia set that one up themselves.

On CPU performance improvements, that depends which category you are looking at. If you look at multi-threaded AMD doubled core counts twice over 3 generations and performance doubled each time the core count was doubled (sometimes more than doubled due to AMD's SMT) in applications that can take advantage of it. Of course there are applications that take time to adapt to use the extra cores, after all the market was stagnent for 10 years on 4 cores. We are seeing more and more games use 6, 8, or even is some cases more. I think there are a lot of things game devs can do now with that extra CPU horsepower. I would say as the push for multi-core performance continues, it should definitely be feasible to see large performance gains between generations. Who knows, it might even stem over-reliance on the GPU for tasks that can be done on the CPU and stop the ever increasing die size of GPUs.
 
These are more of a Creator's Card, than a Gaming one. Given the size and new node, expect these to be of limited quantity and Game Dev's and other workstation types will be the One's buying these Ampere cards up on the cheap (ie: $1,699), instead of paying $3k+...

As such, nVidia will market these as dual role cards, like Radeon Vii was marketed. Expect nVidia to offer big discounts on full ga-102 die...!

win/win for those types who will use the card's duality.
 
I just don't understand what the point is. 2x6pin has more power than 1x8 pin. There is no point to this, it's annoying and will cause some legacy issues. I don't see it being very many, but they'll be there.
The purpose of using the new 12 pin connector is explained a little in the video -- it's about reducing the footprint of the power connectors, I believe.
 
Smaller nodes don't automatically mean lower power consumption/less heat. You have the choice of reducing power, but retaining the previous transistor performance, or doing the opposite: increase performance but at the higher power cost.
Often, a new node means both. If you increase both performance and power draw at the same rate, the ratio doesn't improve ... and if a new node doesn't drastically improve the performance/power ratio, it's generally considered a failure.

And NVidia may have used computational fluid dynamics to better simulate how to shed heat from the card ... but that's not going to help the heat escape the chassis of an end-user's computer. If the 3090 really does have a 350w TDP, that's a lot of waste heat for case fans to remove...
 
The purpose of using the new 12 pin connector is explained a little in the video - it's about reducing the footprint of the power connectors, I believe.
The graphics card looks like it takes up 3 slots and is far taller than previous cards, footprint is a none issue. 6 pin and 8 pin connectors are already smaller than a single slot cooler they were designed to fit into, why is foot print even an issue?
 
Back