Nvidia's RTX Ampere details leaked: powerful and expensive

midian182

Posts: 9,745   +121
Staff member
Highly anticipated: Nvidia’s Ampere-based RTX graphics cards are arriving later this year, which means we’re being swamped with leaks and rumors. The latest of these reveals some alleged juicy details about the three launch cards, including a flagship model that features 24GB of GDDR6X and 350W TDP. There’s also some news about those recent photographs of the GeForce RTX 3080 Founders Edition.

The report comes from Igor’s Lab, which claims that the leaked images of the RTX 3080 Founders Edition are legitimate. Nvidia is said to be furious about the leak and is conducting an internal investigation to find the culprit. The top suspects are Foxconn and BYD (Build Your Dreams), both of which build the Founders Edition cooler.

Igor Wallossek believes not even Nvidia’s own product and sales manager had knowledge of the new Ampere Founders Edition design. Now that people know what to expect—and not everyone likes what they see— Wallossek says the design is likely to change before release. With Nvidia’s (and AMD’s) next-gen cards rumored to arrive in September, that doesn’t give team green a huge amount of time for a redesign.

Another interesting piece of info involves the cost of manufacturing. It was predicted that this would be higher than Turing's, but the FE’s cooling system alone costs up to $150, meaning we can expect them to carry a hefty price tag. Additionally, the Founder’s Edition is reportedly using a completely different PCB to the boards going out to manufacturers such as MSI, which will use a more rectangular shape.

Part PCB Chip Model Extension Memory Interface TBP Connectors
SKU10 PG132 GA102 RTX 3090 (Ti/Super)* 24 GB GDDR6X
(Double-Sided)
384-bt 350 W 3x DP, HDMI
NVLink
SKU20 PG132 GA102 RTX 3080 (Ti/Super)* 11 GB GDDR6X* 352-bit* 320 W 3x DP, HDMI
SKU30 PG132 GA102 RTX 3080 none 10 GB GDDR6X 320-bit 320 W 3x DP, HDMI

Moving onto the specs, all three launch cards are said to be based on the GA 102 GPU, rather than each using a different GPU, as was the case with the RTX 2080 Ti, RTX 2080, and RTX 2070.

The three products will all use GDDR6X memory, with the GeForce RTX 3090 (Ti or Super) featuring a monster 24GB and a 384-bit memory interface, which suggests it could be marketed as a Titan RTX successor.

Even if the RTX 3090 is aimed squarely at enthusiasts with deep pockets, the RTX 3080 (Ti or Super) and RTX 3080 boast 11GB GDDR6X/352-bit and 10GB GDDR6X/320-bit, respectively, with both cards having a 320W TDP.

As with all leaks and rumors, take this one with a pinch of salt, but it goes without saying that these flagship Ampere cards will have to be beastly to compete with AMD’s Big Navi, and high performance often means high prices.

In other Nvidia news, AMD recently revealed more details about its rival's DGX A100 system, for which it provides two Epyc 7742 processors.

Permalink to story.

 
Yes, absolutely: 3080Ti or RTX 3000 model, depending on the pricing strategy.

In addition to the Youtube revenue made just from showing everyone how well the card runs the newest games, the card reviews and the card comparisons, or the sheer enjoyment being able to play your games - not ever having to worry about your place in benchmarks - it literally pays for itself.

One thing I did learn after the 2080Ti: make sure you buy the model that has an integrated closed loop liquid cooler. My 2080TiFTW3 Hybrid was definitely cooler than my 2080Ti Black (dual fan). the average temperature during gaming sessions on my Black was maximum 65 degrees Celsius while the FTW3 was lower than 55 degrees C.
 

Attachments

  • 2080Ti GLOW.jpg
    2080Ti GLOW.jpg
    38 KB · Views: 1
Last edited:
350 watt TDP? That doesn’t sound good!
It wouldn't be the first consumer graphics card to have that kind of TDP - some examples:

Radeon RX Vega 64 = 295 W
Radeon VII = 295 W
GeForce GTX 690 = 300 W
Radeon HD 7970 GHz Edition = 300 W
Radeon HD 5950 = 300 W
Radeon RX Vega 64 LC = 345 W
GeForce GTX 490 = 365 W
GeForce GTX 590 = 365 W
GeForce GTX Titan X = 375 W
Radeon HD 7970 X2 = 500 W
Radeon R9 295 X2 = 500 W
Radeon R9 390 X2 = 580 W
 
It wouldn't be the first consumer graphics card to have that kind of TDP - some examples:

Radeon RX Vega 64 = 295 W
Radeon VII = 295 W
GeForce GTX 690 = 300 W
Radeon HD 7970 GHz Edition = 300 W
Radeon HD 5950 = 300 W
Radeon RX Vega 64 LC = 345 W
GeForce GTX 490 = 365 W
GeForce GTX 590 = 365 W
GeForce GTX Titan X = 375 W
Radeon HD 7970 X2 = 500 W
Radeon R9 295 X2 = 500 W
Radeon R9 390 X2 = 580 W

Check your facts, I checked just one card from your list (Titan X), and it has a TDP of 250W.
 
It wouldn't be the first consumer graphics card to have that kind of TDP - some examples:

Radeon RX Vega 64 = 295 W
Radeon VII = 295 W
GeForce GTX 690 = 300 W
Radeon HD 7970 GHz Edition = 300 W
Radeon HD 5950 = 300 W
Radeon RX Vega 64 LC = 345 W
GeForce GTX 490 = 365 W
GeForce GTX 590 = 365 W
GeForce GTX Titan X = 375 W
Radeon HD 7970 X2 = 500 W
Radeon R9 295 X2 = 500 W
Radeon R9 390 X2 = 580 W
It's still a lot for a single GPU. If I am not mistaken, of the ones over 350 W, the Titan X is the only one with a single GPU, right?
 
"Even if the RTX 3090 is aimed squarely at enthusiasts..."

Doesn't look square to Me at all... ;-)
 
Sure I'm interested big time, but price will be main factor. Ready to dump my 1080Tis and replace them with just one card.

For rendering alone yes I want that 3090-whatever, but that will be probably 4000$ or even more. For possibly 24GB of VRAM @ 350W is not that terrible. Considering that this card will basically multiply my current compute performance by 6 (leaks and previous generations analytics) + added bonus of OptiX support (which Pascal has none). I want it, but will probably settle for normal 3080 and 2x more performance, simply because in no way I can splash that kind of dosh, especially not this year after moving to TRX40 - that bankruptcy red line is fluorescent: "You shall not pass." ;D

It is good to know that potentially 3080 will have that magic 10GB. That's basically main barrier to rendering, previous 2080 was only 8 and it limited options so much.

Certainly I'm not interested in upgrade for gaming.
 
Check your facts, I checked just one card from your list (Titan X), and it has a TDP of 250W.
Sorry that was a simple keyboard typo - meant to hit Z, rather than X, I.e. the GeForce GTX Titan Z is 375 W. The Titan X (Pascal), note no 'GeForce' nor 'GTX' tag, is indeed 250W.

It's still a lot for a single GPU. If I am not mistaken, of the ones over 350 W, the Titan X is the only one with a single GPU, right?
Yes it certainly is a lot. Here's the list again, but this time with additional info to clarify some of the figures, and some errors fixed:

Radeon RX Vega 64 = 295 W (1 GPU, 8GB HBM2)
Radeon VII = 295 W (1 GPU, 16GB HBM2)
GeForce GTX 690 = 300 W (2 GPUs, 2x 2GB GDDR5)
Radeon HD 7970 GHz Edition = 300 W (1 GPU, 3GB GDDR5)
Radeon HD 5970 = 300 W (1 GPU, 1GB GDDR5)
Radeon RX Vega 64 LC = 345 W (1 GPU, 8GB HBM2)
GeForce GTX 490 = 365 W (2 GPUs, 2x 1.5GB GDDR5)
GeForce GTX 590 = 365 W (2 GPUs, 2x 1.5GB GDDR5)
GeForce GTX Titan Z = 375 W (2 GPUs, 2x 6GB GDDR5)
Radeon HD 7970 X2 = 500 W (2 GPUs. 2x 3GB GDDR5)
Radeon R9 295 X2 = 500 W (2 GPUs, 2x 4GB GDDR5)
Radeon R9 390 X2 = 580 W (2 GPUs, 2x 8GB GDDR5)

However, given that we don't know anything concrete about consumer Ampere models, especially with regards to how big the chip is (I.e. transistor and core count), we can't really judge whether or not 350 W is excessive for a single GPU card.
 
I don't see a non Ti/Super or Titan having a TDP over 300 watts. Not on a consumer board. I'm not buying the 'leak.'

Yes ok, a one off flagship GPU it is possible. But not on a regular old RTX3080 with two other GPU tiers above it.
 
Yes, absolutely: 3080Ti or RTX 3000 model, depending on the pricing strategy.

In addition to the Youtube revenue made just from showing everyone how well the card runs the newest games, the card reviews and the card comparisons, or the sheer enjoyment being able to play your games - not ever having to worry about your place in benchmarks - it literally pays for itself.

One thing I did learn after the 2080Ti: make sure you buy the model that has an integrated closed loop liquid cooler. My 2080TiFTW3 Hybrid was definitely cooler than my 2080Ti Black (dual fan). the average temperature during gaming sessions on my Black was maximum 65 degrees Celsius while the FTW3 was lower than 55 degrees C.

Or you can get the cheapest aib/reference card and put the block on it yourself, it will look and perform even better
 
I'm a little wary of the GDDR6X element to the rumour-mill. There's been nothing from JEDEC on the subject (nor from Samsung, Micron, and SK Hynix), and they released the specification for GDDR5X well over 6 months before any commercial use of it was in place.
 
I am planning on getting something to replace my Radeon VII as it does struggle in 4K in quite a few games and the RT is tempting but with how little time I've got to play games these days I think 3080 will be my limit unless 3080Ti is way way faster but not way way more expensive then maybe but I'm not sure if that few hours a week of gaming it's worth $1300+
 
Last edited:
I am planning on getting something to replace my Radeon VII as it does struggle in 4K in quite a few games and the RT is tempting but with how little tittle time I've got to play games these days I think 3080 will be limit unless 3080Ti way way faster but not way way more expensive then maybe but I'm not if that few hours a week of gaming it's worth $1300+
Well, you can tell everyone that you have a 3080 Ti - that's a definitive plus.
 
Well, if you want the absolute best graphic performance, you sure have to pay for it, even in the electric bill. If I had the money, I wouldn't worry about the TDP.
 
All I ever want in a new generation of GPUs is at least a 50% performance increase and doing so at a similar price to what I paid in the past. (with reasonable inflation) It seems like it's taking a lot longer between video card generations to reach that criteria than it used to.
 
Back