Nvidia RTX 3080 prototype photographed (with an absurd design)

mongeese

Posts: 643   +123
Staff
In a nutshell: A photograph of two RTX 3080 prototypes has been uploaded to the Chiphell forums, which regularly intercept early hardware designs. New GPU releases rarely innovate much on the cooling front. But if this image is real, then Nvidia has something special, or at least especially perplexing, in the works.

The design of this GPU pair is immediately strange. First, there’s one fan on the bottom of the GPU and one fan on the top. Second, the power connector isn’t visible, which implies it’s on the rightmost surface of the card (on the bottom prototype) which can’t be seen from the photos. That’s actually where the Quadro GPUs have theirs. But third, and this must be the strangest, the PCB is only two-thirds the length of the chassis and has a triangular cutout to fit the bottom fan.

Observe, however, that these GPUs are in pretty rough nick. The colors are dull, the heat sinks are misaligned and unfiled, and there’s no branding apart from the ‘RTX 3080’ and ‘GeForce RTX’ logos. They’re more likely to be from a minor OEM than be Nvidia reference models. There may be no Ampere GPU in there at all, and these might be pre-emptive designs prototyped ahead of hardware shipments. So although these photos indicate an imminent release schedule at first sight, there’s really no telling where Nvidia is in the development cycle.

Is unconventional good or bad?

This cooling solution isn’t as unconventional as it looks but is nonetheless quite remarkable. The arrangement of the heatsink fins prevents airflow from traveling laterally across the PCB, which is the opposite of how a normal GPU cooler works. The left fan – which is to sit above the position of the power supply in a regular chassis – pulls air up onto the GPU die then exhausts it out the rear of the chassis (that’s normal). The rightmost fan appears to suck air up from the bottom of the case and through a heatsink before exhausting it at the approximate location of the RAM.

This technique is unlikely to provide a direct cooling advantage relative to existing designs. However, because it directs airflow through standardized pathways within the computer chassis, approximately drawing air in from the bottom-front of a case and exhausting it out the top-left, it can use other cooling fans within the computer much more effectively.

Holistically, then, this design could keep the GPU and the air within the chassis cooler than existing designs do if it is well executed. This may be an attempt to differentiate from AMD, or it could signify that these GPUs run excessively hot. This latter theory would contradict rumors that Ampere will offer significant efficiency improvements but supports leaks that suggest flagship Ampere GPUs will have a very high power draw.

Wait, what needs cooling?

Cooling is all well and good, but what needs cooling is more interesting. When it comes to the specifics of the RTX 3080, details are still sparse, or at least well-evidenced details are. But there are two leakers with good track records that deserve recognition: kopite7kimi and KkatCorgi.

A year before its announcement, Kopite leaked the rough die size, transistor count, and memory configuration of the Nvidia A100. By February this year, KkatCorgi knew the exact die size and between the two of them, they created an accurate picture of the A100’s final core count. It was impressive.

Regarding the RTX 3080, they both predicted the unusual PCB months ago. They both believe that the RTX 3080 will have 10 GB of memory operating at a minimum of 18 Gbps, relative to 8 GB and 14 Gbps on the RTX 2080. Kopite speculates that the RTX 3080 will have 4,352 CUDA cores like the RTX 2080 Ti. Nothing either of them say can be verified, however, so it would be unwise to place bets.

Permalink to story.

 
I'm glad to finally see this.

The shroud design may be polarizing to some, but for me: I don't use glass cases so I really don't care.

I just can't wait till we see 3080Ti performance data compared to everything else. Definitely gonna sell my 2080Ti on the market while it's hot and upgrade to the 3080Ti.


Used 2080Ti's easily go for $800+ so that's a recoup of about 72%
 
To any one who is a perspective buyer of an RTX card, the best advice I can give you is to go with the highest 3000 series GPU you can afford. Some people out there will claim that the 2080Ti isn't worth it - and they'll say the same thing about the 3080Ti...just like they did about the Titan X and Xp.

Just remember:

#1 Even if you aren't running your games in 4K, you'll have the absolute highest 1440p and 1080p framerates possible even with all settings at maximum. (Most gaming monitors are 1440p if they are ultrawide).

#2 You'll never have to worry about how well you'll be able to perform in a benchmark because you'll always be at the top of it - for a long time to come.

#3 The resale value of these top tier GPU is very high because there are some people who will wait to buy them just so they can add it to their existing card for a dual card setup.
 
To any one who is a perspective buyer of an RTX card, the best advice I can give you is to go with the highest 3000 series GPU you can afford. Some people out there will claim that the 2080Ti isn't worth it - and they'll say the same thing about the 3080Ti...just like they did about the Titan X and Xp.

Just remember:

#1 Even if you aren't running your games in 4K, you'll have the absolute highest 1440p and 1080p framerates possible even with all settings at maximum. (Most gaming monitors are 1440p if they are ultrawide).

#2 You'll never have to worry about how well you'll be able to perform in a benchmark because you'll always be at the top of it - for a long time to come.

#3 The resale value of these top tier GPU is very high because there are some people who will wait to buy them just so they can add it to their existing card for a dual card setup.
Awful advice.. Get the card that compliments your other components. Balance your system. You don't want a top gpu stuck at 60% usage.
 
Awful advice.. Get the card that compliments your other components. Balance your system. You don't want a top gpu stuck at 60% usage.
I agree, buy what you need.... Quantumphysycs is just saying: spend more thant a thousand dollars so you don't have to worry...

A thousand dollars might be a portion of your paycheck in one country, or a life saving ammount in another!
 
To any one who is a perspective buyer of an RTX card, the best advice I can give you is to go with the highest 3000 series GPU you can afford. Some people out there will claim that the 2080Ti isn't worth it - and they'll say the same thing about the 3080Ti...just like they did about the Titan X and Xp.

Just remember:

#1 Even if you aren't running your games in 4K, you'll have the absolute highest 1440p and 1080p framerates possible even with all settings at maximum. (Most gaming monitors are 1440p if they are ultrawide).

#2 You'll never have to worry about how well you'll be able to perform in a benchmark because you'll always be at the top of it - for a long time to come.

#3 The resale value of these top tier GPU is very high because there are some people who will wait to buy them just so they can add it to their existing card for a dual card setup.

Flagship GPUs always take the largest hit to resale value. 28% is pretty significant when you consider it's 28% of $1,200. You just lost $400, the cost of a good GPU in and of itself.

That's the devaluation without next-gen cards even releasing.

Which GPU to buy isn't nearly as simple as "Buy MOAR!".
 
I'll wait to read the reviews before commenting ... they obviously think they have a winning design, but let's see what the users say after playing with it for a few months .....
 
I agree, buy what you need.... Quantumphysycs is just saying: spend more thant a thousand dollars so you don't have to worry...

A thousand dollars might be a portion of your paycheck in one country, or a life saving ammount in another!

I too agree it is bad advice, but for the reason that it is blanket advice and not relevant to all but the people who actually have a good reason for it. Up to a certain point, unless you are gaming's counterpart to an audiophile, you aren't going to care about framerates AND every noticeable difference between ultra and high graphical settings, unless there is a counter on the top corner telling you exactly what the framerate is. Unless workflow calls for it, or a newer graphical fidelity become mainstream and you really want it, there are very few practical reasons to upgrade.

Though to be fair, I believe the message is to not cheap on it for the sake of saving money; that if you need it and can afford it, just go and get it. I mean, though, look at the prices. 2070 super performs similarly to the 1080Ti, the cost is probably half of initial pricing if you look at 2080Ti. The top card performs well through another generation, so say you bought both the 1070 and 2070, then you might as well spend it on the 3080ti because you get the absolute power for the current generation and more than competitive in the next gen (minus the new bells and whistles) for roughly the cost of two 1070 equivalent cards. If you are the top that happily bought a 1070 and ignored the entire 2000 series, meh, just get whatever falls into your budget because it is like even the 3060 card would be huge improvements over your current one.
 
Though to be fair, I believe the message is to not cheap on it for the sake of saving money; that if you need it and can afford it, just go and get it.

I tend to look at computer equipment as an investment in the exact same manner that I look at buying cars. If you’re going to spend seven or $800 to buy something why not just go ahead and spend the extra 300 or $400 to buy it even better?

It’s not as if I am asking you to double your money and spent $2000 on an RTX Titan.

The future proofing that you get from buying it “right” the first time means that in the event the next generation isn’t terribly overwhelming you can hold out a generation.

It’s like buying a car: if you get all of the major options like moonroof, navigation, heated and cooled seats, safety tech, etc. then you have something you can hold onto longer if you so decide that the next new car replacement model that comes out is it necessarily worth going into more debt for.

As for the cost: you have to pay to play!!!
 
The drawing posted is wrong. The Fans fins are attached with outer ring. One fan has reversed blades.

Looks like the first fans is heatsink and pipes for GPU.. and the second fan and surrounding heatsink and fins, cool the memory and VRM.

It's a suck-blow design, pulling in fresh air from the ends of the card internally across the GPU cooler (think OEM Intel CPU cooler), and it exhaust the air right out near the GPU and out the back of the case.
 
I tend to look at computer equipment as an investment in the exact same manner that I look at buying cars. If you’re going to spend seven or $800 to buy something why not just go ahead and spend the extra 300 or $400 to buy it even better?

It’s not as if I am asking you to double your money and spent $2000 on an RTX Titan.

The future proofing that you get from buying it “right” the first time means that in the event the next generation isn’t terribly overwhelming you can hold out a generation.

It’s like buying a car: if you get all of the major options like moonroof, navigation, heated and cooled seats, safety tech, etc. then you have something you can hold onto longer if you so decide that the next new car replacement model that comes out is it necessarily worth going into more debt for.

As for the cost: you have to pay to play!!!

You certainly didn't seem to care about future proofing when commenting on a CPU related article. You advocated for the 8700K and 9900K over CPUs that could easily be considered more future proof.
 
Very bad tagline, of course Ampere will run hotter than Turing, because it's 7nm vs 12nm...With the same cooler and TDP, a 7nm will run hotter than 12nm part. I mean just look at reference Vega 56 and reference 5700 XT operating temperatures.

Anyways just seems like endless Ampere leaks coming out these days

 
With the same cooler and TDP, a 7nm will run hotter than 12nm part. I mean just look at reference Vega 56 and reference 5700 XT operating temperatures.
That's not a particularly good example to validate this point, given that they don't have the same TDP nor the same cooler. The latter also has a base/boost clocks of 1605/1905 MHz and the former is 1156/1471 MHz; and the Vega uses HBM which is packaged directly around the GPU, whereas the Navi doesn't. They're not manufactured by the same company and the nodes aren't revisions of each other.

One could use the same website's data on the Vega 64 and Radeon VII as a counter to the argument that 7nm is automatically hotter than a larger node: the VII has the same TDP, but has higher clocks (core and memory), more transistors, and twice the HBM surrounding the GPU. It does, of course, have a much larger cooler.

The actual picture is far more complex than just a case of 'xxx nm is smaller than yyy nm so the temp will be higher' - current leakage levels, number of metal layers, substrate composition, logic density, cache amount and distribution with the die all play significant roles, and that's before one starts to factor in operating voltages, clock speeds, and so on. It's not a given that Ampere graphics cards will have higher temperatures than Turing, by any means.
 
I tend to look at computer equipment as an investment in the exact same manner that I look at buying cars. If you’re going to spend seven or $800 to buy something why not just go ahead and spend the extra 300 or $400 to buy it even better?

It’s not as if I am asking you to double your money and spent $2000 on an RTX Titan.

The future proofing that you get from buying it “right” the first time means that in the event the next generation isn’t terribly overwhelming you can hold out a generation.

It’s like buying a car: if you get all of the major options like moonroof, navigation, heated and cooled seats, safety tech, etc. then you have something you can hold onto longer if you so decide that the next new car replacement model that comes out is it necessarily worth going into more debt for.

As for the cost: you have to pay to play!!!

A depreciating asset is not an investment, but each to their own
 
If true, this can only be in response to the claimed GPU speeds the PS5 had announced, so clearly Nvidia is worried if it's designing it's OEM cards to be even better at cooling to get higher clocks. I guess competition comes with innovation. About time NVIDIA did something for it's consumers instead of rising the prices every generation by 20%.
 
I tend to look at computer equipment as an investment in the exact same manner that I look at buying cars. If you’re going to spend seven or $800 to buy something why not just go ahead and spend the extra 300 or $400 to buy it even better?

It’s not as if I am asking you to double your money and spent $2000 on an RTX Titan.

The future proofing that you get from buying it “right” the first time means that in the event the next generation isn’t terribly overwhelming you can hold out a generation.

It’s like buying a car: if you get all of the major options like moonroof, navigation, heated and cooled seats, safety tech, etc. then you have something you can hold onto longer if you so decide that the next new car replacement model that comes out is it necessarily worth going into more debt for.

As for the cost: you have to pay to play!!!

Buying a New car is Never a cost effective solution. Which is why buying a 3+ YO well speced vehicle is nearly always the way to go if you value money. Cars are never an investment. They are a tool. And buying one of Credit tends to mean you are spending money you don't have. The last thing someone should do is put a Videocard on Credit.

The Nvidia flagship cards while nice, and nearly never worth it on a Value perspective. They are for people who want the best, nothing more. They make for great second hand cards for people later in life.

I still rock a GTX1080. Why? Because even at 3440x1440 it still plays every game with great results. Why upgrade when video games have yet to improve themselves. The Mid-High end cards like the RTX2080 & RTX2070 have been the clear choice for higher end video cards the last year+.

So many people still rock GTX1080 and 1080ti cards simply because games just have not improved visually much over the last few years. I'm more interested in the videocards coming a year+ after next gen consoles drop.
 
I wouldn't call this design absurd. Is it good, well, I don't see why it would be bad.
 
Back