Nvidia's GeForce RTX 3080 and 3090 could enter mass production in August

I'm really curious to see where this goes in the next gen in and out of the consoles. AMD showed that the original 1560 MHz spec of the 5600XT was running close to the sweet spot of power efficiency and then pushed it up from there with the BIOS update to increase performance to 1750 MHz. TechPowerUp's power efficiency graphs are great to visualize this, with the 5600XT original BIOS clearly the most efficient video card out there. I believe laptop GPUs run in this range as well to get the most performance from the fewest watts.
Well, to be honest, I got the 1750 MHz/150W numbers not from some rigorous testing, but from my playing around with different TDP and undervolting in my Radeon driver.
That said, now that you mention more rigorous testing the numbers seem to add up. My card is much more recent (bought only 3 months ago) and, more importantly, all Radeon chips are the same die, with different levels of binning, A level chips go to 5700XT, B to 5700, C to 5600XT and D to 5600. So it would make sense that higher-quality silicon would have a curve shifted towards higher efficiency.
Just as an aside, the most efficient design that AMD came with is a custom 5600M for Apple, which is in fact the A quality die (5700XT, 40 CUs) but downclocked to only 1035MHz and using HBM2 instead of GDDR 6.

I do the same for my GTX 1080 using Afterburner and the fastest peak efficiency is around 0.9v at 1911 MHz (or 1923 MHz with a better cooler) but that's optimally undervolted which needs to be done on a per-card basis. Standard clocks for 0.9v are at 1733 MHz, right at the number you mention.

It will be interesting to see if clocks do increase noticeably with Nvidia's node shrink and whether AMD's IPC and other improvements in Navi 2 can keep up. Intel has already given a hint that some companies to not see a clock speed improvement with a smaller node as their new laptop CPUs on 10nm have lower top speeds than their very mature and optimized 14nm CPUs, though IPC improvements make up for the speed deficit.
Yes, I'm really curious to see how this one goes, too. I'm pretty sure that we'll see higher clocks from AMD this fall, it's already their second generation on this node or whereabouts, and historically, ATI/AMD engineers seem to be better at silicon level technology, while Nvidia engineers seem to be better with architectures. Nvidia also might have the handicap of having delayed the choice of its node, playing (and losing) the competition between Samsung and TSMC in order to get a better price.
One thing is sure, though, is that the barrier that you are speaking about, that both AMD and Intel are facing, seems to be somewhere between 4 and 4.5GHz, so I'd say that for GPU's we've got some leeway yet. I imagine that the problems are different in nature and are mostly related to the huge size of the dies.
 
Last edited:
My feeling is the 3090 or Titan will be a twin gpu model with 24GB
The feeling is supported by the fact that the leaked 3080 images show no Sli port - so Sli is clearly not supported and this is because they will be selling a twin gpu Big Navi killer And they have seen what AMD have done to intel - Nvidia will be taking no chances - it’s deffo a twin gpu card to my mind And the 3090 will be untouchable.

My concern is this
Nvidia sold us the Ray tracing “dream” with plenty of talk and two years on that is all it is - talk.
Whilts the pascal performance was a Reasonably worthy upgrade from 1080ti to 2080ti The ray tracing was frankly pathetic , it killed the fps and there was little of it to enable

Two years on there are still only half a dozen titles that have ray tracing in and frankly the 2000 series ray tracing selling point for me was frankly a total rip off.

My feeling is it’ll be the same rip off with the 3000 series loads of talk and few titles that actually use it, whilst the ray tracing performance will at last be what it should have been the first time with little FPS cost ray tracing is pointless with no volume of games supporting it.

Unless these cards are cheap I won’t be adopting one and knowing Nvidia they won’t be cheap.
My real hope is that Big Navi surprises everyone and really kicks ***, I’d love Big Navi to make me want to buy it.
Id love to see AMD do well and close the gap because whilst Nvidia are unlikely to make the same mistake intel have a decent Big Navi care might keep prices of the 3000 and the 2000 series down and as PC gamers that can only be good for all of us.
 
Last edited:
My feeling is the 3090 or Titan will be a twin gpu model with 24GB
The feeling is supported by the fact that the leaked 3080 images show no Sli port - so Sli is clearly not supported and this is because they will be selling a twin gpu Big Navi killer And they have seen what AMD have done to intel - Nvidia will be taking no chances - it’s deffo a twin gpu card to my mind And the 3090 will be untouchable.

My concern is this
Nvidia sold us the Ray tracing “dream” with plenty of talk and two years on that is all it is - talk.
Whilts the pascal performance was a Reasonably worthy upgrade from 1080ti to 2080ti The ray tracing was frankly pathetic , it killed the fps and there was little of it to enable

Two years on there are still only half a dozen titles that have ray tracing in and frankly the 2000 series ray tracing selling point for me was frankly a total rip off.

My feeling is it’ll be the same rip off with the 3000 series loads of talk and few titles that actually use it, whilst the ray tracing performance will at last be what it should have been the first time with little FPS cost ray tracing is pointless with no volume of games supporting it.

Unless these cards are cheap I won’t be adopting one and knowing Nvidia they won’t be cheap.
My real hope is that Big Navi surprises everyone and really kicks ***, I’d love Big Navi to make me want to buy it.
Id love to see AMD do well and close the gap because whilst Nvidia are unlikely to make the same mistake intel have a decent Big Navi care might keep prices of the 3000 and the 2000 series down and as PC gamers that can only be good for all of us.
I doubt Nvidia would have a twin-GPU for the launch or be making one any time soon.
Twin-GPU cards are far more complex to develop and the design can't be downscaled without cutting into other segments while being far less financially performant compared to downscaling a single-GPU card.

I agree on the RT part but I would like to mention a few things about it:
- this was the first iteration of the technology and, much like the first 3D games, it was very underwhelming compared to rasterized graphics with a lot of baked effects + screen space effects (basically the norm in AAA games today)
- even this next iteration will be underwhelming because going from a highly-optimized raster development pipeline to a strictly RT one is impossible right now as the hardware availability just isn't there... and that doesn't happen overnight
- game development will slowly evolve by adding small parts of the RT spectrum to enhance the current effects and reduce the development costs associated with having to use a complex raster pipeline
- games won't be able to use the full spectrum of RT any time soon as the hardware power just isn't there, and we're talking orders of magnitude here, not percentages
- I recommend you view some of Digital Foundry's youtube videos about how RT is handled in games today to better understand this
- the full spectrum of RT is nowhere near even the "Ultra" RT settings of current games that use it: imagine dropping from 4k 60 FPS of no RT, to 4k 30 FPS of "Ultra" RT, to 4k 2FPS of full RT... and yes, the quality differences are massive between "Ultra" and the full shebang of RT features
- it will likely be until the next-next generation of consoles before RT use will start becoming mainstream
- I wouldn't be surprised if, at one point, we get a crazy game that only uses RT and becomes the next "Crysis", needing at least 2 GPU generations to be playable at max RT
 
My feeling is the 3090 will be a 24GB twin GPU card
The fact there is no SLI socket on any of the leaked pictures to my mind says Nvidia are not supporting SLi
I just got that feeling
The 295 cards were twin gpu, Nvidia also will not want to take any chances on big navi and they will want a killer cardz
They have seen what happened to intel at AMD’s hands
The will deffo be two gpu’a on a single card with 12GB each side of the card
I actually think the leaked cooler was the twin gpu cooler
I guess time will tell
 
Back