Photo of MSI GeForce RTX 3080 Ti SUPRIM X suggests launch is close

midian182

Posts: 6,796   +61
Staff member
Rumor mill: With the RTX 3080 Ti’s announcement and launch supposedly just a few weeks away, a purported image showing one of MSI’s cards has leaked. The MSI RTX 3080 Ti SUPRIM X in the photo has come from a distribution center and is the final retail product.

Rumors that an RTX 3080 Ti card was in the works have been circulating since November. Alleged details of the card have changed since then; once believed to feature 20GB of GDDR6X, it seems Nvidia has instead opted for 12GB of GDDR6X at 19 Gbps. Virtually every month of 2021 has seen a predicted launch date, but it looks increasingly likely that it will land in May.

A photo of what’s claimed to be pallets of MSI cards, including the GeForce RTX 3080 Ti Ventus 12G OC, was published earlier this month. Now, a VideoCardz source has provided an image of the MSI RTX 3080 Ti SUPRIM X from one of the distribution centers that have received the next Ampere release.

The SUPRIM X is the series’ flagship model, offering the best performance and materials. The RTX 3090 version features a nickel-plated copper baseplate and precision-machined heat pipes.

A recent GPU-Z screenshot of the RTX 3080 Ti showed it with 10,240 CUDA cores, a 384-bit memory bus, a 1,365 MHz base clock, and a 1,665 MHz boost. It’s expected to come with a mining limiter and is thought to have an MSRP of $1,099. Rumors put the announcement date at May 18, with a release on May 26. Just don’t expect it to be any easier to buy than any other graphics card right now.

Permalink to story.

 

Cycloid Torus

Posts: 4,747   +1,547
I'm still waiting for the version with 5,120 CUDA cores, a 192-bit memory bus, a 1,365 MHz base clock, a 1,665 MHz boost and 6 GB of DDR6.... a plain old gaming card for the masses.
 

Adi6293

Posts: 802   +1,070
Where did you learn that from? I've known quite the opposite - 12GB is plenty enough for any game at 4K.

For how long though? we are no longer in the old generation of consoles, games will start using more memory, my wife's 3070 for example at 1440p is using up all the 8GB quite often now
 

Burty117

Posts: 4,086   +2,080
Control very often uses 12.5 GB, for example. The latest COD sucks 14.5 GB, another example.
That's not true at all, your confusing Allocated VRAM vs Dedicated VRAM.

As far as I can find, no game, even at 4K resolution, fully utilizes 10GB of VRAM. Control supposedly gets close with 9.5GB of VRAM usage at 4k with no DLSS enabled, so it's running poorly at that point, once you turn DLSS on it drops down to 7GB.
For how long though? we are no longer in the old generation of consoles, games will start using more memory, my wife's 3070 for example at 1440p is using up all the 8GB quite often now
It'll be years before games actually start using the full 8GB of VRAM, here are some examples:


Witcher 3 - VRAM actual usage was 2.3GB but showing as 6.5GB in Afterburner
Tony Hawk Pro Scater 1+2 - VRAM actual usage goes up to 6GB but shows 9.2GB in Afterburner
RE2 - VRAM actual usage is 6.6GB but 9.4GB in Afterburner

These are at 4k as well using DSR (my monitor is 1440p) with a 1080Ti.
 
Last edited:

fluffydroid

Posts: 34   +22
Well this is good news. Of course your odds of getting one are about the same as the odds of finding a unicorn, so good luck!
 

Geralt

Posts: 288   +341
TechSpot Elite
That's not true at all, your confusing Allocated VRAM vs Dedicated VRAM.

As far as I can find, no game, even at 4K resolution, fully utilizes 10GB of VRAM. Control supposedly gets close with 9.5GB of VRAM usage at 4k with no DLSS enabled, so it's running poorly at that point, once you turn DLSS on it drops down to 7GB.

It'll be years before games actually start using the full 8GB of VRAM, here are some examples:


Witcher 3 - VRAM actual usage was 2.3GB but showing as 6.5GB in Afterburner
Tony Hawk Pro Scater 1+2 - VRAM actual usage goes up to 6GB but shows 9.2GB in Afterburner
RE2 - VRAM actual usage is 6.6GB but 9.4GB in Afterburner

These are at 4k as well using DSR (my monitor is 1440p) with a 1080Ti.
I believe in AB, sorry, not in you.
 

pcnthuziast

Posts: 1,061   +755
Based on the effort miners and scalpers put into constantly improving and diversifying their methods of acquisition, these cards are already gone.
 

Austinturner

Posts: 159   +165
This card makes no sense as the gap between 3080 and 3090 is very little. And 12 GB VRAM is hardly enough for 4K. The price is high too.
I wonder if it is possible for it to have a lot more raytracing cores, if the 3080ti is great at up to 4k with raytracing and the 3090 is great at professional work with its large memory capacity they could both be good in their own way? I don’t know if this is plausible but it is the only way I can think of to differentiate them.

edit:
Nope looks like this isn’t possible because the RT cores are on the same SM as the cuda cores so the ratio is fixed.
 
Last edited:

Geralt

Posts: 288   +341
TechSpot Elite
I wonder if it is possible for it to have a lot more raytracing cores, if the 3080ti is great at up to 4k with raytracing and the 3090 is great at professional work with its large memory capacity they could both be good in their own way? I don’t know if this is plausible but it is the only way I can think of to differentiate them.

edit:
Nope looks like this isn’t possible because the RT cores are on the same SM as the cuda cores so the ratio is fixed.
As a matter of fact, the mistake is not this card but the nonsensical 3090.
 

Theinsanegamer

Posts: 2,452   +3,607
Control very often uses 12.5 GB, for example. The latest COD sucks 14.5 GB, another example.
Then you should be able to find sources showing this, correct? When vram becomes a limitation the effects are immediately obvious, frame times go to pot, stuttering is rampant, and pop in occurs where it did not before.

Funny thing is none of the professional reviews online see this occuring.
 

Geralt

Posts: 288   +341
TechSpot Elite
Then you should be able to find sources showing this, correct? When vram becomes a limitation the effects are immediately obvious, frame times go to pot, stuttering is rampant, and pop in occurs where it did not before.

Funny thing is none of the professional reviews online see this occuring.
No, because I have a 6900 XT with 16 GB VRAM. I never crossed 16 GB till now. Direct experience is the best source of truth. The opinion of others is important but secondary. I start Afterburner and Control and sometimes the game sucks 12.5 GB (in certain areas). And COD is worse, with 14.5 GB. If the game is doing this in order to use so much memory as available, I don't know, but these games are doing this. And RDR2 sucks sometimes 11 GB. All in 4K and maxed out. I cannot deny what I am 'directly watching'. Another explanation might be that Afterburner is inflating the numbers, but I don't know about those intricacies.

Here: https://www.thetechlounge.com/how-much-vram-do-I-need/
According to his opinion, for 4K you need 8-12GB. Well, some games just cross the line sometimes.
 
Last edited:

envirovore

Posts: 193   +459
TechSpot Elite
Control very often uses 12.5 GB, for example. The latest COD sucks 14.5 GB, another example.

That's allocation for use if needed, not actual use.

The system says "we're setting aside this amount of VRam for asset streaming *if* we need it."
Otherwise you end up with asset streaming from system RAM, which is why frametime spikes and stutters occur when you've hit a VRam limit, the engine is streaming assets from RAM to VRam (or worse, from hard storage to VRam)

Hell, techspot even did an article about this, and while it focuses primarily on system RAM it also explains above:
https://www.techspot.com/article/1770-how-much-ram-pc-gaming/

You can verify this via Task Manger, of all things, for years now...
https://www.reddit.com/r/nvidia/comments/iy9ehm
 
Last edited:

Geralt

Posts: 288   +341
TechSpot Elite
That's allocation for use if needed, not actual use.

The system says "we're setting aside this amount of VRam for asset streaming *if* we need it."
Otherwise you end up with asset streaming from system RAM, which is why frametime spikes and stutters occur when you've hit a VRam limit, the engine is streaming assets from RAM to VRam (or worse, from hard storage to VRam)

Hell, techspot even did an article about this, and while it focuses primarily on system RAM it also explains above:
https://www.techspot.com/article/1770-how-much-ram-pc-gaming/

You can verify this via Task Manger, of all things, for years now...
https://www.reddit.com/r/nvidia/comments/iy9ehm
My question is then why is Afterburner showing allocated VRAM and not the actual used VRAM?