Zotac confirms 32GB RTX 5090 is on the way, 16GB RTX 5070 Ti and 5060 Ti could follow

Daniel Sims

Posts: 1,877   +49
Staff
Highly anticipated: As Nvidia teases the imminent launch of its next-generation graphics cards, codenamed Blackwell, the frequency of leaks continues to intensify. In a situation resembling Asrock's recent Intel Battlemage leak, a board partner has jumped the gun on listing the first Nvidia RTX 5000 GPUs.

Zotac briefly leaked listings for five graphics cards from Nvidia's upcoming GeForce RTX 5000 series. The mistake marks the first official confirmation that the lineup, codenamed Blackwell, continues the RTX branding. More information regarding video memory arrangements has also emerged.

VideoCardz captured screengrabs of Zotac's lsitings before the company removed them, and they still appear on Google searches for Zotac's version of the flagship RTX 5090. The 5090D (for Chinese markets), 5080, 5070 Ti, and 5070 also appear.

Click to enlarge

Furthermore, the VRAM configuration options on Zotac's store briefly alluded to the RTX 5000 series. The memory type section added a selection for GDDR7 VRAM, which is expected to debut with Blackwell. A new option for 32GB of memory also appeared, likely confirming rumors that the 5090 will become the first gaming card to feature that amount.

Meanwhile, other recent reports suggest that the rest of the RTX 5000 lineup will include VRAM pools identical to their corresponding RTX 4000 predecessors. If Wccftech's unnamed sources prove accurate, the RTX 5070 Ti and 5060 Ti will feature 16GB, echoing the 4070 Ti and some models of the 4060 Ti.

The amounts contrast with the RTX 5060 and 5070, which disappointingly include only 8GB and 12GB, respectively. Recent high-end games have become increasingly taxing for cards with under 12GB, especially in 4K and at high graphics settings.

Moreover, new and upcoming mid-range GPUs could present RTX 5000 with stiff competition. Intel's recently released Arc B580 includes 12GB of VRAM for only $250 and trades blows with Nvidia's 4060, and AMD promises to catch up in the ray tracing and AI upscaling departments with its mainstream-focused next-generation RDNA4 lineup.

However, Nvidia might have an advantage in memory speed across the board. The RTX 5060 and 5060 Ti reportedly include GDDR7 VRAM, contrasting with prior leaks indicating the desktop 5060 would stick to GDDR6 while its laptop counterpart upgrades to GDDR7.

Nvidia will deliver a CES keynote on January 6, where it is expected to unveil RTX 5000. AMD is also rumored to reveal RDNA4 at the trade show.

Permalink to story:

 
Rumours claim 5080 will get 30 Gbps memory, up from 28 Gbps on all the other 5000 cards.
Is that overclocked 28 Gbps modules then? I thought next step from 28 Gbps was 32 Gbps...

5070 looks severely gimped tho, with just 6400 cores. That is like 50% less than 5070 Ti.

5060 gets 8GB.

Maybe they will talk about Neutral Texture Compression at CES 2025. Even AMD talked about something similar. Better compression instead of more VRAM.
 
Last edited:
8GB in entry-level GPUs until 2029, only then will Jensen have mercy and make the huge upgrade to 9GB!

I know is that GDDR7 is much more expensive. But it doesn't justify keeping 8GB in 2025, they should jump to at least 10GB for the sake of graphics quality, but that's just another bad thing a monopoly can do... lol
 
The gap between the 5080 and 5090 is huge (if these specs are accurate)! The 5080 reportedly has half the cores and half the VRAM of the 5090. Looks like Nvidia is at it again, positioning the xx90 "Titan" class GPUs for gamers, just like they did with the 4090. Nvidia being Nvidia, still sticking with 12GB of VRAM for the 5070, while the 5080 gets "only" 16GB.

Who’s brave enough to predict the MSRP for the 5090?
 
I don't see the problem with 8GB for entry-level GPUs.. There is not single game on market that does not run well on 8GB GPU with the right settings.. As long as those GPU don't cost over 300 dollar, it is not issue as people try to make it.
 
Hello guys. Sup?

sw8pPWO.jpeg
 
8GB in entry-level GPUs until 2029, only then will Jensen have mercy and make the huge upgrade to 9GB!

I know is that GDDR7 is much more expensive. But it doesn't justify keeping 8GB in 2025, they should jump to at least 10GB for the sake of graphics quality, but that's just another bad thing a monopoly can do... lol
There's no reason to, so long as people keep up with this argument:
I don't see the problem with 8GB for entry-level GPUs.. There is not single game on market that does not run well on 8GB GPU with the right settings.. As long as those GPU don't cost over 300 dollar, it is not issue as people try to make it.
This is factually incorrect. Techspot already covered the 8GB issue over a year ago, and the problem is not getting better. GPUs like the 4060 are powerful enough to run settings higher then 8GB will allow.

You are gimping your hardware, paying for performance you cannot use. 8G first appeared on cards in 2013. It's almost 2025. Stop hanging onto obsolete memory configs and stop going to bat for multi million dollar corpos. 8GB should be in the dustbin.
Rumours claim 5080 will get 30 Gbps memory, up from 28 Gbps on all the other 5000 cards.
Is that overclocked 28 Gbps modules then? I thought next step from 28 Gbps was 32 Gbps...

5070 looks severely gimped tho, with just 6400 cores. That is like 50% less than 5070 Ti.

5060 gets 8GB.

Maybe they will talk about Neutral Texture Compression at CES 2025. Even AMD talked about something similar. Better compression instead of more VRAM.
5070 and up are using GDDR7, not GDDR6. Different speed tables. Different available module speeds.
 
There's no reason to, so long as people keep up with this argument:
This is factually incorrect. Techspot already covered the 8GB issue over a year ago, and the problem is not getting better. GPUs like the 4060 are powerful enough to run settings higher then 8GB will allow.

You are gimping your hardware, paying for performance you cannot use. 8G first appeared on cards in 2013. It's almost 2025. Stop hanging onto obsolete memory configs and stop going to bat for multi million dollar corpos. 8GB should be in the dustbin.
5070 and up are using GDDR7, not GDDR6. Different speed tables. Different available module speeds.
I know they are using GDDR7. I am talking about the actual speed of these modules. I doubt there is actual 28 and 30 Gbps modules. They are probably just clocking the same modules differently.

Besides, 8GB is plenty for many 1080p gamers, which is why you get to choose. More than 60% of PC gamers use 1080p or less. Which game will punish a 8GB card in 1080p unless you enable RT or Path Tracing?

At 1440p, you don't need more than 12GB. There is not a single game even in 4K/UHD that use up 12GB unless you crank Path Tracing to max at native res, which no-one does. Even 4090 buckles without DLSS here. 5090 will buckle too and no AMD cards can do Path Tracing at all.

Generally GPU power will dry out long before VRAM becomes a problem on most GPUs. Pretty much all. Lowering VRAM usage is easy too. First, remove motion blur and DoF which sucks anyway. Then lower shadows. Alot more fps yet a much cleaner image with better visibility = Win/Win.


VRAM is not the only thing that matters for longevity.
GPU power and features matter. Features being RT performance and upscaling for example.

I never use RT or Path Tracing and own a 4090. It would perform identical in 99% of games with half the VRAM.

4070 Ti 12GB beats 3090 24GB in 4K/UHD and is closer to 3090 Ti here, just at half the wattage. GPU power > VRAM, always.

3070 8GB slapped 6700XT 12GB on release and still does.

Proof:


Yet 6700XT was praised for having 12GB VRAM and therefore being "futureproof" - Yeah ... Not so much. Even in 4K/UHD the 3070 wins.

All new and demanding AAA games released this year.

3070 also very often beats Radeon 6800 16GB and sometimes even 6800XT.

VRAM don't matter if GPU is weak to begin.
 
Last edited:
I don't see the problem with 8GB for entry-level GPUs.. There is not single game on market that does not run well on 8GB GPU with the right settings.. As long as those GPU don't cost over 300 dollar, it is not issue as people try to make it.

I guess it depends on how you define 'entry level'.

Back in the day anything with an XX60 on it was mid tier. So a 5060 would be a mid tier GPU, albeit lower mid tier.

For me entry level is a XX50 part. A presumed RTX5050 is entry level. 8GB for such a part seems perfectly reasonable. It's not like you're using expensive GDDR7 for such a card. You would assume slower GDDR6.

RTX5060 should be at least 10GB because it would likely have enough performance to run games at settings that require at least 10GB of VRAM these days. If the GPU is at least as good if not better than the 2020 consoles, it is going to need at least 10GB to run games with the same texture quality.

There has to be more flexibility for memory configurations and there will be, with 3GB modules available in 2025. After this point there is no excuse to have inadequate levels of VRAM on GPUs, especially ones that will be well over $500 no doubt.

For me I see Indiana Jones for example and the issue is an RTX4070 12GB or particularly the RTX3080 10GB could probably manage a bunch of the ray tracing settings, but there isn't enough memory. That's very poor for cards that are not particularly old nor cheap even if they are no longer the latest model.

Nvidia big up features like frame generation, but they swallow up a chunk of extra VRAM too.
 
I know they are using GDDR7. I am talking about the actual speed of these modules. I doubt there is actual 28 and 30 Gbps modules. They are probably just clocking the same modules differently.

Besides, 8GB is plenty for many 1080p gamers, which is why you get to choose. More than 60% of PC gamers use 1080p or less. Which game will punish a 8GB card in 1080p unless you enable RT or Path Tracing?

At 1440p, you don't need more than 12GB. There is not a single game even in 4K/UHD that use up 12GB unless you crank Path Tracing to max at native res, which no-one does. Even 4090 buckles without DLSS here. 5090 will buckle too and no AMD cards can do Path Tracing at all.

Generally GPU power will dry out long before VRAM becomes a problem on most GPUs. Pretty much all. Lowering VRAM usage is easy too. First, remove motion blur and DoF which sucks anyway. Then lower shadows. Alot more fps yet a much cleaner image with better visibility = Win/Win.


VRAM is not the only thing that matters for longevity.
GPU power and features matter. Features being RT performance and upscaling for example.

I never use RT or Path Tracing and own a 4090. It would perform identical in 99% of games with half the VRAM.

4070 Ti 12GB beats 3090 24GB in 4K/UHD and is closer to 3090 Ti here, just at half the wattage. GPU power > VRAM, always.

3070 8GB slapped 6700XT 12GB on release and still does.

Proof:


Yet 6700XT was praised for having 12GB VRAM and therefore being "futureproof" - Yeah ... Not so much. Even in 4K/UHD the 3070 wins.

All new and demanding AAA games released this year.

3070 also very often beats Radeon 6800 16GB and sometimes even 6800XT.

VRAM don't matter if GPU is weak to begin.
Are you familiar with how game engine like UE5 manage low VRAM situations? Things like lowering the resolution of textures or loading in less lighting effects?

The side effect of that is higher framerates, because you are not rendering as much. Therefore, a GPU like the 3070 suddenly looks significantly faster then the 6700xt, because it isnt loading as much. Techspot covered this n their 8GB review over a year ago.

I dont know why people want to meatshield corpos and 8GB GPUs so much.
 
Are you familiar with how game engine like UE5 manage low VRAM situations? Things like lowering the resolution of textures or loading in less lighting effects?

The side effect of that is higher framerates, because you are not rendering as much. Therefore, a GPU like the 3070 suddenly looks significantly faster then the 6700xt, because it isnt loading as much. Techspot covered this n their 8GB review over a year ago.

I dont know why people want to meatshield corpos and 8GB GPUs so much.
Yeah I do and I have also seen pretty much all games listed running on a 8GB GPU and it looked fine. Personally I use 4090 24GB and could not care less. I just know those 24GB is complete overkill and wasted for the most part. Especially for people running 1440p or lower, which is like 98% of PC gamers.

You only need 20+ GB VRAM when you completely crank games like Cyberpunk 2077 at native 4K/UHD with Ultra settings, Path Tracing, Frame Gen, EVERYTHING applied. And guess what, No AMD card can do it, meaning 24GB on 7900XTX is completely useless. 7900XTX does like 3.5 fps average with those settings. My 4090 does 25-30 fps and needs DLSS and FG to deliver 80-100 fps. Input lag is too high for me tho, I prefer 100 base fps before enabling FG, so its more like 50-60 fps. Useless, I use 240 Hz.

VRAM never made a GPU and alot of VRAM won't magically futureproof a lacking GPU and this is why Radeon 6000 series aged badly even tho it had plenty of VRAM. Can't do RT, yet many new games has forced RT elements now, drowning them AMD cards.

This is what Radeon 8000 series are trying to fix. Vastly improved RT perf.

What is great for longevity tho, is upscaling, and DLSS is the best one today, with a big margin. Both in terms of visuals and support, 600+ games have DLSS now. Upscaling with built in AA and sharpening, are replacing 3rd party AA already. TAA is trash.

DLAA is the best AA solution today.
DLSS is the best upscaler today.

Digital Foundry say this. Techpowerup say this. Everyone with experience with all upscalers say this. I tried tons of GPUs in the last 5 years and DLSS/DLAA is clearly the holy grail, especially true IN ACTUAL MOTION. FSR has tons of shimmering issues, artifacts and just weird stuff happening. DLAA is clean as day, beats any other AA solution. DLSS Quality is very impressive at 1440p. DLDSR can make a 1080p IPS panel look good too. DLSS even works with DLDSR, so you can downsample 1440p or even 4K to 1080p with only a small performance penalty.

And this is why Nvidia has 90% GPU marketshare now. Feature-wise AMD is lacking far far behind. Something that should improve, hopefully, in 2025+ starting with Radeon 8000 series.

FSR 4 should launch at CES 2025 along with Radeon 8000 series.

I write all this, because VRAM is not the end all be all. It is a small part of a GPU. What is more important for MOST PEOPLE is GPU power (can always adjust settings, easily with little to no visual loss) and FEATURES, that will improve performance or make visuals better.

Go read about Neural Texture Compression. I would not be suprised if Nvidia launches this at CES 2025. Even AMD speaks about this stuff now. Answer is not MORE VRAM, it is better COMPRESSION and developers needs to USE THE TECH because it is already available. BETTER LOOKING TEXTURES with MUCH LESS VRAM USAGE, YES PLEASE.


 
Last edited:
Yeah I do and I have also seen pretty much all games listed running on a 8GB GPU and it looked fine. Personally I use 4090 24GB and could not care less. I just know those 24GB is complete overkill and wasted for the most part. Especially for people running 1440p or lower, which is like 98% of PC gamers.

You only need 20+ GB VRAM when you completely crank games like Cyberpunk 2077 at native 4K/UHD with Ultra settings, Path Tracing, Frame Gen, EVERYTHING applied. And guess what, No AMD card can do it, meaning 24GB on 7900XTX is completely useless. 7900XTX does like 3.5 fps average with those settings.
so....is 24GB overkill or not? you just contradicting yourself. If the 4090 can saturate games with setting high enough to use 20+ GB, then 24GB is, by definition, not overkill.
VRAM never made a GPU and alot of VRAM won't magically futureproof a lacking GPU and this is why Radeon 6000 series aged badly even tho it had plenty of VRAM. Can't do RT, yet many new games has forced RT elements now, drowning them AMD cards.
VRAM has destroyed GPUs when they are inadequately equipped. It became an issue for the 8000s way back in the day, and more recently the GTX 600 series and then the 4GB maxwells. It was also an issue on the infamous 4GB fury cards. Even low end cards, the RX 5500 was kneecapped by a combo of a small VRAM buffer and a x8 PCIe bus that, when put on gen 3 systems, resulted in significant performance issues.

8GB belongs on sub $200 cards, at best.
This is what Radeon 8000 series are trying to fix. Vastly improved RT perf.

What is great for longevity tho, is upscaling, and DLSS is the best one today, with a big margin. Both in terms of visuals and support, 600+ games have DLSS now. Upscaling with built in AA and sharpening, are replacing 3rd party AA already. TAA is trash.

DLAA is the best AA solution today.
DLSS is the best upscaler today.

And this is why Nvidia has 90% GPU marketshare now. Feature-wise AMD is lacking far far behind. Something that should improve, hopefully, in 2025+ starting with Radeon 8000 series.

FSR 4 should launch at CES 2025 along with Radeon 8000 series.
None of this has anything to do with the 8GB limit, so imma ignore it.
 
so....is 24GB overkill or not? you just contradicting yourself. If the 4090 can saturate games with setting high enough to use 20+ GB, then 24GB is, by definition, not overkill.

VRAM has destroyed GPUs when they are inadequately equipped. It became an issue for the 8000s way back in the day, and more recently the GTX 600 series and then the 4GB maxwells. It was also an issue on the infamous 4GB fury cards. Even low end cards, the RX 5500 was kneecapped by a combo of a small VRAM buffer and a x8 PCIe bus that, when put on gen 3 systems, resulted in significant performance issues.

8GB belongs on sub $200 cards, at best.

None of this has anything to do with the 8GB limit, so imma ignore it.

24GB is overkill. Even with the best GPU today, you need to use settings that are too high for the GPU to handle, and still its only at 20GB.

You know games work even if they don't run Ultra preset right? Tons of settings on ultra preset eats alot of VRAM while providing the user with NOTHING but SMEAR (motion blur and DoF) and FOG, limiting visibility. Even with a 4090 I disable crap like this.

Fact is that 4060 with its 8GB outsold entire Radeon 7000 lineup multiple times. So you can vote with your own wallet. People don't care.

Again, 60% of PC gamers use 1080p or less. 98% use 1440p or less. They don't need 24GB VRAM. It is as simple as that. Also, RTX owners have DLSS to improve longevity, which works great and replaces AA completely.

Proof: https://www.rockpapershotgun.com/outriders-dlss-performance

4K DLSS looks better than 4K Native here with 75% higher perf.
I rarely play any games without DLSS or DLAA today. Pretty much all new games has DLSS/DLAA and it beats native and any other upscalers with ease, while lowering VRAM usage.

So you can scream all you want, people vote with their wallets. You don't decide what people buy and 8GB VRAM is plenty for the majority of PC gamers. Many PC gamers don't touch new and demanding AAA games running on ultra preset anyway. They could not care less.

Why do you care if people buy an 8GB GPU? Like seriously? I bet you own an AMD GPU.
 
Last edited:
24GB is overkill. Even with the best GPU today, you need to use settings that are too high for the GPU to handle, and still its only at 20GB.

You know games work even if they don't run Ultra preset right? Tons of settings on ultra preset eats alot of VRAM while providing the user with NOTHING but SMEAR (motion blur and DoF) and FOG, limiting visibility. Even with a 4090 I disable crap like this.

Fact is that 4060 with its 8GB outsold entire Radeon 7000 lineup multiple times. So you can vote with your own wallet. People don't care.

Again, 60% of PC gamers use 1080p or less. 98% use 1440p or less. They don't need 24GB VRAM. It is as simple as that. Also, RTX owners have DLSS to improve longevity, which works great and replaces AA completely.

Proof: https://www.rockpapershotgun.com/outriders-dlss-performance

4K DLSS looks better than 4K Native here with 75% higher perf.
I rarely play any games without DLSS or DLAA today. Pretty much all new games has DLSS/DLAA and it beats native and any other upscalers with ease, while lowering VRAM usage.

So you can scream all you want, people vote with their wallets. You don't decide what people buy and 8GB VRAM is plenty for the majority of PC gamers. Many PC gamers don't touch new and demanding AAA games running on ultra preset anyway. They could not care less.

Why do you care if people buy an 8GB GPU? Like seriously? I bet you own an AMD GPU.

This sounds like the Mac argument for RAM from the last few years. And what did they end up doing? 16GB across the board.

I don't care if people buy an 8GB GPU. But there's just no compelling case for it anymore above $250. Even the B570 will have 10GB @$220. I'm sure the 5060 will perform better and have better drivers, but for what price?

The RX 6800 can be readily had for $350 with a full 16GB and blow the doors off a 4060 (and 4060 Ti). Yes, I know it uses more power.

I do own an AMD GPU. And I had Nvidia before it. Loved my GTX 1070, but my 6800 XT tripled its performance. Agree about AMD 7-series though. I want a 7900 XT but can't justify it above $500. Can't justify the price of any Nvidia option. Maybe the 4070 Super @$500.
 
Probably with some new proprietary tech to make loyal followers prematurely discard their GPUs, paying extra for the supposedly advertised "premium product".

Ironically, 8GB of Vram can hinder the performance of fake frames in many games. Huang always outdoes himself when it comes to manipulation. "Just buy it"
 
The gap between the 5080 and 5090 is huge (if these specs are accurate)! The 5080 reportedly has half the cores and half the VRAM of the 5090. Looks like Nvidia is at it again, positioning the xx90 "Titan" class GPUs for gamers, just like they did with the 4090. Nvidia being Nvidia, still sticking with 12GB of VRAM for the 5070, while the 5080 gets "only" 16GB.

Who’s brave enough to predict the MSRP for the 5090?

I’m gonna say either $1999 or $2499 at least. As for the 5080 - $1299
 
I'm surprised, I keep getting told that 8GB is plenty for games and devs are just lazy. Why would Nvidia keep wasting money on more VRAM?

Do I even need the /s?
I don't even know why they chase higher performance overall.
Just downgraded from 3070 to 2070 🤷🏻‍♂️
 
Back