Zotac confirms 32GB RTX 5090 is on the way, 16GB RTX 5070 Ti and 5060 Ti could follow

Nvidia: "Here’s an 8GB RTX 5060, good luck running Starfield in 4K."
Intel and AMD: "Here’s 12GB for $250, and yes, you can still have ray tracing."
 
This sounds like the Mac argument for RAM from the last few years. And what did they end up doing? 16GB across the board.

I don't care if people buy an 8GB GPU. But there's just no compelling case for it anymore above $250. Even the B570 will have 10GB @$220. I'm sure the 5060 will perform better and have better drivers, but for what price?

The RX 6800 can be readily had for $350 with a full 16GB and blow the doors off a 4060 (and 4060 Ti). Yes, I know it uses more power.

I do own an AMD GPU. And I had Nvidia before it. Loved my GTX 1070, but my 6800 XT tripled its performance. Agree about AMD 7-series though. I want a 7900 XT but can't justify it above $500. Can't justify the price of any Nvidia option. Maybe the 4070 Super @$500.
Surely blows the doors off it - https://www.techpowerup.com/review/silent-hill-2-fps-performance-benchmark/5.html

6000 series is old and dated tech, can't do RT well, which is why pretty much all new games perform miserable on them. You know RT elements are forced in tons of games today right? You got 16GB but your GPU capabilities and features are lacking big time, making it useless anyway. I would not even pay 200 bucks for a 6800XT in 2024, soon 5 year old tech and Radeon 8000 series launch in a month with vastly better RT for 400-500 bucks brand new. Probably with FSR 4 hardware locked too. AMD is going the AI route as well with FSR 4 and old cards won't be supported.

4070 SUPER 12GB beats 6800XT in everything with ease and has superior features across the board. Nvidia just manages VRAM better; better compression, better cache/hit miss system, this is why AMD generally needs more VRAM than Nvidia. 6700XT 12GB aged worse than 3070 8GB even tho they launched at 480-500 USD and DLSS/DLAA beats FSR every single time, which is way more important for longevity than VRAM. All new demanding games have DLSS. Also, 3070 can actually do RT somewhat decent, which is why performance is not dropping hard in UE5 games like Radeon 6000 series.

So sure, enjoy your 16GB and talk crap about 8GB cards, they beat your card in tons of titles anyway, even with half the VRAM and half the power usage + superior upscaling with widespead supprt - 600+ titles now ;)

At CES 2025 Nvidia will be talking about Neural Texture Compression, which will deliver better textures at much less VRAM usage. This is Nvidias next big feature coming and AMD already tried to copy it - https://www.tomshardware.com/pc-com...n-rivals-nvidias-texture-compression-research
 
Last edited:
I will not defend NVIDIA, AMD nor Intel. I will defend my wallet. I'll buy the best possible GPU in the price range from 500 to 700usd/euros to play at 1440p native res and that's all. My 3070ti is still good enough so I'll enjoy reading reviews and wait for sales season xD
I know like 10 people with 8GB GPUs that don't have any issues in the games they play. They are satisfied and happy. Several of them are using 3070, after like 5 years now.

Most people here don't have a clue what the average PC gamer is playing. Hint; It is not brand new AAA games running on ultra settings in native 4K/UHD with path tracing. Hahahaa.

90% of the most popular games on PC barely use any VRAM. eSport titles etc. Most successful games are made for the masses, meaning no developer is coding their game so 16-24GB VRAM is needed. Or they won't sell any games, because that is targetting 10% of PC gamers.

Even PS5 and XSX have 16GB total RAM shared for entire system. OS, game and graphics, all uses less than 16GB.

According to Steam HW Survey, 75% of PC gamers have 8GB VRAM or less and most of these people also use 1080p or less.

I would take a 3070 8GB over a 6700XT 12GB any day of the week. Faster GPU (that actually don't take a huge performance hit when RT elements are forced) + DLSS with widespread support is more useful than having 4GB extra VRAM, however even with no upscaling, 3070 beats 6700XT in 99% of games, even at 1440p which they barely do at native.

Hell, even at 4K the 3070 wins... This is at ultra settings.


So much for VRAM...

Upscaling > VRAM for longevity and DLSS beats FSR every single time. In both visuals and in terms of support. 600+ games have DLSS now and rising for every week. All new games pretty much have DLSS. DLSS Quality for 1440p is GREAT. FSR is only somewhat usable at 4K but still loses to DLSS here. FSR has issues when moving around, tons of shimmering and artifacts. DLSS is clean and can easily replace most AA solutions. If you prefer top tier visuals, DLAA exist and beats any other AA solution.

Disabling garbage like motion blur, DoF, fog lowers VRAM usage too, increases performance and visibility = Pure win. Lower shadows for a huge fps boost with little visual difference and much lower VRAM usage. Plenty of ways to lower VRAM usge while losing out on NOTHING, which is why testing is flawed in most cases when it comes to VRAM.
 
Last edited:
Wow dude, I stated my point of view quite clearly and still you have to write a small essay to prove that you are a true fan. I get it. I have my own criteria to purchase a GPU, thank you xD
 
Wow dude, I stated my point of view quite clearly and still you have to write a small essay to prove that you are a true fan. I get it. I have my own criteria to purchase a GPU, thank you xD
True fan? More like a realist that look at actual performance. Numbers don't lie. Reality hurts I see. I use AMD CPU + Nvidia GPU because they offer the best available overall, in terms of drivers, performance, visuals and features in general. This is why Nvidia has like 85-90% of dGPU gaming market now.

AMD GPUs are in a horrible state right now and AMD don't seem to care. Probably because its a niche market for them. No money to be made.

Yes you buy AMD, like the 10% other people that can't afford Nvidia. Nvidia don't care about low-end market, so its fine.

Lets hope Radeon 8000 won't be shite or AMD might shaft gaming GPUs completely. They make no money on them. Which is pure fact, go read their latest financial report if in doubt ;)

What is you criteria? That price is below 200 dollars? Best selling AMD GPUs of last 10 years are RX480, 470, 580, 570 because they were dirt cheap. 95% of AMD GPU buyers won't even consider buying a GPU priced above 300-400 dollars and you expect AMD to care and spend a ton of R&D money on gaming GPUs? Ha.

AMD makes great CPUs. It is what they do. 90% or more of their income is this market.
GPUs are a second thought. Niche market for them. However they spend more R&D on AI and Enterprise than Gaming.
 
Last edited:
Starting at £1650/£1899 top end
Yeah expect 2000+ USD, maybe even 2500. It is 5080 times two after all. Twice the core count, twice the memory and bus. However 550-600 watts will scare many away (should have used 3nm TSMC)

5080 will be 1199

5070 Ti probably 899

And 5070 at 499-599 (Only 6400 cores - Room for 5070 SUPER)

I expect 5000 series to be a forgetable gen. Delayed, same 5nm tech as 4000 series, no competition at 5070 Ti and up.

Nvidia has no reason to sell them cheap and has most focus on AI anyway.

Radeon 8800XT hopefully can compete with 5070. 5070 Ti and up will have NO COMPETITION at all. This usually don't mean low prices ;)
 
Starting at £1650/£1899 top end

I think it will be more. As of today, a Palit OmniBlack 4090 is £1899. The cheapest new 4090 on NewEgg is $2899.

My guess is £2499 MSRP in the UK, or $2500 in the USA.

Of course you'll need to pay more than that if you want anything but the founders edition... (which will be out of stock)
 
I think it will be more. As of today, a Palit OmniBlack 4090 is £1899. The cheapest new 4090 on NewEgg is $2899.

My guess is £2499 MSRP in the UK, or $2500 in the USA.

Of course you'll need to pay more than that if you want anything but the founders edition... (which will be out of stock)

yeah tbf that doesn't sound outlandish, oh deeeeer
 
Surely blows the doors off it - https://www.techpowerup.com/review/silent-hill-2-fps-performance-benchmark/5.html

6000 series is old and dated tech, can't do RT well, which is why pretty much all new games perform miserable on them. You know RT elements are forced in tons of games today right? You got 16GB but your GPU capabilities and features are lacking big time, making it useless anyway. I would not even pay 200 bucks for a 6800XT in 2024, soon 5 year old tech and Radeon 8000 series launch in a month with vastly better RT for 400-500 bucks brand new. Probably with FSR 4 hardware locked too. AMD is going the AI route as well with FSR 4 and old cards won't be supported.

4070 SUPER 12GB beats 6800XT in everything with ease and has superior features across the board. Nvidia just manages VRAM better; better compression, better cache/hit miss system, this is why AMD generally needs more VRAM than Nvidia. 6700XT 12GB aged worse than 3070 8GB even tho they launched at 480-500 USD and DLSS/DLAA beats FSR every single time, which is way more important for longevity than VRAM. All new demanding games have DLSS. Also, 3070 can actually do RT somewhat decent, which is why performance is not dropping hard in UE5 games like Radeon 6000 series.

So sure, enjoy your 16GB and talk crap about 8GB cards, they beat your card in tons of titles anyway, even with half the VRAM and half the power usage + superior upscaling with widespead supprt - 600+ titles now ;)

At CES 2025 Nvidia will be talking about Neural Texture Compression, which will deliver better textures at much less VRAM usage. This is Nvidias next big feature coming and AMD already tried to copy it - https://www.tomshardware.com/pc-com...n-rivals-nvidias-texture-compression-research
Here's the average FPS over 25 games:


Yes, at all 3 resolutions, I would indeed say "blow the doors off" the 4060, and handily defeats the 4060 Ti. I chose the 7900 GRE review for the sake of something recent.

As for the 4070 Super, I agree with it being the performance choice over the 6800 XT. It's newer, more efficient, performs close enough in raster, and offers far better ray tracing. It's also $200 more. And it's 12GB not 8GB.

Looking at Alan Wake 2 and Jedi Survivor, two demanding 2023 titles, the 6800 XT STILL comes out on top of the 4060s. (And ekes out the vanilla 4070!)

You're right. It's old tech, and absolutely should be replaced by the more complete, 5 year newer, 8800 XT. Can't wait for it. Though I predict both the 8800 XT and Nvidia's new releases will still be priced too high and perform too low. I hope I'm wrong.
 
True fan? More like a realist that look at actual performance. Numbers don't lie. Reality hurts I see. I use AMD CPU + Nvidia GPU because they offer the best available overall, in terms of drivers, performance, visuals and features in general. This is why Nvidia has like 85-90% of dGPU gaming market now.

AMD GPUs are in a horrible state right now and AMD don't seem to care. Probably because its a niche market for them. No money to be made.

Yes you buy AMD, like the 10% other people that can't afford Nvidia. Nvidia don't care about low-end market, so its fine.

Lets hope Radeon 8000 won't be shite or AMD might shaft gaming GPUs completely. They make no money on them. Which is pure fact, go read their latest financial report if in doubt ;)

What is you criteria? That price is below 200 dollars? Best selling AMD GPUs of last 10 years are RX480, 470, 580, 570 because they were dirt cheap. 95% of AMD GPU buyers won't even consider buying a GPU priced above 300-400 dollars and you expect AMD to care and spend a ton of R&D money on gaming GPUs? Ha.

AMD makes great CPUs. It is what they do. 90% or more of their income is this market.
GPUs are a second thought. Niche market for them. However they spend more R&D on AI and Enterprise than Gaming.

Funny, I have an Intel CPU and an AMD GPU. (They were about half the price of the comparable AMD CPUs and Nvidia GPUs.)
 
Back