Nvidia tried to bury early reviews of the 8GB RTX 5060 Ti, but now the truth is out – and it's not pretty. With outdated specs and poor value, this GPU leaves a lot to be desired in 2025.
Nvidia tried to bury early reviews of the 8GB RTX 5060 Ti, but now the truth is out – and it's not pretty. With outdated specs and poor value, this GPU leaves a lot to be desired in 2025.
TechPowerUp has several of them, clamoring that is bad game developer's fault that 8GB cards are not enough.I still hear fanboys saying that 8GB is enough.
Why so? 2 extra GB either way, but totally opposite result?planned obsolescence of the 3080 10GB (as even 12 has much better longevity).
Lazy or not, a decade of VRAM stagnation will cause issues with game development. There is only so much optimisation you can do for textures until you hit a brick wall. Outside of maybe a 5030, 8GB should not be a thing anymore.Just can't wait for the RTX 5060 reviews.
The entry level cards were never meant to play current games at high/ultra settings.
Looking back in time this was always the case since GeForce 2.
Entry level was always medium/low settings or resolution. DLSS it's just a nice trick but won't help much at this level.
And game developers are getting real lazy also.
"basically" is not equal. we gamers generally look at optimal settings to achieve playable FPS with the highest possible image quality that we subjectively accept. some prefer high 100+ FPS (like me), and some want better image quality at maybe 60FPS.TechSpot, why don't you do benchmarks at DLSS Performance instead of Quality? The Transformer model, whenever it can be used, offers basically the same image with much better fps at Perf.
TechSpot, why don't you do benchmarks at DLSS Performance instead of Quality? The Transformer model, whenever it can be used, offers basically the same image with much better fps at Perf.
TechPowerUp has several of them, clamoring that is bad game developer's fault that 8GB cards are not enough.
TechPowerUp has several of them, clamoring that is bad game developer's fault that 8GB cards are not enough.
Like......did these people do the same thing with 2GB cards? or 512MB cards? IDK what it is about 8GB but some people are determined to clutch their pearls over it.
There are several games that use over 10GB of VRAM at reasonable settings but stay under 12GB.Why so? 2 extra GB either way, but totally opposite result?
Previous developer optimization is largely a myth.Actually I would have, but no games came out that wouldn't run on cards just a few years old, its only with UE5 and RTX that suddenly vram is an issue, quite frankly at 1080p every game should fit into 6gb vram at medium/high, 8gb at the most. To your point 2gb Cards first released in 2010 for AMD and 2012 for Nvidia, and remained the ram standard until Pascal in 2016 with 3 and 6gb cards, remember GTX 660, 660 Ti, 760, and 960 where standard 2gb cards, and until about 2020 you could run every new game at 1080p on a GTX 960 2gb card at medium/high settings, and on the older 660-760 cards medium was fine. Same thing happened in 2005 ATI and Nvidia went 512 on high end, midrange got it with 8800GT, and 512 remained viable until about 2013/2014 to run games at medium settings. Now why was this? Because game devs programed efficiently, they optimized properly, and they weren't as rushed, today the publishers rush them, and they don't optimize.