Early RTX 5060 Ti benchmarks reveal why Nvidia withheld the 8GB version from reviewers

Huh that's interesting I remember when GTA IV came out max settings 1200p used more RAM than on the current flagship, it had 1gb but you needed at least 1.2, doom3 max needs 512 but high-end at the time only had 256, and then crysis, the 8800gtx couldn't max it, nor the GTX 280, the 480 could at 1080 but the 590 was needed for 1600p.

This is all to say that what's your experiencing isn't all that uncommon, it used to be expected to turn down settings on mid-range cards I'm not defending in video I think their prices are insane but the old expectation was you can't run max settings on the flagship let alone the mid-range.

When I had my 7600 GT do you think I was maxing out fear or oblivion lol
Do the major difference these days isn't so much just turning settings down. People say engines are bloated or unoptimized. That's not necessarily true. They're running how they were intented to run, it's just when they were being designed 5-6 years ago they were designed with the intended hardware projections. UE5 gets a lot of crap, some deserved and some undeserved. In the same way that your OS has a kernel that needs to run in ram before you can even open and application, the game engine needs to run in VRAM before it can even start loading assets. Today's engines have such high minimum requirements relative to the hardware available that you have to turn settings down so far that the games look worse than stuff that came out in the mid 2000s. Performance has become essentially stagnant for the last 3 generations.
 
They will sell well anyway, cause there are not many kids out there or their parents who are willing to do some research before purchasing a computer. And even if they do, if their budget is limited and they don't really focus much on cases where VRAM become a problem, they will just buy these GPUs anw.

How many non-techsavy parents are willing to buy a 6800xt, for example, for their kids instead of a 5060 when most of the salesmen will suggest them that 5060 is more than enough ? Or simply just buy a console and avoid all the headaches xD

Whether we like it or not, NVIDIA is controlling the market and they can do whatever they want.

this is what I'm looking forward too. these kids will sell their 3060 or 4060 for fast money and I could upgrade my old GPU for a bargain.
 
Do the major difference these days isn't so much just turning settings down. People say engines are bloated or unoptimized. That's not necessarily true. They're running how they were intented to run, it's just when they were being designed 5-6 years ago they were designed with the intended hardware projections. UE5 gets a lot of crap, some deserved and some undeserved. In the same way that your OS has a kernel that needs to run in ram before you can even open and application, the game engine needs to run in VRAM before it can even start loading assets. Today's engines have such high minimum requirements relative to the hardware available that you have to turn settings down so far that the games look worse than stuff that came out in the mid 2000s. Performance has become essentially stagnant for the last 3 generations.
Hi Tom burst your bubble but performance has been stagnant much longer than that.

I tested it the other day with a Ryzen 5 1400, 16gb DDR4 2933, and rx480 8gb every game I play was fine at 2560x1080

My normal games are ultimate admirals dreadnoughts, wow, satisfactory, cities skylines 2, and Hogwarts legacy.

Sure it's not the latest but it all runs fine, night wager for most people that system would still be plenty.
 
Hi Tom burst your bubble but performance has been stagnant much longer than that.

I tested it the other day with a Ryzen 5 1400, 16gb DDR4 2933, and rx480 8gb every game I play was fine at 2560x1080

My normal games are ultimate admirals dreadnoughts, wow, satisfactory, cities skylines 2, and Hogwarts legacy.

Sure it's not the latest but it all runs fine, night wager for most people that system would still be plenty.
My bubble remains unburst. notice how I was referring to development cycles and how things like the unreal engine ate having Performance issues due to the expected hardware specs years before release? no?
 
My bubble remains unburst. notice how I was referring to development cycles and how things like the unreal engine ate having Performance issues due to the expected hardware specs years before release? no?
Sure I did, they got it wrong, that is actually their fault
 
Sure I did, they got it wrong, that is actually their fault
They're far from the only ones. nVidia promised us real-time raytracing for all within a generation when they released the 20 series. They failed to do that after 4 generations and I bet they love how everyone is blaming the developers instead of nVidia for failing to deliver on their promise.
 
They're far from the only ones. nVidia promised us real-time raytracing for all within a generation when they released the 20 series. They failed to do that after 4 generations and I bet they love how everyone is blaming the developers instead of nVidia for failing to deliver on their promise.

It doesn't matter it's the devs jobs to release for what exists, the fact is 6 and 8gb cards are still the vast majority so that should be the target for mainstream settings.

GTX 2060, 2060S, GTX 3060, GTX 3060TI, RX 5600, RX 5600 XT Are all still very popular along with RTX 2070, 2070S, 3070, 3070 TI, RX 5700XT, I could go on this is a dev issue.
 
It doesn't matter it's the devs jobs to release for what exists, the fact is 6 and 8gb cards are still the vast majority so that should be the target for mainstream settings.

GTX 2060, 2060S, GTX 3060, GTX 3060TI, RX 5600, RX 5600 XT Are all still very popular along with RTX 2070, 2070S, 3070, 3070 TI, RX 5700XT, I could go on this is a dev issue.
In the sense that it's nVidia's job to develop hardware that can run the technology they created? if you take game devs out of the equation altogether ,the 5060s can't even run upscaling or framegen the way nVidia intended
 
In the sense that it's nVidia's job to develop hardware that can run the technology they created? if you take game devs out of the equation altogether ,the 5060s can't even run upscaling or framegen the way nVidia intended
I'm not excusing Nvidia but I'm pointing out this is a dev issue same way crysis was a crytek screw up
 
I'm not excusing Nvidia but I'm pointing out this is a dev issue same way crysis was a crytek screw up
Crysis wasn't a screw up, it was a demo they made to showcase what their engine could do in the hopes of licensing it out to developers. they knew what they were doing when they released it because they were making something that was supposed to used in games being released 2-4 years from when crysis came out. Crysis was an Advertisement.

so when nVidia goes to Epic and says "this is where we want graphics to be in 6 years, make an engine for it" and then nVidia fails to deliver on making hardware that can run any of the technologies they developed, it's on them. And don't forget, people were complaining about how little VRAM the 30 series had.

And sure, many devs could have dialed this stuff back a little bit and done some optimization, but they had already spent millions of dollars and years of development time making something that they had every right to believe the hardware would exist to run properly.

There is stuff in development right now for next gen consoles where 16GB of Vram is going to be the minimum. and there is a market for lowend products that do the bare minimum. there is nothing wrong with 8GB cards, but it is unacceptable to be selling them at $400+. It would be insulting to call the 5060ti 8GB a 50 class card at this point.
 
Crysis wasn't a screw up, it was a demo they made to showcase what their engine could do in the hopes of licensing it out to developers. they knew what they were doing when they released it because they were making something that was supposed to used in games being released 2-4 years from when crysis came out. Crysis was an Advertisement.

so when nVidia goes to Epic and says "this is where we want graphics to be in 6 years, make an engine for it" and then nVidia fails to deliver on making hardware that can run any of the technologies they developed, it's on them. And don't forget, people were complaining about how little VRAM the 30 series had.

And sure, many devs could have dialed this stuff back a little bit and done some optimization, but they had already spent millions of dollars and years of development time making something that they had every right to believe the hardware would exist to run properly.

There is stuff in development right now for next gen consoles where 16GB of Vram is going to be the minimum. and there is a market for lowend products that do the bare minimum. there is nothing wrong with 8GB cards, but it is unacceptable to be selling them at $400+. It would be insulting to call the 5060ti 8GB a 50 class card at this point.
The fact is developers already knew what the most common hardware was The GTX 1060 and RX 580 are still really common but I believe they've passed their prime and shouldn't be a target anymore but something like an RTX 2060 or an RX 5600 XT shoukf certainly still be a target, frankly if you're designing your games and the only cards it can run it are the top end models is the developers problem it's the reason I don't buy any brand new AAA games the developers are lazy.
 
What grinds my gears about the 8 GB version of the 5060 Ti is that given the GPU shortages right now there is not even a good financial reason for them to exist. Nvidia & its board partners would more money by just making the higher profit margin 16 GB version right now; the cheaper 8 GB version doesn't make a lick of sense to produce until the 16 GB version stops selling out super fast.
 
8GB isn't a problem in itself, it is a problem when paired with an upper mid-tier GPU that sells for $379 ($719 here in Australia). If NV had called it a 5050 or even 5060LE (bring that naming convention back!) and priced it at $250 it would have been better received.

I was looking at 5060Ti prices here in Oz, and surprise surprise a lot of retailers are doing their best to hide the 8GB models among the 16GB models for maximum confusion, with 8GB buried in the specs rather than in the item title. You bet less informed buyers are going to make that mistake.
 
We're gonna have this odd situation where the RTX 3060 12GB will destroy the RTX 5060 8GB in performance in a few select cases.

Odd you should say that as I have two computers, (I am budget limited) - one with a Ryzen 3600 and RTX4060ti 8gb and an older Intel i5 3570k with RTX3060 12gb

and when I am pratting about with blender etc I tend to throw it at the older RTX3060 rig. Probably not as fast but dang it's far more stable and there are some things that due to VRAM you just get a hard no about with the 4060ti.

I would much rather have VRAM than GPU but both are important.
 
I'm still tired of the lack of module vram.
While I agree to some extent, we are at the point where the latency from socketed VRAM is an issue. You'd get less GPU performance to account for the modular VRAM. But those were the days, I still have a special set of plyers for removing ram chips from their socketed modules
 
Back