The RTX 5050 is Nvidia's third 8GB GPU to launch without day-one reviews

Daniel Sims

Posts: 1,962   +53
Staff

Public drivers are now available for Nvidia's RTX 5050. Although the company confirmed that the desktop GPU will launch later this month, board partners such as Inno3D claim it's available now and have already benchmarked it. However, independent reviews have not appeared yet, suggesting that Nvidia is bypassing media outlets with yet another entry-level product. Customers interested in the $250 card currently only have preliminary benchmarks for guidance.

None of the 8 GB RTX 50 series graphics cards received proper pre-launch reviews, prompting suspicion that Nvidia attempted to hide their disappointing performance from early adopters. The strategy feels especially misleading in the case of the 5060 Ti, which comes in both 8 GB and 16 GB variants.

Nvidia only released early review samples for the 16 GB RTX 5060 Ti, which we found to be a decent GPU. However, the 8 GB model immediately felt outdated upon launch, often lagging far behind its partner in our benchmarks.

The situation repeated with the standard RTX 5060. While reviewers were able to acquire the card, Nvidia withheld the preview drivers. Furthermore, the company only sponsored reviews from outlets that agreed to compare the GPU's AI-generated frames against purely rendered frames from the 3060 and 2060, obscuring the actual performance difference.

In our review, not only did the $299 GPU further prove that 8 GB of VRAM is a handicap in 2025, but its performance fell short of the five-year-old RTX 3070. An early benchmark of the RTX 5050, which includes 8 GB of GDDR6 VRAM and 2,560 CUDA cores, suggests a similar outcome.

Inno3D recently shared a few synthetic and gaming benchmarks on Weibo. Although the RTX 5050 pulls slightly ahead of the 4060 in Steel Nomad, Speed Way, TimeSpy, and other tools, the older GPU inched ahead in Borderlands 3, Far Cry 6, Horizon Zero Dawn, and Assassin's Creed: Valhalla.

Although 8 GB GPUs can technically play almost any modern title, especially in 1080p, analysis of other cards shows that upgrading to 12 or 16 GB in the same performance tier brings 4K gaming well within reach. Our comparison with the RTX 5060 Ti delivers solid proof of this, but Intel's Arc B580 provides especially fascinating evidence.

Although finding Intel's GPU at MSRP is difficult, it launched at the same price as the RTX 5050 late last year, and its 12 GB VRAM pool propels it past the 8 GB 5060 Ti in some situations.

The laptop version of the RTX 5050 is also now available in devices starting at $999. It is expected to perform similarly to its desktop counterpart.

Permalink to story:

 
"Although 8 GB GPUs can technically play almost any modern title, especially in 1080p, analysis of other cards shows that upgrading to 12 or 16 GB in the same performance tier brings 4K gaming well within reach."

A lot of people are still using 1080p monitors.

I have to imagine Nvidia sales data tells them it's worth it to keep cranking out 8GB cards.

Otherwise, they'd sit on the shelf, rot, stores would stop ordering them and Nvidia would stop building them.
 
"Although 8 GB GPUs can technically play almost any modern title, especially in 1080p, analysis of other cards shows that upgrading to 12 or 16 GB in the same performance tier brings 4K gaming well within reach."

A lot of people are still using 1080p monitors.

I have to imagine Nvidia sales data tells them it's worth it to keep cranking out 8GB cards.

Otherwise, they'd sit on the shelf, rot, stores would stop ordering them and Nvidia would stop building them.
That's not why Nvidia does this.

Nvidia really only started skimping on Vram when DLSS became a big deal because they know that any decent modern nvidia GPU can keep up with modern games by just going to a lower DLSS setting over time as needed as long as the GPU has enough Vram to avoid any memory related bottlenecks. Especially when you consider that DLSS improves over time and suddenly gamers can go to an even lower DLSS setting without losing on image quality. If they gave the cheaper cards sufficient Vram then DLSS would eat into the sales of their more expensive cards.

Also, I game in 4K. I know several other gamers who also game in 4K and others who use 1440p, I don't know anyone who still plays at 1080p, and I have to imagine that most of the people who do are only doing so because they just play a few casual games and are not in the market for GPU upgrades anyway.

All and all, a GeForce 3070 is faster and goes for about the same price on the used market. It's sad that Nvidia can't or won't give us something that offers more value than a used card from several years ago.
 
"Although 8 GB GPUs can technically play almost any modern title, especially in 1080p, analysis of other cards shows that upgrading to 12 or 16 GB in the same performance tier brings 4K gaming well within reach."

A lot of people are still using 1080p monitors.

I have to imagine Nvidia sales data tells them it's worth it to keep cranking out 8GB cards.

Otherwise, they'd sit on the shelf, rot, stores would stop ordering them and Nvidia would stop building them.

This story was posted today. Here at this website, did you miss it?

Mindfactory customers are overwhelmingly choosing 16GB Nvidia and AMD cards over their 8GB versions

Some quotes:

"the more expensive models of the Nvidia RTX 5060 Ti and AMD RX 9060 XT are outselling the less powerful alternatives by an enormous amount: up to 30 times, in the case of the AMD card."

For Nvidia: "retail sales for the 16GB model are 16 times higher."

They're dribbling out 8GB cards at best and they're rotting on the shelves, unless you think there are 16-30x as many 16GB cards made as 8GB ones. Which would strongly argue against anyone "cranking" them out.
 
Next gen, they will send "advisors" with each of these cards sold just to make sure people do not say "wrong" things. And whenever you try to post something negative about them, the advisor would gently adjust your comment or review to better represent the great products Jensen generously gave us.
 
Back