Nvidia RTX 5050, 5060, and 5060 Ti specs leaked, slight core and TDP boosts expected

Daniel Sims

Posts: 1,877   +49
Staff
Rumor mill: Following the disappointing launches of Nvidia's RTX 5090, 5080, 5070 Ti, and 5070 graphics cards, the Blackwell lineup's mainstream and entry-level products are expected to emerge soon. Unsurprisingly, they feature slightly more CUDA cores, draw more power, and upgrade to GDDR7 VRAM without increasing the VRAM pool. The exception is the 5050, which closely resembles the RTX 3050.

Prominent leaker "Kopite7kimi" recently shared specifications for Nvidia's upcoming RTX 5050, 5060, and 5060 Ti graphics cards, which the company may announce this week. Like the other Blackwell cards, the three lower-end GPUs appear to be, at best, minor upgrades over their predecessors.

The most glaring indication that this is the case is that the RTX 5060 and 5050 will still only have 8GB of VRAM. Come on, Nvidia. It's 2025. The 5060 Ti will have a 16GB option, confirming that the flagship 5090 is the only Blackwell GPU to feature more VRAM than its predecessor.

Nvidia may promote its new mainstream products as budget-oriented options for multi-frame generation on 460Hz and 500Hz 1080p monitors. However, even at mainstream price levels, some users might find three consecutive generations of 8GB graphics cards hard to swallow, especially considering that competing options, like Intel's Arc B580, include 12GB.

The RTX 5060 and 5060 Ti each feature a few hundred more CUDA cores than their 40 series equivalents. The 5060 upgrades from 3072 to 3850 cores, while the 5060 Ti receives an increase from 4352 to 4608. Of course, this factor makes them more energy-hungry, with TDPs rising from 115W and 165W to 150W and 180W, respectively.

Meanwhile, the RTX 5050 looks identical to the 3050, which launched in 2022. It still has 2560 CUDA cores, a 130W power draw, and 8GB of GDDR6 VRAM, while Nvidia upgraded all the other Blackwell GPUs to GDDR7. However, clock speeds for the three upcoming Blackwell cards remain unknown.

Each Blackwell graphics card released so far has provided smaller gains over its predecessor than the GPU ranked above it. The RTX 5070 performs almost identically and sometimes worse than the 4070 Super. This trend indicates improvements might be worse or nonexistent with mainstream and entry-level variants.

Despite their unimpressive specs, history suggests that the RTX 5060 and 5060 Ti might become the most popular Blackwell products, as 60 series GPUs usually dominate Steam hardware surveys. Still, the latest models might only be attractive to consumers upgrading from a card that is multiple generations old, and that's if AMD's upcoming Radeon RX 9060 and 9050 don't offer better value.

According to prior reports, Nvidia might announce the RTX 5060 and 5060 Ti this Thursday, March 13. However, availability isn't expected until next month.

Permalink to story:

 
8 GBs? Really nGreedia?
8gb is just disrespectful 😂
They're angling for the subset of the market that argues until they are blue in the face that 8GB is totally fine and developers are just lazy.
8GB is FINE for a GPU that "gets you in the door" for gaming. I don't think anyone is expecting to be playing games at max settings on "entry level" graphics card. The thing is it needs to be priced correctly.

Oh, who am I kidding, these will be selling for $500 and they'll be shouting us to be thankful we got anything.
 
Why bother at this point? These cards serve no real purpose over the current gen - same **** as 2 years ago except with 2 extra fake frames in their software (and the latency that goes with that) to pretend they are better than they really are.
 
5060 gets the biggest boost by far in core count. Needs to be launched with 3GB dies, and come with 12GB and would probably be a decent card if $380. Dump the disgraceful 8GB 5060 Ti and just release 16GB model. Or do the honourable thing and rename the 5070 as the 5060 Ti, 5070 Ti -> 5070, 5080 -> 5070 Ti, launch 24GB 5080 Ti with 13K cores for $1000.
 
8GB is FINE for a GPU that "gets you in the door" for gaming. I don't think anyone is expecting to be playing games at max settings on "entry level" graphics card. The thing is it needs to be priced correctly.

Oh, who am I kidding, these will be selling for $500 and they'll be shouting us to be thankful we got anything.


For the free to play crowd, 8gb is fine. These cards are ment for all those Steam Survey players playing, CS2 and the other top 10 games that can run on potatoes.

Nvidia is just going to market these 5050 and 5060 cards for them. "Its the Kings nephew**, go team green" (**nephew by naming only, actual family member is from a distand 7th cousin we cant remember anything about)

Those players will buy this up. Just like they did when they sold thier 3060ti to get a 4060.. Higher number better and all..
 
For the free to play crowd, 8gb is fine. These cards are ment for all those Steam Survey players playing, CS2 and the other top 10 games that can run on potatoes.

Nvidia is just going to market these 5050 and 5060 cards for them. "Its the Kings nephew**, go team green" (**nephew by naming only, actual family member is from a distand 7th cousin we cant remember anything about)

Those players will buy this up. Just like they did when they sold thier 3060ti to get a 4060.. Higher number better and all..
This is just my opinion, but there really haven't been many good game releases in the last 5 years. I would argue that BGS3, CP2077 and SM2 as notable releases. Just look at the steam charts, most of the top played games are basically 10 years old or older, skyrim is still got a significant place on those charts.

I think we reached a point during the PS4/XBone era where graphics were good enough to essentially become timeless. if you just want to play some skyrim or BGS or basically any entry level gaming, I think plenty of people would be thrilled with a $100 8gig card. I think people are getting pissed about $1000 GPUs with 16gigs of ram. iGPUs are so good these days that there isn't any point in paying for one outside of the desire to do some casual gaming.

I'm speculating here, but I wouldn't be surprised if an 8GB 5060 could beat a 1080ti. I ended up replacing my 1080ti with a 6700xt because I had to. it died in the middle of the shortage. but what am I going to do if my 6700xt dies tomorrow? I'm either going to have to pay scalper pricing for a 9070 or probably make due with ARC iGPU on my 12th gen Intel laptop. Which, it actually does pretty well for on the road gaming in hotel rooms and whatnot. I've considered eGPUs for it, then I see 7600M eGPUs for 6-700 and say "I'm not paying for that". not kidding when I say I was looking at an asus laptop today with a 7700S 8GB in it today for $900 because I can get a whole ryzen laptop with GPU than I could with an eGPU and dock.

If someone could make a decent 8GB eGPU for $200 that I could stick in my laptop bag for road trips or when I'm working out of town, I'd be thrilled
 
I've considered eGPUs for it, then I see 7600M eGPUs for 6-700 and say "I'm not paying for that". not kidding when I say I was looking at an asus laptop today with a 7700S 8GB in it today for $900 because I can get a whole ryzen laptop with GPU than I could with an eGPU and dock.
If someone could make a decent 8GB eGPU for $200 that I could stick in my laptop bag for road trips or when I'm working out of town, I'd be thrilled.

A hub/dock like that without the GPU would be $150 at a minimum considering it has to reliably handle data, ports, and power. Add $300 for a modern x60 class GPU and then you’d still have to pay for the ancillaries like R&D, Marketing, Logistics, and Insurance. It sucks but $600 is about as cheap as those discrete docks are gonna get.
 
A hub/dock like that without the GPU would be $150 at a minimum considering it has to reliably handle data, ports, and power. Add $300 for a modern x60 class GPU and then you’d still have to pay for the ancillaries like R&D, Marketing, Logistics, and Insurance. It sucks but $600 is about as cheap as those discrete docks are gonna get.
It doesn't have to be that way, look at m.2. There is no reason we can't have external m.2 ports and a 4X 5.0 port would be over kill for those things. Everyone is so obsessed over USB4.0 and thunderbolt what-ever-were-on-now that people are forgetting the easy solution. Most laptops have 2 X4 m.2 slots, just make an external PCIe connector with power and you don't have to worry about all the USB4.0 nonsense and using the right cables and having the correct doc.

I honestly think Framework, the laptop company, is basing their "graphics modules" on this concept.
 
It doesn't have to be that way, look at m.2. There is no reason we can't have external m.2 ports and a 4X 5.0 port would be over kill for those things. Everyone is so obsessed over USB4.0 and thunderbolt what-ever-were-on-now that people are forgetting the easy solution. Most laptops have 2 X4 m.2 slots, just make an external PCIe connector with power and you don't have to worry about all the USB4.0 nonsense and using the right cables and having the correct doc.

I honestly think Framework, the laptop company, is basing their "graphics modules" on this concept.

OcuLink is the closest thing we have to externalized M.2, and it comes with its own issues. I don't think anyone would want to lose out on hot-plugging, PD (good luck powering a laptop via OcuLink or M.2, you're talking about making a major change to the PCIe standard in order to accomplish this), or the ancillary DP/USB connectivity that is already built into USB4/TB for a $50 savings at best (that's not even a 9% discount off a $600 eGPU), given the fact that they're already plugging in a USB4 cable anyways for power/hub (and you can thank the EU for basically mandating USB-C).

As for graphics modules, they're cool, but proprietary.
 
OcuLink is the closest thing we have to externalized M.2, and it comes with its own issues. I don't think anyone would want to lose out on hot-plugging, PD (good luck powering a laptop via OcuLink or M.2, you're talking about making a major change to the PCIe standard in order to accomplish this), or the ancillary DP/USB connectivity that is already built into USB4/TB for a $50 savings at best (that's not even a 9% discount off a $600 eGPU), given the fact that they're already plugging in a USB4 cable anyways for power/hub (and you can thank the EU for basically mandating USB-C).

As for graphics modules, they're cool, but proprietary.
Things being hotswappable is a relatively new thing. the idea that we should sacrifice cost and compatibility to make sure something is hotswapable is absurd. turn your machine off and plug it, turn it back on.

Convenience should be illegal the moment it hinders progress. now we have several USBC standards existing concurrently and there's no easy way to tell what works with what and you still need a special cable, it's just usually not marked and easier to lose since it isn't easily identifiable.

It should be used for mobile devices and charging, it doesn't belong on everything between here in the sun. I guarantee we'll see talk of a USB "Type-D" by the end of the decade
 
Things being hotswappable is a relatively new thing. the idea that we should sacrifice cost and compatibility to make sure something is hotswapable is absurd. turn your machine off and plug it, turn it back on.

Hotswappability for eGPUs is over a decade old. And USB-C is far more compatible today than OCuLink let alone a hypothetical M.2 external connector so I'm not sure where that perspective comes from.


Convenience should be illegal the moment it hinders progress. now we have several USBC standards existing concurrently and there's no easy way to tell what works with what and you still need a special cable, it's just usually not marked and easier to lose since it isn't easily identifiable.

It should be used for mobile devices and charging, it doesn't belong on everything between here in the sun. I guarantee we'll see talk of a USB "Type-D" by the end of the decade

Seems a bit paradoxical, as convenience is strongly indicative of progress (In fact I'm genuinely struggling to think of any time the opposite was true to be honest). And yes, USB-C started out as a mess but it's getting better. We have cleaned it up significantly from the "USB 3.2 Gen 2x2" nonsense days. The Type-C port itself was a necessary evolution and consolidation of the classic Type-A and largely pointless Type-B ports, into a port that had the advantages of having more pins in a smaller form factor and also being reversible, thus becoming essentially ubiquitous on those merits BEFORE the EU mandated it as the sole charging port in new devices.

As for a Type-D port, given that Type-C resolved the longstanding physical and electrical issues of Types A/B, I doubt we'll be seeing a physical change to what is essentially a "solved" port any time soon, unless a major paradigm shift occurs in the industry and we wind up NEEDING a significantly thinner connector or more pins. Not saying it's impossible, just highly unlikely to happen in the near future given the fact that the Type-C port is now capable of a seriously impressive 120Gbps asymmetrically. In the meantime, Type-C will get updates, and as we are no longer updating USB-A or B, USB-C is implied in the USB4, and USB4v2 specifications.

As for cabling being an issue, it might seem annoying to need specific cables for certain bandwidth and power delivery capabilities, but that's not a problem solved by introducing a new connector. You'll still need specific cables. In fact, such a situation would only be less flexible.
 
I'm speculating here, but I wouldn't be surprised if an 8GB 5060 could beat a 1080ti. I ended up replacing my 1080ti with a 6700xt because I had to. it died in the middle of the shortage.

The 3060 already beats the 1080 Ti in most modern DX12 games, sometimes by a wide margin. The 1080 Ti is still faster in older DX11 games.

Which means the 5060 will be considerably faster as the 4060 is already 17% faster than the 3060, and the 5060 is getting a decent core count increase over that.

The 1080 Ti was excellent but that was 4 generations and 8 years ago. As of yesterday, as it happens.
 
"the RTX 5050 looks identical to the 3050, which launched in 2022. It still has 2560 CUDA cores, a 130W power draw, and 8GB of GDDR6 VRAM"

Well, 50 is 30+(30*60%), which means the 5050 MSRP is 3050 MSRP +(3050*60%).
That's $250+ $150 = $400 for the 5050.

Add the gree-n/d tax on top and it's going to sell for about $4000.

Remember :
That's MSRP without going through our friends' hands :
Scalper1 (online shops, those guys pricing "€550" RX9070 at €903+)
Scalper2 (sick ducks buying en mass to sell overpriced on ebay)

So, $400 is now... well does it even matter at this point?

Reference for sanity :
RTX 5090 for about $800
RTX 5080 for about $550
RTX 5070 for about $400
RTX 5060 for about $200
RTX 5050 for about $ 99

There was a mythical time in history when PC Hardware would improve from gen to gen and pricing wouldn't change or would even drop.
Now, everybody is amazed by how much more performant the new GPU is, without realizing it's the old MSRP + 4 young and healthy kidneys.
 
Last edited:
For the free to play crowd, 8gb is fine. These cards are ment for all those Steam Survey players playing, CS2 and the other top 10 games that can run on potatoes.

Nvidia is just going to market these 5050 and 5060 cards for them. "Its the Kings nephew**, go team green" (**nephew by naming only, actual family member is from a distand 7th cousin we cant remember anything about)

Those players will buy this up. Just like they did when they sold thier 3060ti to get a 4060.. Higher number better and all..

Why would anybody spend that kind of money to play CS2 I built a computer for my nephew to play CS2 it uses an RX 480 and he does just fine at 1080p.
 
Last edited:
Back