Nvidia GeForce RTX 3050 Review: Availability is not guaranteed

Microcenter had 6500XT for 199$ usd available for some time, so I don't think it is fair to actually list it with a 270$ price on cost/per frame
I doubt we will see the rtx 3050 close on its msrp for some time, its 8gb of vram, dlss capability and gtx 1660 like performance, would probably it initially place it above 450$ usd
 
Normally in the UK the prices are pretty much a £=$ for GPUs. The 6500XT is selling for £199.99 at the usual suspects (which is still over priced for the performance) the same price as they are selling the 1050Ti and 1650 (non super). It looks good value in that comparison.....
 
Interesting card: A decent card for DLSS 2.X at least when it comes to actually being able to enable other stuff like higher settings and ray tracing at 1080p (I'd like to see how playing 4k but with the fastest DLSS setting looks like as it might actually make some games sort of playable and better looking than just upscaling 1080p)

Of course the list of titles where this trick is even possible is tiny and I am not sure it will ever get much more better so probably not enough to justify a purchase if you could afford anything better, but well, you won't be able to: I am expecting this card to land, even after the inevitable double MSRP increase, still like half the money a 3060 goes so a lot more people will be able to afford it at least as a place holder while people wait for other cards to come down (Although I am of the opinion that they might be waiting another 1 or 2 years at best, with prices never returning to pre-pandemic prices ever again at worst)
 
Interesting review and conclusion, thanks.

If pricing ends up where you suspect, the 3050 looks more like a 6600 competitor rather than a 6500XT competitor.

Just like the 6600XT ended up being a 3600 rather than a 3060Ti competitor based on real price.

I think it was very nice to show that the x8 PCIe bus limitation is not an issue for the 3050 rather than what you wrote for the 6600 review without testing if there is an effect. That‘s definitely preferrable.

Finally, like the 6600 XT, the non-XT version is limited to PCIe 4.0 x8 bandwidth. So when installed in a PCIe 4.0 system this is a non-issue, but performance related problems could arise when installed in a system that only supports PCIe 3.0, which currently is most systems.

One thing I would have liked to see for the review of an entry level graphics card is at which CPU level driver overhead is still a thing.
 
Why the hell you use medium setting in many of these games???! LOL
to make 6500XT cost per frame not look too bad??

I pretty sure you can easily use high or max texture in Cyberpunk with very small penalty on performance (on 8GB GPU). Why stick to medium texture on 8GB GPU ?? Funny thing is that in RTX 3050 Ti 4GB laptop review, Techspot tested all games on ultra/max setting to make it look bad.
https://www.techspot.com/review/2297-geforce-rtx-3050-ti-laptop/

But faster desktop RTX 3050 GPU with larger memory, you use medium settings. LOOOL It should be the other way around.

In laptop RTX 3050 Ti review, techspot concluded that 4GB memory is not enough for modern games. Obviously when you test all your game on max setting (including demanding games like cyberpunk), then yea you are going to have problems. Have you used medium setting (at least when running demanding games) like this review, you would not have the same conclusion

I feel techspot has been unfair toward nvidia. RTX 3050 Ti laptops are not as bad as the review make it. A lot of these issues could have been fixed by just lowering a few settings (memory hungry settings).

On other hand, you use medium setting on faster GPU, desktop RTX 3050 8GB, even on games that not too demanding (Hitman, far cry 6, and watchdogs RTX 3050 still have enough headroom for higher setting on 1080p). I mean in Hitman at 1080p medium, RTX 3050 got 112fps.. Seriously, why the hell you would use medium setting for this game ??! RTX 3050 8GB is not benefiting from its larger memory at those settings
 
Normally in the UK the prices are pretty much a £=$ for GPUs. The 6500XT is selling for £199.99 at the usual suspects (which is still over priced for the performance) the same price as they are selling the 1050Ti and 1650 (non super). It looks good value in that comparison.....

The GTX 1650 is around £250-£280 with most UK retailers with the GTX 1050 Ti being £200-220. The current retail prices in the UK make the RX 6500 XT seem decent price the performance. I'm hoping the RTX 3050 isn't anymore than £300. I think over that price point it loses any real world value. I think it has its benefits but it barely matches the 2060 and that was considered a bad card for the money when it launched.

Hopefully there's plenty of stock of the 3050 but let's be real Nvidia will pump numbers out for a couple of months max and then this will be another unicorn card with silly prices at retail and on eBay.
 
8GB Version is like an old GTX1070 then. Fine for 1080p.

I certainly wouldn't be hoping for many rays to be traced but with DLSS it should still be useful.
 
How much do used GTX 1070s usually go for nowadays? That seems like another decent compromise
I think they land about the same as MSRP when they launched last time I checked: 400 to 500.

So honestly if you can start finding 3050 you might as well buy one if the price is the same: you get almost the exact same performance (Tech Jesus did tested against the 1070 check that review on Gamers Nexus) but it uses slightly less power, it's brand new and it can access DLSS 2.0 and Ray Tracing at least on a limited capacity 1080p with both enabled so it's just overall a better card to get.

But if you can't wait or Nvidia takes 2 to 3 months to actually have regular restocks of the cards (A very likely scenario) you could just buy a 1070 and probably not regret it for a couple more years.
 
I think it’s disgusting that Techspot didn’t compare the 3050 to a 6500XT in the ray tracing tests because they didn’t want to “be so cruel”. You mean you don’t want to see AMD get its *** handed to it? I’ve seen 6500XTs on sale in shops now and they have “ray tracing capable” written on the back of the box. It’s the biggest lie I think I’ve ever seen written on the side of a graphics card box and Techspot for some reason isn’t calling that bullshit out. Especially as I have no doubt Techspot would call it out if it was the other way around. They would “be so cruel” to Nvidia, their beef with them is highly publicised.
 
The RTX 3050 actually doesnt look that bad. $370 would be a terrible price though, the 1070 was $350 back in 2016. Going backwards on price/perf is just awful. If they get it out for a MSRP of $250, it will be a slap in the face for AMD, and if it can be found for $300 or less IRL then AMD is going to be stuck with piles of worthless 6500s.

I'm also rather disappointed in the power draw, not because its bad, but because this is the 3050, likely meaning there will be no 75w 1650 replacement for low profile builds. There's been no talk of a RTX 3040, so I'm likely stuck for another generation waiting for an upgrade, given the utter disappointment that was the 6500xt my only hope is we get a good 75w A380 from intel.
I think it’s disgusting that Techspot didn’t compare the 3050 to a 6500XT in the ray tracing tests because they didn’t want to “be so cruel”. You mean you don’t want to see AMD get its *** handed to it? I’ve seen 6500XTs on sale in shops now and they have “ray tracing capable” written on the back of the box. It’s the biggest lie I think I’ve ever seen written on the side of a graphics card box and Techspot for some reason isn’t calling that bullshit out. Especially as I have no doubt Techspot would call it out if it was the other way around. They would “be so cruel” to Nvidia, their beef with them is highly publicised.
I think techspot's repeated bashing of the 6500xt in its own review, let alone being constantly called out in this review for being a worthless pile of garbage, makes the point pretty clear.
 
Where I live in the UK you can easily buy an ex-mining GTX 1660 for about £300. These have all appeared quite recently. From the results here, if you can get a 3050 for MSRP then you will be better off. But some of the retail pricing has leaked and this thing is going to cost considerably more than Nvidia suggests. Still its good to see ex mining cards start to hit the market.

The pricing just goes to show how little confidence people have in Radeon these days. This card will likely compete on price with a 6600 - 6600XT and it will be much slower when not using RT or DLSS. The inflated and very fluid pricing recently has really highlighted how much people are willing to pay for cards and people are clearly willing to pay much more for Geforce than Radeon (myself included).

Oh also anyone who claims this card is being bought by miners is lying, this 3050 has been gimped big time when it comes to mining, it mines at less than half the rate of a Radeon 6600 (which by the way is a very good mining card for the money apparently). The scalping done these days is mostly by OEMs and retailers. But if they didnt do it then some shmuck on ebay would.
 
Tom's hardware tested both medium setting and ultra setting

Just look at far cry 6 running medium setting
RTX 3050 beats 6500XT but not massive difference

Now run game on Ultra setting and HD texures

RTX 3050 completely destroy 6500XT at those setting. In fact even GTX 1060 beats 6500XT because of larger memory, but it lose when you use medium setting.

So I wish if techspot used higher setting (or at least run higher texures resolution). RTX 3050 shines when put 8GB memory at good use
 
Last edited:
I love the meatshielding for billion dollar companies in the comments section here and at other tech sites. You were too nice, you were too mean. Normal people read the reviews of the card and reviews of the comps and can actually make informed decisions.

Which means the RTX 3050 is relatively slow but does exactly what you'd expect it to for it's specs. The price may suck but the product is competent. And the 6500 XT is garbage. It should be a whole lot better with a few design changes, but it's not. In the distant past I've owned products like this and it sucks. One or two changes and it would be a competent product like the 3050. If it still ended up a bit slower than the 3050, who cares as long as it's competent?

And for the record my 2 most recent GPUs are from AMD and I game on all 3 brands. Yes, including Intel.

I just want competent products, even with today's uber-craptacular pricing.
 
The 3050 seems to have almost the exact same performance as my 1070 Ti, which is about the same as a 2060. Interesting, but not surprising, that nvidia has maintained the same level of performance each iteration by "downgrading" the model number 1070 Ti -> 2060 -> 3050.

For comparison, I purchased my 1070 Ti card in November of 2018 for $369, which was right at the tail end of the previous crypto boom and bust. Interesting to see that a card released in 2017 matches a card in 2022, but costs $75 more. The mid-range value consumer has really gotten the dirty end of the stick over the last several years.

Note to TechSpot, why do you not include the 10XX series in your charts any more (a 1080 would be a good choice)? I know that it is 3 generations ago, but I am willing to bet that a lot of your readers still have them. I know I do because the 20XX series was a complete dud and the newer 30XX series has been unobtainable due to the crytpo boom.
 
A respectable 1080p card. It keeps it above 60 fps average and even 1% lows for most titles tested at 1080p and even many 1440p titles. Looks like its about 1660 Ti levels, which was selling for around $280 in 2019 and < an RTX 2060 which was selling for $299 in 2019. Back when GPU prices were normal. So $250 is not a terrible price considering the current market, if of course, you can get on near MSRP.
 
Irrelevance, thy name is RX 6500 XT.

It just begs the question though, why would anyone stick the XT moniker on that abomination? All they needed to do is have the RX 6500 (in its current configuration) and the XT with PCI-E x8.

I really hope that some corporate fool was fired from AMD because this is not the galactic-level screw-up that an engineer at ATi would've made.
 
Back