Nvidia GeForce RTX 3050 6GB Review: Quiet Release, Dodgy Name

Also RX 6400 with HH SS form factors available for ultra small cases/spaces. But lacking video encoders for those who need them.
Not bad! Looks like (testing in Windows) the GTX1650 just edges it out; but Mesa (Linux 3D..) support for AMD chips is amazing (and consistently higher FPS than in Windows). I don't have the hate for Nvidia binary drivers that seems to be common in the Linux community but if I were in the market now between the 1650 and the 6400, given the very close performance I'd probably go for the 6400 just for having Mesa support rather than having to install aftermarket drivers. (Although I have dabbled with CUDA now and then...)

I for one don't give a care about the video encoder; I don't routinely stream, and I've got like 6C/12T on my desktop so I imagine I have plenty of spare CPU power for steam play without nvenc.
 
What y'all seem to have missed, is the fact that this card is a far better offering than the dreadful excuse for a "gaming card" that was the GTX-1630 (4GB), which Huang tried to foist off on people for $190.00 at release..!

As for the GTX-1650s & 1660s, they were all GDDR-5 when released, then went to GDDR-6 The later AIB cards all required outside power, with the 1660's power being capped at 135 watts (or there about).

As for it "not being able to play AAA 2024 titles". Are we entirely sure they're not just bulking them up a bit so as to throw another Grand or two Nvidia's way?

Besides, if you only interested on pitching some smut over to the TV, or surfing the web for more, a GT-1030 will do just fine. Those you can get in low profile.

But then I guess its fair to say, "I amuse easily".
 
What y'all seem to have missed, is the fact that this card is a far better offering than the dreadful excuse for a "gaming card" that was the GTX-1630 (4GB), which Huang tried to foist off on people for $190.00 at release..!

As for the GTX-1650s & 1660s, they were all GDDR-5 when released, then went to GDDR-6 The later AIB cards all required outside power, with the 1660's power being capped at 135 watts (or there about).

As for it "not being able to play AAA 2024 titles". Are we entirely sure they're not just bulking them up a bit so as to throw another Grand or two Nvidia's way?

Besides, if you only interested on pitching some smut over to the TV, or surfing the web for more, a GT-1030 will do just fine. Those you can get in low profile.

But then I guess its fair to say, "I amuse easily".
Embarassingly, I got my GTX1650 when prices were very high, I ended up paying like $200 for it, used. No regrets since I didn't want to wait (I had an on board Ivy Bridge GPU which was not going to cut it, since my old GTX650 would not transfer over due to lack of power connectors). When there was a temporary drop in crytomining demand about 6 months later, it was down to $120 new though. Ahh well.

Anyway.. RTX 3050 6GB is no 4090. But I had a go at playing CP2077 and The Last of Us Part I on my 1650; CP2077 is fine on it; TLOUI has it at about it's limit but on low it runs about 30FPS (with lows in the low 20s.) I imagine mainly due to lack of VRAM (is this a wine/vkd3d thing or does TLOUI just report usage like this? On this 4GB GTX1650, it claims in the graphics settings menu it's using 6 out of 12GB VRAM.) But, the 3050 6GB has about double the performance across the board (and of course has raytracing hardware), and 50% more VRAM, if the 1650 can just handle essentialy the most unoptimized and bloated game on the market, the 3050 will have some nice headroom to last a while. Not a lot of headroom but some.

Just for the LOLs, I attempted to fire it up on a Tiger Lake 1154G4 notebook (48EU/384 shader integrated Intel GPU). It's a rather slow GPU, but with 20GB RAM in there (and it's a integrated GPU so the GPU can get large amounts of VRAM) I was curious to see if having like 16GB VRAM availble to the game would make up for the GPU being a bit slow. I didn't find out, it got to the spinning dogtag for a good while (while it builds some shaders), then some shader or other actually hung the GPU. The GPU hang didn't lock up the deskotp permanently, the game hung but I could alt-tab out of it (the kernel logged some unhappy messages about the GPU being hung for over 10000ms and having to reset the GPU.)
 
What y'all seem to have missed, is the fact that this card is a far better offering than the dreadful excuse for a "gaming card" that was the GTX-1630 (4GB), which Huang tried to foist off on people for $190.00 at release..!

LOL, I believe I excised that card from my life experience as it sold for 2x+ the price it should have. For me the 1050, 1050 Ti, 1650, 1650 D6, 3050 6GB, RX 6400 exist as the recent choices for slot-powered gaming.

The 1030 was just too slow though at least it was only $80. But the 1630 for $160+?? Released 4+ years later yeah it should be 2-3x faster than the 1030 but at maybe $10-20 more! Though when you see that the 1650/D6 were going for well over $200 at the same time, the 1630's insane prices are somewhat explained.

So I take your point, this is at least vaguely reasonable at $180.
 
@Lew Zealand OK, I have an EVGA GTX-1050 ti (4 GB), an Asus "TUFF" GTX-1650 (GDDR-6), and neither card is slot powered.

The 1650 cost me $210. US at the very tail end of "the great video card famine". I had it still unopened when Newegg managed to obtain Asus "TUFF" GTX-1660 ti cards for about $220. US. As I held the 1650 for more than 30 days, Newegg refused to swap it out for the 1660. I said "f**k it", and bought the 1660 outright. Newegg did compensate me with a price match and a $10 gift card. After they sold out of those two shipments, the prices went back up to about $270.00..!

Over the years I've built up a collection of "relics", (or E-waste, if you prefer). One such novelty is a Gigabyte "P-45" ("Performance", w/ NO IGP), with an Intel Core 2 Duo E-7300. The 32 bit XP was swapped out for Win Pro 64 bit, an SSD and 8 gigs of RAM installed. (Keep in mind this is a SATA 2 rig).

OK, I don't game at all, and confine my online activities to, searching for "erotic art", arguing in forums, (some would say "trolling"), and being talked down to by the new wave AMD crowd, as I build with Intel. Then there's the whole paying my bills once a month thing.

Back to the P-45 and its current lowly GT-1030. So the card only draws 30 watts, big plus. I started with a GT-710 I had laying around, which wouldn't work for an hour without crashing the driver. About every hour I had ti yank the monitor cable and reinsert it to get the display back. Pull he 710, insert GT-730, which only crashed every couple of hours. (same procedure required). Enter a GT-1030.

No crashes whatsoever, and the only occasional negative symptom it displays, is the failure to overwrite the former pages background color without a tiny bit of perceptible lag. It drives 2K monitors just fine. I does want to ague about having to output 4K, so you just let your TV scale up from 1080p, and "all's well that ends well".

Brace yourself, outrageous and blasphemous statements incoming. Excluding any gaming use, the GT-1030 could almost be considered the "sweet spot" for the average mainstream user.

But, I did see that nasty GT-1630 (did they have the hubris to call that turd a "GTX"? I can't recall), for $189.95 at the time of release. The reality is though, at between $100 & $125 it would have been a decent upgrade for the mainstream user from the GT-1030.

In other news: While it may be beneath most Techspot member's dignity to consider buying an Intel CPU, let alone a 2 generation old one, Newegg has been offering the Alderlake 12900K for $314. or less than half of list. (Which I think was somewhere about $650).
 
So its good product. If you want to get upset over its name that's on you, it really doesn't matter! And ive already seen them for $160 online.

Besides, I find the naming scheme less annoying than AMD, with their 7xxx GPUs and their 7xxx CPUs. Can they not use a naming convention that distinguishes between CPU and GPU a bit better!

Although in both cases, if you don't do your research, you can't really blame the company if you get burned. Thats on you.
 
Didn't you ever attend school? "70" would be a "C" which equates to 3 out of 5 stars How does that make you feel?
I was being less "objective" and more "subjective" in my wording all around here.

What I mean is, what is the psychological impact of a 7 compared to that of a 3.5, it is "much" higher.

I do believe I am making a relevant point here.

Now this might be European thing (I am from Europe) I mean cultural thing.

How educated would you guess that I am?

Since I am not a communist education is highly valued here where I live.
 
Last edited:
I was being less "objective" and more "subjective" in my wording all around here.
"5 Star" ratings are numerically equivalent to 20% (or points) per star. "10 star" rating systems are rarely used, with one oddball being IMDB. (rows of 10 stars, simply take up to much space to be practical).

However, cumulative rating are pure mathematical averages, of the number of different "star" ratings received.

Whether I'm shopping or reading a review, I attempt to psychologically evaluate the buyer or reviewer's abilities and/or motivations for the mark received. At the extremes, motivation by compensation for a given product. So, "5 stars" is applied Or, the complete inability of a buyer to use and understand the intended purpose of any given item. In the case of a buyer being a complete imbecile, this would net, "I'd give this piece of sh!t mo stars If they'd let me.. After which come the realistic appraisals from knowledgeable buyers.

Then there comparative reviews. For example, comparing a GTX-1630 to a GTX-4090 might give the 1630 1 star, and the 4090 10 stars. (Assuming the 4090 didn't burn down the reviewer's house).

As for your level of education, I'll refrain from making a guess. You might be studying for you masters in social Psychology, but suck at math. Or put differently, (and more tactfully), math might not be a priority in your career objectives.

However, math is a concrete science, and terms like "subjective", don't apply. However, "you might be overthinking the emotional implications of the star rating system", IMO, does. Not that 1 or 5 star ratings can't be emotionally impactful to any given buyer, they certainly are.

I'm still sticking to my story that 1 star equals 20/100, and 3.5 stars equals 70/100, which equals a "C" in most scholastic environments.

PS, I didn't proofread this perhaps as well as I should have. Now it's time to dust off that intuition to figure out what I was trying to say. (y) (Y)
 
Back