Can 2022's $300 GeForce GPU beat 2016's $600 GPU?

He was trying to make a point, in a 1080 vs 3050 the rx6600 wins.
True fact at least. The AMD card is better priced and can pull more Fps.
3050 is a damn joke at the money asked, not even equal or better than 2060.
 
Last edited:
Of course the 3050 wont last as long, its a budget card.

if you burn the cash on the top end then you'll get alot of life out of your hardware, it may be expensive but you do get your money worth, and thats coming from someone who recently replaced a 980ti.
 

Can you please explain how did you enable re-bar on Gtx1080? I would really like to know.

"Resizable BAR has been used for all 50 games we tested at 1080p and 1440p"
 
3050 was available for $249 direct from EVGA a few months ago, I would assume thats still the case…

A 3050 paired with a 12400F would make a pretty smooth entry gaming rig imo
 
I'm not sure that I see the point of this article because the $300 RX 6600 XT whips BOTH of these cards. It gets higher fps in Cyberpunk 2077 when set to high quality than either of these cards do at medium quality:
CP2077-p.webp

CP2077_1080p.png

CP2077_1440p.png

Death Stranding gives us a better apples-to-apples comparison because all of these cards are set to very high quality:
DS-p.webp

DS_1080p.png
DS_1440p.png

At $300, the RX 6600 XT renders this test irrelevant because it renders the RTX 3050 irrelevant.
 
I'm not sure that I see the point of this article because the $300 RX 6600 XT whips BOTH of these cards. It gets higher fps in Cyberpunk 2077 when set to high quality than either of these cards do at medium quality:
CP2077-p.webp

CP2077_1080p.png

CP2077_1440p.png

Death Stranding gives us a better apples-to-apples comparison because all of these cards are set to very high quality:
DS-p.webp

DS_1080p.png
DS_1440p.png

At $300, the RX 6600 XT renders this test irrelevant because it renders the RTX 3050 irrelevant.

Not if you can read it doesn't; "Can 2022's $300 GeForce GPU beat 2016's $600 GPU?"

Since when was a 6600XT a Geforce GPU?>

You render your whole waste of half a page with void quote irrelevant. Well done.
 
This is very interesting. It shows the generations have big improvements. I would be interested to see a similar comparison of how a 3060 or 3070 compares to the laptop version of the 3060 or 3070. how bad is laptop gaming really? I don't know if that is possible to test though.
 
Not if you can read it doesn't; "Can 2022's $300 GeForce GPU beat 2016's $600 GPU?"
You've just answered your own question. The comparison should have been with the RX 6600 XT because comparing it to a $300 GeForce card is irrelevant. The article should have been this:
"Can $300 today get you more performance than a GTX 1080?"
Since when was a 6600XT a Geforce GPU?>
Since never and it doesn't matter for the reasons I stated above.
You render your whole waste of half a page with void quote irrelevant. Well done.
I'm sorry, did you think I care what you think? Did you think that anyone cares what you think? Now that is an entertaining idea! :laughing:

Let's face it, who's interested in paying $300 for a card that gets absolutely stomped by the RX 6600 XT for the same price? Maybe an absolute fool, sure, but I don't think that's the kind of purchasing behaviour that should be encouraged. Usually somebody that stupid probably doesn't have a lot of money to flush down the nVidia toilet because they've been flushing down elsewhere.
 
Interesting comparison, but I'm going to play devil's advocate and say that the RTX3050 held its ground reasonably well. Compare the GTX780 and GTX1050ti - which also had a two-generation gap - and the former held a roughly 50% performance advantage over the latter.
 
2 generations and I would have expected it to be a win for the 3050, but that ol' 1080 is still pretty good. Yeah 3050 is a joke at it's price and 4050 needs to boost performance by 75% at the same price to be viable, but It'll still struggle against the 7600XT.
 
So can someone explain why the comparison sites that give a ranking to each video card shows the 1080 as faster/better than the 6600xt?

Not disagreeing with the article at all, I simply want to be more informed as I'd normally look at a ranking list when considering a new card.
 
So can someone explain why the comparison sites that give a ranking to each video card shows the 1080 as faster/better than the 6600xt?
Some may do it simply on the basis of review scores, accumulated across multiple websites - if you look up the 1080 and 6600 XT in our Product Finder database, for example, you'll see that the Nvidia card has a significantly higher metascore than the 6600 XT.

But that simply reflects the fact that when the 1080 came out, it was simply an amazing graphics card, whereas the 6600 XT was good but not outstanding.

Not disagreeing with the article at all, I simply want to be more informed as I'd normally look at a ranking list when considering a new card.
I can't recommend that as being a good approach to selecting a new card. The ideal way is to set your budget cap (e.g. $400) and stick to it; then check out various stores to see what products are available for that amount, and then start to go through different reviews.
 
This is very interesting. It shows the generations have big improvements. I would be interested to see a similar comparison of how a 3060 or 3070 compares to the laptop version of the 3060 or 3070. how bad is laptop gaming really? I don't know if that is possible to test though.

Actually there are YT tests doing that. Sure not 100% apples to apples, but 3060 in particular is very much competitive with desktop 3060. I'm in process of trying to get new company laptop with 3060 so I had to inform myself a little :D also, laptop 3060 beats desktop 1070 Ti by some 40%, in case someone's wondering ;) yes, my old desktop this would replace has 1070 Ti.
 
IMO, the lack of real progress on this front is due to nvidia's focus on ray tracing. The 2000 series seemed to only focus on RT with little progress for raster performance. 3000 series saw progress in rasterization, but again most focus on RT and then crypto total screwed up pricing.

AMD mostly ignored RT and focused on turning its barely competitive Vega cards into serious raster GPUs that could match or beat nvidia. That is how I think we got to today's state were AMD GPUs are better in terms of price/performance, and nvidia seems to have lost it edge in low/mid range. Though, many people still go nvidia because AMD is still the underdog for your average consumer. It will be interesting to see if AMD can change that perception like they did with their Ryzen CPUs.
 
Of course the 3050 wont last as long, its a budget card.

if you burn the cash on the top end then you'll get alot of life out of your hardware, it may be expensive but you do get your money worth, and thats coming from someone who recently replaced a 980ti.

Well it mostly depends on what generation you're betting on... Kepler GPUs aged somewhat poorly, Maxwell was fine and then we have Pascal that even with the lack of asynchronous compute aged quite well, besides increasing performance A LOT in a single generation while maintaining efficiency.

It'll take quite some time to NVidia to launch a great microarchitecture that gave a lot of value again :/
 
3050 was available for $249 direct from EVGA a few months ago, I would assume thats still the case…

A 3050 paired with a 12400F would make a pretty smooth entry gaming rig imo
I'd say that it's only worth it picking the 3050 if it's 20% cheaper than the 6600...
 
IMO, the lack of real progress on this front is due to nvidia's focus on ray tracing. The 2000 series seemed to only focus on RT with little progress for raster performance. 3000 series saw progress in rasterization, but again most focus on RT and then crypto total screwed up pricing.

AMD mostly ignored RT and focused on turning its barely competitive Vega cards into serious raster GPUs that could match or beat nvidia. That is how I think we got to today's state were AMD GPUs are better in terms of price/performance, and nvidia seems to have lost it edge in low/mid range. Though, many people still go nvidia because AMD is still the underdog for your average consumer. It will be interesting to see if AMD can change that perception like they did with their Ryzen CPUs.

Vega cards were 5th generation GCN Architecture... Radeon 7000 series GPU will be based on the 3rd generation of RDNA architecture, RDNA3.


Understand what AMD's CEO Dr Lisa Su did. 6 years ago She split AMD's graphics division into Gaming and Compute... giving us RDNA architecture for gaming and graphics, and giving us CDNA architecture for Compute & Enterprise.

New architecture required a whole new driver stack and software suite, which AMD secretly worked on until Dr Su made her stunning announcement 5 years ago when revealing the Radeon 5000 Series and the new RDNA gaming architecture. And how it was 100% scalable, etc..

RDNA architecture is what the new Consoles PS5 & XSX use.. and it is what architecture nearly every major AAA game will be designed and optimized for.
 
" a complete mystery as to how the green team is able to shift a sub-par product at such a premium."

Maybe because the green team keeps getting the press.
 
Back