Nvidia GeForce RTX 4070 Review: $600 Mid-Range is Here

Status
Not open for further replies.
Not cheap but then what GPU is these days? With the next step up is $200 more and the immediate step below (presumably 4060Ti) potentially coming with only 8GB of memory this is about the best value you are going to get in Nvidia's line up.
Steve from Hardware Unboxed said is best. 8 gigs of vram is the new entry level graphics. Nvidia has been stagnating the 12 gig vram offerings for almost 5 years now ( 2080ti, 3080ti), 7 years ( close that figure at 11 gig vram for the 1080ti) and now 4070/4070ti. If I had to guesstimate the minimum vram going forward for future proofing is probably around that 16 gig vram threshold that leader of consoles Hardware ( AMD) had in its gpu offerings since last gen ( at the $579 price point).
The problem going forward for these 12 gig vram cards and below is the potential of playing at potatoe graphics sooner rather than later. Yes you can play at 240fps and rt on but the risk of running out of vram for 12 gig vram cards is probably around the corner and we have not even started with the unreal engine 5 games yet. Although I do believe 12 gig for the 4070 makes more sense at $599 vs the 12 gigs on the 4070ti at their respective price points.
 
It's a bit of a shame that Nvidia didn't stick with the die usage scheme it followed with Ampere, namely using the top-tier chip for the 4090 and 4080. The 4070 Ti is literally a 75% version of the AD103 -- a 4080 using a 75% version of the AD102 would have 35% more SMs, 29% more ROPs, 13% more L2 cache, and could have been furnished with a 320-bit memory bus for 20 GB of VRAM.

The AD103 actually used in the 4080 could then have been a 4070 Ti, with the full AD104 being the 4070 (I.e. the 4070 Ti would just be a 4070), and been a seriously good card with 16 GB of VRAM. This, of course, would have eaten into the profit margins for the AD102 and AD103, but hey ho -- no going back now.
 
"In an ideal world, the GeForce RTX 4070 would be a $500 product"

In an ideal world, it would be free, wouldn't it?

Are we considering inflation here? I used an inflation calculator to compare the prices of the last five generations of xx70 cards, anchoring off the $329 970 from 2014:

970 $329
1070 $373
2070 $470
3070 $456
4070 $470

Since the 2070 five years ago, MSRPs for this series have been rather stable after accounting for inflation.
 
I'm really curious about next year's new cards....or the cards at the end of this year I guess.
Will be very interesting to see how pricing and performance compare.
 
It's a bit of a shame that Nvidia didn't stick with the die usage scheme it followed with Ampere, namely using the top-tier chip for the 4090 and 4080. The 4070 Ti is literally a 75% version of the AD103 -- a 4080 using a 75% version of the AD102 would have 35% more SMs, 29% more ROPs, 13% more L2 cache, and could have been furnished with a 320-bit memory bus for 20 GB of VRAM.

The AD103 actually used in the 4080 could then have been a 4070 Ti, with the full AD104 being the 4070 (I.e. the 4070 Ti would just be a 4070), and been a seriously good card with 16 GB of VRAM. This, of course, would have eaten into the profit margins for the AD102 and AD103, but hey ho -- no going back now.
I don't know, I think the 4080 is actually quite a nice leap over the 3080, over 50%. It also returns the 80 series to the 103 die size, considering the 3080 was a fluke on GA102. The problem is just that price, the smaller die size savings definitely wasn't passed on. I imagine that AD102 will be used for a 4080 Ti in the future which will be interesting from a pricing perspective. Will 4080 drop by then or will 4080 Ti be $1400 with more VRAM and 15% more performance? Nvidia actually did a decent job tiering their cards this time around from a performance perspective, the 30 series was a bit out of whack, less than 50% the price for the 3080 but 90% the performance of the 3090, the 4070 Ti is 20% faster than the 4070, the 4080 is about 25% faster than the 4070 Ti, the 4090 is about 25% faster than the 4080. So you are getting nice performance leaps that match what you would expect for the given tier. The only thing really out of whack is the pricing, especially the 4080. The 3080 made the 3080 Ti and 3090 look like horrendous value (which they were, but it only made it more obvious). If the 4080 was $1000 it would make a lot more sense, even if it was still too expensive.
 
Last edited:
I imagine that AD102 will be used for a 4080 Ti in the future which will be interesting from a pricing perspective.
More likely to just be a full die AD103, clocked quite highly (those on the 4080 are quite conservative), as the lower binned AD102 chips are going to be kept for the professional sector.
 
ASRock just lowered its 7900xt slightly below $799.99 at $779.99 with 5% microcenter discount it comes down to $741 FYI

https://www.microcenter.com/product...d-triple-fan-20gb-gddr6-pcie-40-graphics-card

Hopefully this is a foreshadowing of more price cuts.
Also the 7900xtx by Asrock fell to $969.99 ($921.50 after 5% off)
 
Last edited:
The only thing this card really has going for it is the power consumption. Everything else for the price of it is just pure sh*t. It's not faster than the 3080 and it even loses to it at 4K (in before the folks that scream, "but it's not meant for 4k!").

I'll just do what I can do and vote with my wallet - this generation of GPUs (from Nvidia and AMD) are not impressive enough to warrant me spending my money on their crap. Maybe next gen will be a proper improvement.
Personally, I have a bigger problem with the whole market.

In this case, reviewers, for some "reason" are ignoring what proprietary tech like DLSS is doing to the market, the industry and to us the customers.

Example, remember Physx?
One bad example, the Arkham games will only display all the eye candy if you are using a nvidia gpu, everything else can pound sand and that is horrible in what is supposed to be a platform that is open.
Imagine that the miracle happens that nvidia dies and no other company can buy or access their tech, those games will be forever locked into their cr@p.

DLSS is actually worse, since look at what they are doing:
Nvidia: Do you want DLSS 3.0? Need to buy a new 40 GPU.
I bet you crazy money that they will pull the same sh!t with the 50 series.

And guess what, Tim (expected, the man simply worship nvidia) and now Steven, will be super happy in shoving DLSS 4.xxx down our throats without a warning or a negative word about the precedence set.

Same for RT.
I have tried over and over to see how an over 50% drop in performance is worth the eye candy displayed.

Some games (and I mean very, very few do look that much better) to accept such penalty, yet, again, seems that Steven cant get enough of the RT stuff now and again, never saying its not worth the hit.

Anyways, it is what it is and we are at their mercy.
 
Personally, I have a bigger problem with the whole market.

In this case, reviewers, for some "reason" are ignoring what proprietary tech like DLSS is doing to the market, the industry and to us the customers.

Example, remember Physx?
One bad example, the Arkham games will only display all the eye candy if you are using a nvidia gpu, everything else can pound sand and that is horrible in what is supposed to be a platform that is open.
Imagine that the miracle happens that nvidia dies and no other company can buy or access their tech, those games will be forever locked into their cr@p.

DLSS is actually worse, since look at what they are doing:
Nvidia: Do you want DLSS 3.0? Need to buy a new 40 GPU.
I bet you crazy money that they will pull the same sh!t with the 50 series.

And guess what, Tim (expected, the man simply worship nvidia) and now Steven, will be super happy in shoving DLSS 4.xxx down our throats without a warning or a negative word about the precedence set.

Same for RT.
I have tried over and over to see how an over 50% drop in performance is worth the eye candy displayed.

Some games (and I mean very, very few do look that much better) to accept such penalty, yet, again, seems that Steven cant get enough of the RT stuff now and again, never saying its not worth the hit.

Anyways, it is what it is and we are at their mercy.

I've tried RT in one game, Metro Exodus - I really couldn't care less about it. I had to enable DLSS to give decent frame rates and when looking around at some of the spots you could see things were blurred/smeared. I didn't care for the added ambience to give a performance hit like it did and the down scale/up scale of DLSS didn't help my opinion of things. That was the one and only time I tried RT and DLSS. Yeah, no thank you.

I've said it before and I'll say it again:
I don't know who I feel worse for; AMD for being so far behind Nvidia in terms of RT performance or for Nvidia for having dedicated cores for RT and they still suck at it.

RT is still in the gimmicky stage and if none of the three dGPU manufacturers can improve upon it, it's just going to continue to be the bastardized red-headed step-child in the room - sure he's there, but most people are going to ignore him or verbally abuse him every chance they get and right now that is exactly what RT is.
 
On the one hand this card sounds like a similar price and performance to the 3080 and 6800 from last generation.

On the other, it's place within overall industry context feels very different to me. The prior gen cards were premium cards, offering near the top of available performance for their time, and aligned with the start of a new console generation. (Even as a PC gamer I care about console lifecycles because I believe a lot of game development is strongly influenced by console capabilities, even games that ship on PC.) If you wanted a premium card nearly three years ago, those cards seemed like they fit the bill at an appropriate price.

The 4070 does not fit in the same place today. You are still paying premium class dollars, but you are getting nowhere near the top available performance, and you are buying in at what may be closer to the mid-point of this console generation.

Some people will need a card today and if they need this level of performance, that's where they are. For people who have more choice this feels like a pass and wait for future offers situation.
 
I don't know who I feel worse for; AMD for being so far behind Nvidia in terms of RT performance or for Nvidia for having dedicated cores for RT and they still suck at it.

RT is still in the gimmicky stage...

That's exactly it for me.

When I finally got an RT capable GPU, I tried it in CP2077 and Control, the 2 premiere games for the tech. In CP2077 I couldn't tell a difference that was worth more than maybe a 20% frame loss. In Control I was not in a spot where there were lots of shiny glass panes so I saw zero improvement in image quality. In both the frames were a little different, but not necessarily better either way. I want notably better for the frame loss, like good AO or better textures give you.

I was pretty disappointed to say the least. If I'd had a 3080, I would probably run CP2077 with regular rasterization, though I'd definitely give RT+DLSS a go for a while. But in Control I'd just go raster (I'm late in the game and those shiny surfaces seem less common there). As it is, I use a 6800XT at 1440p and the RT frame reduction is bigger there so that makes for an easy choice.

Raster is still king. RT is the future, but I'm playing games now.
 
No clue why DLSS is considered a desirable feature anymore. Any game that has DLSS integrated also has FSR. Pretty sure there's no game that exists that either has one or the other and FSR can look better than DLSS in motion. I understand DLSS 3 exists but AMD will obviously have their own implentation coming to market soon which will make the argument between the two null and void. Nvidia really screwed up with their VRAM offerings with these cards and their price relating to the amound of VRAM they offer.
4090 - 24 GB
4080 - Should have 20 GB
4070/TI - should have 16 GB
4060 - Should be 12 GB
4050 - Nobody cares.
 
Last edited:
Techspot says "The Radeon 6800 XT can be had for around $570 and it offers more VRAM, but you'll have to weigh that up against the lack of DLSS support and inferior ray tracing performance - we'd probably still go with the RTX 4070."

16GB can save you from playing a game without stutters VS RTX and DLSS ( a better FSR lol).
Come on. In a few years people will say : This works fine on the 6800XT but not on the 4070 just because of VRAM.
 
More likely to just be a full die AD103, clocked quite highly (those on the 4080 are quite conservative), as the lower binned AD102 chips are going to be kept for the professional sector.
Looks like it would be possible to get an FP32 uplift of 17% (56TFlops vs 48TFlops) just from the 512 additional shaders and a 10% (2.75Ghz vs 2.5Ghz) increase in boost clock speeds, so yeah, I guess that could be a real possibility.
 
Do you look a review and see 47 FPS at 4K ultra and think this card is useless at 4K ??- Know what that number really means.

I read TV reviews of latest OLED , QD-OLED tvs - see people raving Samsung has more colour volume etc as so that is the pinnacle - I'll probably upgrade when this or MLA hits 83" - still happy with my Plasma - even though now it's meant to be rubbish - no 4K , no HDR etc ( still excellent for classic SDR movies )

Anyway my conclusion is know yourself , know your usage going forward - very easy for well moneyed , triple A , 4K - get the 4090 or top AMD possibly

is 4K important , RTX , super high FPS , Price , add on encoding features etc etc Your overall system - 4K might be quite constrained . Plus the type of games you want to play - Oh I must have leeway for those GPU AAA games - yet you rarely buy or play them. Do you really need high fps for your type of games?
Plus power draw , card size, noise , ease of use , warranty - would be crazy to get a 4090 for all it's hassle for only 5% real need - when a GPU at 1440p will give you 95% satisfaction in this situation.

Most of the AAA games need to be playable on PS5, or Xbox X

Plus maybe you are smarter than the average bear and can get just the same fps at 4K as Ordinary bear with a 4080 at Ultra settings - for little real lost in quality.

So know who your are , what you will use it for , and learn to tweak settings to keep up somewhat with plug and play big spender
 
I just made my move and got a 3070 at $300. Used of course, but with warranty seals intact and 6 more months of warranty coverage. So I lost 20% performance at 50% price.
Sure, but you're still stuck with 8GB of VRAM. That's going to cripple the hell out of that card.
Nice. Mine was roughly 420 but at the end of 2022. Good card in these crazy times of inflated prices.
But not a good card in these crazy times of inflated VRAM requirements.
Every new launch makes me appreciate my 6950 XT more.
I feel the same way about my RX 6800 XT.
And since I got the Asus TUF OC edition that comes with 2x8 pin PCIE power, no need to change PSU or get funky adapters.
Well at least you got that.
TBH, I don't like however some AMD cards having good value, it all boils down to how Nvidia has DLSS and AMD doesn't and we need to pretend that FSR doesn't exist or its sooo inferior that it is not even worth mentioning?
I agree but it really doesn't matter because it'll be years before this card needs to use DLSS. With my RX 6800 XT, I didn't buy it for FSR because I know that by the time I might need FSR, it's going to be incredible. It's why I always say "Buy hardware for the HARDWARE because the software can always be created, improved or modified after the fact but the hardware cannot. A card dies with the same amount of VRAM that it's born with but FSR can go through several iterations.
Thank you for the review Steve. Great work, useful advices too. And indeed, 4070 is the best affordable price/performance new 4xxx gen Nvidia offers nowadays, including what we've seen from AMD, I mean they still not released any new mid-range videocard.
I know, eh? I was saying the same thing.
I made a similar move too.
I was prepared to wait for AMD and Nvidia new gen videocard to settle down.
2 days ago I found a great offer from one of my supplier, a new ASUS TUF RX 7900XTX OC for "only" 840 Euro, so I took it instantly :)
Good call! I wish I had your supplier! :laughing:
You guys are funny... it is a 3080 with 12GB of VRAM, 2 years later, at 600$ MSRP...

This is not a 90/100 GPU, it is a joke. So much for saying that less than 16GB of VRAM is now an issue, just to give this card 90/100.

The fact that you can buy a 6950XT for about the same price, is making this GPU DOA.
Didn't you know? They're generally not allowed to give GeForce cards scores less than 80. If you look over the past two generations, this has been true (with ONE exception, the RTX 3060 8GB).
You have a valid point, just that for nowdays videocard market, 4070 is perhaps one of the best offers we can buy. In some markets, it is already hard to find 6950XT and the top new gen videocards, both from Nvidia and AMD are still overpriced in my opinion.
Though the Techspot score is quite overhyped, but this is my personal opinion.

BTW chat, If is OK for Techspot staff, I am thinking to propose to everybody, if they are willing too, to give their personal score point for the card, to see, like a poll, what chat forum users average score will be. And without any drama, let's keep it civilized.

I can give 85 points to 4070 videocard.
Let's the forum poll begin.
For less than 16GB of VRAM at $600USD, no more than 70. This card is already outclassed by the RX 6950 XT and has 25% less VRAM than the RX 6800. It's not impressive at all.
Are you kidding me? Beside the 4090, this whole lineup by Nvidia has been unimpressive.

If you put in perspective the MSRPs of the 4000 series, than it is rather an abysmal showing from Nvidia.

Just add the whole not enough VRAM problems on most of Nvidia SKUs and you have all the ingredients to skip any SKUs beside the 4090. The 4080 would be interesting, but it is 300-400$ overpriced.
I'm thinking that there must be some kind of short-circuit in Markham because there haven't even been any leaks about the upcoming Radeons, let alone any announcements. It doesn't matter though because there are enough people out there who will buy nVidia no matter what. Remember that people who buy nVidia don't care about value, no matter how much they whine about it, because they keep doing the same thing over and over.
Man, I remember when that $600 price range used to land you a top-end tier card. Now it's only able to land you a mid-ranged card.....$650 8 years ago (cost for a 980Ti) is now the same as $822 today.

Maybe I'm really starting to realize that this might not be a hobby I continue in if pricing continues to climb and climb like it has been.
Well, in the last-gen, $650 DID get you a top-end tier card, the RX 6800 XT. Radeons make this hobby A LOT easier to continue because they make you immune to nVidia's pricing.
The only thing this card really has going for it is the power consumption. Everything else for the price of it is just pure sh*t. It's not faster than the 3080 and it even loses to it at 4K (in before the folks that scream, "but it's not meant for 4k!").

I'll just do what I can do and vote with my wallet - this generation of GPUs (from Nvidia and AMD) are not impressive enough to warrant me spending my money on their crap. Maybe next gen will be a proper improvement.
I honestly don't care what next-gen is like. I intend to keep my RX 6800 XT for several generations.
It's a bit of a shame that Nvidia didn't stick with the die usage scheme it followed with Ampere, namely using the top-tier chip for the 4090 and 4080.
It's not a shame, it's how nVidia works.
"In an ideal world, the GeForce RTX 4070 would be a $500 product"

In an ideal world, it would be free, wouldn't it?

Are we considering inflation here? I used an inflation calculator to compare the prices of the last five generations of xx70 cards, anchoring off the $329 970 from 2014:

970 $329
1070 $373
2070 $470
3070 $456
4070 $470

Since the 2070 five years ago, MSRPs for this series have been rather stable after accounting for inflation.
Oh jeez, not THIS crap again! If "inflation" had ANYTHING to do with it, we'd see CPU prices skyrocketing as well because they're both made of silicon. Except, oh yeah, that hasn't happened.

An examination of AMD's Ryzen pricing from 2017-2022 shows that there's really no excuse:

2017:
Ryzen 5 1600X - $249
Ryzen 7 1700X - $399
Ryzen 7 1800X - $499

2018:
Ryzen 5 2600X - $229
Ryzen 7 2700X - $329

2019:
Ryzen 5 3600X - $249
Ryzen 7 3700X - $329
Ryzen 7 3800X - $399

2020:
Ryzen 6 5600X - $299
Ryzen 7 5800X - $449

2022:
Ryzen 7 7600X - $299
Ryzen 7 5700X - $299
Ryzen 7 7700X - $399

So, between 2017 and 2022, with some slight ups and downs, the Ryzen 6 has increased by $50 and the Ryzen 7 x7xx has remained exactly the same. Meanwhile, between 2017 and 2020, the price of the Ryzen 7 x8xx has decreased by $50.

Inflation my posterior! It's just greed. Anyone who thinks otherwise is incredibly naive.
Anyways, it is what it is and we are at their mercy.
Your post is perfectly sound and I agree with all but this part of it. Only people who buy GeForce cards are at nVidia's mercy. I chose my path long ago and couldn't be happier about it because I'm NOT at their mercy. No amount of gimmickry will ever get me to buy a GeForce card. It's the only way to really be happy because it's the only way to really not care. ;)
Thanks for the review! I think I'll just get a used 3080 10GB for $400ish and save myself $200ish.
Good luck with that 10GB of VRAM. Have you still not learned that buying a card with less than 12GB is just begging for trouble? Save yourself even more and get an RX 6800 XT. At least that card will last a good long time because of its 16GB. There's a reason that reviewers everywhere are saying to get something with at least 12GB if not 16. It's the same reason that I said the RTX 3080 was a terrible buy from day one.

I've tried RT in one game, Metro Exodus - I really couldn't care less about it. I had to enable DLSS to give decent frame rates and when looking around at some of the spots you could see things were blurred/smeared. I didn't care for the added ambience to give a performance hit like it did and the down scale/up scale of DLSS didn't help my opinion of things. That was the one and only time I tried RT and DLSS. Yeah, no thank you.

I've said it before and I'll say it again:
I don't know who I feel worse for; AMD for being so far behind Nvidia in terms of RT performance or for Nvidia for having dedicated cores for RT and they still suck at it.
I don't feel bad for AMD because RT sucks no matter which card you use.
RT is still in the gimmicky stage and if none of the three dGPU manufacturers can improve upon it, it's just going to continue to be the bastardized red-headed step-child in the room - sure he's there, but most people are going to ignore him or verbally abuse him every chance they get and right now that is exactly what RT is.
I think that RT will never be anything more than a gimmick because we're already seeing its successor, path-tracing being showcased. RT will just be another footnote that nVidia used to soak people. I never needed it before and I sure as hell don't need it now.
The 4070 and 4070 Ti will age as bad as the 3080 10 GB. In a matter of 2 years at the latest we will see that they will starve for VRAM. Especially if you enable raytracing and Framegeneration.
Yeah, but people don't buy GeForce cards because they care about value. To them, it's like owning a BMW. It's overpriced as hell but it makes them feel oh so special! :laughing:
On the one hand this card sounds like a similar price and performance to the 3080 and 6800 from last generation.
That's actually a huge problem because the value is supposed to increase exponentially from generation to generation, not stay the same.
On the other, it's place within overall industry context feels very different to me. The prior gen cards were premium cards, offering near the top of available performance for their time, and aligned with the start of a new console generation. (Even as a PC gamer I care about console lifecycles because I believe a lot of game development is strongly influenced by console capabilities, even games that ship on PC.) If you wanted a premium card nearly three years ago, those cards seemed like they fit the bill at an appropriate price.
I agree with you and they were. Then the Ethereum hit the fan and nVidia realised the truth, that there are a crap-tonne of people who will pay ANYTHING for a video card in a green box and that's why we're where we are.
The 4070 does not fit in the same place today. You are still paying premium class dollars, but you are getting nowhere near the top available performance, and you are buying in at what may be closer to the mid-point of this console generation.
It's because nVidia is trying to enact a paradigm shift to higher prices for their benefit, not ours. AMD isn't helping but AMD got shafted by consumers for so long that I really can't blame them.
Some people will need a card today and if they need this level of performance, that's where they are. For people who have more choice this feels like a pass and wait for future offers situation.
Make no mistake Brucey, nobody NEEDS this level of performance. Most games that I've seen look glorious and play amazingly well at 1080p. We're just a bunch of spoiled brats with our 1440p and 2160p gaming dreams. That's why I decided to use my RX 6800 XT until I literally couldn't anymore before buying anything new. Either the GPU will be too weak, the VRAM will be insufficient or the card will die before I buy a new one (and I have an RX 5700 XT as a backup card).
 
On the one hand this card sounds like a similar price and performance to the 3080 and 6800 from last generation.

On the other, it's place within overall industry context feels very different to me. The prior gen cards were premium cards, offering near the top of available performance for their time, and aligned with the start of a new console generation. (Even as a PC gamer I care about console lifecycles because I believe a lot of game development is strongly influenced by console capabilities, even games that ship on PC.) If you wanted a premium card nearly three years ago, those cards seemed like they fit the bill at an appropriate price.

The 4070 does not fit in the same place today. You are still paying premium class dollars, but you are getting nowhere near the top available performance, and you are buying in at what may be closer to the mid-point of this console generation.

Some people will need a card today and if they need this level of performance, that's where they are. For people who have more choice this feels like a pass and wait for future offers situation.

The PS4 enjoyed a 7 year run, the PS5 is only at 2.5 years. Yes, there will probably be a Pro version at some point, but no one actually buys those. So you're probably looking at 4 to 5 years before a new console launches and we'll probably be on RTX 60/ RX9000 series GPUs by then. (That is, if the world still exist and provided that the economy hasn't completely collapsed, I'm not convinced either are a given) I think it's still pretty safe to buy a new GPU and get plenty of use out of it before the next console generation. Honesty, I think this console generation might even be longer than the last because we're 2.5 years in and really there have not been many games that have justified the purchase of either.
 
Nice review, Steve. I'm impressed with the power efficiency of the 4070. Hopefully NVidia will continue that with the 4060. No chance I'm buying in at $600.
 
I'm confused.
"We're talking Radeon 6800 XT / GeForce RTX 3080 levels of performance for $600 - not exactly mind-blowing stuff after about two and a half years - but it's a $50 to $100 discount when comparing launch MSRP pricing and it's not like the crypto bubble didn't happen"

How is that a 90/100 product rating?
The best thing about the 4070 is the energy efficiency. If you took 3 of the 13 games out the list you would notice on average it barely matches the 3080 10GB. This product should be matching and beating the 3080ti at a minimum in all games an scenarios to be awarded a 90/100.

Sorry but $600 this is still a stinker and any gamer considering buying it you're part of the problem, buy it at $500 sure but $600 Nvidia is laughing all the way to the bank. Do not buy this product.
 
Do you look a review and see 47 FPS at 4K ultra and think this card is useless at 4K ??- Know what that number really means.

I read TV reviews of latest OLED , QD-OLED tvs - see people raving Samsung has more colour volume etc as so that is the pinnacle - I'll probably upgrade when this or MLA hits 83" - still happy with my Plasma - even though now it's meant to be rubbish - no 4K , no HDR etc ( still excellent for classic SDR movies )

Anyway my conclusion is know yourself , know your usage going forward - very easy for well moneyed , triple A , 4K - get the 4090 or top AMD possibly

is 4K important , RTX , super high FPS , Price , add on encoding features etc etc Your overall system - 4K might be quite constrained . Plus the type of games you want to play - Oh I must have leeway for those GPU AAA games - yet you rarely buy or play them. Do you really need high fps for your type of games?
Plus power draw , card size, noise , ease of use , warranty - would be crazy to get a 4090 for all it's hassle for only 5% real need - when a GPU at 1440p will give you 95% satisfaction in this situation.

Most of the AAA games need to be playable on PS5, or Xbox X

Plus maybe you are smarter than the average bear and can get just the same fps at 4K as Ordinary bear with a 4080 at Ultra settings - for little real lost in quality.

So know who your are , what you will use it for , and learn to tweak settings to keep up somewhat with plug and play big spender

A shorter way of saying all of that is, if you don't get suckered into marketing, pick the GPU in your price range with the most VRAM and best raster. (ie: Price/performance.)

4k means nothing unless you have 16GB or more of VRAM.
 
Status
Not open for further replies.
Back