I understand what the author is trying to say but it’s pretty pointless to discuss what someone should have bought two years ago. That has no bearing on their purchase decision “today”.
I actually appreciate that he's explicitly talking about the useful life portion of the value equation. Too many reviews leave this out. It's an important concept to understand. Without it, you might think that a new card today at roughly similar price and performance to a last gen model is an equivalent deal. Or that that any model card purchased near the end of its generation is worth the same as it was at the beginning of its generation. But it's not at all, you're paying the same money for two less years of remaining useful life. If you convert the purchase price to an estimated equivalent monthly subscription price you'll see it can make for a very substantial price difference.I understand what the author is trying to say but it’s pretty pointless to discuss what someone should have bought two years ago. That has no bearing on their purchase decision “today”.
Maybe for a handful of games going the "Crysis" marketing route, but if there's any upside to the abominations that are microtransactions, loot boxes, etc. increasingly even in paid games, it's that publishers have a big incentive to make sure their titles run well = gamers will keep playing on the hardware they actually have.I'm with a 3060ti and I'm fine with it. I think manufactures will try to make most cards unusable by stimulating new games to demand more RAM, so most 10- to 40- series will be very restricted, though the chip is still capable. I bet that the 50- series will only have 8 GB RAM on the 5050 versions and will come with 12-16 GB for all other mainstream versions with a minimum of 16 GB RAM for the 60- series. That way it's guaranteed that most 10- to 40- series owners will feel the need to change, even if the chip is not that good.
Regarding "useful life", Nvidia still supports Maxwell, an architecture that's about 9 years old now, but it's likely on the chopping block in the next couple years. I think what saved it from Kepler's fate is that it and Pascal have a pretty similar architecture. Ampere and Ada Lovelace are also fairly close other than DLSS3 and improved RT perf in the latter.I actually appreciate that he's explicitly talking about the useful life portion of the value equation. Too many reviews leave this out. It's an important concept to understand. Without it, you might think that a new card today at roughly similar price and performance to a last gen model is an equivalent deal. Or that that any model card purchased near the end of its generation is worth the same as it was at the beginning of its generation. But it's not at all, you're paying the same money for two less years of remaining useful life. If you convert the purchase price to an estimated equivalent monthly subscription price you'll see it can make for a very substantial price difference.
It may not be a good time to upgrade a GPU, but it’s really good for anyone still rocking Gen 1/2 AM4 to step up their CPU - as well as filling up those DDR4 and m.2 slots. PCIE 4.0 SSDs are now less of a price premium over their predecessors.
It also seems that this article is written on the premise of someone upgrading a 12-month old PC. Most people however are coming from much older machines than that, so 2021/22-gen GPUs are a reasonable upgrade path from Polaris/Vega or Pascal etc. “Upgrade” in PC terms has generally referred to a minor component swap to extend service life, as opposed to a system replacement.
Yes I agree with you there, with regards to this statement in the article:I understand what the author is trying to say but it’s pretty pointless to discuss what someone should have bought two years ago. That has no bearing on their purchase decision “today”.
That was pretty much smack in the late part of the crypto boom, and GPUs were 150-200% RRP, if not more (especially Nvidia). I know because I *had* to buy an RTX3060 12GB in June 2022 to actually get some work done with graphics editing and video rendering, and was able to claim it as a work expense. By then the Ethereum hype had died off and GPU prices were beginning to drop, but they were still around 20-30% above RRP. RTX 30s didn't really approach RRP until Q4 2022. So yeah, not really fair to tell people they should've done something back then.In essence, if you were planning to upgrade from a GPU that's now a generation or two old for roughly the same price, you would have been better off doing so 1-2 years ago, or instead continue to wait for another year or two.
There are new mid tier x3d cpus released. You might want to check how well they perform compared to 5800x3d, unless you do not want to upgrade motherboard and ram.Really it’s just GPU upgrades that are pointless at the moment. I’m waiting to see if there are any Black Friday deals to go from my 1700 to a 5800X3D.
I actually appreciate that he's explicitly talking about the useful life portion of the value equation. Too many reviews leave this out. It's an important concept to understand. Without it, you might think that a new card today at roughly similar price and performance to a last gen model is an equivalent deal. Or that that any model card purchased near the end of its generation is worth the same as it was at the beginning of its generation. But it's not at all, you're paying the same money for two less years of remaining useful life. If you convert the purchase price to an estimated equivalent monthly subscription price you'll see it can make for a very substantial price difference.
Simply put: We at the point where increasing GPU performance through adding more shaders or ramping up clocks is over. We're about to hit "peak computing" due to the increasing manufacturing costs of using smaller processer nodes.
It's definitely possible. I have my 4090 using 250 watts at vermitide 2 max settings, 4k and vsync on capped frames at 120hz on my cx. The benefit is 3 fold. Less energy, less heat output and you still have the the benefit of the same latency than if my frames doubled. Maybe even 4 fold In the summer where you don't have to run the ac just to keep playingGot a 3070 in early 2021 and hoping the 5070 is at least a 2x jump with at least 16GB VRAM, otherwise I may wait for a discounted 4080 or 7900 XTX then. But I wanna keep TDP below 250 watts as much as possible.
my 1070ti ran everything I play just fine at 4k. Granted, all I really is ESO, EvE and Age of Empires II. I'd probably still have it if it didn't die but I'm happy with my 6700xt.I feel the pain in MFS2020 at 3840x1600 (no surprise there) but all other games? Nope, runs everything fine.