Going From 20 to 40: A Graphics Card Upgrade Tale

Yes, it would have been useful but this was my money, not a review loan situation. If I actually had a 3080, I definitely wouldn't be upgrading it to a 4070 Ti.
Ideally nobody should be upgrading every gen. It's a huge waste of cash. Most shouldnt even upgrade every 2 gens. You've got use cases that benefited, but I'd bet many consumers who just play games dont. Only now is pascal starting to struggle with never games (that are unoptimized garbage) and likely optimized games will become too heavy in a few years, by which time any RTX 5000 series will be a massive upgrade.
 
Ideally nobody should be upgrading every gen. It's a huge waste of cash. Most shouldnt even upgrade every 2 gens. You've got use cases that benefited, but I'd bet many consumers who just play games dont. Only now is pascal starting to struggle with never games (that are unoptimized garbage) and likely optimized games will become too heavy in a few years, by which time any RTX 5000 series will be a massive upgrade.
On the other side, not upgrading 3-4 generations after makes your experience much less desirable up to the point where you could completely miss some features.
I feel like people who used to upgrade every gen will slowly moveto every third gen upgrade due to high costs.
 
.


I'm a Pascal and/or Touring user looking to upgrade and will not pay this "new price". I didn't pay Ampere prices and not going to go on Ada either. Waiting for video card prices bubble to pop.
Worst case scenario for me is not buy anymore or buy an used Ampere.

Thing is that waiting for a used card could be problematic as well. I have a great example in that it took me purchasing 3x 360TI cards to actually get one that wasn't a dud. The mining market will still have an impact for years to come. Sure, it has been lessened to a point, but will still be prevalent. You have those that lie saying they are selling off their card that was *mildly* used for production and gaming even though it becomes obvious it was mined especially when you remove the heatsinks to see the destroyed thermal compound. Then there is also the counterfeits that are being *refurbed* to appear as new, etc. The first 3060ti I bought used lasted me less than 24 hours. I took off the heatsink and the compound was cooked. The 2nd card I got lasted a couple weeks before dying. That one somehow was worse for the compound. yet lasted me a little longer. It wasn't until I got the third used one that I hit a winner and it was a Dell OEM that was pulled from a system six weeks prior to me purchasing and the seller had all of the information to prove it. If it wasn't a purchase for my son with him providing half of the funds I wouldn't have put up with the headache especially after that 2nd card took a piss and would have just sucked up for new.
 
Thanks for the article. A decent performance upgrade over the years.

Speaking of years, I tend to compare 'computer years' to dog years instead of Human or automobile years since the tech becomes obsolete so fast.
 
Well it certainly shows that unless you're going 'all in' and getting a 4090, upgrading from any 30xx class card is a total waste of money.

Only those with a 20xx class card, or older, should be even contemplating paying these new greed driven prices that Nvidia have shoved down everyone's throats.
My 2080ti is still doing what I need I had thought of upgrading but for the games I'm playing it's fine.
 
I was hoping to get a new AMD graphic card but the prices just killed them for me. The 4070 Ti is a good compromise for the price to performance compared to all the other cards currently on the market. I just couldn't allow myself to spend too much more than the card it replaces the 1080 Ti.
 
Not my problem they jacked up the price of the x70 series so much then. Sorry, I'm the one who decides what is a good price or not, not them.
Really? So you have insider info on what their manufacturing costs are, and have adjusted for global inflation to reach your conclusion?

What's "good" to you and reality don't always match up. I paid $50 more for my 12GB 3080 than I did for a 1080Ti in 2017. Double the performance. No complaints.

BTW: The GTX 980Ti was $649 in 2015
 
I am so glad to see a review that understands AI and rendering are part of the equation. I went from a 1660 Super to a 3060 12GB due to budget (read: Wife) and am very happy with the improvements in Maya, Premiere, Photoshop and Daz which are my main uses. I also tinker in AI/ML assisted robotics on the side. I would love to see more articles like this so I can start planning my next upgrade.

My only question is: Do you find the 16GB main RAM limiting? I had 32GB and upped it to 64GB and want to add 64GB more in the two open slots. 32GB was a noticeable bottleneck for me and I am hoping 128GB will give me at least a bit of a bump over the 64GB I upped to when I did the GPU upgrade.

Hey, no matter what, HAVE FUN and keep creating. Good for the brain and for the heart! ;)
 
I am so glad to see a review that understands AI and rendering are part of the equation. I went from a 1660 Super to a 3060 12GB due to budget (read: Wife) and am very happy with the improvements in Maya, Premiere, Photoshop and Daz which are my main uses. I also tinker in AI/ML assisted robotics on the side. I would love to see more articles like this so I can start planning my next upgrade.
I wouldn't have done the upgrade if I only used my PC purely for gaming (nor would I have 4K monitors, for that matter). But Blender alone made it worth every penny.

My only question is: Do you find the 16GB main RAM limiting? I had 32GB and upped it to 64GB and want to add 64GB more in the two open slots. 32GB was a noticeable bottleneck for me and I am hoping 128GB will give me at least a bit of a bump over the 64GB I upped to when I did the GPU upgrade.
It is a problem, especially when experimenting with rendering engines, but for what ever reason, my system's motherboard just doesn't like all four DIMM slots filled. At some point, I may consider replacing the 2x8GB setup with 2x16GB, but that'll have to wait for a while.
 
Really? So you have insider info on what their manufacturing costs are, and have adjusted for global inflation to reach your conclusion?

What's "good" to you and reality don't always match up. I paid $50 more for my 12GB 3080 than I did for a 1080Ti in 2017. Double the performance. No complaints.

BTW: The GTX 980Ti was $649 in 2015
Sorry, inflation's not double and I'm not going to simp for a company posting higher than ever margins.
Sorry for your loss.
 
Sorry, inflation's not double and I'm not going to simp for a company posting higher than ever margins.
Sorry for your loss.
Your math is off: paying $50 more than I did in 2017 is not "double". What is double is the performance. Well worth it for me. Sheesh - we're not talking about a $50K Harley here.

I don't care what Nvidia's profit was on the sale. I wanted the performance, and I got it for a good price, comparatively speaking. Hardly a "loss" to gain 100% more performance from my GPU for a minimal premium, nearly six years later.

"Value" is subjective. If one feels that the price is worth the benefits, then it is a good value to them. I now have a sweet 12GB OC'd 3080 which will serve me well for the next 4 years at 3440x1440p. At $800 (then subtract the $300 I got for my old 1080Ti on eBay), it will cost me 34 pennies a day, before electricity.

Yeah, I can swing that. :joy:
 
Last edited:
Neeyik... did u ever consider a RTX 6000..?
The $6799 Ada RTX 6000? Well, I considered the fact that it's not available in the UK and it's 8 times more expensive than what I wanted to spend but other than those aspects, no I didn't consider it at all.
 
Always fun to look back but why is 4K used here? None of these cards are really for 4K gaming (without DLSS/FSR) IMO. 1440p would make more sense. Even Nvidia said 4070 Ti is aimed at very good performance at 1440p and not 4K.

It will work "alright" for 4K in some games, sure..

Nvidia is really gimping bus this generation. 4060 is going to be miserable if 128 bit is used... :/

Maybe we will see a 4060 Ti with 192 bit bus after 4070 has launched..
 
Always fun to look back but why is 4K used here? None of these cards are really for 4K gaming (without DLSS/FSR) IMO. 1440p would make more sense. Even Nvidia said 4070 Ti is aimed at very good performance at 1440p and not 4K.
Because I don't like how 1440p looks on either of my 4K monitors.

It will work "alright" for 4K in some games, sure..
In the vast majority of the games I play, 4K is perfectly fine with the 4070 Ti. Where it's not, the problem is easily resolved by applying upscaling.
 
Back