Nvidia RTX 3080 prototype photographed (with an absurd design)

Buying a New car is Never a cost effective solution. Which is why buying a 3+ YO well speced vehicle is nearly always the way to go if you value money. Cars are never an investment. They are a tool. And buying one of Credit tends to mean you are spending money you don't have. The last thing someone should do is put a Videocard on Credit.

The Nvidia flagship cards while nice, and nearly never worth it on a Value perspective. They are for people who want the best, nothing more. They make for great second hand cards for people later in life.

I still rock a GTX1080. Why? Because even at 3440x1440 it still plays every game with great results. Why upgrade when video games have yet to improve themselves. The Mid-High end cards like the RTX2080 & RTX2070 have been the clear choice for higher end video cards the last year+.

So many people still rock GTX1080 and 1080ti cards simply because games just have not improved visually much over the last few years. I'm more interested in the videocards coming a year+ after next gen consoles drop.


You and I clearly aren't working with the same budgets or philosophy. The only peoplepushing for someone to buy used cars are used car salesmen. I don't even want the smell of the last guy in my cars - nor would I buy a used GPU - although I depend on those who buy used to buy my used GPU and my used cars.

When you say the Nvidia GPU isn't "worth it"...

I consider being at the top of performance charts to be worth it.

I consider the piece of mind of having the best to be worth it.


When Techspot posts a performance test and I already know I'm in the 99th percentile, I consider that to be "worth" it.

When I fire up DCS or one of these high-res texture packs and can run it in maximum detail without ever worrying...I consider that "worth it".

Gotta pay to play.
 
Don't know why the design should be labeled absurd... Because it isn't the same design used in the last 13 years? Looks like a blower style cooler with some intake reinforcement...
 
You and I clearly aren't working with the same budgets or philosophy. The only peoplepushing for someone to buy used cars are used car salesmen. I don't even want the smell of the last guy in my cars - nor would I buy a used GPU - although I depend on those who buy used to buy my used GPU and my used cars.

When you say the Nvidia GPU isn't "worth it"...

I consider being at the top of performance charts to be worth it.

I consider the piece of mind of having the best to be worth it.


When Techspot posts a performance test and I already know I'm in the 99th percentile, I consider that to be "worth" it.

When I fire up DCS or one of these high-res texture packs and can run it in maximum detail without ever worrying...I consider that "worth it".

Gotta pay to play.

Clearly.

I don't think most gamers care about being at the top of the performance charts. And the high cost of doing so with overall little rewards with yearly upgrades.

I buy for long term value. As I try to keep any extra cash going toward my F.I.R.E. Or in my case R.E

Having fun toys are great. But we all have our budgets we like to keep. I just bought a 2016 MX-5 with 37k miles as I like to save money when I can. I think the best recommendation for anyone is to buy in your price range. If you need to put it on a CC, don't buy it. Even a RTX2080 is more than the avg joe needs. With Kids I don't have as much time to dedicate toward video games as I once did, but my gtx1080 still does a great job on every game I've played.

I plan on upgrading in the next year or two, depending on the games that come out. Will be AMD or Nvidia, just depends on what is the bigger bang for my buck. Zen 3 will be calling my name when it comes out, as my old i7 is really starting to show its age running my VMs.
 
Last edited:
That's not a particularly good example to validate this point, given that they don't have the same TDP nor the same cooler. The latter also has a base/boost clocks of 1605/1905 MHz and the former is 1156/1471 MHz; and the Vega uses HBM which is packaged directly around the GPU, whereas the Navi doesn't. They're not manufactured by the same company and the nodes aren't revisions of each other.

One could use the same website's data on the Vega 64 and Radeon VII as a counter to the argument that 7nm is automatically hotter than a larger node: the VII has the same TDP, but has higher clocks (core and memory), more transistors, and twice the HBM surrounding the GPU. It does, of course, have a much larger cooler.

The actual picture is far more complex than just a case of 'xxx nm is smaller than yyy nm so the temp will be higher' - current leakage levels, number of metal layers, substrate composition, logic density, cache amount and distribution with the die all play significant roles, and that's before one starts to factor in operating voltages, clock speeds, and so on. It's not a given that Ampere graphics cards will have higher temperatures than Turing, by any means.


Those are all great points. I would like to also add/note, that also the node style, that a particular die is on, also matters to how efficient a die is.
 
I tend to look at computer equipment as an investment in the exact same manner that I look at buying cars. If you’re going to spend seven or $800 to buy something why not just go ahead and spend the extra 300 or $400 to buy it even better?

It’s not as if I am asking you to double your money and spent $2000 on an RTX Titan.

The future proofing that you get from buying it “right” the first time means that in the event the next generation isn’t terribly overwhelming you can hold out a generation.

It’s like buying a car: if you get all of the major options like moonroof, navigation, heated and cooled seats, safety tech, etc. then you have something you can hold onto longer if you so decide that the next new car replacement model that comes out is it necessarily worth going into more debt for.

As for the cost: you have to pay to play!!!
WHAT?!? "If you're going to spend $700, then you may as well spend another $400 and buy even better"?!?! It's ok for you rich kids to spend willy nilly, but to say spend an extra c60% is flippant in the extreme. Everyone doesn't have money to splash around like you seem to have. I for one, and I'm not exactly on the bread line, would have to take out a two year loan to spend $700/$800 on a GPU, and 'just' another $400 would be totally and completely out of the question. You should think before making such remarks.
 
WHAT?!? "If you're going to spend $700, then you may as well spend another $400 and buy even better"?!?! It's ok for you rich kids to spend willy nilly, but to say spend an extra c60% is flippant in the extreme. Everyone doesn't have money to splash around like you seem to have. I for one, and I'm not exactly on the bread line, would have to take out a two year loan to spend $700/$800 on a GPU, and 'just' another $400 would be totally and completely out of the question. You should think before making such remarks.


Tell you what: I'm not gonna argue.

I'm just gonna buy it.

End of conversation.
 
The drawing posted is wrong. The Fans fins are attached with outer ring. One fan has reversed blades.

Looks like the first fans is heatsink and pipes for GPU.. and the second fan and surrounding heatsink and fins, cool the memory and VRM.

It's a suck-blow design, pulling in fresh air from the ends of the card internally across the GPU cooler (think OEM Intel CPU cooler), and it exhaust the air right out near the GPU and out the back of the case.
Not wrong at all, the blades ARE attached to the outer ring, to keep them stabilised and stop fluttering, and the reversed blades are also for resonance reduction, due to them rotating in opposite directions.
 
Back