Thank you sir you made me laugh more than the comedy specials on Netfilx,now I know I'm wasting my breath,
A. How the heck is the exact same principle since games become MORE GPU DEMANDING over time? Take the most powerful CPU of this time and run it with a 560 guess what? It's going to be the same pile of crap no matter what CPU you throw at it, that's mainly because the GPU is to weak, If you would like to see more useless benchmarking for a card that is aimed at 1080p because at some point you might want to hook a 4k panel on it, Google, no one in their right minds that spends $250 on a GPU would think "Hey I wonder how it does 4k since it barely spits out 60 fps on 1080p". Why on God's green Earth do you think 720p tests are used? newsflash
to eliminate the GPU dependency. What bottleneck are you trying to expose when running a higher resolution other than the GPU itself? What's your opinion on CPUs performing the same at higher resolutions? How would a GPU test at 4k would make any difference for a 1080p card? You are implying that testing 2 different parts that have different roles in the same test is a good idea... it's not. Although you might think that having more Vram is a good thing if you want to go to a higher resolution, while that might be true in part, you have to take into account if the chip can push the that amount of Vram, if this wasn't a thing we would all be gaming on 8800 GT's with 12 GB of Vram.... both of these cards could have 240 GB of HBM they won't be able to use them.... That's why there's little to no difference between the 580 4gb and the 8gb version in higher resolutions, does it help? Yes, does it make the whole experience better? Hell no.
B. Please educate me more on how a CPU becomes the bottleneck faster than a GPU does. Here's a hint those APIs you quoted and have no clue about at this point tend to lift the weight off the CPU and focus it on the GPU. Here's another "empty statement" which part requires regular updates more in let's say 1 year in order for it to perform ok in a title? Also please educate me more how can a Ivy Bridge part still does not bottleneck the heck out of a 1080Ti.
C. Saving face? Hardly, it was plastered all over the tech sites, no one could miss it, furthermore no one brought up the cut down version, the whole debate started with the
naming convention being "hard to get", but then again I'm wasting my breath since I can't convince you otherwise
.If you can't follow a conversation, stop having it (this is me being condescending). All sarcasm aside as long as it's listed somewhere everything is fair game, all you have to do is read, the specs are on the manufacturer's website, the retailer's website, take your pick, you don't have to have a degree in computer science to figure out if a number is higher (CUDA cores) it should perform better.
D. "Adoring nVidia" this would be the time that I would really need to be condescending. Here's the deal I don't care if BOSCH is making my GPU as long as it fits my needs, this wasn't the subject to begin with anyway, stop moving the goal posts. The 3.5 GB was a fiasco and guess what? they got sued and lost
, that was deceptive marketing according to the court, they paid up, this is a whole different scenario it's not even remotely close to what you're implying. As long as your point is thrown out of court it means nothing.
E. Dude seriously ... Does the box say 6GB? Yes, it does,
deal with it, you know what you're buying it's common sense.
Last but not least the average non tech savy consumer's judgement is way easier than you might think, the more expensive the part the better the performance that's why people buy i9s and Threadrippers and play Minecraft on them....