That doesn't even make sense for 3 reasons:
1) HD5800/6900 cards can make $ mining, which means their value should be at least as high if not higher since they can make $30-40 every month mining, which pays off for a used card rather quickly, especially if electricity cost is cheap where you are at
Nope. Bitcoin rig owners usually aim for the highest hash rate they can get. I know more than a few Bitcoiners that update their rig with every new architecture. If you
aware of the process you would realize that power usage is a critical factor in realized profit - hence the constant upgrades, as well as...
Videocards don't wear out, only the moving parts like the fans do.
Absolute garbage. For example, bad design caused HD 5970 failures (VRM placement + lack of heatsinks), add a bad fan profile crappy airflow and you can add the HD 4870 and 4890 cards to the equation
Maybe you should take some time to read the feedback from Bitcoiner's
And while you're at it, maybe check out some thermographs
(HD 4870's)
(HD 5970)
(GTX 590)
3) Cards with 2GB of VRAM such as the 6950/6970 are far more preferable to the 570 simply because it already runs out of vram in games such as max payne 3, GTAiv, civilization v, shogun 2, etc.
Nvidia cards hold their value in the resell market for precisely the same reason that Nvidia cards generally hold a price premium in the retail market. Have you never noticed that Nvidia cards are generally more expensive than AMD/ATi cards at the same performance point...of course you have -you just made the same observation! (post #16). The argument you're making is immaterial to
why AMD cards are cheaper second hand than Nvidia cards- now you're just trying to justify that they
shouldn't be.
Your argument doesn't hold water since cards such as 5970 and 6990 have higher resale value than dual-GPU cards from NV specifically because of bitcoin mining
More rubbish.
HD 5970.......
HD6990....
GTX 590....and maybe we should include the GTX 690 since a few seem to be popping up in the resell market
The most logical explanation is that there are just more 6900 series of cards available for sale in the used market
Might be a possible contributing factor, but GTX 560Ti/570/580 sales are in the same ballpark (see verified owner Newegg reviews and the Steam HW survey for example-The month-to-month usage is probably more an indicator than overall %), and of course, the biggest roadblock in your argument is that the absolute numbers don't tally with what you're saying. Here's Mercury Research's
discrete desktop (as opposed to IGP, APU and workstation/HPC) graphics share breakdown for Q2 2011. Note the units sold in both $200-300 and $300+
The story is much the same going into 2012 (Investor Village amongst others carry the info).
What wears the out is overvoltage/electron-migration. The transistor is just a switch. You don't wear it out by using it.
I thought you just said:
Videocards don't wear out, only the moving parts like the fans do.
And overvoltage - or more typically, fluctuating voltage/current for a failing circuit- is a product of failing voltage regulation, and/or a corroded/broken choke, and/or a failing solder joint. None of these are "moving parts", (nor are failing VRAM IC's -another prevalent cause for card failure-for that matter). And while some fail due to fan failure, a greater percentage seem to fail from design flaws, aging of components, and most importantly, the vendor pushing the extreme upper limits of the specification -hence failure rates tend to be higher as the specification/performance price point increases. You seem to be contradicting yourself.