Nvidia GeForce GTX 980 final specifications reportedly detailed

This is amazing, I just can't wait to get mine GTX970 model. With less power usage, I don't need to change my PSU. :)
 
I'm sold...780ti performance with 4 gigs @ $500 pricepoint with more wiggle room for oc vs 780 ti with 3 gigs @ $650 pricepoint (reduced because of 980 release lol). More cost effective. How many people have the ability to buy top tier? Much larger market for mid tier, so much more people get to feel top tier performance. They haven't even released top tier yet, but wait for AMD counter. Meanwhile I get my brand new 980 for $500, give my son my 770 and use his 750ti in a new HTPC vs buying a top tier $1000 card, kid gets no hand me down upgrade (and this isn't reality fiscally for mass majority??) nad no other manufacturer gets my $ for an HTPC build. How bout all the bitcoin miners? Anyone think of this, or just have tunnel vision? How bout in similar comparison the new Pentium G3258 unlocked for $70: my kid's build has this and so therefore a higher quality case, cooler, fans, and oc mobo where we clocked to 4.5 Ghz and he sees 90 smooth fps in the most demanding scenarios in our fav game for sub $800= more $ for more quality, price performance products for everyone in the house. This clearly is about fiscal responsibility and I admire it and therefore Nvidia gets my money!
 
Sure I get the lure of performance per watt, and think it's cool and neat, however some of you are suckers. like the guy above who wrote a book. so all that math and you come up with 100$ after 5 years and are concerned. that's having your priorities out of whack. how about you get a job and somehow make an additional 100, don't sweat it, you have 5 years to get there! and, btw, the amd rig heats my house in the summer and that's bad? what about winter? also, I see that since it uses so much less power it'll OC very well. like that 750 card that they put out a while back. that was an OC beast. actually the worst part of that card is it won't SLI, but I digress. but in the end it all comes back to the fact that if you are the sort of guy who is really concerned with your power bill you shouldn't be buying expensive cards to run in expensive computers to fool around with games. your power bill is no good, but you can't stop gaming. maybe there should be a 12 step program.
oh yeah, the EU did it. I didn't know nvidia even sold computers. sweet!
 
Yeah, right. What was again the selling point of AMD gpus? they're cheap.
Ok, their gpus might have performance revalling nvidias, however, they dont outperform them. So the only actual reason why you would consider an AMD gpu is either because you belive in AMD, in which case reason doesn't actually play any role in the decision and we can stop here. The more logical reason is that AMD gpus simply cost less. it might be cool to have a 400$ gpu boasting out the same level of performance as a 6-700$ gpu, but you don't consider that the most money you aren't paying for your rig but for using it.
just an example: A rig with an AMD FX-9370 (highest AMD Cpu I know of) (220W) + R9 295X2 (500W)
leaving out other parts, this currently costs about 1220$ and consumes 720W. I don't know what energy might cost where you live, but here in Germany we're paying 0,30€/Kw, so let's use 0,35$. using your rig every day for, let's say, 6 hours, you use 0,720 Kw/hour, *6 = 4,320 Kw/hour. So your paying about 1,512$ each day. that makes about 550$ a year. So after 3 years, you payed about 300$ more for your power bill than you payed for your rig.
now the same with an Intel I7 5960X and nvidia 980 in dual-sli. (The actual performance of both, cpu and gpu should be a bit (if not much) higher than the AMD rig, but let's assume they had the same performance)
1000$ for cpu (140W) and 1200$ for gpu (350W, 175 each) (600$ each). same calculation as before:
490W = 0,49 Kw/hour * 6 = 2.94Kw. 2.94Kw * 0,35$/Kw = 1,02$/Day. that Makes 371$/year. So after about 7 years you payed your rig in power bill.
let's compare: 1200$ and 550$/year vs 2200$ and 371$/year. after 5 years, you payed 3950$ for the AMD and 4055$ for the intel/nvidia rig.
that still makes a difference of 100$, however, as the AMD righ uses more power, it also produces more heat, what results in higher cost for cooling. So I guess after 5 years the AMD and Intel/nvidia rig costed you the same.
However, if you say money isn't an issue when using a high end rig, I seriously don't know why your actually using an AMD product.

And by the way, looking at this, you may now understand why AMDs color is red, while nvidias color is green.
No that is not true, there are more reasons to using Nvidia or AMD then just "Its cheaper to use AMD". AMD GPU's on the top end have generally lasted better/done better at higher resolution due to the extra VRAM compared to Nvidia especially recently. On top of that when in multi-GPU setups AMD has scaled better in games especially in situations where resolution is higher than normal not to mention eyefinity has better support than Nvidia surround. AMD has also claimed the title of most powerful Dual GPU card recently continuously with the exception of the 7990 vs 690 which while performed better at stock the 690 seemed to do better than the 7990 especially when factoring overclocking.

So do not say "The only reason to be AMD is because their cheap" because that is not true. Each side has its pros and cons but fact is there are major reason to buy each side and it all depends on what you want.
 
He also forgot that using that:
a.) said PC with said settings won't automatically max out power consumption, if it is only used for lighter tasks, it will obliviously use less.
b.) said TDP does not mean average consumption: "AMD’s PowerTune and Nvidia’s GPU Boost technologies introduce significant changes to loading [...]" (Tom's Hardware, R9-295x2 Review, page 12) Both technologies aim to improve power consumption, resulting in often noticable deflections when measured over time.

That being said, Maxwell still seems more efficient, but the difference is not as big as implied, especially when the improvements of GCN 1.2 (Tonga) are also taken into consideration.
 
Are there people that really care about their electricity bills? If so then they wont be looking at high end cards anyway as they too expensive.

As for benchmarkers I aggree that there must be some sort of pointless criteria for them to compare against, in this case it seems to be power draw. (IMHO anyway)

Go go AMD and dont get bothered with those who compare their electricity bills as true enthuasiast dont care about performance per watt. Having read this I look forward to AMD 3xxx series more than ever.

Good day electricians

Are you really so daft?
Aside from the obvious overclocking headroom lets say that these GPU's use 100 watts less than your average high end GPU, now lets say the Nvidia sells 10,000 of these. That's a 1,000,000 watt difference per HOUR at max load, averaged out over every user. Lets say that the average user uses their computer gaming (lets assume the card is at max load) for roughly 3.5 hours per day, that's 3,500,000 watts saved in a day OR 1,277,500,000 saved over the period of one year.

So, does overclocking headroom and less demand for energy sound horrible to you or do you just fail to realize that small differences (in this case quite a large difference) stack up.
 
"But do take into consideration I read somewhere that the EU will be implementing strict power usage on computers in the future maybe that is why nvidia also made this move."

Lol, that's what we called technology improvement, more efficient than before


"There are 28 member-states in the European Union, and the EU economy constitutes 20-23% of global GDP"

20-23% global GDP.. I don't see any connection here.. Oh, did you mean that numbers also reflected that EU constitutes 20-23% of nvidia's global sale too? :D
 
The lower power draw of the Nvida 900 series cards will appeal to indoor growers who strive to maximize efficiency while minimizing heat.
 
Back