Nvidia GeForce GTX 980 final specifications reportedly detailed

By Scorpus · 33 replies
Sep 16, 2014
Post New Reply
  1. In the lead up to the imminent launch of the Nvidia GeForce GTX 900 series, final specifications for the two top-end cards - the GTX 980 and GTX 970 - have been leaked thanks to VideoCardz.com.

    Read more
  2. Are there people that really care about their electricity bills? If so then they wont be looking at high end cards anyway as they too expensive.

    As for benchmarkers I aggree that there must be some sort of pointless criteria for them to compare against, in this case it seems to be power draw. (IMHO anyway)

    Go go AMD and dont get bothered with those who compare their electricity bills as true enthuasiast dont care about performance per watt. Having read this I look forward to AMD 3xxx series more than ever.

    Good day electricians
  3. Seventh Reign

    Seventh Reign TS Booster Posts: 131   +65

    You are not the brightest bulb in the lamp. Lower power consumption means more headroom to overclock meaning more performance. A card that draws less power can be fed more power making it faster. It also produces less heat making it last longer
  4. surely lol
    I wont even start.
  5. tomkaten

    tomkaten TS Maniac Posts: 222   +143

    Your comment shows just fine, but you've got some anger management issues as far as I can tell...

    For the last time, more power draw equals more heat. More heat equals more noise. Some people tolerate a lot of noise, some don't, YMMV. But we can all see AMD struggling to manage the heat on the R9 290 and up. So none of your personal feuds matter here, because people will always LOVE technologies that bring a better performance/watt ratio.
    cliffordcooley, Burty117 and robb213 like this.
  6. Looks like another AMD fan boy
    its all about better performance/watt ratio....to have the same or more performance with less power being used is what its all about.....this applies to most things

    Good luck with your 100w lightbulbs around your house I will stick with my 4 watt SMD's that give the same amount of light yes maybe not as warm but at 96 watt less usage I can deal with a less warmish light.
    cliffordcooley likes this.
  7. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    Not exactly, it still comes down to the card and its designs (Like its power delivery system for instance) before you can just start pumping more electricity in it. Just because a card uses less energy than its predecessor, does not mean you can pump up to the same amount for performance boosts.

    Either way the GTX 980 sounds pretty cool with that hefty boost clock. I am shocked they are going that high (Plus with Boost the way it works it just means the clocks could go higher). I am anxious to see the actual performance in games as that will determine how great this is going to be and should then change the market up.
    cliffordcooley likes this.
  8. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 9,715   +3,696

    Thank you! I'm sure it would have been more BS.

    The others are right, the electric bill is not the only concern. Although I do see your point though with high-end card. If anyone is going to by one they more than likely will not care about the electric bill. However as others have mentioned, heat and noise is a concern. The electric bill just tops off the cons, after reviewing the other cons. You are trying to review the icing on the cake, without reviewing the cake.
    tomkaten likes this.
  9. Nero7

    Nero7 TS Maniac Posts: 273   +104

    Damn that 970. With less power consumption one might be able to take a 500w instead of a 600w power supply. Or fit in a dedicated physx card like the 750.

    intriguing. Might exchange my gainward 770 phantom 4gb for a gainward 970 phantom with hmm... 8gb? Pair it with a 750 and see how it goes xD ofc not now but in the foreseeable future.
  10. robb213

    robb213 TS Maniac Posts: 330   +100

    True, but it's safe to assume it will be the case in most circumstances. Or if we're going by laws alone, which you, me, and nearly most people here know off the top of their heads, it's the suspect trend. Doesn't hurt to be optimistic when it's a realistic wish :)

    I've heard a few rumors that the 900s will be utilizing cheaper/shoddier VRMs, which would definitely lead to a lack of stability when pumping more electricity in. Hopefully, that isn't the case.
  11. Experimentongod

    Experimentongod TS Maniac Posts: 269   +111

    Looks like I will get away with keeping my +6 year old 550W PSU? Insane. Guess I'll have to wait for it to blow up. Great unit.
    amstech likes this.
  12. amstech

    amstech IT Overlord Posts: 1,936   +1,101

    Please include 4K performance charts! PPPPPPLLLLLLEEEAAASSEEEE.
  13. Puiu

    Puiu TS Evangelist Posts: 2,654   +1,093

    TDP doesn't mean much. Power draw usually is very different from the TDP. We'll have to wait for some official numbers that show how much power it uses in idle and load.
  14. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 9,715   +3,696

    Do all games support 4K? Or would the charts be limited to those games that do?
  15. Someone still lives with their parents...
  16. robb213

    robb213 TS Maniac Posts: 330   +100

    Pretty much all games coming out now do, or at least they do with tweaks. The main problem as I still see it is field of view (lack of control and improperly applied methods).

    If you want to give yourself a rundown of recent games and their standings, WSGF is a good start.
  17. Anyways fan boys or not the point is as the performance per watt goes.
    The titan 2 is coming and it will be a beast lets pray for 4096 cuda cores 8 gig ram and a 1024 bit bus :D

    Funny thing about less power you guys miss the dx12 concept it will use less power than dx11.

    What I want from nvidia is a tegra k1 pcb that can plug into a pci express port with its own memory custom pc interface to run android and can run in conjunction with current gpu's.
  18. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    Well most would at least scale at least to 4K but some have troubles with proper support for it. One of the few games I can think of that at least brags about 4k is Ryse: Son of Rome PC edition being a native 4K game (Or at least being designed with 4k in mind).
  19. Also forgot to mention if nvidia were to make a card that uses as much power as previous gen cards the performance increase would be 2 to 3 times as much as previous generations.

    But do take into consideration I read somewhere that the EU will be implementing strict power usage on computers in the future maybe that is why nvidia also made this move.

    Long story short you miss the point with the amount of performance per watt increase from nvidia they show that they can create a beast of a card without having to make one.

    And dont forget with pascall coming in the years to come with vertical stacked memory we will be looking at 16 gig and 24 gig gpu's.
  20. I'll wait for the official reveal on Thursday to decide if I'll up grade from my 780 ti. If I'm impressed, I'll buy a 980 and sell my 780 ti on ebay, but if I'm not impressed my eye is on the R9 295x2 wich is only 999.99 right now. If I buy the R9 card I'll keep my 780 ti and just have a hybrid NVidia-AMD super computor!
  21. tmach

    tmach TS Rookie

    It also means being able to get a somewhat lower wattage PSU, which frees up a little cash for other things.
    Seraphim401 likes this.
  22. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 9,715   +3,696

    I fail to see how that is relevant. I doubt nVidia will limit their cards worldwide based on one countries regulations. If EU does this then they can restrict sales to only using the 970 or worse 960. Not to mention the question of policing system builders that configure in SLI/Xfire. The regulation will probably only effect pre-built systems. Still yet AMD would be the one hurting the most, if this comes to pass.
  23. Fallere825

    Fallere825 TS Rookie

    Yeah, right. What was again the selling point of AMD gpus? they're cheap.
    Ok, their gpus might have performance revalling nvidias, however, they dont outperform them. So the only actual reason why you would consider an AMD gpu is either because you belive in AMD, in which case reason doesn't actually play any role in the decision and we can stop here. The more logical reason is that AMD gpus simply cost less. it might be cool to have a 400$ gpu boasting out the same level of performance as a 6-700$ gpu, but you don't consider that the most money you aren't paying for your rig but for using it.
    just an example: A rig with an AMD FX-9370 (highest AMD Cpu I know of) (220W) + R9 295X2 (500W)
    leaving out other parts, this currently costs about 1220$ and consumes 720W. I don't know what energy might cost where you live, but here in Germany we're paying 0,30€/Kw, so let's use 0,35$. using your rig every day for, let's say, 6 hours, you use 0,720 Kw/hour, *6 = 4,320 Kw/hour. So your paying about 1,512$ each day. that makes about 550$ a year. So after 3 years, you payed about 300$ more for your power bill than you payed for your rig.
    now the same with an Intel I7 5960X and nvidia 980 in dual-sli. (The actual performance of both, cpu and gpu should be a bit (if not much) higher than the AMD rig, but let's assume they had the same performance)
    1000$ for cpu (140W) and 1200$ for gpu (350W, 175 each) (600$ each). same calculation as before:
    490W = 0,49 Kw/hour * 6 = 2.94Kw. 2.94Kw * 0,35$/Kw = 1,02$/Day. that Makes 371$/year. So after about 7 years you payed your rig in power bill.
    let's compare: 1200$ and 550$/year vs 2200$ and 371$/year. after 5 years, you payed 3950$ for the AMD and 4055$ for the intel/nvidia rig.
    that still makes a difference of 100$, however, as the AMD righ uses more power, it also produces more heat, what results in higher cost for cooling. So I guess after 5 years the AMD and Intel/nvidia rig costed you the same.
    However, if you say money isn't an issue when using a high end rig, I seriously don't know why your actually using an AMD product.

    And by the way, looking at this, you may now understand why AMDs color is red, while nvidias color is green.
    DJoshua likes this.
  24. *facepalm* Did you really just refer to the EU as a single country? There are 28 member-states in the European Union, and the EU economy constitutes 20-23% of global GDP.
  25. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 9,715   +3,696

    Aside from my lack in knowledge of EU internals, you still strengthened my point.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...