Nvidia GeForce GTX 980 final specifications reportedly detailed

Scorpus

Posts: 2,162   +239
Staff member

In the lead up to the imminent launch of the Nvidia GeForce GTX 900 series, final specifications for the two top-end cards - the GTX 980 and GTX 970 - have been leaked thanks to VideoCardz.com.

The GTX 980 flagship will reportedly come with 2,048 CUDA cores across 16 Streaming Multiprocessors Maxwell (SMM) units, up from the previously-rumored 1,920. It'll also feature 4 GB of GDDR5 memory on a 256-bit bus, providing 224 GB/s of bandwidth, while the card's TDP is listed at around 175 W.

The TDP is especially interesting, considering it's much lower than the previous generation. The Kepler-based GTX 780 with its GK110 GPU had a TDP of 250 W, so if these specifications are accurate, we're looking at a 75 W drop in TDP in switching to a Maxwell architecture on the same 28nm fabrication node.

In comparison, AMD's Radeon R9 290X has a TDP of 290 W, so if the GTX 980 really does outperform it, it'll be even more impressive when you consider the TDP is over 100 W lower.

The GTX 970 will come with 13 SMMs, equating to 1,664 CUDA cores. It has the same memory specifications as the GTX 980, and has a TDP of just 148 W, down 82 W from the 230 W GTX 770. Again, it's looking like Nvidia will have a significant performance-to-power advantage over AMD with their Maxwell architecture.

It won't be too long until we know for sure how the GTX 900 series performs, and what the final specifications for the cards really are. Nvidia is expected to launch their new series of GPUs at the end of this week.

Permalink to story.

 
Are there people that really care about their electricity bills? If so then they wont be looking at high end cards anyway as they too expensive.

As for benchmarkers I aggree that there must be some sort of pointless criteria for them to compare against, in this case it seems to be power draw. (IMHO anyway)

Go go AMD and dont get bothered with those who compare their electricity bills as true enthuasiast dont care about performance per watt. Having read this I look forward to AMD 3xxx series more than ever.

Good day electricians
 
Are there people that really care about their electricity bills? If so then they wont be looking at high end cards anyway as they too expensive.

As for benchmarkers I aggree that there must be some sort of pointless criteria for them to compare against, in this case it seems to be power draw. (IMHO anyway)

Go go AMD and dont get bothered with those who compare their electricity bills as true enthuasiast dont care about performance per watt. Having read this I look forward to AMD 3xxx series more than ever.

Good day electricians

You are not the brightest bulb in the lamp. Lower power consumption means more headroom to overclock meaning more performance. A card that draws less power can be fed more power making it faster. It also produces less heat making it last longer
 
You are not the brightest bulb in the lamp. Lower power consumption means more headroom to overclock meaning more performance. A card that draws less power can be fed more power making it faster. It also produces less heat making it last longer

surely lol
I wont even start.
 
Your comment shows just fine, but you've got some anger management issues as far as I can tell...

For the last time, more power draw equals more heat. More heat equals more noise. Some people tolerate a lot of noise, some don't, YMMV. But we can all see AMD struggling to manage the heat on the R9 290 and up. So none of your personal feuds matter here, because people will always LOVE technologies that bring a better performance/watt ratio.
 
Are there people that really care about their electricity bills? If so then they wont be looking at high end cards anyway as they too expensive.

As for benchmarkers I aggree that there must be some sort of pointless criteria for them to compare against, in this case it seems to be power draw. (IMHO anyway)

Go go AMD and dont get bothered with those who compare their electricity bills as true enthuasiast dont care about performance per watt. Having read this I look forward to AMD 3xxx series more than ever.

Looks like another AMD fan boy
its all about better performance/watt ratio....to have the same or more performance with less power being used is what its all about.....this applies to most things

Good luck with your 100w lightbulbs around your house I will stick with my 4 watt SMD's that give the same amount of light yes maybe not as warm but at 96 watt less usage I can deal with a less warmish light.
 
You are not the brightest bulb in the lamp. Lower power consumption means more headroom to overclock meaning more performance. A card that draws less power can be fed more power making it faster. It also produces less heat making it last longer
Not exactly, it still comes down to the card and its designs (Like its power delivery system for instance) before you can just start pumping more electricity in it. Just because a card uses less energy than its predecessor, does not mean you can pump up to the same amount for performance boosts.

Either way the GTX 980 sounds pretty cool with that hefty boost clock. I am shocked they are going that high (Plus with Boost the way it works it just means the clocks could go higher). I am anxious to see the actual performance in games as that will determine how great this is going to be and should then change the market up.
 
surely lol
I wont even start.
Thank you! I'm sure it would have been more BS.

The others are right, the electric bill is not the only concern. Although I do see your point though with high-end card. If anyone is going to by one they more than likely will not care about the electric bill. However as others have mentioned, heat and noise is a concern. The electric bill just tops off the cons, after reviewing the other cons. You are trying to review the icing on the cake, without reviewing the cake.
 
Damn that 970. With less power consumption one might be able to take a 500w instead of a 600w power supply. Or fit in a dedicated physx card like the 750.

intriguing. Might exchange my gainward 770 phantom 4gb for a gainward 970 phantom with hmm... 8gb? Pair it with a 750 and see how it goes xD ofc not now but in the foreseeable future.
 
You are not the brightest bulb in the lamp. Lower power consumption means more headroom to overclock meaning more performance. A card that draws less power can be fed more power making it faster. It also produces less heat making it last longer
Not exactly, it still comes down to the card and its designs (Like its power delivery system for instance) before you can just start pumping more electricity in it. Just because a card uses less energy than its predecessor, does not mean you can pump up to the same amount for performance boosts.

Either way the GTX 980 sounds pretty cool with that hefty boost clock. I am shocked they are going that high (Plus with Boost the way it works it just means the clocks could go higher). I am anxious to see the actual performance in games as that will determine how great this is going to be and should then change the market up.
True, but it's safe to assume it will be the case in most circumstances. Or if we're going by laws alone, which you, me, and nearly most people here know off the top of their heads, it's the suspect trend. Doesn't hurt to be optimistic when it's a realistic wish :)

I've heard a few rumors that the 900s will be utilizing cheaper/shoddier VRMs, which would definitely lead to a lack of stability when pumping more electricity in. Hopefully, that isn't the case.
 
TDP doesn't mean much. Power draw usually is very different from the TDP. We'll have to wait for some official numbers that show how much power it uses in idle and load.
 
Do all games support 4K? Or would the charts be limited to those games that do?
Pretty much all games coming out now do, or at least they do with tweaks. The main problem as I still see it is field of view (lack of control and improperly applied methods).

If you want to give yourself a rundown of recent games and their standings, WSGF is a good start.
 
Anyways fan boys or not the point is as the performance per watt goes.
The titan 2 is coming and it will be a beast lets pray for 4096 cuda cores 8 gig ram and a 1024 bit bus :D

Funny thing about less power you guys miss the dx12 concept it will use less power than dx11.

What I want from nvidia is a tegra k1 pcb that can plug into a pci express port with its own memory custom pc interface to run android and can run in conjunction with current gpu's.
 
Do all games support 4K? Or would the charts be limited to those games that do?
Well most would at least scale at least to 4K but some have troubles with proper support for it. One of the few games I can think of that at least brags about 4k is Ryse: Son of Rome PC edition being a native 4K game (Or at least being designed with 4k in mind).
 
Also forgot to mention if nvidia were to make a card that uses as much power as previous gen cards the performance increase would be 2 to 3 times as much as previous generations.

But do take into consideration I read somewhere that the EU will be implementing strict power usage on computers in the future maybe that is why nvidia also made this move.

Long story short you miss the point with the amount of performance per watt increase from nvidia they show that they can create a beast of a card without having to make one.

And dont forget with pascall coming in the years to come with vertical stacked memory we will be looking at 16 gig and 24 gig gpu's.
 
I'll wait for the official reveal on Thursday to decide if I'll up grade from my 780 ti. If I'm impressed, I'll buy a 980 and sell my 780 ti on ebay, but if I'm not impressed my eye is on the R9 295x2 wich is only 999.99 right now. If I buy the R9 card I'll keep my 780 ti and just have a hybrid NVidia-AMD super computor!
 
You are not the brightest bulb in the lamp. Lower power consumption means more headroom to overclock meaning more performance. A card that draws less power can be fed more power making it faster. It also produces less heat making it last longer

It also means being able to get a somewhat lower wattage PSU, which frees up a little cash for other things.
 
But do take into consideration I read somewhere that the EU will be implementing strict power usage on computers in the future maybe that is why nvidia also made this move.
I fail to see how that is relevant. I doubt nVidia will limit their cards worldwide based on one countries regulations. If EU does this then they can restrict sales to only using the 970 or worse 960. Not to mention the question of policing system builders that configure in SLI/Xfire. The regulation will probably only effect pre-built systems. Still yet AMD would be the one hurting the most, if this comes to pass.
 
Are there people that really care about their electricity bills? If so then they wont be looking at high end cards anyway as they too expensive.

As for benchmarkers I aggree that there must be some sort of pointless criteria for them to compare against, in this case it seems to be power draw. (IMHO anyway)

Go go AMD and dont get bothered with those who compare their electricity bills as true enthuasiast dont care about performance per watt. Having read this I look forward to AMD 3xxx series more than ever.

Good day electricians

Yeah, right. What was again the selling point of AMD gpus? they're cheap.
Ok, their gpus might have performance revalling nvidias, however, they dont outperform them. So the only actual reason why you would consider an AMD gpu is either because you belive in AMD, in which case reason doesn't actually play any role in the decision and we can stop here. The more logical reason is that AMD gpus simply cost less. it might be cool to have a 400$ gpu boasting out the same level of performance as a 6-700$ gpu, but you don't consider that the most money you aren't paying for your rig but for using it.
just an example: A rig with an AMD FX-9370 (highest AMD Cpu I know of) (220W) + R9 295X2 (500W)
leaving out other parts, this currently costs about 1220$ and consumes 720W. I don't know what energy might cost where you live, but here in Germany we're paying 0,30€/Kw, so let's use 0,35$. using your rig every day for, let's say, 6 hours, you use 0,720 Kw/hour, *6 = 4,320 Kw/hour. So your paying about 1,512$ each day. that makes about 550$ a year. So after 3 years, you payed about 300$ more for your power bill than you payed for your rig.
now the same with an Intel I7 5960X and nvidia 980 in dual-sli. (The actual performance of both, cpu and gpu should be a bit (if not much) higher than the AMD rig, but let's assume they had the same performance)
1000$ for cpu (140W) and 1200$ for gpu (350W, 175 each) (600$ each). same calculation as before:
490W = 0,49 Kw/hour * 6 = 2.94Kw. 2.94Kw * 0,35$/Kw = 1,02$/Day. that Makes 371$/year. So after about 7 years you payed your rig in power bill.
let's compare: 1200$ and 550$/year vs 2200$ and 371$/year. after 5 years, you payed 3950$ for the AMD and 4055$ for the intel/nvidia rig.
that still makes a difference of 100$, however, as the AMD righ uses more power, it also produces more heat, what results in higher cost for cooling. So I guess after 5 years the AMD and Intel/nvidia rig costed you the same.
However, if you say money isn't an issue when using a high end rig, I seriously don't know why your actually using an AMD product.

And by the way, looking at this, you may now understand why AMDs color is red, while nvidias color is green.
 
I fail to see how that is relevant. I doubt nVidia will limit their cards worldwide based on one countries regulations. If EU does this then they can restrict sales to only using the 970 or worse 960. Not to mention the question of policing system builders that configure in SLI/Xfire. The regulation will probably only effect pre-built systems. Still yet AMD would be the one hurting the most, if this comes to pass.

*facepalm* Did you really just refer to the EU as a single country? There are 28 member-states in the European Union, and the EU economy constitutes 20-23% of global GDP.
 
Back