Nvidia GeForce GTX 1080 announced: Faster than two GTX 980s in SLI

Scorpus

TechSpot Staff
Staff member

At a special event today, Nvidia announced their new flagship graphics card: the GeForce GTX 1080. The card uses a new Pascal GPU built on a 16nm FinFET process, as well as GDDR5X memory from Micron.

Nvidia says they spent several billions of dollars developing Pascal, and the result is a massive GPU capable of powering today and tomorrow's games at ultra quality settings. It's also energy efficient thanks to improvements in the way Nvidia delivers power to the GPU, making Pascal the company's most efficient architecture yet.

According to Nvidia, the GTX 1080 is "a whole lot" faster than the Titan X, and faster than two GTX 980s in SLI, while consuming a lot less power. The chart below sees the GTX 1080 consume around 180W of power, compared to the 165W TDP of the GTX 980, with performance around 30% faster than the Titan X.

An on-stage demo of the GTX 1080 had the card's core running at 2.1 GHz air cooled, with a temperature of less than 70°C. While these aren't the stock clock speeds for the 1080, it shows just how overclocking friendly this card will be.

The stock clock speeds for the GTX 1080 are 1607 MHz with a boost of 1733 MHz, on 2,560 CUDA cores. This suggests the card is using a GP104 core rather than the full GP100 GPU used on the Tesla P100, which comes partially-disabled with 3,584 CUDA cores.

The memory system features 8 GB of GDDR5X with an effective clock speed of 10 GHz on a 256-bit bus, providing 320 GB/s of bandwidth.

As the GTX 1080 is a Pascal-based card, gamers will get to enjoy some of Nvidia's new technologies, the best of which is Simultaneous Multi-Projection. This technology has two huge benefits: no more fish-eye lens effect in multi-monitor gaming environments, and faster VR gaming due to improved viewport efficiency.

The GTX 1080 will retail for $599 in its basic edition, and $699 for "Founder's Edition", although it's not completely clear what benefits the Founder's Edition will bring. The card will be available worldwide from May 27th.

There will also be a GTX 1070 available for $379 ($449 for the Founder's Edition), which Nvidia says is faster than a GTX Titan X, packing 6.5 TFLOPs of performance and 8 GB of GDDR5 memory. The GTX 1070 will be available from June 10th.

Permalink to story.

 
Last edited by a moderator:

Tommy Lee

TS Rookie
Oh you gotta be kidding me!!!!! I gotta pay more attention to new gpu news. I just shelled out $650 for a EVGA 980Ti. Darnit.
You can always sell it

I have Sli 980's ha what do you think I'm going to do it's messed up though their going to bring out Ti's in like 5 months after
 

3volv3d

TS Addict
Oh you gotta be kidding me!!!!! I gotta pay more attention to new gpu news. I just shelled out $650 for a EVGA 980Ti. Darnit.
You can always sell it

I have Sli 980's ha what do you think I'm going to do it's messed up though their going to bring out Ti's in like 5 months after
So wait 5 months? That way you will be getting the card that has the bugs ironed out, the full vram rather than the card missing a GB at the top end.

No one has patience anymore. Definitely would consider updating my 970 to one of these once I figure out what monitor will go nicely with it
 

StrikerRocket

TS Enthusiast
I will definitely *wait* until the actual cards come out and get the bugs ironed out! I know better than jump as soon as a manufacturer whistles! Looks *very* promising, that's granted, but as the saying goes "wait and see"... Upgrading my GTX970 to one of these might be a good thing though, I will "recycle" my "old" 970 in my other PC and put this one in the dual Xeon I just finished building... Darnit, Techspot, you are putting pressure on my wallet!! :p
 

Luay

TS Enthusiast
The fine print at the bottom of the spec page says DP 1.2 certified, DP 1.3/1.4 ready. An optimist would say that DP 1.4 specifications have been released, but no certification is available yet.

The worst case would be like current 4k monitors. Rendering textures in 4K, or upscaling HD video to 4K, all without being able to display HDR.

It's not all Nvidia's fault. There aren't any HDR monitors on sale yet, and the HDR tvs available today have slow processing (40~50 ms).
 

deemon

TS Addict
IF ANYTHING, Intel should really take notes, how a proper upgrade looks like. Not some shabby +5% maybe ****.

Also we all can agree, that GPU tier price inflation with it's +50$ for both tiers (GTX *80: $550 => $600 and GTX *70: $330 => $380) is way bigger than we normally would like to see. Yes, they are more powerful and whatnot... but still... they are the same respective tiers, just newer and better versions.
 
Last edited:

VitalyT

Russ-Puss
From nVidia website:
1 - 7680x4320 at 60 Hz RGB 8-bit with dual DisplayPort connectors or 7680x4320 at 60 Hz YUV420 8-bit with on DisplayPort 1.3 connector.
2 - DisplayPort 1.2 Certified, DisplayPort 1.3/1.4 Ready.
3 - Recommendation is made based on PC configured with an Intel Core i7 3.2 GHz processor. Pre-built system may require less power depending on system configuration.
What the hell is "DisplayPort 1.3/1.4 Ready"? Ready for what? DisplayPort 1.4 specification was finalized a while ago. Are we back to the old DP1.2 again?
 

mbrowne5061

TS Evangelist
From nVidia website:
1 - 7680x4320 at 60 Hz RGB 8-bit with dual DisplayPort connectors or 7680x4320 at 60 Hz YUV420 8-bit with on DisplayPort 1.3 connector.
2 - DisplayPort 1.2 Certified, DisplayPort 1.3/1.4 Ready.
3 - Recommendation is made based on PC configured with an Intel Core i7 3.2 GHz processor. Pre-built system may require less power depending on system configuration.
What the hell is "DisplayPort 1.3/1.4 Ready"? Ready for what? DisplayPort 1.4 specification was finalized a while ago. Are we back to the old DP1.2 again?
The cards themselves might not be certified just yet. They expect it to be certified since it meets the DP1.4 spec, but hasn't passed just yet - so they say "Ready" instead of "Certified"
 

hahahanoobs

TS Evangelist
OMG the girl in the audience at the GTX event is HILARIOUS!

Jensen
-...the GTX 1080 for $599 MSRP
Girl
-OMG!
-WHAT!!
-WHAT!!!
-1080 WHAT?!
-I CAN AFFORD THAT!
-I CAN AFFORD THAT!!!

Jensen
-But I have more.
Girl
-WHAT?!
-WHAT?!
Jensen
-GTX 1070
Girl
-WHAT!?!
-WHAT??!!
-10-7 WHAT?!!?
 

Puiu

TS Evangelist
What else do you primarily use GTX cards for?
they did mention that these big numbers are only for some "special" situations (like VR). so it doesn't provide 2x better FPS in 99% of the games.
so to answer your question, numbers like that are just false advertising if you don't add the asterix at the end.
I can also say that my table is 2x better than yours. you know what a table is used for, but what does the 2x refer to?
 

dividebyzero

trainee n00b
"Founders Edition"?
You mean the "You're gonna get screwed even harder when the 1080 Ti comes out" edition?
Nvidia sells direct to the consumer. The Founder Edition's will be reference cards clocked higher than stock. EVGA particularly makes serious coin off clock-bumping reference cards (and their ACX-cooler brethren) and then selling the Superclocked variants at a premium (usually ~ 10% over reference MSRP). Nvidia is just following suit.
they did mention that these big numbers are only for some "special" situations (like VR). so it doesn't provide 2x better FPS in 99% of the games.
True. SMP will play a sizable role in those numbers. If the rumoured 160 TMU count for GP104 is true, I'd expect the performance between the 980 Ti (and Titan X) to be fairly close at 4K/HD texture packs applied. Once you overclock both cards the GTX 1080 should exert a degree of control - probably considerable - at anything less than the most demanding of situations. I don't think many people - myself included- were expecting the new card to be able to break 2100MHz core on air.
From nVidia website:
What the hell is "DisplayPort 1.3/1.4 Ready"? Ready for what? DisplayPort 1.4 specification was finalized a while ago. Are we back to the old DP1.2 again?
I wouldn't be surprised if Nvidia holds off on full certiifcation until its monitor vendors either make a better market distinction between G-Sync and adaptive sync, or give vendors time to phase out G-Sync models IF any distinction is lost on the buying public.
 
  • Like
Reactions: VitalyT
B

BMfan

AMD brings out numbers and most people say "Yeah.whatever" ,Nvidia brings out numbers Then it's "Oh My God"
 
  • Like
Reactions: Tommy Lee