Nvidia GeForce GTX 1080 Review: The Mad King of GPUs

Steve

Posts: 3,044   +3,153
Staff member

Things used to move further and faster back in the early days of GPUs. In fact, things moved pretty quickly until just four years ago, when we made the transition from the 40nm design process used by Nvidia’s Fermi architecture to the 28nm process, still used today. This extended development cycle has had AMD and Nvidia squeezing the absolute most out of the 28nm design process.

But as good as the Titan X and Fury X are, it is time to move on in the quest for greater efficiency and even greater performance. First to market with a true next generation GPU is Nvidia. Codenamed Pascal, this latest architecture promises big things and could very well be the biggest step we've seen in recent years.

The new GeForce GTX 1080 is faster, built using the 16nm design process and packed with GDDR5X memory, it promises to put away the Titan X while consuming less power than the 980 Ti. We put this and other bold claims to the test.

Read the complete review.

 
Great article! The witcher benchmarks are what I wanted to see as im getting the new expansion when it comes out at the end of the month. It looks like the 1080 will do very well over my 780 at 1440p. So very excited. I dont mind the founders edition because I really like the look of it even if I do have to pay a premium.
 
Score 100/100?

- AMD's Async shaders implementation is superior compared to Nvidia's.
- This card is using GDDR5X memory, so first HBM2 card will receive at least 150/100?
- DirectX 12 support is about same level as Maxwell, again AMD is miles ahead.

Giving perfect score somethiong is just ridiculous as it leaves no room for improvements. Some of which already exists or will soon become. Nvidia fanboy score spotted.
 
Score 100/100?

- AMD's Async shaders implementation is superior compared to Nvidia's.
- This card is using GDDR5X memory, so first HBM2 card will receive at least 150/100?
- DirectX 12 support is about same level as Maxwell, again AMD is miles ahead.

Giving perfect score somethiong is just ridiculous as it leaves no room for improvements. Some of which already exists or will soon become. Nvidia fanboy score spotted.

- Well the first point is correct.

-The card does use GDDR5X which is the best out there reliably at this present. Thats how reviews work, you review compared to what exists today, not what is coming in the future.

- How many games even support DX12? No doubt Nvidia were working hard getting this out and now they have the easier job of adding better support and drivers for DX12 gaming.

Just my thoughts. Its aggressive pricing - lets be honest, they are kicking their 1000 dollar cards out of the market and saying buy a 1080. Thats ballsy and you get great 4k performance. These are the releases that companies should be creating, yet we normally get the 10% increments for 75% more cost. This is a positive move all round, and at the current time of writing, is worthy of the 100/100.
 
Last edited:
Can't wait to see what the Custom cooled and factory over clocked versions can do!
Asus, EVGA, Gigabyte etc all have awesome coolers that should bring the Temps way down.
Interested to see what the fastest out of the box card can hit.
 
There will be a few 8K products later this year with support for DisplayPort 1.4. It will be interesting to see how the card performs under maximum load.
 
- Well the first point is correct.

-The card does use GDDR5X which is the best out there reliably at this present. Thats how reviews work, you review compared to what exists today, not what is coming in the future.

- How many games even support DX12? No doubt Nvidia were working hard getting this out and now they have the easier job of adding better support and drivers for DX12 gaming.

Just my thoughts. Its aggressive pricing - lets be honest, they are kicking their 1000 dollar cards out of the market and saying by a 1080. Thats ballsy and you get great 4k performance. These are the releases that companies should be creating, yet we normally get the 10% increments for 75% more cost. This is a positive move all round, and at the current time of writing, is worthy of the 100/100.

- 256-bit GDDR5X may be best right now but 384-bit or 512-bit GDDR5 should be even better. So using your logic, let's assume this card uses 28nm instead of 16nm. 28 nm would be "best available" so it would not lower score? Despite that "best available 28nm tech" is over 4 years old? This also answers your comment about "we normally get the 10% increments for 75% more cost", Nvidia and AMD went with 28nm way too long.

- If I spend 700$ for graphics card, I expect it to be good even after some time. Or should these cards be tested with only DirectX 9 games because DirectX 11 games are still very few compared to DX9 games?

As for last section. This card shows nothing impressive. Even using same architechture and just switching from 28nm to 16nm should give much better results. This is FIRST 16nm card, and if it receives 100/100, what should be given to second batch 16nm cards?

Just making product little better than previous one is very far from perfect. Also this card has undeniably some features missing that alone make perfect score unjustified.
 
Was thinking of getting 1080ti but if the 1080 has temperatures of over 80 in use I wonder if there will even be a ti model. Well I can wait as everything still runs on ultra but will be interesting to see.
 
Sure it's a nice card but is it worthy of a perfect 100 score? I very much doubt it. Personally the GTX 1070 interests me more, you're not going to sacrifice much in the way of anything and the price more than makes up for the sacrifices but even that card wouldn't be a perfect 100.
 
- 256-bit GDDR5X may be best right now but 384-bit or 512-bit GDDR5 should be even better. So using your logic, let's assume this card uses 28nm instead of 16nm. 28 nm would be "best available" so it would not lower score? Despite that "best available 28nm tech" is over 4 years old? This also answers your comment about "we normally get the 10% increments for 75% more cost", Nvidia and AMD went with 28nm way too long.

- If I spend 700$ for graphics card, I expect it to be good even after some time. Or should these cards be tested with only DirectX 9 games because DirectX 11 games are still very few compared to DX9 games?

As for last section. This card shows nothing impressive. Even using same architechture and just switching from 28nm to 16nm should give much better results. This is FIRST 16nm card, and if it receives 100/100, what should be given to second batch 16nm cards?

Just making product little better than previous one is very far from perfect. Also this card has undeniably some features missing that alone make perfect score unjustified.

I'm not trolling, honestly, but what would you consider as 100/100? There is only so much these companies can do. This is a huge leap in terms of the processing of the chips 16nm. OK, given the styling of the card looks like every other GTX card, but the way they design cards these days you dont even see the card once it's installed.

There were some little extras, not just the extra power. There is now MUCH better support for multiple displays so there should be less tearing, blurring, orientation of the surround screens. There is also Ansel, Super Resolution, VRWorks updates, Simultaneous Multi-projection (this is realllly cool), ASync Compute, new memory compression, HDR for gaming and encoding and more....

So add all of these extras, as well as the biggest leap in card raw power for at LEAST 3 years, they deserve some kudos.

Also, I was curious, what would you have given it out of 100 and why?
 
I simply cannot believe you have omitted one of the most important benchmarks in your review - VR performance!
Why are there no VR benchmarks when Nvidia have claimed that Pascal performs especially well here with its simultaneous multi projection and VR is pretty much the future of gaming?

Also, I agree we desperately need more DX12 tests.
I am disappointed with Pascals performance here and would like to know whether we can expect much better in future games and/or future drivers.
 
Poor AMD fan boys. They gave nvidia some mid range competition for almost a 6+ months. Now it's back to the bottom of the heap again.

3 to 4 times cheaper then the Titan X with better performance, if that doesn't give you a perfect score I don't know what does.
 
Very disappointing overall. The 1080 is what I expected to be a 1080ti and costs as much. The 1070 is what I expected the 1080 to be but with old GDDR5 memory and costs as much...

1060ti with GDDR5X come on plz...

Does the 1080 fully support DX12 in all its variations?
 
Poor AMD fan boys. They gave nvidia some mid range competition for almost a 6+ months. Now it's back to the bottom of the heap again.

3 to 4 times cheaper then the Titan X with better performance, if that doesn't give you a perfect score I don't know what does.

P.s. If anyone comments on the performance of a game in alpha on a direct x that's not even released yet... sad grasp.

AMD fan boys "On on the new pokem Island adventures coming out in 2018 amd wins!! Uh uh uh even though the new 1080 smokes amd on 100% of existing games .000000000001% of games that haven't hit the shelves yet, particularly 1 game shows it might maybe maybe maybe be better after they release direct x 12!"

Titan X is not 3 or 4 times more expensive, not even twice as much as the founders edition, bizarre comment.
And anyway you need to compare GTX 1080 with 980Ti not Titan X
100/100 means perfection or zero faults yet operating temp 80c is certainly less than perfect as is the $100 premium for founders edition so a perfect score is nonsensical.

As for your DX12 comment, you think it is a sad grasp to want more Dx12 tests because the only results we have so far are disappointing and instead we should follow your wisdom and just assume DX12 performance will be wonderful without any evidence at all, that truly is a sad.
 
Last edited:
Just making product little better than previous one is very far from perfect. Also this card has undeniably some features missing that alone make perfect score unjustified.

Saying this card is a "little better" is by far an understatement and you know it. Compared to every other available GPU it's faster and at a reasonable price point, while consuming less power, thus giving it a perfect score... Your just hating for the sake of hating because you can, or because of another reason most people can see without me having to point out. Perhaps save the perfect score for the board partners design that doesn't come at a price premium, but those are not available yet so they can't be considered yet, but I'm sure when does come out they will also get perfect scores, and once again you will be upset about it right?
 
- Well the first point is correct.

-The card does use GDDR5X which is the best out there reliably at this present. Thats how reviews work, you review compared to what exists today, not what is coming in the future.

- How many games even support DX12? No doubt Nvidia were working hard getting this out and now they have the easier job of adding better support and drivers for DX12 gaming.

Just my thoughts. Its aggressive pricing - lets be honest, they are kicking their 1000 dollar cards out of the market and saying by a 1080. Thats ballsy and you get great 4k performance. These are the releases that companies should be creating, yet we normally get the 10% increments for 75% more cost. This is a positive move all round, and at the current time of writing, is worthy of the 100/100.

- 256-bit GDDR5X may be best right now but 384-bit or 512-bit GDDR5 should be even better. So using your logic, let's assume this card uses 28nm instead of 16nm. 28 nm would be "best available" so it would not lower score? Despite that "best available 28nm tech" is over 4 years old? This also answers your comment about "we normally get the 10% increments for 75% more cost", Nvidia and AMD went with 28nm way too long.

- If I spend 700$ for graphics card, I expect it to be good even after some time. Or should these cards be tested with only DirectX 9 games because DirectX 11 games are still very few compared to DX9 games?

As for last section. This card shows nothing impressive. Even using same architechture and just switching from 28nm to 16nm should give much better results. This is FIRST 16nm card, and if it receives 100/100, what should be given to second batch 16nm cards?

Just making product little better than previous one is very far from perfect. Also this card has undeniably some features missing that alone make perfect score unjustified.

The 1080 is about 2x as powerful per watt and 30% faster at the same price point as the 980ti. Overall efficiency of the card is much better than current maxwell cards. This is why the reviewer gave the card a 100
 
- Well the first point is correct.

-The card does use GDDR5X which is the best out there reliably at this present. Thats how reviews work, you review compared to what exists today, not what is coming in the future.

- How many games even support DX12? No doubt Nvidia were working hard getting this out and now they have the easier job of adding better support and drivers for DX12 gaming.

Just my thoughts. Its aggressive pricing - lets be honest, they are kicking their 1000 dollar cards out of the market and saying by a 1080. Thats ballsy and you get great 4k performance. These are the releases that companies should be creating, yet we normally get the 10% increments for 75% more cost. This is a positive move all round, and at the current time of writing, is worthy of the 100/100.

- 256-bit GDDR5X may be best right now but 384-bit or 512-bit GDDR5 should be even better. So using your logic, let's assume this card uses 28nm instead of 16nm. 28 nm would be "best available" so it would not lower score? Despite that "best available 28nm tech" is over 4 years old? This also answers your comment about "we normally get the 10% increments for 75% more cost", Nvidia and AMD went with 28nm way too long.

- If I spend 700$ for graphics card, I expect it to be good even after some time. Or should these cards be tested with only DirectX 9 games because DirectX 11 games are still very few compared to DX9 games?

As for last section. This card shows nothing impressive. Even using same architechture and just switching from 28nm to 16nm should give much better results. This is FIRST 16nm card, and if it receives 100/100, what should be given to second batch 16nm cards?

Just making product little better than previous one is very far from perfect. Also this card has undeniably some features missing that alone make perfect score unjustified.
So, by your logic, NOTHING should ever get a 100/100, because something better is ALWAYS coming down the road?

At this moment, the 1080 is a 100/100 contender. It is faster than any other single GPU, often by a huge margin, and does so while sipping juice compared to even maxwell. It has a framebuffer big enough to satisfy gaming needs for the next few years, and OCes very well, considering the limitations of it's cooler. Performance should be even better than this with nicer heatsink/watercooling. if being the king of GPU land for the foreseeable future, while being more efficient than everything else on the market in FPS/$, perf/watt, and pure performance alone, isnt enough to get a 100/100 score, then nothing is, and the rating system would be broken at that point.
 
Back