Nvidia GeForce GTX 1080 Review: The Mad King of GPUs

I'm not terribly interested in how people too lazy to educate themselves should be given special dispensation.

Those too lazy people are quite large group. I'm not also interested about tablet users but looking how many sites are designed for tablets, quite many are.

Their averaged scores are aggregated from a sites that vary from the valid to little more than blogs and poorly disguised marketing. Again, there's no cure for stupid.

No cure for stupid yes but if everything is made for those stupid ones, eventually stupidity will increase.

So you'd like the score to better cater to people who can't be bothered to spend a few minutes researching their prospective purchase, while also catering to their twitter/ADHD tendencies by eliminating the need to read the review. I hope I never have to see an era where enthusiast sites have to dumb down content to Twitter sized word bites liberally sprinkled with emoji's for those whose attention strays after half a dozen words.
Personally, I hope these people just bypass TechSpot and move on to sites fully geared for 2 minute attention spans and a low level of mental acuity.

Those who read review do not actually need any ratings. For those who just check rating, that rating is everyhing. So article for enthusiasts, rating for lazy. Or perhaps even different short conclusion section for lazy ones?

Your hope is admirable but still doomed. As many think they "know" something after they have checked score from UltraHighTech enthusiast site. After that, they think they are gurus.

I have a system for that. It's called adding to their knowledge base when I can. I'll provide further reading links and maybe a synopsis of the information. I'd prefer to offer information rather than have the site kowtow to the lowest level of interest.

What if they have no time/interest to read further links? Most people are not interested about video cards way you and I are. We still have to get along those people, so why make it harder?

So the site should better tailor their articles to people who can't be bothered reading said articles so these people, who can't be bothered researching their possible hardware buy will feel....what exactly? A 95/100 is also pretty much a must buy. You think these people are then going to go back and re-read the entire article just to see if the missing 5% impacts their purchase? They couldn't be bothered in the first instance.
I have no sympathy for anyone who makes a substantial purchase without researching it beforehand. Anyone who relies on snippits from a single review (let alone its final distilled score) and blames anyone but themselves for the outcome AND is too stupid to return the item if dissatisfied really deserves everything they get.

95/100 means it's not perfect so there must be something why it's NOT perfect. At least someone will check where that 5/100 is lost. Most will not but again, while there is no cure for stupid, it doesn't mean everything should be done for stupid ones.

I have no sympathy also but fact is that most people in the world are stupid if we talk about video card knowledge. And while I do not say articles should be made for them, that big group should be considered while making articles.

You missed the point. The AMD lead is less than half the average it posts in other games (aggregated) at 4K. What I posted was the best case scenario for AMD bearing in mind 4K's love of high texture fill rates. If I were to choose the middle ground where Nvidia's TAU's weren't limiting their cards, and AMD's cards were likewise unaffected by raster op inefficiencies at much lower resolutions, the difference is more noticeable
jc3_2560_1440.png

Looking more carefully it seems that GTX 970 SLI loses to single GTX 970, so I would not draw any big conclusions about that one. AMD cards are generally better at high resolutions, true. Just Cause 3 is also quite crappy game and not so good to draw performance conclusions. Main problem is that DX12 games are still very rare.

These things are cyclic. Fermi was a compute-centric architecture, and still stands as one of the best archs for compute efficiency. Yes, during Fermi's reign (and GT200 before it) Nvidia fanboys paid no attention to perf/watt. But you know who held it as paramount? AMD fanboys. When Evergreen arrived, perf/watt and perf/mm (a newly important metric that arrived overnight) was the sole point of interest. As soon as AMD pushed "always on" compute and wattage climbed with the GCN architecture, perf/watt suddenly became irrelevant to AMD fanboys.
It cuts both ways. Always has.

I don't really remember that AMD fanboys have ever hyped power effiency as much that Nvidia fanboys did when Maxwell came out.

Nope. AMD's R&D couldn't sustain multiple developments and AMD had too many irons in the fire. Console development, an expensive to run logic layout business (since sold to Synopsys), a poorly thought out attempt at making a splash in the ARM server architecture market, and very likely a substantial ongoing investment in HBM integration which began at least 5 years ago....not to mention a long running APU/CPU architecture development.
If you want to distill AMD's woes down to a single point, it is their managements lack of strategic planning, goal setting, and a reliance upon being reactive rather than proactive in the industry. Too busy trying to imitate those more successful, but putting little thought into how to achieve goals and the actual returns on investment and time (see the SeaMicro acquisition for a prime example) once a course of action is embarked upon.

ARM server architecture market is mostly backup plan in case ARM really becomes strong on servers. AMD can do it, Intel cannot. AMD already has integrated HBM to GPU, Nvidia not. Remembering many manufacturing problems past years, that is something important. Also HBM2 can finally resolve APU's memory bandwidth problems.

Seamicro's idea looked good but not seem to work right now so AMD pulled it off. For a while at least.

That is one interpretation, but I don't think it is correct. As a serial upgrader myself, the best time to sell old hardware is just before the new series arrives. You still recoup a reasonable amount of your original purchase cost and can use the funds to offset the new purchase.

GTX 980 Ti was launched just year ago. And while it was made using very old 28nm tech, it was very expensive. Generally it's quite funny that GPU prices basically went higher as technology become more obsolete. While I agree that selling old hardware is best done before new tech arrives, we are still talking about product that released one year ago. I would expect that 700$ card would last much longer than one year. Many seem not to. No hard to claim that most GTX 980 Ti buyers did not listen to those that said 16nm cards would be much better and now they are in panic mode. In any case GTX 980 Ti goes very high (or even takes top spot) on "worst video card buys ever" -list.

Not just DX11 software.If that were solely the case, why did it need Nvidia to publicize frame pacing, ShadowPlay, GeForce Experience, and a host of other software that AMD has eagerly tried to adapt to its own uses. I'm guessing that Ansel and MSP will also find themselves with AMD analogues in the not too distant future

Those are just software side includes that are quite easy to copyy if become succesfull. Not all software features are succesfull and developing them cost something.

I've been hearing a near constant stream of this marketing since Raja Koduri claimed that Polaris and 14nm was well ahead of Pascal and 16nm....

The overall target is still "console-class gaming on a thin-and-light notebook."

Nvidia's 16nm offerings for console class gaming and notebooks are coming when?

...Yet Nvidia have demonstrated the largest non-Intel GPU in the world on 16nm with series production underway (with over 4500 pre-sold at $10K/per), have the GTX 1080 reviewed and a week from retail availability. the volume market GTX 1070 basically ready to go (holding it back obviously a marketing strategy) and mass market GP106 due to arrive in a month.
:rolleyes:

And GTX 1080 is not mainstream market, it's for enthusiast/high end market. GP106 is likely compete with Polaris 10 but still, where are Nvidia's 16nm low end offerings? It also takes much more capacity to launch big amount of notebook and low end/mid end chips than launch one high end card with very limited availability.
 
No cure for stupid yes but if everything is made for those stupid ones, eventually stupidity will increase.
The exact scenario you are pushing. Trying to boost your self esteem by lowering the bar?
Or perhaps even different short conclusion section for lazy ones?
Conclusion: "Read the f*cking review if you plan on buying hardware. Can't be bothered? Your funeral". Cut and paste into every review conclusion for people who don't actually care about the review
Your hope is admirable but still doomed. As many think they "know" something after they have checked score from UltraHighTech enthusiast site. After that, they think they are gurus.
And some people actually learn something new (not you naturally). If imparting knowledge on a tech site is doomed, why are you trying so hard to spread your gospel?
What if they have no time/interest to read further links?
Then they continue to do whatever they usually do. Hopefully it doesn't include spending money on hardware that they don't have a clue about...but if they do, I certainly don't care
Most people are not interested about video cards way you and I are. We still have to get along those people
Do we? I don't. If someone wants to discuss graphics cards but 1. doesn't have a clue about them, and 2. doesn't care to learn, then I don't feel disposed to engaging with them.
95/100 means it's not perfect so there must be something why it's NOT perfect. At least someone will check where that 5/100 is lost
And that person will very likely just look at the benchmark graphs because that's way easier than reading.
Looking more carefully it seems that GTX 970 SLI loses to single GTX 970 so I would not draw any big conclusions about that one.
I'm beginning to think the reason you are championing the videocard challenged is because you are one.
Just Cause 3 is also quite crappy game
It's just the first example of CR's use. AotS is an even crappier game, does that preclude its features never being used again? :rolleyes:
I don't really remember that AMD fanboys have ever hyped power effiency as much that Nvidia fanboys did when Maxwell came out.
See, thats why I have trouble taking you seriously. The GTX 480 comment threads were an absolute sh1tfest. TPU was mild compared to the mouthbreathers at OCN.You won't see anyone yukking it up in these threads that posted anything in the R9 290X threads - a card that runs hotter and uses more power than the GTX 480
ARM server architecture market is mostly backup plan in case ARM really becomes strong on servers. AMD can do it, Intel cannot.
Oops. More stuff you don't know. Intel have had an ARM architectural license since the DEC settlement.
AMD already has integrated HBM to GPU, Nvidia not.
You really love shilling don't you. Nvidia;s P100 HBM2 equipped DGX-1 was demonstrated by QuantaCT at GTC in January
NVIDIA-Tesla-P100-GP100-GPU_1.jpg
original.jpg


Seamicro's idea looked good but not seem to work right now so AMD pulled it off.
Wrong. AMD's Seattle development was too slow. Companies like Cavium (Thunder-X), Broadcom (Vulcan), and Applied Micro (X-Gene) destroyed AMD's whole program. So much for AMD;s claim that they'd hold 25% of the ARM server market. More hollow boasts and the whole SeaMicro acquisition written off.
Your shilling is getting worse.
In any case GTX 980 Ti goes very high (or even takes top spot) on "worst video card buys ever" -list.
Spoken like a true fanboy. For a year the custom overclocked 980Ti's were pretty much the top dogs in the GPU world. Even now, a high clocked 980Ti is compretitive with the latest cards.
Nvidia's 16nm offerings for console class gaming and notebooks are coming when?
Woo hoo more trolling. What has that got to do with Koduri's assertion that 14nm is months ahead of 16nm. Hint: nothing
Where are Nvidia's 16nm low end offerings?
The same place ALL AMD's 14nm GPUs are. Unreleased.

I'm done here. You've demonstrated a stunning lack of knowledge and a predeliction for shilling for AMD and general trolling behaviour. How anyone can bleat on about Nvidia not having HBM when the whole tech world saw the demonstrations and the 4,500 card order placed for the Large Hadron Collider is beyond me. Even non-tech heads know what the LHC is.
 
Last edited:
The exact scenario you are pushing. Trying to boost your self esteem by lowering the bar?

Generally I try to promote less stupid alternatives to Facebook and such.

Conclusion: "Read the f*cking review if you plan on buying hardware. Can't be bothered? Your funeral". Cut and paste into every review conclusion for people who don't actually care about the review

This website seem to be designed for tablets. I thought only casual users use such things. So casual users are taken into consideration when desining this website. So articles should also.

And some people actually learn something new (not you naturally). If imparting knowledge on a tech site is doomed, why are you trying so hard to spread your gospel?

Trying to learn is OK but not all realize that learning something takes much more than reading few lines and checking some pictures from single article.

Then they continue to do whatever they usually do. Hopefully it doesn't include spending money on hardware that they don't have a clue about...but if they do, I certainly don't care

But I care. Because what sells, will be sold in the future. There is no denying that 16:10 picture ratio is awesome against 16:9 on everything else than watching videos. Where can I faind reasonably priced, modern 27 inch 16:10 display? Nowhere. This is problem. Stupid buyers eventually can decide what is available and what is not.

Do we? I don't. If someone wants to discuss graphics cards but 1. doesn't have a clue about them, and 2. doesn't care to learn, then I don't feel disposed to engaging with them.

We do. Underpowered consoles sell well = console games sell well = more and more PC games are crappy console ports. What is bought now will be availabe in the future. And what is not...

And that person will very likely just look at the benchmark graphs because that's way easier than reading.

Score is however easier and faster to look than many benchmark graphs.

It's just the first example of CR's use. AotS is an even crappier game, does that preclude its features never being used again? :rolleyes:

Their features will be used again, but neither of those games necessarily give indication how cards are ranked on future games.

I'm beginning to think the reason you are championing the videocard challenged is because you are one.

That clearly tells why game is not so good. However, why test SLI setup with game that does not support it?

This is good example on what I said. You provided just two pictures for benchmarks, that show GTX 970 SLI to be slower than single GTX 970. Now, I make little more than others, copy picture address and Google it. I got this page https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/19.html where that picture is taken.

No single mention about "game is not supporting SLI". Now I put google "Just Cause 3 SLI" and finally there reads that this game is not supporting SLI. So even reading article is not enough to reveal this so many people will think that SLI is generally slower than single card.

See, thats why I have trouble taking you seriously. The GTX 480 comment threads were an absolute sh1tfest. TPU was mild compared to the mouthbreathers at OCN.You won't see anyone yukking it up in these threads that posted anything in the R9 290X threads - a card that runs hotter and uses more power than the GTX 480

IIRC GTX480 was among first cards that exceeded PCI Express 300 watt limit. So it caused WTF effect because it was thoght that 300 watt really is hard limit. Years later over 300 watt cards were more common and no stunning effect anymore.

Oops. More stuff you don't know. Intel have had an ARM architectural license since the DEC settlement.

And? Intel dominates CPU server market with x86-based chips, while AMD have very poor portion right now. So while nobody really cares if AMD is offering ARM AND x86 based chips, Intel offering ARM chips would mean x86 market share would get lower. And offer more space for competitors. It would also mean that Intel=x86 applies no more.

So while Intel theoretically could make ARM chip for servers, practically right now they really cannot.

You really love shilling don't you. Nvidia;s P100 HBM2 equipped DGX-1 was demonstrated by QuantaCT at GTC in January

And that can be bought from where? AMD has sold many products with HBM memory, Nvidia has only sampled. That product is supposed to be available over a month from now. So does Nvidia have not any HBM products available right now? I think not. Also this thread is about consumer graphic cards so server offerings can be ignored.

Wrong. AMD's Seattle development was too slow. Companies like Cavium (Thunder-X), Broadcom (Vulcan), and Applied Micro (X-Gene) destroyed AMD's whole program. So much for AMD;s claim that they'd hold 25% of the ARM server market. More hollow boasts and the whole SeaMicro acquisition written off. Your shilling is getting worse.

It's known for some time that AMD wanted that x86 and ARM share same socket. That project Skybridge was later abandoned and that was because GF had problems. Skybridge and Seamicro had strong connection so better to blame Global Foundries for that one.

Spoken like a true fanboy. For a year the custom overclocked 980Ti's were pretty much the top dogs in the GPU world. Even now, a high clocked 980Ti is compretitive with the latest cards.

For one year, right. Repeat, one year. It's quite expensive to sell one year old card for about 200$ less than buyed it. Even worse if that was bought last christmas. That makes 180 days and about 200$, over one dollar per day. Pretty expensive rental I would say :D

Woo hoo more trolling. What has that got to do with Koduri's assertion that 14nm is months ahead of 16nm. Hint: nothing

So learn to read. What reads there:

- High end GPU in 2016
- Overall target is "console-class gaming on a thin-and-light notebook."

"We believe we're several months ahead of this transition, especially for the notebook and the mainstream market.

Now, Nvidia's offerings for notebook and mainstream are where? Even rumours about them? No, so AMD seem to bee months ahead.

The same place ALL AMD's 14nm GPUs are. Unreleased.

Planned release when? AMD releases them at Computex soon. Nvidia?

I'm done here. You've demonstrated a stunning lack of knowledge and a predeliction for shilling for AMD and general trolling behaviour. How anyone can bleat on about Nvidia not having HBM when the whole tech world saw the demonstrations and the 4,500 card order placed for the Large Hadron Collider is beyond me. Even non-tech heads know what the LHC is.

Jesus. This thread PCI Express video cards and you are talking about some server stuff that is not even available :D
 
Jesus. This thread PCI Express video cards and you are talking about some server stuff that is not even available :D
Hahahaha, A comment like that, hahahaha, and you keep talking about AMD. ahahaha

I guess you are also ignoring this is an nVidia thread. And talking about a product of the same architect is not as far off topic as, bringing up a whole different manufacturer. Don't get me wrong I like a good nVidia/Intel and AMD debate. But your stance about what is off topic. lol
 
Hey all guys, I follow this youtube channel and I trust this guy with his hardware news/evaluations. I believe he's informative and unbiased. He just made a video about performance and some potentially important information/opinion of Nvidia's new flagship GPU, the GTX 1080's. I hope you find it useful. Enjoy the video. https://youtu.be/myDYnofz_JE
 
Hey all guys, I follow this youtube channel and I trust this guy with his hardware news/evaluations. I believe he's informative and unbiased. He just made a video about performance and some potentially important information/opinion of Nvidia's new flagship GPU, the GTX 1080's. I hope you find it useful. Enjoy the video.

It is pretty easy to sit back and pass judgement once the dust has settled. Even more so when you don't actually have the product yourself.

A couple of key points to note here. We actually test in a case, the Corsair Carbide 600C and no single test takes less than 4 minutes. All up it isn’t possible to complete a single game test in under 9 minutes, I would say 15mins per game seems realistic. I stand by our results and the test methods. I agree that the Founders Edition version isn’t great value, particularly when the board partner cards will cost $100 less. I thought that point was made clear.

EDIT: He might want to do some of his own testing before pointing the finger, I am not seeing throttling anything like what was shown in the video.

As for the boost clocks, the card is rated for a boost clock speed of 1733MHz, after 30mins of gaming in any of the games we tested I haven't seen it dip below 1785MHz for more than a second, it often runs closer to 1.8GHz in fact. I have no idea why computerbase.de's card was running so slow, but I would really like to hear what others are seeing.
 
It is pretty easy to sit back and pass judgement once the dust has settled. Even more so when you don't actually have the product yourself.

A couple of key points to note here. We actually test in a case, the Corsair Carbide 600C and no single test takes less than 4 minutes. All up it isn’t possible to complete a single game test in under 9 minutes, I would say 15mins per game seems realistic. I stand by our results and the test methods. I agree that the Founders Edition version isn’t great value, particularly when the board partner cards will cost $100 less. I thought that point was made clear.

EDIT: He might want to do some of his own testing before pointing the finger, I am not seeing throttling anything like what was shown in the video.

As for the boost clocks, the card is rated for a boost clock speed of 1733MHz, after 30mins of gaming in any of the games we tested I haven't seen it dip below 1785MHz for more than a second, it often runs closer to 1.8GHz in fact. I have no idea why computerbase.de's card was running so slow, but I would really like to hear what others are seeing.
hey, no offense to you or your hard work! I visit your site often and I like your reviews. I shared this cause it offers a different opinion, just that. Though, I do not think Nvidia was perfectly honest in their representations. They displayed 1080's temps much lower than the real world there (I checked this from different reviews), almost implying that any card would hit 2ghz+ under decent temps. Still, tests made by different reviewers show a bit different results in temps. In your case, 1.8 ghz is not a bad boost clock but if it hits 83-84 degrees at that point, this product has little to none overclockability
 
Wow. those numbers speaks the fact. Now to retire my gtx770. 100 score seems realistic. Who would have thought that 14xxp resolution is in our hands right now with that price.

Prepare for the influx of worried AMD fury users.
AMD is going to have to restructure their pricing aggressively if they plan to sell another Fiji graphics card this year. The GTX 1080 Founders Edition costs just 11% more and is some 30% faster. Not just that, but once overclocked the new GTX 1080 can be up to 60% faster than the Fury X, while AMD's GPU is renowned for its poor overclocking headroom.
 
EDIT: He might want to do some of his own testing before pointing the finger, I am not seeing throttling anything like what was shown in the video.

As for the boost clocks, the card is rated for a boost clock speed of 1733MHz, after 30mins of gaming in any of the games we tested I haven't seen it dip below 1785MHz for more than a second, it often runs closer to 1.8GHz in fact. I have no idea why computerbase.de's card was running so slow, but I would really like to hear what others are seeing.

Toms also saw huge drops after only 4 minutes -

01-Clock-Rate_w_600.png


PCGH saw a similar story.

12IeRQx.png



It is pretty easy to sit back and pass judgement once the dust has settled. Even more so when you don't actually have the product yourself.

Yes you're right, I don't need to worry about Nvidia taking away my marketing bucks and free cards by telling the truth about their card. It's very liberating knowing that you can judge tech on its merits. That means I'll be buying the cards to test myself and getting true consumer samples - maybe consider doing that yourself every so often and see how it lines up with your press sample? Could make a good article at least.

Note I wasn't "pointing the finger" at your article, more the stupid score. A Perfect 100 for this card is ludicrous under any circumstances.
 
I have since spent hours monitoring the operating frequency and I rarely see my card drop below 1785MHz in any game. Testing of course in a computer case as we always do.

As I said it isn’t even possible for us to benchmark a single game in under 4 minutes, 15mins would be the absolute fastest time I could complete a game test.

Yes you're right, I don't need to worry about Nvidia taking away my marketing bucks and free cards by telling the truth about their card. It's very liberating knowing that you can judge tech on its merits. That means I'll be buying the cards to test myself and getting true consumer samples - maybe consider doing that yourself every so often and see how it lines up with your press sample? Could make a good article at least.

Are you really insinuating that we aren’t telling the truth? I am so sick of people claiming reviewers are all corrupt and take money from companies such as Nvidia in return for a glowing review. This really pisses me off! I have been doing this for 15 years now and in that time I have never even been offered money in exchange for a review. I have been heavily pressure to change reviews (scores and conclusions) and we have never once done so. In fact, our relationship with Nvidia for example is quite rocky as we have often called them out regarding things like their GameWorks program which has not gone over well.

We will also be testing production cards once available, I suspect the Founders Edition will perform exactly as ours has. If not, we will be the first to tell you about it. TechSpot has a long history of providing honest and accurate CPU and GPU reviews.

From Tom’s Hardware
“It offers a substantial step up from GM204 and an impressive boost compared to the former flagship GeForce GTX 980 Ti. In fact, across the eight real-world games we benchmarked today, GeForce GTX 1080 averages 34%-higher frame rates than the 980 Ti at 3840x2160.”

We found the GTX 1080 to be 26% faster than the 980 Ti at 4K.

From Anandtech
“Translating this into numbers, at 4K we’re looking at 30% performance gain versus the GTX 980 Ti and a 70% performance gain over the GTX 980, amounting to a very significant jump in efficiency and performance over the Maxwell generation.”

Again we found the GTX 1080 to be 26% faster than the 980 Ti at 4K.

You focused on the data from Computerbase.de in your video and yet their conclusion appears every bit as positive as mine.

“The Nvidia GeForce GTX 1080 is an impressive high-end graphics card without significant weaknesses.” – They also suggested that partner boards will be even better, just as I did.

“Away from the cooling system, the GeForce GTX 1080 (Founders Edition) is currently by far the best graphics card on the market.”

Finally, if you are only taking issue with our perfect score then fine. Out of interest would a 95 or 90 have felt better? If we can’t ever give a score of 100 (which we hadn’t done so until now) should it be out of 90 then?
 
I have since spent hours monitoring the operating frequency and I rarely see my card drop below 1785MHz in any game. Testing of course in a computer case as we always do.

As I said it isn’t even possible for us to benchmark a single game in under 4 minutes, 15mins would be the absolute fastest time I could complete a game test.

Many other sites do warm up periods lasting longer than 15 minutes.

Are you really insinuating that we aren’t telling the truth? I am so sick of people claiming reviewers are all corrupt and take money from companies such as Nvidia in return for a glowing review. This really pisses me off! I have been doing this for 15 years now and in that time I have never even been offered money in exchange for a review. I have been heavily pressure to change reviews (scores and conclusions) and we have never once done so. In fact, our relationship with Nvidia for example is quite rocky as we have often called them out regarding things like their GameWorks program which has not gone over well.

No what I'm saying is that it's liberating to be able to fully judge a product on its merits, without fear of reprisals.

For example, I'm sure you heard all about Phoronix being blacklisted?
http://www.phoronix.com/scan.php?page=news_item&px=GTX-1080-Embargo-Lift

I'm sure you heard all about HardOCP's issue with AMD not sending cards recently?

I do find it amusing that you got so defensive over that comment however. YOU were the one who said it was easy to pass judgement when you don't have the product yourself. Yes it is. It is easy to look objectively at the data when I have zero fear of reprisals. I don't need to second guess what might happen if I don't.

We will also be testing production cards once available, I suspect the Founders Edition will perform exactly as ours has. If not, we will be the first to tell you about it. TechSpot has a long history of providing honest and accurate CPU and GPU reviews.

In other words you've already decided that you're going to get the same results even after I pointed out that plenty of sites don't with their samples? Shouldn't you be waiting on actual results before making that assumption, especially given what I've just shown you?

Or maybe you think your testing is so much better than Toms, PCGH, Computer Base and that they all must be doing it wrong?

Finally, if you are only taking issue with our perfect score then fine. Out of interest would a 95 or 90 have felt better? If we can’t ever give a score of 100 (which we hadn’t done before) should it be out of 90 then?

Yes like I said the issue was with your ludicrous 100 score, like this is the best graphics card ever made. If you're going to make statements like that then get a brass neck and expect to be called out on it.
 
Many other sites do warm up periods lasting longer than 15 minutes.

When benchmarking I quickly move from one game to the next, there isn’t a cool down period. Certainly not one long enough to allow the card to reach its idle temp. As you have seen these cards get up to temp very quickly and that isn’t something exclusive to the 1080 reference card. I often see cards reach their threshold in the game menus while confirming all the quality settings.

Still based on this new evidence I will make sure there is a 15min warm up period before conducting any benchmarks. That said I would like to make it clear that I have spent the last day re-testing with the card operating at the 82 degree limit and have found the exact same results. If there was an issue with our results I would without question make an amendment to our review.

No what I'm saying is that it's liberating to be able to fully judge a product on its merits, without fear of reprisals.

For example, I'm sure you heard all about Phoronix being blacklisted?
http://www.phoronix.com/scan.php?page=news_item&px=GTX-1080-Embargo-Lift

I'm sure you heard all about HardOCP's issue with AMD not sending cards recently?

I do find it amusing that you got so defensive over that comment however. YOU were the one who said it was easy to pass judgement when you don't have the product yourself. Yes it is. It is easy to look objectively at the data when I have zero fear of reprisals. I don't need to second guess what might happen if I don't.

We review products on their merits and aren’t concerned with reprisals. AMD, Nvidia and Intel have all excluded us from launches in the past due to reviews we refused to change, so be it. We take the hit and wait for them to get over it. Why wouldn’t I be defensive when someone calls my integrity into question?

In other words you've already decided that you're going to get the same results even after I pointed out that plenty of sites don't with their samples? Shouldn't you be waiting on actual results before making that assumption, especially given what I've just shown you?

Or maybe you think your testing is so much better than Toms, PCGH, Computer Base and that they all must be doing it wrong?

I haven’t decided anything, I have the product in hand to examine, I have over a decade of experience and I don’t feel that the production version of the Founder Edition will be any different. Of course it could be and we aim to find out, as I said.

Toms, PCGH, Computer Base all got very similar results to mine, in fact as I pointed out their performance was actually slightly better in relation to the GTX 980 Ti.

Also as shown in my previous post, Computerbase.de said “the GeForce GTX 1080 (Founders Edition) is currently by far the best graphics card on the market”. Bit like my ludicrous 100 score suggesting this is the best graphics card ever made :S

Yes like I said the issue was with your ludicrous 100 score, like this is the best graphics card ever made. If you're going to make statements like that then get a brass neck and expect to be called out on it.

If it is just the score, then that is fine. You seemed to be declaring our review as bad or dishonest in your video which was my issue. The results are accurate, they line up with other sites and the conclusion reads similar to those from the sites you listed. This isn’t a bad review by any means.

Finally, here are some examples that have landed us in hot water with the company in question…

Testing DirectX 11 vs. DirectX 12 performance with Stardock's Ashes of the Singularity
https://www.techspot.com/review/1081-dx11-vs-dx12-ashes/
“Nvidia overstates DX12 support”

Project CARS Benchmarked: Graphics & CPU Performance
https://www.techspot.com/review/1000-project-cars-benchmarks/page6.html
“Nvidia: Kepler (old) vs. Maxwell (new)”

The Witcher 3: Wild Hunt Benchmarked: Graphics & CPU Performance
https://www.techspot.com/review/1006-the-witcher-3-benchmarks/page7.html
“Nvidia's HairWorks plays an important role in making The Witcher 3 stand out visually, but it's that same feature that spoils performance to a halt. When enabled, hair quality is impressive, but expect a 20% drop in average frame rates and a shocking 60% reduction in minimum fps.”

“We'd love to see The Witcher 3 using AMD's TressFX and chances are Nvidia's own hardware would run faster, especially Kepler-based cards.”

Batman: Arkham Knight Benchmarked: Graphics & CPU Performance
https://www.techspot.com/review/1022-batman-arkham-knight-benchmarks/
“However for AMD the problem is that like other recent releases such as The Witcher 3 and Project CARS, Batman: Arkham Knight is laced with Nvidia's performance crushing GameWorks features including PhysX clothing and destruction on both PC and consoles. There are a few features that can only be enabled on Nvidia GPUs, such as interactive paper debris and interactive smoke/fog.”

There are many more examples but I am sure you get the idea.
 
Steve, I am not questioning your integrity - I only have an issue with the 100 score. It's not helpful and really just creates hype for a mediocre product. I'm not talking about the chip, I'm talking about the card, which is what you were all reviewing.

100 means perfect. You then went and basically said it wasn't perfect due to price so that alone should have been enough to drop the score. The price is a joke and you know that better cooled, cheaper cards are coming soon as well.

Yes I know I have hindsight that you didn't have while doing the review - but when you look back at this yourself in a few months I'm sure you'll wonder wtf you were smoking at the time as well.
 
Scores are subjective. I was part of the decision to score the GTX 1080 100/100 and stand by it. We don't do that often. Does it mean the product is "perfect"? Given the intended market and within the context of the review (there is a full conclusion that goes with the score, you know) I think so, or at least gets as close as we've come in a very long while.

We have a 15+ year track record reviewing components and have no issues with people disagreeing with our conclusions or when the occasion arises, recognizing our mistakes.

I think Steve has gone above and beyond in responding to users' feedback and inquiries. Whether it's testing methodology or otherwise, we happen to learn a lot from our readers and we are bound to discover new things or miss something entirely, but that doesn't seem to be the case here.
 
"The GeForce GTX 1080 might only be 16% faster than the Titan X in GTA V, but that margin makes all the difference. "

No, it doesn't. It still averages BELOW 60fps. You still need G-sync to avoid stuttering, which means either a more expensive monitor or a new one altogether.

"We now have a single GPU graphics card capable of ~60 fps at 4K with frame dips never dropping below the 50 fps mark."

"~60fps" is not what we want. Nobody here wants to settle for between 50-70fps. That's not the goal. What we want is a solid, consistent 60fps. This card isn't up to it. It's as close as you can get, but it still isn't quite there.

Honestly, I have no idea why people are so keen to over-represent these cards. It's already an excellent card - you don't need to misrepresent its capabilities by fudging the numbers and distorting the performance.
 
Back