Nvidia releases GeForce GTX 1070 specifications

Polaris 10 perhaps? To be seen of course. This is way too expensive to be game changer.
How did I guess you'd point to something AMD.

To confirm, your idea of something game changing is something cheap?
I just checked, the definition of "game changing" is to do something out of the ordinary in a very positive way.

So I'll explain why the 1080 is "game changing" compared to at least the last few generations of Nvidia's GPU's (Maxwell and Kepler etc... all the way back to the 480)

So,
Between the 480 and the 580, about 10fps difference in Crysis Warehead (1080p)
Between the 580 and 680, about 12fps difference in Crysis 2 (1080p)
Between the 680 and 780, about 6fps difference in Crysis 3 (1080p)
Between the 780 and 980, about 11fps difference in Crysis 3 (however this was run at 1440p since Techspot didn't do a 1080p bench unless I'm just blind?)

Now onto the meat of it all:
Between the 980 and 1080, about 31fps difference in Crysis 3 (Done at 1440p as well)

Do you see where I'm going with this? In the last 6 years this is the biggest jump we've seen in performance. I'm sure I could go back even further and find it's been an even longer time since we've seen this sort of performance boost.

Not only has the performance got dramatically higher but the power usage hasn't increased and neither has the price (unless you want a reference card). It's also the first time we've seen a new X80's series card dramatically beat the last generation of Ti and/or Titan cards by a considerable margin.

This was out of the ordinary in a very positive way, therefore a game changer.
 
How did I guess you'd point to something AMD.

To confirm, your idea of something game changing is something cheap?
I just checked, the definition of "game changing" is to do something out of the ordinary in a very positive way.

"Game changer" needs to sell a lot and cheap cards sell much more than expensive ones. Or it should contain something very special. GTX1080 is neither of those.

Polaris can be both if it brings VR-capable card cheaply for masses.

So I'll explain why the 1080 is "game changing" compared to at least the last few generations of Nvidia's GPU's (Maxwell and Kepler etc... all the way back to the 480)

So,
Between the 480 and the 580, about 10fps difference in Crysis Warehead (1080p)
Between the 580 and 680, about 12fps difference in Crysis 2 (1080p)
Between the 680 and 780, about 6fps difference in Crysis 3 (1080p)
Between the 780 and 980, about 11fps difference in Crysis 3 (however this was run at 1440p since Techspot didn't do a 1080p bench unless I'm just blind?)

Now onto the meat of it all:
Between the 980 and 1080, about 31fps difference in Crysis 3 (Done at 1440p as well)

Do you see where I'm going with this? In the last 6 years this is the biggest jump we've seen in performance. I'm sure I could go back even further and find it's been an even longer time since we've seen this sort of performance boost.

No, I don't see where you are going with this. I put my own list here:

GTX480 is 40nm part
GTX580 is 28nm part
GTX680 is 28nm part
GTX780 is 28nm part
GTX980 is 28nm part
GTX1080 is 16nm part

See where 28nm changes to 16nm?

Not only has the performance got dramatically higher but the power usage hasn't increased and neither has the price (unless you want a reference card). It's also the first time we've seen a new X80's series card dramatically beat the last generation of Ti and/or Titan cards by a considerable margin.

This was out of the ordinary in a very positive way, therefore a game changer.

It was out of ordinary because 16nm tech vs 28nm tech. Just doing Maxwell architechture chip with 16nm would be enough for things you said. In fact GTX1080 is more huge disappointment than game changer. Just 30% better card with at least 60% better manufacturing tech is much less than what is possible right now. Of course Nvidia has no reasons to make as good card they could because this way they get more money. But because of that and the fact that Pascal is essentially Maxwell with very little improvements, GTX1080 is far from game changer.
 
"Game changer" needs to sell a lot and cheap cards sell much more than expensive ones. Or it should contain something very special. GTX1080 is neither of those.
Then it won't be cheap unless Nvidia have a reason to price it low, since there is virtually no competition, why would they price it low?
It is still, by definition, a game changer. Just because you like everything incredibly cheap doesn't change this fact.

Polaris can be both if it brings VR-capable card cheaply for masses.
In the last few years, how often has AMD failed to live up to expectation? How often do they release WHQL certified drivers? How much hotter must they get to squeeze any more performance out? How much extra power must they take? Don't get me wrong, Polaris might be very good, as you said:
Just doing Maxwell architechture chip with 16nm would be enough for things you said.
So I guess we'll see Polaris getting the same sort of performance as the 1070/1080 because it's Fiji XT with 16nm?

GTX480 is 40nm part
GTX580 is 28nm part
GTX680 is 28nm part
GTX780 is 28nm part
GTX980 is 28nm part
GTX1080 is 16nm part

See where 28nm changes to 16nm?
Yes I do, you've just proven my point, between the GTX480 and GTX580 was actually one of the least substantial jumps in performance from my list, yet that's a jump from 40nm to 28nm? I guess we don't get massive performance gains by simply jumping down a few nanometers does it?

Of course Nvidia has no reasons to make as good card they could because this way they get more money. But because of that and the fact that Pascal is essentially Maxwell with very little improvements, GTX1080 is far from game changer.
Not quite, if they actually had some competition would help...

Very little improvements? They dropped to 16nm, they used GDDR5X for the first time, it uses less power than the last generation and it out performs a $1000 Titan X by a considerable margin. Also re-read my last post on the performance jump. How can you be impressed with the 970 when it literally didn't beat any performance metric? Do you own one by any chance?
 
Then it won't be cheap unless Nvidia have a reason to price it low, since there is virtually no competition, why would they price it low?
It is still, by definition, a game changer. Just because you like everything incredibly cheap doesn't change this fact.

Unless it's low priced, it won't be widely adapted. And so in my definition cannot be game changer. Virtual reality was some kind of game changer 20 years ago. Too high price means it's not even yet game changer...

In the last few years, how often has AMD failed to live up to expectation? How often do they release WHQL certified drivers? How much hotter must they get to squeeze any more performance out? How much extra power must they take? Don't get me wrong, Polaris might be very good, as you said:

So I guess we'll see Polaris getting the same sort of performance as the 1070/1080 because it's Fiji XT with 16nm?

AMD has rarely failed to live expectations. Nvidia much more often. WHQL is irrelevant. There are many examples of Nvidia's (and AMD's) WHQL drivers that has fatal bugs like broken fan control. One reason why AMD chips are hotter is the fact that they offer much more features. One explanation for Maxwell's so called "energy effiency" is the fact that it lacks many DX12 features or barely meets them.

Polaris will be targeted at most GTX1070 range or bit lower, because Vega will aim high end and use HBM2 memory. Also AMD uses 14nm tech. And while GTX1080 was aimed high end, Polaris is not.

Yes I do, you've just proven my point, between the GTX480 and GTX580 was actually one of the least substantial jumps in performance from my list, yet that's a jump from 40nm to 28nm? I guess we don't get massive performance gains by simply jumping down a few nanometers does it?

There was error on my list. GTX580 was 40nm part while GTX680 is 28nm. Manufacturing problems meant that GTX580 was something GTX480 should have been. Also GTX1080 is only card of those that uses something else than GDDR5 memory. By the way, those benchmarks you linked have different drivers that make difference.

Comparing GTX1080 and GTX980Ti, both have (almost) same architechture and nearly same transistor count. Despite GTX1080 has almost 1.5x higher core clock speed, power consumption is about 35% lower and most importantly it's size is only 314mm2 while GTX980Ti is 601 mm2.

So imaginary "600mm2 16nm GTX980Ti" would be around 1.9 (die size)*1.5 (core clock)=280% faster. Power consumption would only 40% higher. So manufacturing process does make huge performance gains alone. Reason for this big jump is fact that 20nm parts were cancelled by both Nvidia and AMD so this is like jumping over one generation.

Not quite, if they actually had some competition would help...

Very little improvements? They dropped to 16nm, they used GDDR5X for the first time, it uses less power than the last generation and it out performs a $1000 Titan X by a considerable margin. Also re-read my last post on the performance jump. How can you be impressed with the 970 when it literally didn't beat any performance metric? Do you own one by any chance?

AMD have higher manufacturing capacity (Samsung+GF vs TSMC) so they aim mid end first.

Dropping to 16nm is not any sort of architechtural improvement and that seems to be very big improvement just because 20nm parts were cancelled. GDDR5X same thing, not architechtural improvement, just using memory that is now available. Uses less power than previous generation is expected because 16nm. Fact that it beats 1000$ Titan proves that 28nm parts were way overpriced just like I and many others said long time ago.

This performance jump is mainly from manufacturing technology. It does not fix many weaknesses present on Maxwell architechture so I'm not impressed as performance gains come almost automatically.

I have never been impressed with GTX970.
 
And so in my definition cannot be game changer.
So you're changing the definition of "game changer" to suite your needs? Right...
AMD has rarely failed to live expectations. Nvidia much more often.
Right...
GTX1080 is only card of those that uses something else than GDDR5 memory.
But you said they hadn't made any changes since Maxwell except it's gone "16nm" and that was it? Interesting...
WHQL is irrelevant.
Of course it is. It's not like loads of not very tech savvy people go looking for driver updates and tend to steer clear of "beta".
those benchmarks you linked have different drivers that make difference.
You're absolutely right, the drivers would favor the older card with maturer drivers. But since this is the same situation with the 1080 it's a moot point.
One explanation for Maxwell's so called "energy effiency" is the fact that it lacks many DX12 features or barely meets them.
Fantastic, good for AMD, lets give everyone some features that aren't in use by nearly any game developer and lets make our GPU's slower than the competitions so no one will want to use them anyway. Sounds like a sound proposition. Real smart of AMD "Ah! our GPU's are a bit crap but we'll pack them full of currently useless tech anyway, it's not like we'll release much better cards which will actually handle future games properly down the line at all...".
Polaris will be targeted at most GTX1070 range or bit lower, because Vega will aim high end and use HBM2 memory. Also AMD uses 14nm tech. And while GTX1080 was aimed high end, Polaris is not.
What you mean is, AMD hasn't had anything to compete with Nvidia since the 980Ti was released and they still won't have anything to compete. Yeah, why not, lets all wait for Vega, I'm sure Nvidia will just sit idle and not be readying a 1080Ti or some such magic.

Is it me, or am I the only one who feels like we are constantly waiting on AMD to deliver something? Processors, GPU's. It's always "look what we have coming" but by the time they release anything the competition has already released something better.
Comparing GTX1080 and GTX980Ti, both have (almost) same architechture and nearly same transistor count. Despite GTX1080 has almost 1.5x higher core clock speed, power consumption is about 35% lower and most importantly it's size is only 314mm2 while GTX980Ti is 601 mm2.

So imaginary "600mm2 16nm GTX980Ti" would be around 1.9 (die size)*1.5 (core clock)=280% faster. Power consumption would only 40% higher. So manufacturing process does make huge performance gains alone. Reason for this big jump is fact that 20nm parts were cancelled by both Nvidia and AMD so this is like jumping over one generation.
Literally the first decent reply you've ever given. And once again, this has been the same with the last couple of generations, the 780Ti / 980Ti were different chips and had the extra performance and bigger dies.

All you're moaning about is the fact Nvidia didn't release the 1080Ti first, that is literally all you're complaining about. You are so upset by this you simply cannot fathom that the 1080 is a master piece compared to any of it's predecessors for the last (at least) 6 years even when evidence is thrown in your face. Trying to convince yourself it's all "because 16nm" even when it's been proven in the past between node changes to not make this much of an impact.
I have never been impressed with GTX970.
*clears throat* I'll just quote yourself back to you
How is this card game changer? How GTX970 was game changer (exept it was falsely advertized as 4GB card and does still not support async shaders)
You literally tried to use it as an argument about "game changers" earlier in this thread...
 
Last edited:
Unless it's low priced, it won't be widely adapted. And so in my definition cannot be game changer.
You don't seem to know how the industry works then....
I have never been impressed with GTX970.
Yet Nvidia sold the best part of a million of them in the first four months of it being available
Yet it is comfortably the most popular card on the Steam hardware survey despite costing $300 or more for the bulk of its life
Yet it is the main contributor to Nvidia's record year for revenue and profit which directly translates into their increased R&D spend
Yet it was also the main contributor in AMD's discrete desktop graphics market share plummeting to an all time low of 18.1%, leading directly to AMD's average sale price dipping under $29 per GPU, their graphics revenue falling like a rock, and very likely their R&D budget decreasing still further.
Polaris 10 perhaps? To be seen of course. This is way too expensive to be game changer.
Whatever you say chief. Considering this is a thread about the GTX 1070 you seem determined to turn it into an AMD thread at every turn. Between the constant shilling of Polaris and your compunction to use the phrase "async compute" as often as possible, I'm starting to wonder if you aren't Roy Taylor.
Polaris will be targeted at most GTX1070 range or bit lower, because Vega will aim high end and use HBM2 memory. Also AMD uses 14nm tech. And while GTX1080 was aimed high end, Polaris is not
Thanks for the unconnected list of AMD marketing bullet points. 14nmLPP and 16nmFF+/FFC are both effectively half-nodes of 20nm. 16nm appears to clock higher (Polaris, by AMD's own figures will have moderate clocks (1350MHz max)). The GTX 1080 isn't Polaris's competition, GP106 is.
As I noted in another thread, I've been hearing this 14nm AMD marketing for a while. How 14nm and AMD are ahead of Nvidia and 16nm, blah blah blah.
"We believe we're several months ahead of this transition, especially for the notebook and the mainstream market," said Koduri. "The competition is talking about chips for cars and stuff, but not the mainstream market."
Yet the simple truth is TSMC have Fab 16B up and running turning out GPUs up to 610mm² and two SKUs (one high volume) ready for retail, while Apple recently pulled the plug on Samsung producing A10's, and GloFo isn't even confirmed for AMD GPU production - their appalling track record with new processes should hardly fill anyone with confidence even if they were - after all AMD is fighting a lawsuit stemming from Glofo's previous overoptimistic ramp/yield claims.
I can't believe a GTX 1080 is upwards of 100% faster than a GTX 980 and up to 60% faster than a Fury X when overclocked, and without water and HBM! I still think it's a dream...
If you are in a dream now, just wait until EVGA's GTX 1080 Hybrid arrives. The AIO watercooled cards with a beefier power delivery and board power limit are predicted to be able to achieve 2500MHz clockspeed
You know... I was waiting and really excited for the 1070 spec release. I thought it wasn't going to be far behind the 1080 but it's not as impressive anymore. Nvidia have also held back on lots of potential development they could have added on the 1080, why are they holding back on adding more cores?
To avoid what happened with the GTX 970 and GTX 980. A mildly factory overclocked 970 reaches GTX 980 level performance. The GTX 970 gutted 980 sales because of this. Nvidia has widened the performance gap to avoid a repeat.
 
So you're changing the definition of "game changer" to suite your needs? Right...

Thre is no exact and widely accepted definition for "game changer". Provide proof if there is.


Exactly. Like 3.5GB GTX970 and Async shader support still missing after 10 months or so.

But you said they hadn't made any changes since Maxwell except it's gone "16nm" and that was it? Interesting...

Read more carefully.

Of course it is. It's not like loads of not very tech savvy people go looking for driver updates and tend to steer clear of "beta".

Given problems with WHQL certified drivers, it's not that important.

You're absolutely right, the drivers would favor the older card with maturer drivers. But since this is the same situation with the 1080 it's a moot point.

Newer Nvidia drivers favor new Nvidia cards. Kepler architechture suffers. Anyway drivers can easilly make 40% and even more difference.

Fantastic, good for AMD, lets give everyone some features that aren't in use by nearly any game developer and lets make our GPU's slower than the competitions so no one will want to use them anyway. Sounds like a sound proposition. Real smart of AMD "Ah! our GPU's are a bit crap but we'll pack them full of currently useless tech anyway, it's not like we'll release much better cards which will actually handle future games properly down the line at all...".

Any developer will ever use new features unless they are supported on GPU's. Not everyone changes graphics cards very often so having wide support for features will increase chance those features become widely adopted. Just look how long Xbox 360 and PS3 were around before successors came.

What you mean is, AMD hasn't had anything to compete with Nvidia since the 980Ti was released and they still won't have anything to compete. Yeah, why not, lets all wait for Vega, I'm sure Nvidia will just sit idle and not be readying a 1080Ti or some such magic.

Is it me, or am I the only one who feels like we are constantly waiting on AMD to deliver something? Processors, GPU's. It's always "look what we have coming" but by the time they release anything the competition has already released something better.

As manufacturing capacity is very limited at start, Nvidia and AMD can basically sell every new tech GPU's they make. So why they should compete with each other also right now? And because AMD has much higher manufacturing capacity, they aim for mid end first. This all make sense.

By time AMD releases Polaris, Nvidia has not any 16nm cards against it at same price range. Because Nvidia aimed high end first and AMD not.

Literally the first decent reply you've ever given. And once again, this has been the same with the last couple of generations, the 780Ti / 980Ti were different chips and had the extra performance and bigger dies.

All you're moaning about is the fact Nvidia didn't release the 1080Ti first, that is literally all you're complaining about. You are so upset by this you simply cannot fathom that the 1080 is a master piece compared to any of it's predecessors for the last (at least) 6 years even when evidence is thrown in your face. Trying to convince yourself it's all "because 16nm" even when it's been proven in the past between node changes to not make this much of an impact.

Once again, on 1080 manufacturing process makes most difference. Even Nvidia does not deny it.

even when it's been proven in the past between node changes to not make this much of an impact.

Make better proof on this one. Your previous proofs need much more evidence. If we look Polaris, it's likely that Polaris will not achieve same performance as Fury X. And so it's proven that 14nm tech AMD is using is in fact worse than 28nm tech because Fury X is faster than Polaris :)

*clears throat* I'll just quote yourself back to you

You literally tried to use it as an argument about "game changers" earlier in this thread...

How GTX970 was game changer (exept it was falsely advertized as 4GB card and does still not support async shaders)?

See question mark? I asked how GTX970 was game changer.
 
You don't seem to know how the industry works then....

There are many technologies that are potential game changers. But high price means they are not.

Yet Nvidia sold the best part of a million of them in the first four months of it being available
Yet it is comfortably the most popular card on the Steam hardware survey despite costing $300 or more for the bulk of its life
Yet it is the main contributor to Nvidia's record year for revenue and profit which directly translates into their increased R&D spend
Yet it was also the main contributor in AMD's discrete desktop graphics market share plummeting to an all time low of 18.1%, leading directly to AMD's average sale price dipping under $29 per GPU, their graphics revenue falling like a rock, and very likely their R&D budget decreasing still further.

Now we know that GTX 970 was falsely advertised as 4GB card and DirectX 12 support was quite poor. Also after that 4GB fiasco was revealed, many GTX 970's was returned. So good selling figures were not because card was good but rather that many buyers bought something else they thought. Also that GPU was designed for 20nm and it's not free to redesign it for 28nm.

And now we have proof that 16nm is much better than 28nm so buying GTX 970 was not so wise buy after all. Something many people said years ago. GTX 980 Ti even better example of this. Tons of used GTX 980 Ti's are on sale because GTX 1080 is much better and GTX 980 Ti owners realized their card is expensive old tech. Clear proof that Nvidia buyers are ready waste lots of money on old tech. Probably AMD thought AMD users are not so stupid and so did not release new architecture on 28nm tech. That may sounds rough but truth sometimes hurt.

Whatever you say chief. Considering this is a thread about the GTX 1070 you seem determined to turn it into an AMD thread at every turn. Between the constant shilling of Polaris and your compunction to use the phrase "async compute" as often as possible, I'm starting to wonder if you aren't Roy Taylor.

So when Polaris 10 thread is up, no discussion about Nvidia's offerings at same price range? Right.

Thanks for the unconnected list of AMD marketing bullet points. 14nmLPP and 16nmFF+/FFC are both effectively half-nodes of 20nm. 16nm appears to clock higher (Polaris, by AMD's own figures will have moderate clocks (1350MHz max)). The GTX 1080 isn't Polaris's competition, GP106 is.
As I noted in another thread, I've been hearing this 14nm AMD marketing for a while. How 14nm and AMD are ahead of Nvidia and 16nm, blah blah blah.

True, those 14nm/16nm are not really 14nm and 16nm but still better than 20nm. And while 20nm was totally cancelled, some difference must be made. AMD Polaris is not aimed for high end so there is no need to clock it high for those. Also 1080 clocks tend to be more 1700 MHz range and even then GPU is quite hot.

Yet the simple truth is TSMC have Fab 16B up and running turning out GPUs up to 610mm² and two SKUs (one high volume) ready for retail, while Apple recently pulled the plug on Samsung producing A10's, and GloFo isn't even confirmed for AMD GPU production - their appalling track record with new processes should hardly fill anyone with confidence even if they were - after all AMD is fighting a lawsuit stemming from Glofo's previous overoptimistic ramp/yield claims.
 
Thre is no exact and widely accepted definition for "game changer". Provide proof if there is.
Can I save Burty117 twenty seconds of Googling? Thanks!
Definition of game changer
  1. : a newly introduced element or factor that changes an existing situation or activity in a significant way
Now check my last post rather than just ignoring aything that conflicts with your views
Exactly. Like 3.5GB GTX970 and Async shader support still missing after 10 months or so.
Another async mention - your overlords will be pleased

And because AMD has much higher manufacturing capacity, they aim for mid end first. This all make sense.
Provide proof. The GTX 1060 (GP 106,1280 core, 128-bit) is due in July, How does that translate to AMD having "much higher manufacturing capacity"?
By time AMD releases Polaris, Nvidia has not any 16nm cards against it at same price range. Because Nvidia aimed high end first and AMD not.
The GP 106 cards are set for $149-229 price points. You know when Polaris is due in retail availability? You must do to keep making these claims. Can we get some proof? Even AMD's Computex announcement points to Polaris "updates" and 7th-gen APU "launches"
Make better proof on this one.
Practice what you preach
See question mark? I asked how GTX970 was game changer.
Already answered. Stop trolling.
There are many technologies that are potential game changers. But high price means they are not.
Bollocks. Example G80.
You just said:
Thre is no exact and widely accepted definition for "game changer". Provide proof if there is.
...and now you are setting the rules for what constitutes a game changer. You can't even be consistent in your trolling. You really are a waste of everyone's time
Also after that 4GB fiasco was revealed, many GTX 970's was returned.
Nope. Very few were returned as a percentage, and those that were were resold. The GTX 970 issue was more an issue to AMD fanboys without the card, than owners of it.
So good selling figures were not because card was good but rather that many buyers bought something else they thought.
That is just plain stupid. If that were the case why is the GTX 970 still gaining share in Steams Hardware survey and still tops Amazon's best sellers list. Duh
And now we have proof that 16nm is much better than 28nm so buying GTX 970 was not so wise buy after all.
The same can be levelled at any recent release. The R9 380 is a pig compared to other 28nm GPUs. It looks positively pathetic against the new cards. Don't see you singling out that turd.
Tons of used GTX 980 Ti's are on sale because GTX 1080 is much better and GTX 980 Ti owners realized their card is expensive old tech.
Already previously explained. Yet more trolling. Time to move on.
Probably AMD thought AMD users are not so stupid and so did not release new architecture on 28nm tech. That may sounds rough but truth sometimes hurt.
Yeah, it couldn't be that AMD couldn't afford to as the entire industry is well aware. And it still doesn't explain how he Fury cards are now basically EOL, and will likely have the shortest lifespan of any modern GPU
Also 1080 clocks tend to be more 1700 MHz range and even then GPU is quite hot.
.
With stock fan profile and a reference cooler. Most people will just set a custom profile and have higher clocks
14636166464UR8bKmzfk_1_5_l.gif
 
Last edited:
Can I save Burty117 twenty seconds of Googling? Thanks!
Definition of game changer
  1. : a newly introduced element or factor that changes an existing situation or activity in a significant way
Now check my last post rather than just ignoring aything that conflicts with your views

And who made that definition? Also, that "significant way" also includes availability. "We have ultimate game changing product here in the labs. However we only have one piece and will not produce another in five years". Changes situation? Yes. In a significant way? No.

Provide proof. The GTX 1060 (GP 106,1280 core, 128-bit) is due in July, How does that translate to AMD having "much higher manufacturing capacity"?

Source? I checked out new rumours that say it will have 256-bit bus. And still AMD is expected to release much wider lineup.

The GP 106 cards are set for $149-229 price points. You know when Polaris is due in retail availability? You must do to keep making these claims. Can we get some proof? Even AMD's Computex announcement points to Polaris "updates" and 7th-gen APU "launches"

Can I get some proof about GP106 then? Even 1080 is not yet available.

Already answered. Stop trolling.

Stop writing BS. I did NOT say GTX 970 is game changer. I asked WHY it is.

Bollocks. Example G80.

You just said:

...and now you are setting the rules for what constitutes a game changer. You can't even be consistent in your trolling. You really are a waste of everyone's time[/quote]

Of course because

1. there are no single exact definition one game changer
2. game changer needs to be widely adopted in case of GPU's to be game changer

Another definition for game changer here:

a person or thing that dramatically changes the course, strategy, character, etc., of something:

Now tell my why your "definition" is right and that is wrong? Current consoles GPU's are much more game changer than GTX 1080 will ever be. Because consoles sell much more.

Nope. Very few were returned as a percentage, and those that were were resold. The GTX 970 issue was more an issue to AMD fanboys without the card, than owners of it.

Product that can be returned for refund even if used means there has to be something wrong in it, right?

That is just plain stupid. If that were the case why is the GTX 970 still gaining share in Steams Hardware survey and still tops Amazon's best sellers list. Duh

Perhaps those people do not know that they are buying 3.5GB part? Even Nvidia does noy say anything about it http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications

The same can be levelled at any recent release. The R9 380 is a pig compared to other 28nm GPUs. It looks positively pathetic against the new cards. Don't see you singling out that turd.

What do you mean saying pig? It's still much better buy than GTX960.

Already previously explained. Yet more trolling. Time to move on.

Truth hurts it seems. Buying GTX 980 Ti was undeniably waste of money so no wonder there are no any arguments against me.

Yeah, it couldn't be that AMD couldn't afford to as the entire industry is well aware. And it still doesn't explain how he Fury cards are now basically EOL, and will likely have the shortest lifespan of any modern GPU

AMD could afford to bring Fury and 3xx series. So that argument does not apply. Fury cards were first using HBM memory and because HBM2 is coming quite soon, no wonder they have short lifespan. In this case short lifespan is not big problem as AMD has no direct replacement for them.

With stock fan profile and a reference cooler. Most people will just set a custom profile and have higher clocks
14636166464UR8bKmzfk_1_5_l.gif

Then temps and that "very important" power consumption will get higher.
 
And who made that definition? Also, that "significant way" also includes availability.
It's a dictionary definition. You're now claiming to know more about the English language than Merriam-Webster .
Availability? The GTX 970 sold in millions of units and is the single most popular card on Steams Hardware survey.
TROLL
Source? I checked out new rumours that say it will have 256-bit bus. And still AMD is expected to release much wider lineup.
Very definitive
Can I get some proof about GP106 then?
Rumours seem good enough for you. I'll throw in some pictures as well
Stop writing BS. I did NOT say GTX 970 is game changer. I asked WHY it is.
And I told you I had already provided the answer. You even quoted it in post #33. Do you have learning difficulties? If so, I can cut and paste into an easy to follow jpeg
You just said:...and now you are setting the rules for what constitutes a game changer.
Nope. I didn't set any rules. I just provided a dictionary definition. Having trouble reading through the red mist?
Now tell my why your "definition" is right and that is wrong? Current consoles GPU's are much more game changer than GTX 1080 will ever be
1. I wasn't talking about the GTX 1080. I used the G80 (first unified shader architecture), and the GTX 970 as examples. If you read what is written rather than making the whole conversation up in your head, it will make better sense.
2. I also never said anything about consoles. Game changers come in many varieties. I didn't make a case for one being better than another - again that's your internal dialogue rather than real life.
Product that can be returned for refund even if used means there has to be something wrong in it, right?
No, you troll. Right of return is a consumer right in many countries. It still doesn't negate the fact that the GTX 970 is the most popular SKU on the Steam Hardware survey and Nvidia made a mint selling them leading to an increased R&D budget. What part of this very simple factual statement are you having trouble grasping?
What do you mean saying pig? It's still much better buy than GTX960.
That's not saying much.
Fury cards were first using HBM memory and because HBM2 is coming quite soon, no wonder they have short lifespan
Apologist
 
It's a dictionary definition. You're now claiming to know more about the English language than Merriam-Webster .

Availability? The GTX 970 sold in millions of units and is the single most popular card on Steams Hardware survey.
TROLL

Here's another definition

a person or thing that dramatically changes the course, strategy, character, etc., of something:

Now, is that more right than your version?

I did not talk about GTX 970 there.

Very definitive

You have better info?

Rumours seem good enough for you. I'll throw in some pictures as well

They just guess that is GP106.

And I told you I had already provided the answer. You even quoted it in post #33. Do you have learning difficulties? If so, I can cut and paste into an easy to follow jpeg

I have difficulties, yes. Now, what existing GTX 970 really changed in significant way?

Did it contain something awesome? On negative way, yes.
Sold well? Nvidia discrete sales were better than AMD's before GTX 970.
Steam Hardware survey it changed, that's true but very little.
Nvidia mede revenue before GTX 970 and it did not signifiacantly raise revenue.
AMD's discrete cards were already downhill

So it really didn't change any existing in significant way. Just making bit more profit is not really significant without something like revolutionary technology and such,

Nope. I didn't set any rules. I just provided a dictionary definition. Having trouble reading through the red mist?

Problem is that I found another dictionary definition.

1. I wasn't talking about the GTX 1080. I used the G80 (first unified shader architecture), and the GTX 970 as examples. If you read what is written rather than making the whole conversation up in your head, it will make better sense.
2. I also never said anything about consoles. Game changers come in many varieties. I didn't make a case for one being better than another - again that's your internal dialogue rather than real life.

1. I could say same for you about many parts.

2. If A is more game changing than B, then B is not so likely game changing.

No, you troll. Right of return is a consumer right in many countries. It still doesn't negate the fact that the GTX 970 is the most popular SKU on the Steam Hardware survey and Nvidia made a mint selling them leading to an increased R&D budget. What part of this very simple factual statement are you having trouble grasping?

I troll? Then tell me why so many are selling their GTX 980 Ti? They should just return it and get their money back. Much more profit than selling it, right? But wait a minute, is GTX 980 Ti known to have similar (or worse) defect than GTX 970? No? Ah, there goes the masterplan.

Getting more revenue from successful product and increased revenue can lead to increased R&D budget are pretty much expected. In case that is enough to be game changer, then almost every profitable product is.

That's not saying much.

No so bad GPU then as it beats it's direct competitor, excluding power effiency.

Apologist

Still waiting for Nvidia's consumer class video cards with HBM memory.[/QUOTE]
 
I think we should give up on this guy, he's just trolling pure and simple, I was hoping he could be a bit of a laugh with the amount of sarcasm in my last post but apparently not. He's just going full troll instead.
 
I think we should give up on this guy, he's just trolling pure and simple, I was hoping he could be a bit of a laugh with the amount of sarcasm in my last post but apparently not. He's just going full troll instead.

Or perhaps you ran out of arguments so they say others are trolling. Try to figure out something new.
 
Or perhaps you ran out of arguments so they say others are trolling. Try to figure out something new.
I didn't have anything left to say, your responses were so painful to read, once I deciphered them, All I can make out of everything you've said is:

A) You're too poor to spend any real money on GPU's
B) You're Seriously Jealous of 980Ti owners
C) You're really upset Nvidia didn't release the 1080Ti first
D) Even though you don't plan on buying one because (see point A) you love AMD
 
I didn't have anything left to say, your responses were so painful to read, once I deciphered them, All I can make out of everything you've said is:

A) You're too poor to spend any real money on GPU's
B) You're Seriously Jealous of 980Ti owners
C) You're really upset Nvidia didn't release the 1080Ti first
D) Even though you don't plan on buying one because (see point A) you love AMD

A) I usually buy GPU's with good price/performance ratio
B) Not at all, they wasted money on old technology.
C) Not at all, but for reasons I provided GTX 1080 just do not deserve 100/100 rating.
D) Perhaps I buy when get better monitor.
 
A) I usually buy GPU's with good price/performance ratio
B) Not at all, they wasted money on old technology.
C) Not at all, but for reasons I provided GTX 1080 just do not deserve 100/100 rating.
D) Perhaps I buy when get better monitor.
What GPU do you currently have?
 
I think we should give up on this guy, he's just trolling pure and simple, I was hoping he could be a bit of a laugh with the amount of sarcasm in my last post but apparently not. He's just going full troll instead.
With you there. I just cut loose of a similar trolling/shilling diatribe in the GTX 1080 thread. Guy doesn't have the attention span to stay on track, and changes tack as soon as his argument falls apart. This thread is just as bad - just check out these delusions
Sold well? Nvidia discrete sales were better than AMD's before GTX 970.
This is an annotated market share graph I put together for Beyond3D (since appropriated by more than a few other sites) using figures JPR and Mercury Research. I think the increase in market share after the introduction of the GTX 980/970 is self explanatory
emvQALG.png

If it isn't self explanatory on its own, the units shipped are easily extrapolated from JPR's quarterly reports.
Q2 2014 (Quarter previous to the GTX 980/970's introduction) : Nvidia discrete desktop units sold: 7.13 million
Q3 2014 (GTX 980/970 introduction) : 8.93 million
Q4 2014 : 9.424 million
Q1 2015: 8.76 million

Steam Hardware survey : Of the 6.13% combined GTX 980/970 share of cards, the GTX 970 has a 5-to-1 advantage (5.10% to 1.03%).

That's the thing about trolls - they don't possess much knowledge and are too lazy to do any fact checking - much easier to just make "facts" up as they go along.
Nvidia mede revenue before GTX 970 and it did not signifiacantly raise revenue.
While some other facts are so easy to access, it makes you wonder why anyone would be stupid enough to believe they'd get away with passing off fiction to people who actually follow the industry. Since Seeking Alpha requires registration, here's a synopsis
On Wednesday, Nvidia reported record fourth quarter revenues of $1.25 billion, up 9 percent. Profits, meanwhile, rose by a healthy 31 percent to $193 million. Revenue from the company’s GeForce business jumped 38 percent as the company’s “Maxwell” based products were snapped up by gamers: the GeForce GTX 980 and 970 on the high end, and the new 960 addressing the 1080p “sweet spot” of mainstream gaming. ..
Profit up much higher than revenue obviously indicating that profit was driven by high margin GPUs.

...and some stuff, well the phrase "comedy gold" comes to mind
AMD's discrete cards were already downhill
The ones AMD continues to rely upon two years later.....
I don't think that is actually born out in performance reviews, but I look forward to using your quote and making sure you get due credit - should make you very popular

Anyhow, a couple of overnight Pascal related items -GP106 is 256-bit and is inbound, which fits with the July launch timeframe the company mentioned,and
The GTX 1080M will soon be with us. Going to be some potent gaming laptops showing up by the looks of it.
 
Last edited:
With you there. I just cut loose of a similar trolling/shilling diatribe in the GTX 1080 thread. Guy doesn't have the attention span to stay on track, and changes tack as soon as his argument falls apart. This thread is just as bad - just check out these delusions

This is an annotated market share graph I put together for Beyond3D (since appropriated by more than a few other sites) using figures JPR and Mercury Research. I think the increase in market share after the introduction of the GTX 980/970 is self explanatory
emvQALG.png

If it isn't self explanatory on its own, the units shipped are easily extrapolated from JPR's quarterly reports.
Q2 2014 (Quarter previous to the GTX 980/970's introduction) : Nvidia discrete desktop units sold: 7.13 million
Q3 2014 (GTX 980/970 introduction) : 8.93 million
Q4 2014 : 9.424 million
Q1 2015: 8.76 million

That's the thing about trolls - they don't possess much knowledge and are too lazy to do any fact checking - much easier to just make "facts" up as they go along.

Now, returning to that game changing, definition:

a newly introduced element or factor that changes an existing situation or activity in a significant way

Now I just stay on that. Which of following scenarios changes existing situation in a significant way?

A) Company makes profit Q1 30 million and Q2 40 million.
B) Company makes profit Q1 1 million and Q2 1.5 million.
C) Company makes profit Q1 -1 million and Q2 +0.5 million

Make a graph if it helps to illustrate what is most significant change of above scenarios.

While some other facts are so easy to access, it makes you wonder why anyone would be stupid enough to believe they'd get away with passing off fiction to people who actually follow the industry. Since Seeking Alpha requires registration, here's a synopsis

Profit up much higher than revenue obviously indicating that profit was driven by high margin GPUs.

Too bad Nvidia launched two products that day so hard to say was it only GTX 970 or GTX 970 + GTX 980 combined.

...and some stuff, well the phrase "comedy gold" comes to mind

The ones AMD continues to rely upon two years later.....
I don't think that is actually born out in performance reviews, but I look forward to using your quote and making sure you get due credit - should make you very popular

For desktop PC market Nvidia has only discrete chips available (perhaps some mobile chips are found on desktops but basically that way). AMD has almost replaced very low end cards with APU's. So more AMD sells APU's, less AMD sells discrete cards. So it's expected that AMD's discrete graphics card share will remain low because Nvidia has very few integrated GPU solutions available to PC.
 
Don't know about beating a TitanX or 980ti in every test. That's a 20% cut on the cores plus slower clocks and memory. Looks like it'll be about 30% slower than the 1080 overall, and put it in line with the 980ti. Still 980ti performance for $370 isn't bad, but the 1080 actually looks more attractive now.
Just because cores are fewer and slower doesn't necessary equate to lower performance. If the GTX1070 was a Maxwell card, I would be right there beside you with my doubt. But Pascal is a whole new ball game in terms of both architecture and firmware/APIs. DX12, along with Vulkan, are very much "smarter not harder" APIs, and so is Pascal architecture.

If the GTX1080 benchmarks are anything to go by, I expect the 1070 to perform on-par or slightly below a Titan X or 980ti - depending on scenario - when it is released. But as DX12 and Vulkan begin to show up in more games and benchmarks, you're going to see these Pascal cards begin to trounce Maxwell cards.
 
Back