Surprise! AMD shares internal Fury X benchmarks ahead of review embargo

wiyosaya

TS Evangelist
The red lines look good, however, when considered as a percentage, the best the Fury X does beyond the 980 Ti is approximately 20-percent. To me, these cards look like they are running pretty much even except for the few games where there is an obvious improvement in the Fury X. Based on AMD's leaked benchmarks, the Fury X is the better card; to me, given the source of the benchmarks, that is no surprise.

That said, I'll wait until the independent benchmarks come out to see what the real story is.
 

Burty117

TechSpot Chancellor
Does anyone know what the system specs were? Intrigued to see if they used an Intel CPU :p

On another note this will make the 980Ti drop in price a little bit and AMD can have the "fastest GPU in the world" moniker for a while. I'd say that's a good thing, means AMD still have a bit of competitiveness with Nvidia.

In the mean time I will patiently wait for next year before I think about replacing my 780, Instead, might look into investing in a G-Sync Screen this year...
 
G

Guest

Also it will be very interesting to see how well Fury and Fury X over clock...since 980ti is essentially a factory over clocked 980 it seems plausible the Fury X will pull even further ahead of 980ti once testers get their hands on them for a few days of play. I think for a first release of a product featuring a brand new memory architecture the Fury cards are impressive. Once Windows 10 releases and we see some games with DX12 things should heat up again all over...AMD and nVidia both claiming some serious performance improvements under DX12...should be a fun summer for GPU's
 

Squid Surprise

TS Evangelist
On another note this will make the 980Ti drop in price a little bit and AMD can have the "fastest GPU in the world" moniker for a while.
Let's not forget that the 980Ti is nVidia's SECOND most powerful GPU... the TitanX beats both of these - although it will be interesting to see real-world benchmarks between the 3 cards, as well as SLI/Crossfire numbers...

Obviously, pricewise, the TitanX is not comparable - but we can't give AMD the performance crown either...
 

Burty117

TechSpot Chancellor
On another note this will make the 980Ti drop in price a little bit and AMD can have the "fastest GPU in the world" moniker for a while.
Let's not forget that the 980Ti is nVidia's SECOND most powerful GPU... the TitanX beats both of these - although it will be interesting to see real-world benchmarks between the 3 cards, as well as SLI/Crossfire numbers...

Obviously, pricewise, the TitanX is not comparable - but we can't give AMD the performance crown either...
Well no actually, if you look at the chart (I'm just going to pick Assassins creed: Unity out) this Fury got 37-38fps @ 4K... exactly the same as Titan X.

This is the same with Battlefield 4 and a few others.

A few games though it actually beats the Titan X, Thief, Hitman: Absolution and Witcher 3 for examples. It's not as clear cut as you'd like it to be I'm afraid and to be honest, the Titan X is a $1000, I just couldn't call it the performance king when a $650 AMD card can beat it in quite a few games and Match it in others.

But as always it will be best to wait for a proper review from Techspot as we don't know the spec of the test system used and they are going to deliberately make it look good...
 

dividebyzero

trainee n00b
Does anyone know what the system specs were? Intrigued to see if they used an Intel CPU :p
AMD almost always use an Intel system for comparative benchmarking. The Fury X press deck list the systems in the appendices. If you scroll down the PR to the end of the article, (Appendix 2) you'll find:


On another note this will make the 980Ti drop in price a little bit and AMD can have the "fastest GPU in the world" moniker for a while. I'd say that's a good thing, means AMD still have a bit of competitiveness with Nvidia.
A lot will depend upon how the Fury X is received by the enthusiast crowd, since they are the buying market. If the card beats a stock 980 Ti by a few percentage points, then Nvidia wont be too perturbed since AIB OC models ( EVGA Classified/Classy Kingpin, MSI Lightning, Galaxy HOF, Gigabyte G1 Gaming, Asus Matrix) should still eclipse the Fury. It is my understanding that HBM memory wont be allowed to be overclocked by AMD, and AMD's own press deck shows modest OC numbers in relation to the 1400-1500+MHz clocks (and attendant performance) of the custom 980 Ti's.

I'm personally hoping that the Fury X does a little better than AMD's numbers. It would give Nvidia a shove towards releasing a full-GPU GTX 900 card - Basically a 6GB Titan X with higher clocks, non-reference designs, and a higher input power envelope at a more budget friendly MSRP.
In the mean time I will patiently wait for next year before I think about replacing my 780, Instead, might look into investing in a G-Sync Screen this year...
Might be wise. I suspect these round of cards will devalue very quickly once the 14nm/16nm cards roll out. Word is that the big Pascal GPU (GP100) has taped out and AMD's next big GPU cant be far away, so if you can hold for ~ 12 months you could see a 70-100% increase in performance in addition to a better implementation of DX12 pipeline features, and (more than likely) DisplayPort 1.3 output.
 
  • Like
Reactions: Burty117
G

Guest

"Also it will be very interesting to see how well Fury and Fury X over clock...since 980ti is essentially a factory over clocked 980 it seems plausible the Fury X will pull even further ahead of 980ti once testers get their hands on them for a few days of play."

Actually the 980ti is related to the titan x, not the 980. I also really doubt the fury x beats it in everything like it shows in the graph above... Where's the metro redux benchmarks? I think you'll find the fury x falls behind in more than a few games
 
G

Guest

At least they are willing to admit it on the box instead of lying to their customers.
You sound like a fanboy.
 
G

Guest

They are 4GB of RAM short of the promise they made, but they are 10 miles ahead in their Red vs Green B.S.

I like green colours much better though :)
At least they are willing to admit that on the box instead of lying (cough, cough 3.5gb).
 
G

Guest

Damn wish this was 8GB on the card and keeping my eye on the FuryX. Oh well will have to wait until Sapphire releases one :D
 

Squid Surprise

TS Evangelist
Well no actually, if you look at the chart (I'm just going to pick Assassins creed: Unity out) this Fury got 37-38fps @ 4K... exactly the same as Titan X.

This is the same with Battlefield 4 and a few others.

A few games though it actually beats the Titan X, Thief, Hitman: Absolution and Witcher 3 for examples. It's not as clear cut as you'd like it to be I'm afraid and to be honest, the Titan X is a $1000, I just couldn't call it the performance king when a $650 AMD card can beat it in quite a few games and Match it in others.

But as always it will be best to wait for a proper review from Techspot as we don't know the spec of the test system used and they are going to deliberately make it look good...
It ties and beats the TitanX in a few selected benchmarks... and loses the rest... And I think it's fairly safe to assume that in just about every benchmark they DIDN'T publish, it loses as well...

That's not to say it isn't a great card - it might very well be... but, money aside, the PERFORMANCE crown goes to the TitanX....

Price/performance.... we'll find that out in a few days whether it's the Ti or the Fury....
 

Darth Shiv

TS Evangelist
Surprise! These are the exact same charts that have been circulating on the internet for a while now.

What does not impress me at all about this Fury X GPU is:
1. eats up a lot of space in the case due to the water cooling bit
- you essentially have a 2-slot card + a radiator (which can be problematic with CPU cooler towers) + tubing to handle
- not sure why they didn't go for a standard water block...

2. since both 980 Ti and Fury X are priced similarly, power consumption can become a deciding factor
- here the 980 Ti takes the cake due to its high efficiency
- note: this is based on how the 390X performs (same GCN 1.2 arch) + the very large die-size and TDP of the Fury X

3. only 4GB of VRAM can hinder performance in games that are VRAM *****s (especially on 4K res!)
- this will have to be tested in a real scenario however, but I do remember people with 980's complaining about it before where their FPS just took a nose-dive due to VRAM limitation in 4K resolutions

4. same GCN 1.2 architecture, just scaled out to a bigger chip

5. no HDMI 2.0 support
HDMI 2.0 support generally shouldn't affect people. Just use DisplayPort.

4GB is a little disappointing considering everyone is going to look at 4K when thinking of this card. Like the 980Ti, it'll do 1440p well so hopefully 4GB is enough for that.

The cooling solution - high TDP. Don't like that so much but if the cooling solution means it is very quiet, I can handle that ;)
 
G

Guest

"HDMI 2.0 support generally shouldn't affect people. Just use DisplayPort"
4K TV, with sound on same cable?
No hdmi 2 is ridiculous!
 
G

Guest

Yes, HDMI 2.0 does matter to some people.

Simply because monitor makers have utterly failed to recognize the potential of 4k in larger screen formats, such as 40" to 50" range. This size has the same pixel density as 20"-25" 1080p monitors, eliminating any and all scaling issues, and providing a much larger, eye filling spectacle, at a pixel density which is generally more useful than 28" 4k monitors.

Meanwhile the 4k TV makers started out doing a horrible job regarding "input lag". So the market was completely barren of large format 4k gaming-capable displays.

However, a few TV makers have realized this niche, and started to make some 4k screens with very fast response times. Meanwhile the monitor makers have not really budged. So for larger format 4k (where you are likely to get more actual visual benefit out of the big resolution) HDMI 2.0 is currently preferred.
 
G

Guest

using a TV as a monitor is more ridiculous
It's only ridiculous if the tv has high input lag.
some do.
some don't.

If the input lag is good, a screen is a screen is a screen, the fact that it comes with a remote doesn't make it a bad monitor.

But KNOW the input lag, if you're going the tv route.
 

Burty117

TechSpot Chancellor
It ties and beats the TitanX in a few selected benchmarks... and loses the rest... And I think it's fairly safe to assume that in just about every benchmark they DIDN'T publish, it loses as well...

That's not to say it isn't a great card - it might very well be... but, money aside, the PERFORMANCE crown goes to the TitanX....

Price/performance.... we'll find that out in a few days whether it's the Ti or the Fury....
It doesn't "lose" the rest though?! Unless you're blind and cannot read simple graphs? It's £350 cheaper yet (currently) according to leaked graphs it CAN keep up with a card that hits quadrople figures ($1000) in price, I'd call that a win from AMD there..

Again not to say it's a good or bad card... But currently it's a pretty good card for the price, which means us Nvidia guys might be able to get the 980Ti for a considerable discount (assuming fury is awesome of course)....

AMD almost always use an Intel system for comparative benchmarking. The Fury X press deck list the systems in the appendices. If you scroll down the PR to the end of the article, (Appendix 2) you'll find:



A lot will depend upon how the Fury X is received by the enthusiast crowd, since they are the buying market. If the card beats a stock 980 Ti by a few percentage points, then Nvidia wont be too perturbed since AIB OC models ( EVGA Classified/Classy Kingpin, MSI Lightning, Galaxy HOF, Gigabyte G1 Gaming, Asus Matrix) should still eclipse the Fury. It is my understanding that HBM memory wont be allowed to be overclocked by AMD, and AMD's own press deck shows modest OC numbers in relation to the 1400-1500+MHz clocks (and attendant performance) of the custom 980 Ti's.

I'm personally hoping that the Fury X does a little better than AMD's numbers. It would give Nvidia a shove towards releasing a full-GPU GTX 900 card - Basically a 6GB Titan X with higher clocks, non-reference designs, and a higher input power envelope at a more budget friendly MSRP.
Agreed, I really hope nvidia pull something like that off... It would be the last greatest GDDR5 GPU kind of thing...

Come on AMD... I haven't (personally) bought a GPU from you since the 4000 series from but if you can beat the 980Ti I will be so happy! :)

Might be wise. I suspect these round of cards will devalue very quickly once the 14nm/16nm cards roll out. Word is that the big Pascal GPU (GP100) has taped out and AMD's next big GPU cant be far away, so if you can hold for ~ 12 months you could see a 70-100% increase in performance in addition to a better implementation of DX12 pipeline features, and (more than likely) DisplayPort 1.3 output.
Ahahaha! I knew it had to be an Intel CPU, thanks for confirming :) man I can't wait for Pascal... Will be very interesting to see what they pull out the hat :)
I didn't think it was worth changing just yet...
 

Darth Shiv

TS Evangelist
"HDMI 2.0 support generally shouldn't affect people. Just use DisplayPort"
4K TV, with sound on same cable?
No hdmi 2 is ridiculous!
Input lag, TV inbuilt speakers. Bleh. How many people are going to use 4K TVs as monitors with the TV speakers? It's not the majority...

For 4K gaming, you need a $650USD graphics card minimum. If you are going to that effort, you should get a gaming grade 4K monitor. One that has decent refresh rate and low input lag. That rules out 4K TVs.

So if you're not gaming, what are you doing? Streaming 4K? 4K blurays? And you are using TV inbuilt speakers? Hah...
 

DaveBG

TS Maniac
It's only ridiculous if the tv has high input lag.
some do.
some don't.

If the input lag is good, a screen is a screen is a screen, the fact that it comes with a remote doesn't make it a bad monitor.

But KNOW the input lag, if you're going the tv route.
And what about the 30hz? Most TVs do not care about 60Hz either.
TVs are no mach for monitors in many ways.
 

Squid Surprise

TS Evangelist
It doesn't "lose" the rest though?! Unless you're blind and cannot read simple graphs? It's £350 cheaper yet (currently) according to leaked graphs it CAN keep up with a card that hits quadrople figures ($1000) in price, I'd call that a win from AMD there..
That chart is compared to the 980Ti.... NOT the TitanX... The TitanX beats this card - And I'm not arguing about whether the card to buy is the Fury or the Titan (we won't know until real benchmarks appear anyways!), I'm simply saying that AMD can't claim the performance crown...

The Ti is the better value over the Titan - the Fury will almost certainly be a better value than the Titan as well... possibly even better than the Ti - we'll see once it's released...

But if money is no object - the TitanX is the card to buy... barring unforseen results in REAL benchmarks - but I don't think the Fury will be beating the TitanX in any but a few select games - we'll see in a few days...
 

darksayian

TS Rookie
Surprise! These are the exact same charts that have been circulating on the internet for a while now.

What does not impress me at all about this Fury X GPU is:
1. eats up a lot of space in the case due to the water cooling bit
- you essentially have a 2-slot card + a radiator (which can be problematic with CPU cooler towers) + tubing to handle
- not sure why they didn't go for a standard water block...
{ takes up no more space than any other card as the block sits on top of the card, if case is too shallow mahaps.}

2. since both 980 Ti and Fury X are priced similarly, power consumption can become a deciding factor
- here the 980 Ti takes the cake due to its high efficiency
- note: this is based on how the 390X performs (same GCN 1.2 arch) + the very large die-size and TDP of the Fury X
{Fury X uses less power than a 970, and outperforms Titan X, read up}

3. only 4GB of VRAM can hinder performance in games that are VRAM *****s (especially on 4K res!)
- this will have to be tested in a real scenario however, but I do remember people with 980's complaining about it before where their FPS just took a nose-dive due to VRAM limitation in 4K resolutions

{4gb of HBM is more than enough as it is not GDDR 5, it feeds faster and with greater throughput, it outperforms all other cards at 4k, still faster at 5k. the frame buffer only becomes problematic after 8k, and at that point the titan outperforms it due to being able to hold the entire frame buffer. Who has the disposable income to play at 8k and above, 1% of the 1%. Nobody is targeting that market yet}

4. same GCN 1.2 architecture, just scaled out to a bigger chip

{and outperforms all other cards}

5. no HDMI 2.0 support

{Perhaps one monitor on the market supports HDMI 2.0. Who cares}

Some things I do like:
1. HBM technology
2. can fit in smaller cases due to the short length of the PCB (mostly attributed to HBM usage)

{Depends on depth, however this contradicts the point you attempted in your first point}

One thing I am very concerned about: operating temperature.

{the air cooled version which has the same performance as the 980 TI stays cooler than the 980ti as it does not build as much heat due to HBM being stacked on an interposer with the GPU. So does the 980 keep you awake at night in fear}

Yes, I know, it uses water cooling, however it is based on the same inefficient (when compared to Maxwell) GCN 1.2 arch, scaled-out to a larger die size AND the HMB memory is on the GPU.
Based on the above, I am expecting to see operating temperatures of around 80-90C, but I hope this won't be the case in reviews.
{You seem to lack proper information on this topic, although it is still GCN 1.2 it is completely reorganized on the die. I packs more compute cores, more stream processors. Although it is a larger GPU it is far more efficient than anything AMD or Nvidia have produced to date, as it does not have to incorporate individual busses to each module of GDDR. Less heat builds up as the requirement for more current due to resistance is reduced by massive '1000s%". It really is that drastic}

Down play it all you want, This is all prepped for DX12. This is Lisa Sue's game now. She sold the idea to MS and they bought it. DX12 is repackaged Mantle that works on every players gear, but AMD has the most densely packed hardware in terms of number of available resources. Intel is even redesigning because in the DX12 environment the bulldozer cores from 2 years ago are outperforming the newest I7s. The I7 980x actually becomes the fastest processor again in DX 12. strange huh?