The Best Graphics Cards 2016: TechSpot's top picks for every budget

Yes, but the RX480 is considerably more expensive, what part of the meaning of the word "Budget" are you not fully understanding? The RX480 costs more and is not in the price category of the 1060 3GB.
But the RX470 is, right?

If you want to play that game, the 1060 6GB is the same price as the RX480 and it wipes the floor with it...
Didn't say it didn't. As said before, picking the 1060 6GB over the RX 480 is justified. Picking the 1060 3GB over the RX 470 is not.

Ah, you are correct, I missed that, my bad. I am at work to be fair, just a little bored and wasn't fully concentrating.
Also at work here...

I haven't lied at all:

Is the RX480 more expensive than the 1060 3GB = Yes
Is the 1060 3GB about as good or better as a RX470 = Yes
Does the 1060 3GB use less power than a RX470 = Yes

None of that is a lie. What's "disgusting" is your inability to read a chart...
LOL. Where on the charts do they talk about price and power? This article states that "There is the argument that the 3GB GTX 1060 will run out of steam down the track due to its limited memory buffer, but we don’t feel that will become an issue". ComputerBase has clearly shown it is already an issue. But you people are deliberately throwing dust at every argument to draw attention away from that. If it was AMD you would be all over their 3GB, but since it's nVidia, we have to ignore it. Just like it's been ignored that the 3GB actually has less shader units than the 6GB. Doesn't matter that they give it the same name, performs is close enough anyway, right...?

I don't have anything new to add because there is nothing new to add. I had to repeat myself because of your inability to respond with anything relevant.
No. You have to repeat yourself because you don't want to talk about the 3GB already being a limit, as clearly shown.

But it's not, the 1060 3GB beats the RX470 in pretty much all reviews I can could bother to Google. It even trades blows with the RX480 in certain games.
Uhuh... And how many of those reviews only look at averages? How many tested frame times like they're supposed to be tested? Newsflash, only ComputerBase did it. Guru3D had a few but didn't go that much in depth.

I never said it wasn't? Let me check... Nope, definitely didn't say that...
And yet you've never admitted it either despite all evidence to the contrary.

How upset would you be right now if I was to tell you what GPU I was running a couple of years ago?
I don't care what GPU you are running or where running. The argument I have is 3GB is a limit, an RX 470 4GB is a better choice. That is it. I don't participate in the fanboyism nonsense that's so prevalent here.

I didn't spin it as a bad thing, I was explaining to all those users on forums such as these that spout nonsense such as "IT DOESN'T HAVE MOAR VRAM! IT MUST BE CRAP!" that you have single handedly linked them all to a performance benchmark that proves it's complete rubbish and they might as well save the money and get the 4GB model. I guess AMD has the same train of thought as Nvidia had with it's older mobile GPU's. Just add more VRAM because higher number is better, right?
You have no idea what you're talking about. 8GB for the RX 480 is not necessary right now. 4GB obviously is. Yet, 4GB on the R9 290 is a limit, thus the 8GB on the R9 390 is helpful. Those cards did not have compression. There are multiple things at play here, rather than just the memory number.
But obviously I can't expect more from you, when all you want to look at is price, average fps and power consumption.
 
Did I hurt some ones feelings, it looks like my post was taken down? (Insert immature insulting rant here) LoL I just said, Don't feed the hardreset. I think you would have to be touched I the head to buy a gtx 1060 3gb.
No... you just spouted nonsense for no reason despite the article telling you why the 1060 3GB was actually an excellent buy...
 
No... you just spouted nonsense for no reason despite the article telling you why the 1060 3GB was actually an excellent buy...
A $200 GPU that's already showing signs of having a vram bottleneck. I have seen what happens to low vram video cards over a short period of time they start to stutter and performance goes to **** as games require more vram. Even the game consoles have 8gb of ram to work with.
 
A $200 GPU that's already showing signs of having a vram bottleneck. I have seen what happens to low vram video cards over a short period of time they start to stutter and performance goes to **** as games require more vram. Even the game consoles have 8gb of ram to work with.
Game consoles don't have 8GB of VIDEO RAM to work with - it's shared with the CPU, and doesn't even come close to utilizing all of it for GPU tasks...

And extra RAM is only useful if the card has the power to use it... we've already seen that the 480 with 4gb and 8gb are virtually identical - which means that the extra 4gb is basically a gimmick (and an excuse to charge a bit more)... in fact, the initial 4GB cards were actually 8GB cards!!! They simply had modified firmware to trick the PC into thinking it had 4GB...

For it's price, and for how it performs now, the 3GB 1060 is an excellent buy... if you are looking for the future, you should be spending more money - 1070 or better....
 
OC'd 980 Ti as good as a 1080? NO- this is a myth by jealous 980 Ti owners.
I'm a 780 SLI owner. The 1080 I replaced them with was 0-5% faster in well scaled games. It lost performance in synthetic benchmarks. It was only a large increase in performance in games with poor SLI scaling (or no support at all, obviously).

Not in the mood to go grabbing the stuff I read/watched comparing overclocked 980 Ti performance to a 1080.

Go look at Futuremark's hall of fame. The only GPUs in it are 980 Ti's and Titan XP's. The 1080 is nowhere to be found, and will never be.

Edit: 1080 is now found, along with other 1000 series cards. Futuremark's site was bugged (seriously, I only found ONE other benchmark record for the 1080.. I now found 70,000+).
 
Last edited:
While on vacation I've entered computer shops in Barcelona and London and new cards from AMD were nowhere to be found.There is such a big demand so nobody even expects prices to be at MSRP level, so Steven is right about price brackets. It's true that GTX 1080 has no competition, but in my opinion GTX 1060 3GB does not even compare to RX470 8GB and besides power consumption, R9 Fury Nano is a way better option than GTX 1070 mini and the future will show that.After some reading, my conclusion is that the wave of Nvidia supporting articles was triggered by the fact that Nvidia is losing market share to AMD.
"Nvidia is losing market share to AMD"

Well you'd better let Steam know about this, because according to their hardware survey, the percentage of Nvidia/AMD users was 51.4%/28.5% on 03/2015 and 57.57%/24.44% on 08/2016.

Of course, Steam only has 125 million users, so I'm sure AMD fanboys will brush off this little fact.

http://store.steampowered.com/hwsurvey
It's not my opinion, check this link: <https://jonpeddie.com/press-releases/tags/tag/gpu> . I don't believe they're lying or inflate numbers to favor AMD.It's about buyers, not users of a certain site or players of a certain game.My opinion was about the latest surge in promoting Nvidia at all cost, even in the disregard of evidence.
 
It's not my opinion, check this link: <https://jonpeddie.com/press-releases/tags/tag/gpu> . I don't believe they're lying or inflate numbers to favor AMD.It's about buyers, not users of a certain site or players of a certain game.My opinion was about the latest surge in promoting Nvidia at all cost, even in the disregard of evidence.
That doesn't say WHERE Nvidia is losing market share... I would assume it's in the budget and all-in-one area.... once firm numbers are out, we'll see... market share also doesn't necessarily mean PROFITS... the high-end is where the profits are - the low-end is for revenue...
 
I'm a 780 SLI owner. The 1080 I replaced them with was 0-5% faster in well scaled games. It lost performance in synthetic benchmarks. It was only a large increase in performance in games with poor SLI scaling (or no support at all, obviously).

Not in the mood to go grabbing the stuff I read/watched comparing overclocked 980 Ti performance to a 1080.

Go look at Futuremark's hall of fame. The only GPUs in it are 980 Ti's and Titan XP's. The 1080 is nowhere to be found, and will never be.
Ok, well then, since you quoted it, lets go check Futuremark out:

1x 780 Score: 3700~
1x 980Ti Score: 6500~
1x 1080 Score: 8500~

Now lets look at the SLI setups:

2x 780 Score: 6000~
2x 980Ti Score: 12000~
2x 1080 Score: 14000~

Now I'm not good with numbers, but it looks like the 1080 is right on the money.
You went from two 780's to one 1080. Correct?
You reckon you only get a 5% increase in performance? Well the numbers are there, that's pretty much what you should have been expecting.

Consider this though, the 1080 is using 30 watts less than a single 780, you had 2 780's. So you're using around 250 watts less power?

Edit: Just to confirm, in the Hall of Fame there are plenty of 1080's in there. 3 way SLI 1080's beating 4 way 980Ti's sounds about right considering the scores as well.
 
I'm a 780 SLI owner. The 1080 I replaced them with was 0-5% faster in well scaled games. It lost performance in synthetic benchmarks. It was only a large increase in performance in games with poor SLI scaling (or no support at all, obviously).

Not in the mood to go grabbing the stuff I read/watched comparing overclocked 980 Ti performance to a 1080.

Go look at Futuremark's hall of fame. The only GPUs in it are 980 Ti's and Titan XP's. The 1080 is nowhere to be found, and will never be.
"The 1080 is nowhere to be found, and will never be."

Are you just pulling this stuff out of the air?

I provided a link to a very legit and thorough review of oc'd 980 Ti vs oc'd 1080, and the results speak for themselves- the 1080 is clearly superior to the 980 Ti. EVERY review of the 1080 by respected tech sites shows the same result: the 1080 is 30-35% faster.

Futuremark?? Well, two things on that: 1) Who buys a GPU to "win" benchmarks? I buy mine for real-world gaming performance. 2) I looked at Futuremark (3DMARK11) and the 1080 scores a 30440, while the 980 Ti is at 24640, which is- by the way- 120 points under the 1070's score of 24760. So what are you talking about?

What's even more confusing is why someone who had 780 SLI (and presumably never owned a 980 Ti) and now owns a 1080 is saying the 980 Ti is better. If you're bitter that the 1080 isn't blowing away TWO 780's, I'd say you didn't do your homework before making the switch.

Stop trolling.
 
Last edited:
"The 480 is just as strong as the 1060"

Not really. It's better at Vulkan (that's 4 games) and SOME DX 12 titles, but according to pcgamer's review, on average over 16 titles the 1060 averages 82.7 fps at 1080p while the 480 comes in at 74.7. The biggest victory for the RX 480 was 11 fps, while in one game the 1060 won by 40 fps.

The 1060 is clearly the faster card. The icing on the cake is that it also oc's better, is cooler and quieter, and is more efficient. The RX 480's price was supposed to be its big selling point, but a quick check on Newegg shows the 1060 6Gb starting at $249 and the cheapest RX 480 8Gb at $259 (both in stock). The extra 2Gb of Vram is a marketing gimmick, as the 480 isn't powerful enough to utilize it anyway.

Hahaha someone can't read trends. So the 1060 is 10% stronger NOW while costing 10% more (Stock vs Stock) - what a victory. Considering how vastly better AMD cards age this is an easy decision for anyone with a brain. That is unless you like doing stuff like paying $650 for a 780 so it can lose to a 280X in a year.

This is before we even start talking about the AIB 480's with 8-pins that have been proven to beat stock 1060's for the same price.
 
Hahaha someone can't read trends. So the 1060 is 10% stronger NOW while costing 10% more (Stock vs Stock) - what a victory. Considering how vastly better AMD cards age this is an easy decision for anyone with a brain. That is unless you like doing stuff like paying $650 for a 780 so it can lose to a 280X in a year.

This is before we even start talking about the AIB 480's with 8-pins that have been proven to beat stock 1060's for the same price.
Age better? How do AMD cards do that? Are they like wine?

Are you comparing reference cards to partner cards? How do the 8 pin partner 1060's then compare to the reference 480?

Sorry for all the questions I just wanted a little clarification.
 
Hahaha someone can't read trends. So the 1060 is 10% stronger NOW while costing 10% more (Stock vs Stock) - what a victory. Considering how vastly better AMD cards age this is an easy decision for anyone with a brain. That is unless you like doing stuff like paying $650 for a 780 so it can lose to a 280X in a year.

This is before we even start talking about the AIB 480's with 8-pins that have been proven to beat stock 1060's for the same price.

First of all, RIGHT NOW, the 6GB 1060 doesn't cost 10% more than the 480... they're the same... check out Newegg....And the 3GB is substantially cheaper...

Second of all, there is no real proof that an AMD card "ages better" than an Nvidia card... and no, I don't accept links to sketchy forums as evidence...

Third of all, even if previous AMD cards somehow do age better than previous Nvidia cards - that doesn't prove that the 480 will age better than the 1060!!

There's a reason Steve chose Nvidia cards at every price bracket other than the lowest tier... Anyone buying an AMD card that isn't tremendously discounted is making a mistake...
 
Hahaha someone can't read trends. So the 1060 is 10% stronger NOW while costing 10% more (Stock vs Stock) - what a victory. Considering how vastly better AMD cards age this is an easy decision for anyone with a brain. That is unless you like doing stuff like paying $650 for a 780 so it can lose to a 280X in a year.

This is before we even start talking about the AIB 480's with 8-pins that have been proven to beat stock 1060's for the same price.
"Hahaha someone can't read trends. So the 1060 is 10% stronger NOW while costing 10% more"

"the AIB 480's with 8-pins that have been proven to beat stock 1060's for the same price"

Wow, you're all over the place here. Is the 1060 10% stronger-as you wrote, or has the 480 been proven to beat it- as you also wrote? Does the 1060 cost 10% more- as you stated, or is it the same price- which you also stated? And why are you comparing an AIB 480 with a stock 1060?

"Hahaha" (rolls eyes) you should think things through a little more before you actually post. Also, I deal in the here and now, not what "the future holds" - you know, the AMD fanboy's stock argument. As of today the 1060 is the faster card- unless you plan to only play Doom for the rest of your life. Something tells me you might...
 
ddferrari, maybe you can cut out the personal remarks and concentrate on the technical issues. Thank you.
Afternoon @mailpup, Have you thought about becoming a YouTube Hero? I hear they're hiring ;)
1b4y30.jpg


*This post was made in jest, DO NOT take it as a personal attack on you, your loved ones or indeed your very soul.
 
Last edited:
First of all, RIGHT NOW, the 6GB 1060 doesn't cost 10% more than the 480... they're the same... check out Newegg....And the 3GB is substantially cheaper...
The 3GB card is not the same GPU as the 6GB card. It has 128 less shaders than the 6GB version. It's disgusting and dishonest to have two different GPUs under the same name and pretend that the only difference is the memory. That alone is enough reason to boycott the 3GB, if you have any dignity.

Second of all, there is no real proof that an AMD card "ages better" than an Nvidia card... and no, I don't accept links to sketchy forums as evidence...
Some obvious things are never enough for some people. You can have a flying saucer crash into a person's house and that person will still claim there is no proof of aliens at all, because he refuses to accept the obvious. You can present a bajillion things to prove that the earth is round, and a flat-earther will still say there is no proof. Same thing goes with the longevity of cards. But I'm glad you inferred in advance that you will try to dismiss everything. Saves me from wasting my time.

Third of all, even if previous AMD cards somehow do age better than previous Nvidia cards - that doesn't prove that the 480 will age better than the 1060!!
It's not proof, but everything points in that direction. Do you know the difference between deductive and inductive reasoning? Science mainly uses inductive reasoning. All the major theories rely on it. Guess what. By those same principles, the RX 480 will age better than the GTX 1060. But hey. Don't mind me. In two years when it really is the case, I'll watch you repeat the same things for the next cards... There is no proof that AMD ages better, and even if there was, there is no proof that the current ones will.

There's a reason Steve chose Nvidia cards at every price bracket other than the lowest tier... Anyone buying an AMD card that isn't tremendously discounted is making a mistake...
Anyone that buys a 3GB card is making an even bigger mistake. Either buy the GTX 1060 6GB, or go for the RX 470/480 if you want smooth performance. The GTX 1060 3GB is for people that want nice averages and are incapable of detecting stutters in their games. Doesn't matter what Steve chose. Steve is still just a guy writing articles. The facts are the facts.

And I also have to wonder... Why is there such a huge gap? $200 to $400 is a huge gap. I just bought my R9 Fury for $300. To me there was no better value than that. Or aren't we allowed to list cards from previous generations?
 
The 3GB card is not the same GPU as the 6GB card. It has 128 less shaders than the 6GB version. It's disgusting and dishonest to have two different GPUs under the same name and pretend that the only difference is the memory. That alone is enough reason to boycott the 3GB, if you have any dignity.

Sorry Nvidia offended you... it's not like they cost the same amount of money... I find it more dishonest when a company rebrands the same card and gives it a different name... but AMD would never do that....

Some obvious things are never enough for some people. You can have a flying saucer crash into a person's house and that person will still claim there is no proof of aliens at all, because he refuses to accept the obvious. You can present a bajillion things to prove that the earth is round, and a flat-earther will still say there is no proof. Same thing goes with the longevity of cards. But I'm glad you inferred in advance that you will try to dismiss everything. Saves me from wasting my time.

So you admit you have no proof.... your analogy fails, as I can PROVE it when a flying saucer crashes into someone's house... there will be photos, video, wreckage, etc... You can't prove that AMD cards age better because they don't!!

It's not proof, but everything points in that direction. Do you know the difference between deductive and inductive reasoning? Science mainly uses inductive reasoning. All the major theories rely on it. Guess what. By those same principles, the RX 480 will age better than the GTX 1060. But hey. Don't mind me. In two years when it really is the case, I'll watch you repeat the same things for the next cards... There is no proof that AMD ages better, and even if there was, there is no proof that the current ones will.

So we're in agreement.... AMD cards don't age better than Nvidia cards... this isn't some vague theory - this is something that can easily be proved with empirical evidence. The reason you don't have any proof is because there ISN'T ANY.

Anyone that buys a 3GB card is making an even bigger mistake. Either buy the GTX 1060 6GB, or go for the RX 470/480 if you want smooth performance. The GTX 1060 3GB is for people that want nice averages and are incapable of detecting stutters in their games. Doesn't matter what Steve chose. Steve is still just a guy writing articles. The facts are the facts.

Really? Facts are facts? This from the person who hasn't provided any facts?!?!? FACTS are that the 3GB Nvidia 1060 delivers adequate performance - and for its price, better performance than any other card... hence its win in its price bracket...

And I also have to wonder... Why is there such a huge gap? $200 to $400 is a huge gap. I just bought my R9 Fury for $300. To me there was no better value than that. Or aren't we allowed to list cards from previous generations?

Actually, I agree with this last point... I suppose new cards are the only ones recommended in this article... Saying that, the 1060 6GB performs about on par with a Fury and costs a fair amount less...
 
Sorry Nvidia offended you... it's not like they cost the same amount of money... I find it more dishonest when a company rebrands the same card and gives it a different name... but AMD would never do that....
Lol "offended". Don't be ridiculous. Doesn't matter if they don't cost the same amount of money. Neither does an RX 480 4GB and 8GB. Imagine if the RX 470 was also called an RX 480. That is what nVidia is doing with the GTX 1060.
As for the rebrands, there is no problem with rebrands if the cards are either placed in a lower tier (like HD 7970 to 280X), or have received a sufficient enough bump in base frequency to warrant the name change (R9 290 to R9 390). If it's combined with a drop in MSRP it's even better. You only dislike it because AMD did it more in recent years. If it was AMD naming the RX 470 and RX 480 the same, and nVidia doing the rebranding, you would be arguing for the complete opposite... And we all know it.

So you admit you have no proof.... your analogy fails, as I can PROVE it when a flying saucer crashes into someone's house... there will be photos, video, wreckage, etc... You can't prove that AMD cards age better because they don't!!
I didn't admit anything. You admitted you will dismiss anything anyway. You know. The only proof you should need is that AMD's cards had full FL12_0 support since March 2013 onwards, while nVidia's cards only do from the 900 series in Sep 2014... But yeah... My analogy is incorrect... Right...?

So we're in agreement.... AMD cards don't age better than Nvidia cards... this isn't some vague theory - this is something that can easily be proved with empirical evidence. The reason you don't have any proof is because there ISN'T ANY.
Obviously what I just told you went way over your head. Guess this is where that part of the conversation ends.

Really? Facts are facts? This from the person who hasn't provided any facts?!?!? FACTS are that the 3GB Nvidia 1060 delivers adequate performance - and for its price, better performance than any other card... hence its win in its price bracket...
Keep believing that.

Actually, I agree with this last point... I suppose new cards are the only ones recommended in this article... Saying that, the 1060 6GB performs about on par with a Fury and costs a fair amount less...
LOL Yeah right... GTX 1060 on par with a Fury...? I wanna smoke what you're smoking. Even on TechPowerUp's performance charts, the Fury beats the GTX 1060 at all resolutions. Take in the DX12 and Vulkan games, and the Fury matches a GTX 1070... Even in GameWorks titles like the Witcher III and Rise of the Tomb Raider, the Fury beats the GTX 1060. Even in GTAV which is a CPU heavy game, while AMD is known for their CPU overhead issue, the Fury beats the GTX 1060... And the higher the resolution, the bigger the difference.
 
Last edited:
Lol "offended". Don't be ridiculous. Doesn't matter if they don't cost the same amount of money. Neither does an RX 480 4GB and 8GB. Imagine if the RX 470 was also called an RX 480. That is what nVidia is doing with the GTX 1060.
As for the rebrands, there is no problem with rebrands if the cards are either placed in a lower tier (like HD 7970 to 280X), or have received a sufficient enough bump in base frequency to warrant the name change (R9 290 to R9 390). If it's combined with a drop in MSRP it's even better. You only dislike it because AMD did it more in recent years. If it was AMD naming the RX 470 and RX 480 the same, and nVidia doing the rebranding, you would be arguing for the complete opposite... And we all know it.

Really? I'm not an Nvidia fanboy... I believe the pot is calling the kettle black... Nvidia sells 2 different cards and calls them both the 1060.... yes, it's a bit confusing... but they are different prices - common sense should dictate that the cheaper one isn't going to be as good...

What seems more dishonest to me is what AMD did with the 480...They sold 2 cards that were EXACTLY THE SAME.... yet they charged more for the 8gb one - even though the 4GB one WAS the 8gb card - just with different firmware to trick the computer into thinking it had 4GB... This seems honest to you? I feel sorry for the suckers who bought an 8GB card at a premium thinking it was superior...

I didn't admit anything. You admitted you will dismiss anything anyway. You know. The only proof you should need is that AMD's cards had full FL12_0 support since March 2013 onwards, while nVidia's cards only do from the 900 series in Sep 2014... But yeah... My analogy is incorrect... Right...?

No... I said I will dismiss "evidence" that cites sketchy forums... you bring me REAL evidence from reputable sources and we can talk... but you can't... because there isn't any.... Nvidia didn't support FL12 until 2014... So what? The majority of games STILL don't use it... That doesn't mean an older AMD card ages better.... it just means it has support for something it isn't powerful enough to utilize...


LOL Yeah right... GTX 1060 on par with a Fury...? I wanna smoke what you're smoking. Even on TechPowerUp's performance charts, the Fury beats the GTX 1060 at all resolutions. Take in the DX12 and Vulkan games, and the Fury matches a GTX 1070... Even in GameWorks titles like the Witcher III and Rise of the Tomb Raider, the Fury beats the GTX 1060. Even in GTAV which is a CPU heavy game, while AMD is known for their CPU overhead issue, the Fury beats the GTX 1060... And the higher the resolution, the bigger the difference.

The difference, at most, is about 10-15%.... usually about even or within 5%...and the Fury STILL costs about 20% or more higher... For price/performance, I think the 1060 6GB wins hands down... plus, it's a newer card, with more memory than the Fury, so maybe it will "age better"... heheheheh....

I know you don't like doing this... but I'll provide you with some benchmarks that actually prove what I'm saying...

http://hwbench.com/vgas/geforce-gtx-1060-vs-radeon-r9-fury
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-6GB-vs-AMD-R9-Fury/3639vs3509

Notice it also uses half the power....
 
Really? I'm not an Nvidia fanboy...
Could've fooled me...

I believe the pot is calling the kettle black... Nvidia sells 2 different cards and calls them both the 1060.... yes, it's a bit confusing... but they are different prices - common sense should dictate that the cheaper one isn't going to be as good...
No... That is not how it works. If a card is named the same it should have the same chip with the same amount of cores. Period.

What seems more dishonest to me is what AMD did with the 480...They sold 2 cards that were EXACTLY THE SAME.... yet they charged more for the 8gb one - even though the 4GB one WAS the 8gb card - just with different firmware to trick the computer into thinking it had 4GB... This seems honest to you? I feel sorry for the suckers who bought an 8GB card at a premium thinking it was superior...
Honest or not, it was the 4GB users winning an additional 8GB, not the 8GB users losing it.

No... I said I will dismiss "evidence" that cites sketchy forums... you bring me REAL evidence from reputable sources and we can talk... but you can't... because there isn't any.... Nvidia didn't support FL12 until 2014... So what? The majority of games STILL don't use it... That doesn't mean an older AMD card ages better.... it just means it has support for something it isn't powerful enough to utilize...
Right... So that 2013 card, the HD 7970, now named as the 280X is completely useless under DX12 and Vulkan that use new features... Right...?

doom_1920_v.jpg


Oh... It can still perform quite nicely.. What a shock.. And lookie lookie where its nVidia counterpart is, the GTX 780... Oh boy...

The difference, at most, is about 10-15%.... usually about even or within 5%...and the Fury STILL costs about 20% or more higher... For price/performance, I think the 1060 6GB wins hands down... plus, it's a newer card, with more memory than the Fury, so maybe it will "age better"... heheheheh....
LOL yeah right... Considering the Fury already matches a GTX 1070 in certain scenarios, the 1060 will definitely not age better, nor is it the better option. But believe whatever you want to. And considering the Fury is now $280, it's an even better choice;
https://www.amazon.com/Sapphire-Rad...=2025&creative=165953&creativeASIN=B0196LWL3W

I know you don't like doing this... but I'll provide you with some benchmarks that actually prove what I'm saying...

http://hwbench.com/vgas/geforce-gtx-1060-vs-radeon-r9-fury
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-6GB-vs-AMD-R9-Fury/3639vs3509
Oh.. And what did I just do above, or a few pages back? I guess I should assume you are only able to see benchmarks that suit your agenda. Once again you prove to be a dishonest agenda-pusher...
Funny how your first link actually recommends the Fury. And oh. You only have to look at the prior benchmark to conclude which card actually has more life in it...
As for your second link, huge difference in amount of users that benchmarked their card. But, they do show the Fury is superior in almost everything....

Notice it also uses half the power....
Yeah great... I guess it's good for your bragging rights. The power difference will matter squat per year anyway.

In two years when the GTX 1060 can run almost nothing, and the Fury still performs great, we'll talk again. I'm done wasting my time now.
 
No... That is not how it works. If a card is named the same it should have the same chip with the same amount of cores. Period.

Honest or not, it was the 4GB users winning an additional 8GB, not the 8GB users losing it.

Really? Why? Why does the same name have to mean the same core/chip but not the same memory? Especially when performance makes a difference with more/less RAM? And 4GB "winning" an extra 4GB of RAM doesn't seem very fair to me...

Right... So that 2013 card, the HD 7970, now named as the 280X is completely useless under DX12 and Vulkan that use new features... Right...?

doom_1920_v.jpg


Oh... It can still perform quite nicely.. What a shock.. And lookie lookie where its nVidia counterpart is, the GTX 780... Oh boy...

Wow... are you related to HardReset? Yes, we all know AMD loves Doom... let's try a benchmark not including Doom... like maybe... the SEVERAL benchmarks included in my last post? They're not quite as shiny are they... but maybe a bit more objective?

I think from now on it will be a simple mattter to spot an AMD fanboy just by seeing when they cite a Doom benchmark as evidence...

LOL yeah right... Considering the Fury already matches a GTX 1070 in certain scenarios, the 1060 will definitely not age better, nor is it the better option. But believe whatever you want to. And considering the Fury is now $280, it's an even better choice;
https://www.amazon.com/Sapphire-Rad...=2025&creative=165953&creativeASIN=B0196LWL3W

Still more expensive than the 1060 6GB... but getting better I suppose... and your "certain scenarios" are Doom I assume...... See above, fanboy... And again, you're assuming it's somehow going to "age better".... still no evidence for this other than Doom...

I'm glad you're "done with this", as I'm thinking this is just like arguing with Hardreset - pointless.....
 
Really? Why? Why does the same name have to mean the same core/chip but not the same memory? Especially when performance makes a difference with more/less RAM? And 4GB "winning" an extra 4GB of RAM doesn't seem very fair to me...
With amount of RAM you know what you get, because gets what, IT's LISTED EVERYWHERE how much it is. If you call two chips the same but they are different, you don't know what you get. It's purposefully deceiving the customer.


Wow... are you related to HardReset? Yes, we all know AMD loves Doom... let's try a benchmark not including Doom... like maybe... the SEVERAL benchmarks included in my last post? They're not quite as shiny are they... but maybe a bit more objective?

I think from now on it will be a simple mattter to spot an AMD fanboy just by seeing when they cite a Doom benchmark as evidence...
What's wrong with Doom? Honestly? It is one of the best programmed games out there. It is not sponsored by any of the two parties either, and they obviously spent their time optimizing it for the new API. What other good metric is there for being future proof, other than a new API that has been well-used? GameWorks titles? A 6+ year old API like DX11/OpenGL? Doom is a perfect example, because they optimized for both OpenGL as Vulkan. No other developer did that, other than Oxide for Ashes of the Singularity. You were all dismissing that game too, until nVidia caught up, and suddenly you start including it in benchmarks... Typical hypocrisy...

Then there's AMD cards leading to their nVidia counterparts in most DX12 titles... But I guess that's not evidence either..

So... I was right. You'll dismiss any evidence presented, so I simply shouldn't waste my time. And then you call me a fanboy... Grow up... But as a parting gift, here;

codbo3_3840_2160.png


Still more expensive than the 1060 6GB... but getting better I suppose... and your "certain scenarios" are Doom I assume...... See above, fanboy... And again, you're assuming it's somehow going to "age better".... still no evidence for this other than Doom...

I'm glad you're "done with this", as I'm thinking this is just like arguing with Hardreset - pointless.....
Not touching this. Talk to me in two years and we'll see who was right.
 
Last edited:
With amount of RAM you know what you get, because gets what, IT's LISTED EVERYWHERE how much it is. If you call two chips the same but they are different, you don't know what you get. It's purposefully deceiving the customer.

Yet selling a card with 8GB of RAM and calling it 4GB is honest? Again, it's pretty easy to realize that the cheaper card isn't going to be as good... anyone paying for the 1060 3GB and being angry it isn't as fast as the 6GB version isn't very bright... But someone paying MORE for the 8GB 480, then finding out that the 4GB is EXACTLY THE SAME ought to be pretty angry...

I agree - neither is exactly fully honest - but I'll take the Nvidia example over the AMD any day...


What's wrong with Doom? Honestly? It is one of the best programmed games out there. It is not sponsored by any of the two parties either, and they obviously spent their time optimizing it for the new API. What other good metric is there for being future proof, other than a new API that has been well-used? GameWorks titles? A 6+ year old API like DX11/OpenGL? Doom is a perfect example, because they optimized for both OpenGL as Vulkan. No other developer did that, other than Oxide for Ashes of the Singularity. You were all dismissing that game too, until nVidia caught up, and suddenly you start including it in benchmarks... Typical hypocrisy...

Because Doom is ONE GAME.... and it's the way it's programmed, not necessarily the support of DX12 that makes AMD run so much better on it - in order to see if results are indicative of future performance, we need MULTIPLE benchmarks from MUTLIPLE games... and the funny thing is, pretty much every other benchmark favours Nvidia...

I love that you added Ashes of the Singularity to your argument... Ashes was supposed to be the poster child for AMD and how superior their cards were GOING TO BE in the future... Then it turned out that Nvidia just had to do some work on their drivers - now they outperform AMD on this game! If we are looking to how cards will perform in the future, this would tell us that Nvidia cards will be the superior choices!!

Then there's AMD cards leading to their nVidia counterparts in most DX12 titles... But I guess that's not evidence either..

So... I was right. You'll dismiss any evidence presented, so I simply shouldn't waste my time. And then you call me a fanboy... Grow up... But as a parting gift, here;

codbo3_3840_2160.png


Not touching this. Talk to me in two years and we'll see who was right.

Wishing English was your first language, as this statement doesn't say what you want it to... I'm assuming you mean that AMD outperforms Nvidia in most DX12 titles... There are less than a dozen - and Nvidia leads in plenty... AMD often leads in cases where neither card can actually play the game - getting 24FPS instead of 20FPS is an irrelevant result!! Neither is playable!!!

Your last benchmark is typical of your so-called "evidence".... only the top 3 cards can play that game at those settings - and BARELY... yes, the 1060 loses to the 480... by 4FPS.... BUT NEITHER CAN PLAY THE GAME!!!!! Once we tone down the resolution (you shouldn't be playing games at 4k with a mid-tier card anyways), you will note that the 1060 blows it away...
 
Back