AMD's Crimson ReLive Edition driver update brings a host of gaming improvements and new features

The power consumption excuses of Polaris are extremely exaggerated. 50W is a light bulb of difference, and it doesn't really matter in real world scenarios. A full system with the RX 480 Nitro+ OC and 8GB uses less than 300W. No additional cost in power supply really compared to a GTX 1060, so in the end, it's a moot point. And as long as your card can remain cool and quiet enough, it's merely an excuse.
Also... Comparing AIB overclocked cards with reference cards of the competitor to determine power consumption differences is completely dishonest and comparing apples and oranges.

Even if you believe the GTX 1060 is faster by 5% - 10%, there is really no reason to go for one over the RX 480 at this point if they're similarly priced. Why?
- RX 480 can do crossfire while the GTX 1060 cannot
- RX 480 + FreeSync is very affordable, GTX 1060 + G-sync is not.
- Freesync reduces the need for the 5% - 10% performance advantage that the GTX 1060 offers
- With the Crimson ReLive update, ShadowPlay is no longer a selling point for the GTX cards
- Power consumption is also less relevant with Radeon Chill, if you see 50W difference as an issue.

Even an RX 470 is arguably a better buy than a GTX 1060 at this point.
 
It seem that you like to ignore what people say and you didn't even check the links I gave you. The Witcher 3 tests you have were done with a reference RX 480. If you had checked what I gave you would see both the reference (which loses) and the Nitro+ oem card which wins.

I don't get it, do you have some sort of Nvidia fetish to actually be this aggressive in defending them? You have newer benchmarks done on the 5th of december using newer drivers with also both OEM and reference cards. Use those as a reference and add the 2% for the ReLife driver (from techpowerup's tests). Otherwise all of your arguments are thin and not grounded in reality.

1) I own both Nvidia and AMD cards. I simply call out bullsh1t whenI see it. You have repeatedly and conveniently failed to point out that you are comparing a 120w STOCK GTX1060 with the 200+watt OVERCLOCKED RX480. Using NightAntilli's logic, comparing a stock card vs a competitor's overclocked card is compleetely dishonest - especially since you have failed to mention these details in your post. Assuming it was not your intention to be dishonest, if you actually bothered to go through your own Hardwarecanuks links, you'd know that the factory overclocked RX480 only beats the stock 1060. The stock 1060 beats the stock 480, and the overclocked 1060 beats the overclocked RX480. And the EVGA factory overclocked version in this test isn't even clocked that high - there is much more additional overclocking headroom compared to the factory overclocked RX480.

http://images.hardwarecanucks.com/image//skymtl/GPU/GTX-1060-UPDATE/GTX-1060-UPDATE-91.jpg

2) What you are doing is CHERRY PICKING different benchmarks from two websites and cross applying fps gains from one website - that makes zero sense. The Hardwarecanuks benchmarks used certain max settings that showed a narrower difference between RX480 vs 1060. The Techpowerup benchmark used different settings that showed a greater difference (the 1060 having a 10-12% fps advantage). The 1060 from Techpowerup also got 3-4 fps more in Witcher 3 than the 1060 from Hardwarecanuks at 1440p, so it's clear they're using different settings. Thus, you can't cherry pick your ReLife fps gain statistics from Techpowerup's tests (which uses their own settings which shows larger performance differences and larger gains), and then apply it to Hardwarecanuk's gaming benchmarks (which use completely difference settings that shows smaller differences)).
It only makes sense to compare Techpowerup's Relife fps gains (with their settings) to Techpowerup's own gaming benchmarks (again, with their settings).
 
Comparing AIB overclocked cards with reference cards of the competitor to determine power consumption differences is completely dishonest and comparing apples and oranges.
You mean like in Pulu's post (which you liked) where he was comparing a REFERENCE GTX1060 with a FACTORY OVERCLOCKED RX480 and didn't mention that little detail? It's hypocritical of you to now say comparing different versions is dishonest - at least I gave enough details for people to juidge for themselves and mentioned both reference and OCed versions for both 1060 and 480s.

If you read my post, I specifically pointed out the power consumption of both the reference GTX1060 and REFERENCE RX480, and the power consumptions of both the factory overclocked 1060s and factory OCed RX480s. I didn't make an ambigious and misleading claim such as "a GTX1060 uses 70 watts less than an RX480."
I made detailed and accurate claims, such as a "stock GTX1060 uses 70watts less than an overclocked RX480...and an factory overclocked GTX1060 uses the same energy as a stock RX480." Go look up the HardOCP power consumption tests again. I gave FOUR figures for power consumption for each card type so folks can see for themselves.

The power consumption excuses of Polaris are extremely exaggerated. 50W is a light bulb of difference, and it doesn't really matter in real world scenarios. A full system with the RX 480 Nitro+ OC and 8GB uses less than 300W. .

1) You apparently didn't read the links I provided. The HardOCP test showed power consumption EXCEEDED 300watts during gaming scenarios when using a Gigabyte RX480. This is not even "peak or maximum power" consumption, which should be higher than gaming consumption.

http://www.hardocp.com/article/2016...0_g1_gaming_vs_msi_gtx_1060_x/16#.WErO6rIrKpp
2) It matters when it comes down to 1x 6 pin VS 2x 6 pins or 1x 8 pin. Again, most factory OCed 1060s come with only 1x 6 pin. The EVGA superclocked version reviewed by Hardwarecanuks only has 1x 6 pin. Virtually all, if not all factory OCed RX480s require 2x 6pins or 1x 8 pin. So it makes a difference for your PSU.
3) And 50-70watt difference is actually quite a lot of power when you compare them relative to their reference stock TDP of 120w (stock 1060) and 150w (certain OCed 1060/stock RX480) and relative to other components. An additional 50-70watts of an OC RX480 is basically the wattage of adding another cpu. The additional watts is also 33%+ more watts/heat for the RX480. That plays a factor in the card size (you can't have small form factor cards like the 6-7inch 1060s), cooling solution (more fans, higher speed = more noise), HTPC builds (It is more difficult to put an OCed 480 in most HTPC/small form factor cases).
- Power consumption is also less relevant with Radeon Chill, if you see 50W difference as an issue.
The Techpowerup review states RadeonChill causes some stuttering in certain gaming scenarios since the GPU is basically downclocking itself and has to reboost the clocks. Not recommended in mutliplayer fps. And I assume it would be problematic for gaming benchmarks because it may adversely affect the GPU's minimum (and possibly average) fps.
Even if you believe the GTX 1060 is faster by 5% - 10%, there is really no reason to go for one over the RX 480 at this point if they're similarly priced. Why?
- RX 480 can do crossfire while the GTX 1060 cannot
- RX 480 + FreeSync is very affordable, GTX 1060 + G-sync is not.
- Freesync reduces the need for the 5% - 10% performance advantage that the GTX 1060 offers
- With the Crimson ReLive update, ShadowPlay is no longer a selling point for the GTX cards
Those are some of your more valid points. However, crossfire/Sli isn't that common (and the general rule anyways is to get the best card you can afford instead of worrying about dual card solutions), and both FreeSync and Gsync are new tech that have yet to become mainstream. The FreeSync monitors I see on Newegg are $150-$200 more expensive than a regular monitor of the same size/resolution. If you have an extra $150-$200 to blow, you might as well just get a GTX1070 instead of the 1060/480 and stick with a regular monitor. I don't know how many people cared about AMD's Raptor/now Relife or Nvidia's Shadowplay. And with DX12's allowing the mix of different brands and types of GPUs, traditional Xfire/SLi is becoming obsolete.

The question is, how much do they care small card form factor/power consumption/heat VS how much they care about crossfiring in the future/having FreeSync being cheaper than GSync. I'm thinking most people care about the former categories than the later, which makes the 1060 a better buy. But if they care more about the later, then yeh, the RX480 will be better.
 
Last edited:
Minimums have reportedly improved with Radeon Chill rather than gotten worse in many instances. The cases where they get worse is generally when the chill feature drops the framerate below the minimum without Chill.

Stuttering is... Intended with Radeon Chill. To quote TechPowerUp which you yourself reference;
you can definitely see some stuttering in those movements across your screen. Once you start interacting with the game, the stutters will instantly disappear and you will be back to your maximum framerate.

Also... Things are not always as they seem... Maybe an off-topic example, but, we all know the GTX 1070 and GTX 1080 are faster than the Fury X. But if you look at recordings, the Fury X looks smoother in many cases, despite the lower framerate. Look here for example;

Small form factor is an argument for the GTX 1060. But then again, so is a Fury Nano. We have to remain focused on what we're talking about at any given point. That point is, power consumption is not really an issue.
Even if you game 24/7 365 days a year, the difference in your bill is going to be at most $5 a month. Most of us can't even game 8 hours a day, so it's in the end cost is going to be much less than $20 a year.
If it uses 300W vs 250W, in the end you still need at least a quality 400W PSU at least. No one is buying a 300W PSU nowadays.
As long as your card doesn't throttle, it's a non-issue.
As long as the noise is not too high, it's a non-issue.
Gigabyte is likely the card that uses the most power out of all of them, because the VRMs are so close to each other that they get hotter than any other card. And more heat = more resistance = more power required.
I used this as a reference of not using more than 300W, and I clearly referred to specifically the Nitro+ OC;
http://www.kitguru.net/components/graphic-cards/zardon/sapphire-rx-480-nitro-oc-4gb-8gb-review/30/

You want small form factor? Your choice is GTX 1060 or Fury Nano, basically. Obviously the Nano is way more power hungry than a GTX 1060, so, power consumption is again not the main issue when deciding for small form factor... The Nano does throttle, yes... And it would be better if it didn't, yes. But even with its throttling, its performance is better than the GTX 1060 in many cases. But if quiet and cool is your thing, you might be better off with the GTX 1060 anyway for small form factor....

I agree that a single card solution is better than multi-card solutions. The option for more than a single card is still a positive though, especially since Crossfire scaling is superior to SLI nowadays. The option to buy one GPU this year, and put another one in next year can be an attractive one to some...

FreeSync... I just bought a 29" UltraWide IPS FreeSync curved monitor for $250 last week, new.... Yes, the range is small, but, 40 Hz - 75 Hz is good enough for my needs. Much better than hoping that my card never drops below 60fps. And if we look at the numbers of monitors that have FreeSync, one can already consider it mainstream. So no, you're incorrect about FreeSync prices, but, it depends on what you want really... FreeSync has budget options and high-end options, and you can get a 144Hz FreeSync monitor for $200;
https://www.amazon.com/Acer-XF240H-...1306330&sr=8-2&keywords=Acer+freesync+monitor

Yeah, you might be better off buying the GTX 1070 instead of a new monitor, but, if you have to buy a new monitor anyway, FreeSync is definitely something to consider.

But I can agree with the last statement. Depending on what people want, they can choose what suits them best. I don't presume to think that I know what people prefer, but if someone does not need a small form factor, I'd recommend them the RX 480 over the GTX 1060.
 
Minimums have reportedly improved with Radeon Chill rather than gotten worse in many instances. The cases where they get worse is generally when the chill feature drops the framerate below the minimum without Chill. Stuttering is... Intended with Radeon Chill. To quote TechPowerUp which you yourself reference; you can definitely see some stuttering in those movements across your screen. Once you start interacting with the game, the stutters will instantly disappear and you will be back to your maximum framerate.
I don't know if these minimum improvements takes into account the stuttering, because otherwise unplayable microstuttering won't be reflected in the minimum fps. eg. If the card is rendering 1 frame for the first 9/10th of the second and 59 frames the last 1/10th of the second, the minimum fps will still be 60 fps but the microstutter will be extremely noticeable and unplayable for that timeframe. (hopefully this doesn't cause problems in first person shooters where you're camping in a corner or sniping when you're not really "interacting" with anything, since you still need smooth consistent fps/zero microstutter to make those snap second decisions)

The problem with the 1070 in the 1070/80 vs Fury X video is the same issue - DX12 microstuttering in Total Warhammer. I think they might have fixed it (or should have fixed it) since the game came out at the end of May and the video was made in June.
Small form factor is an argument for the GTX 1060. But then again, so is a Fury Nano. We have to remain focused on what we're talking about at any given point. That point is, power consumption is not really an issue. ...You want small form factor? Your choice is GTX 1060 or Fury Nano, basically. Obviously the Nano is way more power hungry than a GTX 1060, so, power consumption is again not the main issue when deciding for small form factor... The Nano does throttle, yes... And it would be better if it didn't, yes. But even with its throttling, its performance is better than the GTX 1060 in many cases. But if quiet and cool is your thing, you might be better off with the GTX 1060 anyway for small form factor....
That is true. But for folks who are running small HTPCs with minimal fans, the less heat/less power consumption the better. For folks who don't have that constraint, then you are correct that the 1060 loses its advantages.

FreeSync... I just bought a 29" UltraWide IPS FreeSync curved monitor for $250 last week, new.... Yes, the range is small, but, 40 Hz - 75 Hz is good enough for my needs. Much better than hoping that my card never drops below 60fps. And if we look at the numbers of monitors that have FreeSync, one can already consider it mainstream. So no, you're incorrect about FreeSync prices, but, it depends on what you want really... FreeSync has budget options and high-end options, and you can get a 144Hz FreeSync monitor for $200;
https://www.amazon.com/Acer-XF240H-...1306330&sr=8-2&keywords=Acer+freesync+monitor
Yeah, you might be better off buying the GTX 1070 instead of a new monitor, but, if you have to buy a new monitor anyway, FreeSync is definitely something to consider.
I didn't realize some of them are on sale for $200 and under now. I will agree that if you're looking to get a new monitor, then a cheaper FreeSync monitor certainly holds advantages at this point compared to GSync monitors.

But I can agree with the last statement. Depending on what people want, they can choose what suits them best. I don't presume to think that I know what people prefer, but if someone does not need a small form factor, I'd recommend them the RX 480 over the GTX 1060.
I'll agree to that. I still hold the opinion that the GTX1060 is superior even if SFF isn't the limiting factor, but in the advantage scenarios you've mentioned, then the RX480 would be the better option - it's situational.
 
From the Techpowerup examination of the new Relife drivers, it seems the majority of recent games only gain ~1% fps or less. The majority of the larger fps gains seem to be in much older games (BF3, BF4), or games that came out as a buggy/unoptimized console port (eg. Batman games).
what are you talking about? I've seen graphs that don't even show the percentages on it. here you can't be misled since you have big numbers in front of you. (and the asterisk just says that the tests were done with the pre-release driver, not the final one) these look to be presentation slides. if you've ever done something like this you'll know that you will always exaggerate the sizes when you have small percentages so that the differences can be seen clearly. it just doesn't look good otherwise.
Just because these graphs aren't nearly as misleading as the terrible AMD Zen performance graphs that show double the bar height with no percentages, doesn't mean these graphs are not misleading. They're just "less" misleading than the blatantly misleading graphs without percentages. They are meant to be misleading because the graphs purposely distort the scale. So for folks who don't read the numbers or the fine print (a lot of folks), then they will be misled.
...it just doesn't look good otherwise.
Yes, that's why it is a marketing trick to mislead people into thinking there is a bigger difference than there actually is.

It's not the size that matters, it's how you use it. A good rule of thumb that applies in many areas of life.
 
1) I own both Nvidia and AMD cards. I simply call out bullsh1t whenI see it. You have repeatedly and conveniently failed to point out that you are comparing a 120w STOCK GTX1060 with the 200+watt OVERCLOCKED RX480. Using NightAntilli's logic, comparing a stock card vs a competitor's overclocked card is compleetely dishonest - especially since you have failed to mention these details in your post. Assuming it was not your intention to be dishonest, if you actually bothered to go through your own Hardwarecanuks links, you'd know that the factory overclocked RX480 only beats the stock 1060. The stock 1060 beats the stock 480, and the overclocked 1060 beats the overclocked RX480. And the EVGA factory overclocked version in this test isn't even clocked that high - there is much more additional overclocking headroom compared to the factory overclocked RX480.

http://images.hardwarecanucks.com/image//skymtl/GPU/GTX-1060-UPDATE/GTX-1060-UPDATE-91.jpg

2) What you are doing is CHERRY PICKING different benchmarks from two websites and cross applying fps gains from one website - that makes zero sense. The Hardwarecanuks benchmarks used certain max settings that showed a narrower difference between RX480 vs 1060. The Techpowerup benchmark used different settings that showed a greater difference (the 1060 having a 10-12% fps advantage). The 1060 from Techpowerup also got 3-4 fps more in Witcher 3 than the 1060 from Hardwarecanuks at 1440p, so it's clear they're using different settings. Thus, you can't cherry pick your ReLife fps gain statistics from Techpowerup's tests (which uses their own settings which shows larger performance differences and larger gains), and then apply it to Hardwarecanuk's gaming benchmarks (which use completely difference settings that shows smaller differences)).
It only makes sense to compare Techpowerup's Relife fps gains (with their settings) to Techpowerup's own gaming benchmarks (again, with their settings).

1. I never once mentioned power consumption because it's retarded. both cards draw too little for people to even care about it. if you care about a few watts then maybe a gaming PC is not what you want.

2. I am the one cherry picking benchmarks? the only thing I did is pick the latest ones that actually use OEM cards for both companies. and as I told you before, you only looked at benchmarks that only have reference AMD cards which is frankly very sly of you. I'm starting to think you did intentionally and are just acting like you don't know.
different benchmarks with different settings... and if techpowerup really did use modified settings for the games they tested then shame on them. EDIT: I went and read the fine print: "All games are set to their highest quality setting unless indicated otherwise." - so yeah, your argument here is irrelevant. both benchmarks use max settings.
the only difference is the benchmark itself - you can't expect them to have the same scripted sequences if the game doesn't offer a benchmark tool.

FYI techpowerup is not the only one reporting these type of gains with the new drivers. you can also check user benchmarks, the majority say that they got a boost in the games they usually play or benchmark tools like 3dmark.

Long story short: RX 480 is the recommended buy since it offers similar dx11 performance and it works better in DX12/Vulkan. You also get Freesync support which is significantly cheaper than G-Sync. You can go for Nvidia if you plan on playing older games and don't care about future games (1 or 2 years down the line) or if it's cheaper in your country.
 
Last edited:
Bemchmarks? .If that's what matters to you what a waste of a card... I only care if my game is played so I can enjoy it. If you get a 10% boost for an extra $300 you are not a good Capitalist.

Video Chipset: AMD Radeon HD 7950/R9 280

Works great
 
All I know is that I get vertical noise lines when loading a game or loading next part of the map. For example if I leaving the my garage in GTAV black screen from leaving the garage to street. I get the vertical noise lines.
 
Back