The Best Graphics Cards: What's the Best GPU for Your Budget?

hah this jebaited guy is so out of date, steam already accounted for icafe hardwares in april 2018 HW survey. I already gave the link above. PUBG just kinda destroyed AMD market share really.
Steam changed the algorithm so that internet cafe's are not so prevalent, but not the way they gather data. A single PC can still get multiple surveys is you have multiple Steam accounts on it. My guess is that they now ask for the survey less if it's on the same system so that it doesn't count the same one 20-40 times.

PUBG certainly didn't make tens of millions of people buy Nvidia GPU's in such a short time, it just made more people play on the same PC (be it at home, at school, at friend's houses or at internet cafes), and fortnite did the same. AMD's market share did drop, but not by that huge of a margin. AMD has been trending up as people start to lose some interest in these games, especially in pubg's case.
 
Steam changed the algorithm so that internet cafe's are not so prevalent, but not the way they gather data. A single PC can still get multiple surveys is you have multiple Steam accounts on it. My guess is that they now ask for the survey less if it's on the same system so that it doesn't count the same one 20-40 times.

PUBG certainly didn't make tens of millions of people buy Nvidia GPU's in such a short time, it just made more people play on the same PC (be it at home, at school, at friend's houses or at internet cafes), and fortnite did the same. AMD's market share did drop, but not by that huge of a margin. AMD has been trending up as people start to lose some interest in these games, especially in pubg's case.

Well at its peak PUBG had 3.1 mil concurrent players, 50 millions copy sold on PC as of Jun 2018. Granted hackers were buying multiple copies but the players base should be more than 30millions. Coupled that with the mining era when RX 580 were selling for 500usd plus no wonder AMD Steam market share dropped.
I guess steam generated unique hardware ID for each machine to avoid multiple entries, Overwatch has been using hw ID to ban hackers.
 
Last edited:
The manufacturing cost is exactly the same - the 5700 chip has just failed specific binning tests that the 5700 XT passed. .
The 5700 actually has much better performance for power efficency than the 5700XT, so I don't think the 5700 is merely a failed binned 5700XT. Usually when things are only failed binned and cut down versions of something else, the power for performance efficency stays the same or gets worse.

For example, the cut down/core disabled triple core Athlon and Phenom cpus had almost the same electrical consumption and TDP as the quad cores most of the time (even with comparable clockspeeds) and worse performance/power consumption than the quads. The GTX1070 actually had a slightly better performance/power consumption efficency than the GTX1060.
 
Last edited:
The 5700 actually has much better performance for power efficency than the 5700XT, so I don't think the 5700 is merely a failed binned 5700XT. Usually when things are only cut down versions of something else, the power for performance efficency stays the same or gets worse.

For example, the cut down/core disabled triple core Athlon and Phenom cpus had almost the same electrical consumption and TDP as the quad cores most of the time and worse performance/power consumption than the quads. The GTX1070 actually had a slightly better performance/power consumption efficency than the GTX1060.
Efficiency and OC headroom come down to chip lottery. What he should have said is that AMD disables parts of the GPU that fail during manufacturing. For example, the 5700 XT has 2560 shaders and the 5700 has only 2304 and there are other differences too.
 
The 5700 actually has much better performance for power efficency than the 5700XT, so I don't think the 5700 is merely a failed binned 5700XT. Usually when things are only failed binned and cut down versions of something else, the power for performance efficency stays the same or gets worse.

For example, the cut down/core disabled triple core Athlon and Phenom cpus had almost the same electrical consumption and TDP as the quad cores most of the time (even with comparable clockspeeds) and worse performance/power consumption than the quads. The GTX1070 actually had a slightly better performance/power consumption efficency than the GTX1060.

Think of it this way, out of 100 die from a wafer (figuratively speaking), only 70 of them are working die, AMD do their binning and 50 of them passed as 5700XT while the rest 20 are 5700 (due to higher leakage or lower Freq/Vcore). 5700 die either got laser cut or just a bios that disable a portion of their core (so cross flashing to 5700XT is possible).

Now 5700XT has 9% more cores than 5700, that means 5700XT should perform 9% better and has the same efficiency as 5700; however AMD decided to overclock the hell out of 5700XT already so efficiency is lost (1.2v Vcore vs 5700 1.0v Vcore). There is so little OC out of 5700XT now just as how Zen 2 is. I think AMD imposed some limits into the 5700 and 5700XT Bios to avoid 5700 getting too close to 5700XT performance and 5700XT from frying itself. Nvidia actually said that running their chip at 1.093V vcore could kill it in a year:
https://digiworthy.com/2017/03/30/overvolting-nvidia-pascal-gpus/
 
Last edited:
The 5700 actually has much better performance for power efficency than the 5700XT, so I don't think the 5700 is merely a failed binned 5700XT.
Binning isn't just done for die defects - each chip is tested for a range of voltages, operating frequencies, and thermal characteristics. The results are then statistically grouped (which is where the term bin and binning comes from) and then sold on those results. Yes, the RX 5700 has two workgroup processors (equating to 4 CUs) disabled, however, it is only clocked around 7% lower than the XT but the TDP is 20% down - this strongly suggests that it is a properly binned die, rather than just the same chip as in the XT but setup differently.
 
Binning isn't just done for die defects - each chip is tested for a range of voltages, operating frequencies, and thermal characteristics. The results are then statistically grouped (which is where the term bin and binning comes from) and then sold on those results. Yes, the RX 5700 has two workgroup processors (equating to 4 CUs) disabled, however, it is only clocked around 7% lower than the XT but the TDP is 20% down - this strongly suggests that it is a properly binned die, rather than just the same chip as in the XT but setup differently.

Or the 5700XT is pushed out of it's optimal parameters. The last 5% clock speed on my 2080Ti requires 22% more power to reach (1995mhz/983mV Vcore vs 2070mhz/1.093V Vcore). Remember that 5700 runs at 1V vcore and 5700XT run at 1.2V vcore stock settings.
This is the basic for Nvidia Max-Q technology
Max_Q_4.png
 
Last edited:
That's certainly a possibility, given that Navi's attributes would be under a lot of initial scrutiny and Navi 10 is being used for just two products. Taking your TU102 example, that's used in just 4 products (TU104 is used in 10; TU106 in 8) and they're all top-end models, so there's little scope to use it in a lower spec model (a low bin TU102 would just overlap with a good TU104).
 
After about 90fps I don't care about fps anymore. I would love to see Nvidia cards with extra hardware just for gore quality and fully destructible environments. Maybe with settings that go from off, low, med, high, to end of the world lol. With extra power ports for the extra hardware on the cards. That would be cool.

I don't know though maybe that stuff is more about the game then the hardware but it would be very cool to see the card makers sell cards that do actual stuff like that which is more fun to me personally.

LEDs that I could plug into the front of desktop cases that would be connected to the cards and that ramp up with card performance in colors like memory sticks would be fun also. I figure this would take different parts of the industry working together to do hardware things like these but it would make the cards way more fun. Otherwise it's just the same small step differences on the ladder from mid to upper cards that stops me from wanting to buy and puts me to sleep.
 
Last edited by a moderator:
I don't know what is the point of tampering with the hardware survey for lol. Anyways steam has adjusted their hardware registering entries to account for internet cafe since a while ago.

https://www.tomshardware.com/news/steam-hardware-survey-cpu-gpu,37007.html




This is where every GPU review has been lying to you, when you check the list of steam games with the highest concurrent players count
https://store.steampowered.com/stats/
If you go by every game in that list, only in Rainbow Six does RX 580 outperform GTX 1060. Every GPU review only cares about AAA games that actual gamer care little about. Techspot also deliberately removed several massively played games (GTA V, No Man's Sky, PUBG, etc...) in their GTX 1060 vs RX 580 benchmarks. If you were a budget gamer, would you buy a budget GPU to play every new AAA game or a popular MMO ?

Anyways current dedicated GPU market share

What's the point in testing new GPU's with old game engines? Anyone playing PUBG and NMS have already bought their GPU, so why would they be the ones most interested in new GPU's?
 
Lowering prices too much gives customers the impression that you are selling an inferior product.
Well it is an inferior product when you consider the overall execution, reliability, stability and features of Nvidia's software, and usually, hardware. This is and has been pretty common PC gaming knowledge for a long time.
Doesn't mean AMD are bad, they make awesome GPU's.

I can't understand why AMD is so low on the market share (steam survey). I mean where gamers, PC gamers that is, usually buy their GPU's is mid-end. The GTX 1060 is a good card, but not great considering that the RX 580 offers about the same for less.
Last time I checked AMD did not have 20% of the GPU market share and that is pretty low, considering it is just AMD and Nvidia in custom PC solutions, right now.
People know what they are getting when they buy an AMD GPU, especially Steam gamers which consist's of knowledgeable users.
I've owned several over the years, you deal with more bugs/issues and a less polished overall package.
You get killer performance for the price, and AMD need to stick to that mentality.
 
Last edited:
Probably because it's just not worth buying a Radeon VII anyway (and it was rather debatable whether or not it was worth it when it came out) - compared to the 2080, it has a higher power requirement (and thus generates more heat), it's more expensive (lowest Newegg price is $699 vs $525), and in our testing, we found it slower at 1440p and 4K in 34 out of 37 benchmarks.
 
Interesting, no mention of Radeon VII despite it being released this year. It outperforms the RTX 2080 in most benchmarks.
The Radeon VII was a flop.
ExtremeTech said:
Gamers hoping that AMD would bring a GPU to market that shook up the status quo or at least forced Nvidia to lower its prices are going to be disappointed (again).
As I have and many review sites have stated, if AMD priced the Radeon VII $50-$100 lower it would have did much better. That being said even at $600 I don't think it would have changed the outcome.
 
I am definitely considering a 2080 Super for my next build. I love my 2080TiFTW3hybrid but I don’t see the need for it yet.

Glad to see Nvidia has the high end on lock down.

Why are you glad, and how is it a good thing?

Monopoly is never a good thing. Competition encourages better pricing which in turn benefits us consumers.
 
Anyone who owns a Radeon VII is maxing out all current games.
dont see how they are flopping lol
Because those people could have spent less and bought a 2080 which performs faster, produces less heat, makes less noise, uses less power and has more features. The cards performance is good but that doesn’t mean it was a good buy.
 
As I have and many review sites have stated, if AMD priced the Radeon VII $50-$100 lower it would have did much better. That being said even at $600 I don't think it would have changed the outcome.
The problem is the Radeon VII was basically sold at-cost. Between the HBM and the 7nm process, AMD didn't make much margin on those cards. If AMD had gone for $599 or $549 on the price, it would have been actively losing money with each card.
 
Because those people could have spent less and bought a 2080 which performs faster, produces less heat, makes less noise, uses less power and has more features. The cards performance is good but that doesn’t mean it was a good buy.

But RTX2080 has only 8GB of memory and that will be limiting factor for this card sooner than later, plus Radeon VII is within 5% of RTX2080 performance and the cooler isn't that loud as for the power, who cares about that extra 80 watts?
 
The Radeon VII makes far more sense for what it really is: a rebadged Radeon Instinct M150. $700 for a highly capable OpenCL compute accelerator is an absolute steal.

In my previous post, I mistakenly included refurbished prices for the 2080 - turns out they're not that much cheaper than a VII, with Newegg listing $670 for the cheapest 2080 not on sale. On that basis, I wouldn't recommend either: if it was my money, I'd be looking at an RTX 2070 Super and save myself nearly $200. It's pretty close in performance to both a 2080 and VII.
 
Back