AMD erodes more of Intel's CPU share in Steam Survey, GTX 1060 remains top graphics card

midian182

Posts: 9,632   +120
Staff member
In a nutshell: Steam’s hardware survey for November has dropped, showing that AMD is still chipping away at Intel’s once insurmountable lead in the CPU space. Elsewhere, the GTX 1060 keeps its long-held spot as the most popular video card, though its lead is decreasing, and the RTX 2060 is cementing its place as the most common Turing GPU among Steam users.

It’s no secret that AMD has been closing the gap on Intel in the Steam Survey for some time now. Team Red took more than 25 percent of the CPU share in September; two months later, that figure stands at 26.91 percent. It was only in June 2019 when Intel was sitting pretty with an 82 percent share.

AMD has Ryzen to thank for taking the fight to its long-time rival. The processors have been pushing out Intel’s offerings since their arrival in 2017 and have improved with each generation. With the 7nm Ryzen 5000 series proving to be a fantastic choice for work and play—providing you can find one—and the 5nm Zen 4 set for release next year, expect AMD’s share to start increasing at a faster pace. Meanwhile, Intel has its Rocket Lake processors arriving in the first half of 2021, though they use a variant of Sunny Cove backported to the 14nm process.

AMD’s video card share among Steam users remains pretty flat, having increased from 14.8 percent in June last year to just 16.5 percent in November. And while the Radeon RX 6000 series will doubtlessly sell well—we loved the RX 6800 and RX 6800 XT—AMD is coming up against some monstrous competition in the form of Nvidia’s Ampere, particularly the new RTX 3060 Ti, a $400 card that outperforms the RTX 2080 Super.

AMD’s highest video card in the Steam Survey is the RX 580 in tenth place, holding a 2.14 percent share. The GTX 1060 stay on top with 10.6 percent, though its lead keeps declining, while the RTX 2060 (3.51 percent) and RTX 2070 Super (2.29 percent) are the only 20-series cards in the top ten.

Elsewhere, more people are opting for 16GB as their preferred amount of system RAM, and despite QHD (and higher) resolution monitors getting cheaper, the number of users with 1920 x 1080 displays went up 0.57 percent last month.

Permalink to story.

 
Shock the cards which were the £200-250 price range are the top of the steam survey... Not.

Honestly when will Nvidia and AMD just realise there's a limit you can charge for mid range performance, it's almost like they want to kill the PC gaming industry.
 
Shock the cards which were the £200-250 price range are the top of the steam survey... Not.

Honestly when will Nvidia and AMD just realise there's a limit you can charge for mid range performance, it's almost like they want to kill the PC gaming industry.

I dont think they are killing anything. They are just taking advantage of ultra enthusiasts while us normal folks with a head on our shoulders say “meh, I’ll grab a second hand 1070 for $180 or buy a new 1660 for about $225”

Problem is, that price/performance range hasn't improved for 3+ years. Only the top tier is getting price/performance increases and creating a log jam in the new and second hand market at the 1080/2060/5700 and higher performance tier.
 
Shock the cards which were the £200-250 price range are the top of the steam survey... Not.

Honestly when will Nvidia and AMD just realise there's a limit you can charge for mid range performance, it's almost like they want to kill the PC gaming industry.
You still don't have to spend more than that for mid-range. Prices for top hardware are high indeed but even a $400 RTX 3060 Ti that just released is meant for 4k (as it has 64FPS average on 4k at 18 games tested on TechSpot).

Considering vast majority of people play on 1080p you can get a much cheaper GPU and run everything problem free. Yet not many talk about this because it's not as interesting. And don't get tricked into thinking you need an upgraded because of a poorly optimized game like currently AC Valhalla.
 
AMD's price hikes on their CPUs and GPUs are a serious mistake and these charts show why. Their products still don't command enough marketshare or mindshare to completely abandon their previously successful tactic of competing on price as well as performance (for CPUs anyway). The RX 6000-series GPUs shouldn't be priced so high because ATi's feature set is so much thinner than nVidia's and while I may not care about ray tracing, I'm not everyone. I'm sure that to a lot of people, these things matter. This is especially true with things like DLSS v2.0 (which is, let's face it, pretty awesome).

AMD's growth has been mostly about giving better value than Intel at every price point. The majority of the marketplace doesn't know enough to care about core numbers and clock speeds. They just want a computer that's reasonably fast for their uses and isn't expensive. AMD had that lock, stock and barrel through Ryzen, Ryzen+ and Ryzen 2. They also had great value with the RX 5000-series.

The way things are now, I'd be about as eager to buy an RDNA2 card as I was to buy a Vega or Radeon VII (as in, not at all). It was exciting as hell when we saw the performance numbers and realised that for the first time since the R9 Fury, there was an ATi card that would be battling for the top spot.

AMD needs to remember what made the first Ryzen CPUs so attractive. The fact that they were charging about half of what Intel wanted for their 8-core CPUs is what made people crazy for Ryzen. The i7-6900K was priced at about $1,300USD and if AMD had priced the R7-1800X at $1,000USD, it wouldn't have sold because it would only have been a slightly better deal. This is essentially what AMD has done with the RX 6000-series and that's why people are so disappointed by it. I expect the growth on the CPU chart to slow down considerably and I expect no growth at all on the GPU chart.
 
Last edited:
Comparing used market/2 year old "new" cards pricing to brand new cards is just plain dumb.
I dont think they are killing anything. They are just taking advantage of ultra enthusiasts while us normal folks with a head on our shoulders say “meh, I’ll grab a second hand 1070 for $180 or buy a new 1660 for about $225”

Problem is, that price/performance range hasn't improved for 3+ years. Only the top tier is getting price/performance increases and creating a log jam in the new and second hand market at the 1080/2060/5700 and higher performance tier.

You do realize Nvidia makes less than a few bucks on their lower end cards right? What do you want them to do, spend hundreds of millions to produce cards they make almost nothing on? Common sense.
 
I dont think they are killing anything. They are just taking advantage of ultra enthusiasts while us normal folks with a head on our shoulders say “meh, I’ll grab a second hand 1070 for $180 or buy a new 1660 for about $225”

Problem is, that price/performance range hasn't improved for 3+ years. Only the top tier is getting price/performance increases and creating a log jam in the new and second hand market at the 1080/2060/5700 and higher performance tier.
This is exactly what I've been saying. Hardware reviewers have been in apologist mode when they've been saying that "the price is higher but the card is faster" BS. The performance per dollar available in the market isn't supposed to stagnate, it's supposed to go up every generation. If this wasn't true, then the relative performance of a modern PC compared to the original IBM PC would give it a seven-figure price. In fact, the price has gone DOWN millions of percent, especially when inflation is taken into account.

This is mostly about nVidia using smoke and mirrors with their part numbers to increase their prices by about 20% with nobody noticing and making it permanent. However, this is also about AMD happily following along with what nVidia has done and tried to make themselves look better than nVidia by "only" charging $1,000 on their already most overpriced item.
 
You do realize Nvidia makes less than a few bucks on their lower end cards right? What do you want them to do, spend hundreds of millions to produce cards they make almost nothing on? Common sense.
What you posted is the opinion of someone who hasn't been around long enough to really know how this industry normally works, the way that it has managed to work for decades.

You're the kind of person that nVidia hopes for, the one that falls for their marketing because you don't know what came before and how the pricing was structured over time. You also don't seem to realise that the pricing of the top-end cards affects the entire product stack. Spending (as you say) "hundreds of millions to produce cards they make almost nothing on" (a notion so nonsensical that I couldn't help but crack up when I read it) is exactly what they've been doing for longer than many adults today have been alive. Oh wait, since that's impossible because they'd be out of business if they did, they have not done that. This proves what you said to be completely wrong because nVidia still exists.

You know, common sense.

The truth is that nVidia's margins have always been sky-high which is why their current value is well over $121,000,000,000USD. If they were just "making a few bucks on their lower end cards", the cards that they sell far more of than any other market segment, then they wouldn't be where they are today.

You know, more common sense.

From 2008-2015, nVidia's cards ending in 80 were $500. That's exactly seven years without raising prices but still making enough profit to keep growing into the corporation that bought ARM this year. Then they started to get greedy but knew that the average consumer wouldn't notice if they increased the price by only $50 per generation. You just proved that it worked and it worked well.

Just google the following:
"GTX 280 MSRP" - You'll see that it was $500* (June 2008)
>There was no GTX 300-series<
"GTX 480 MSRP" - You'll see that it was $500 <-no hike (March 2010)
"GTX 580 MSRP" - You'll see that it was $500 <-no hike (November 2010)
"GTX 680 MSRP" - You'll see that it was $500 <-no hike (March 2012)
"GTX 780 MSRP" - You'll see that it was $500 <-no hike (May 2013)
>The GTX 880 was supposed to have an MSRP of $500 but nVidia instead jumped straight to the 980 which was given a $50 price hike (Sept 2014)
"GTX 1080" - You'll see that it was $600 <- another $50 hike (May 2016)
"RTX 2080" - You'll see that it was $700 <- another $50 hike (September 2018)

*Orignally, nVidia wanted the the MSRP of the GTX 280 to be $650 but ATi gut-punched them with the Radeon HD 4850, 4870 and 4890 which offered the same or better performance for $200USD less. This forced nVidia to drop the price of the GTX 280 by $150.

So, between 2008 and 2013, the prices remained stable at $500. I can assure you that "poor little nVidia" wasn't hurting AT ALL in this time period. Their stock price just kept rising and rising, along with the money that they had on-hand. This is because the cost of manufacturing tech goes down much faster than inflation can raise it as the level of tech goes up. That means that prices should never, ever rise. The performance per dollar should be constantly going up not remaining the same.

Nevertheless, nVidia raised their prices by 33% between 2014 and 2018 which is only 80% as long as the time period in which the prices remained stable. Then, when someone calls them on it, they have you, their shining white knight (who is always too young to know better), coming to defend "Poor nVidia"'s honour. You do this despite the fact that nVidia is screwing YOU just as hard as everyone else.

Now you know what common sense actually is, an oxymoron.
 
Last edited:
What you posted is the opinion of someone who hasn't been around long enough to really know how this industry normally works, the way that it has managed to work for decades.

You're the kind of person that nVidia hopes for, the one that falls for their marketing because you don't know what came before and how the pricing was structured over time. You also don't seem to realise that the pricing of the top-end cards affects the entire product stack. Spending (as you say) "hundreds of millions to produce cards they make almost nothing on", a notion is so nonsensical that I couldn't help but crack up when I read it, is exactly what they've been doing for longer than many adults today have been alive. Oh wait, since that's impossible because they'd be out of business if they did, they have not done that. This proves what you said to be completely wrong because nVidia still exists.

You know, common sense.

The truth is that nVidia's margins have always been sky-high which is why their current value is well over $121,000,000,000USD. If they were just "making a few bucks on their lower end cards", the cards that they sell far more of than any other market segment, then they wouldn't be where they are today.

You know, more common sense.

From 2008-2015, nVidia's cards ending in 80 were $500. That's exactly seven years without raising prices but still making enough profit to keep growing into the corporation that bought ARM this year. Then they started to get greedy but knew that the average consumer wouldn't notice if they increased the price by only $50 per generation. You just proved that it worked and it worked well.

Just google the following:
"GTX 280 MSRP" - You'll see that it was $500* (June 2008)
>There was no GTX 300-series<
"GTX 480 MSRP" - You'll see that it was $500 <-no hike (March 2010)
"GTX 580 MSRP" - You'll see that it was $500 <-no hike (November 2010)
"GTX 680 MSRP" - You'll see that it was $500 <-no hike (March 2012)
"GTX 780 MSRP" - You'll see that it was $500 <-no hike (May 2013)
"GTX 880 MSRP" - You'll see that it was $500 <-no hike (October 2014)
"GTX 980" - You'll see that it was $549 <- $50 hike (June 2015)
"GTX 1080" - You'll see that it was $600 <- another $50 hike (May 2016)
"RTX 2080" - You'll see that it was $700 <- another $50 hike (September 2018)

*Orignally, nVidia wanted the the MSRP of the GTX 280 to be $650 but ATi gut-punched them with the Radeon HD 4850, 4870 and 4890 which offered the same or better performance for $200USD less. This forced nVidia to drop the price of the GTX 280 by $150.

So, between 2008 and 2013, the prices remained stable at $500. I can assure you that "poor little nVidia" wasn't hurting AT ALL in this time period. Their stock price just kept rising and rising, along with the money that they had on-hand. This is because the cost of manufacturing tech goes down much faster than inflation can raise it as the level of tech goes up. That means that prices should never, ever rise. The performance per dollar should be constantly going up not remaining the same.

Nevertheless, nVidia raised their prices by 33% between 2014 and 2018 which is only 75% as long as the time period in which the prices remained stable. Then, when someone calls them on it, they have you, their shining white knight (who is always too young to know better), coming to defend "Poor nVidia"'s honour. You do this despite the fact that nVidia is screwing YOU just as hard as everyone else.

Now you know what common sense actually is, an oxymoron.
Well said, I remember getting an ATI 4850 in 2008 and being blown away with how much of an upgrade it was compared to my 8600GT which cost around the same price with only 18 months between them. Literally GPU tech was doubling every few years and like you said remaining similar priced through out. I feel the market has stagnated since 2016 and I'm just shocked at how much people still want for 4 year old graphics cards, especially the Geforce cards, people want £120 for. 1060 or £180+ for a 1070, it's almost as if Nvidia products are being treated like iPhones.

I bought a used RX 470 4GB for £65 last year as my 7870XT 2GB died after 6 years usage (recommend by techspot in 2013 as the best mid range valued 1080p card). I bought used for the first time because I wanted to wait to see what the next generation of cards would come out that would be better than the PS5/XSX and so far I'm disappointed and I'm really tempted to just buy a console and call it a day for PC gaming.
 
AMD's price hikes on their CPUs and GPUs are a serious mistake and these charts show why. Their products still don't command enough marketshare or mindshare to completely abandon their previously successful tactic of competing on price as well as performance (for CPUs anyway).

AMD's growth has been mostly about giving better value than Intel at every price point. The majority of the marketplace doesn't know enough to care about core numbers and clock speeds. They just want a computer that's reasonably fast for their uses and isn't expensive. AMD had that lock, stock and barrel through Ryzen, Ryzen+ and Ryzen 2. They also had great value with the RX 5000-series.

What about the Athlon and Athlon XP, they absolutely destroyed Pentium 3 and Pentium 4 while being much cheaper, anyone had to be insane or very misinformed to not go AMD those days! In fact, the only reason Intel did remain alive during that time period was probably due to the much stronger marketing and mindshare among less computer literate people, and OEM deals.

I always thought AMD should have invested a lot more in marketing during the era when they had much better products than Intel for cheaper. The 2000s were an era when TV ads, magazine ads and newspaper ads still mattered a lot more than today, and I can count on my fingers the times I've seen AMD ads in all those media forms combined during that era. On the other hand, I saw Intel ads all the time.

Anyways, I've seen people, even here on Techspot comments areas, arguing in defense of current AMD pricing strategy claiming that they aren't about to "make the same mistake" they did during the 2000s (trying to one-up Intel and Nvidia in prices). The reasoning being that they blame this strategy for AMD's bad spot during most of the 2010s, claiming that thanks to that, AMD lacked R&D funds to develop better products. It makes sense, but still, even if that's true AMD must have something to compel both enthusiasts and the general public alike away from Intel, and without pricing they have nothing if they intend to use Intel's and Nvidia's current price/performance as a standard they intend to follow.
 
What you posted is the opinion of someone who hasn't been around long enough to really know how this industry normally works, the way that it has managed to work for decades.

You're the kind of person that nVidia hopes for, the one that falls for their marketing because you don't know what came before and how the pricing was structured over time. You also don't seem to realise that the pricing of the top-end cards affects the entire product stack. Spending (as you say) "hundreds of millions to produce cards they make almost nothing on", a notion is so nonsensical that I couldn't help but crack up when I read it, is exactly what they've been doing for longer than many adults today have been alive. Oh wait, since that's impossible because they'd be out of business if they did, they have not done that. This proves what you said to be completely wrong because nVidia still exists.

You know, common sense.

The truth is that nVidia's margins have always been sky-high which is why their current value is well over $121,000,000,000USD. If they were just "making a few bucks on their lower end cards", the cards that they sell far more of than any other market segment, then they wouldn't be where they are today.

You know, more common sense.

From 2008-2015, nVidia's cards ending in 80 were $500. That's exactly seven years without raising prices but still making enough profit to keep growing into the corporation that bought ARM this year. Then they started to get greedy but knew that the average consumer wouldn't notice if they increased the price by only $50 per generation. You just proved that it worked and it worked well.

Just google the following:
"GTX 280 MSRP" - You'll see that it was $500* (June 2008)
>There was no GTX 300-series<
"GTX 480 MSRP" - You'll see that it was $500 <-no hike (March 2010)
"GTX 580 MSRP" - You'll see that it was $500 <-no hike (November 2010)
"GTX 680 MSRP" - You'll see that it was $500 <-no hike (March 2012)
"GTX 780 MSRP" - You'll see that it was $500 <-no hike (May 2013)
"GTX 880 MSRP" - You'll see that it was $500 <-no hike (October 2014)
"GTX 980" - You'll see that it was $549 <- $50 hike (June 2015)
"GTX 1080" - You'll see that it was $600 <- another $50 hike (May 2016)
"RTX 2080" - You'll see that it was $700 <- another $50 hike (September 2018)

*Orignally, nVidia wanted the the MSRP of the GTX 280 to be $650 but ATi gut-punched them with the Radeon HD 4850, 4870 and 4890 which offered the same or better performance for $200USD less. This forced nVidia to drop the price of the GTX 280 by $150.

So, between 2008 and 2013, the prices remained stable at $500. I can assure you that "poor little nVidia" wasn't hurting AT ALL in this time period. Their stock price just kept rising and rising, along with the money that they had on-hand. This is because the cost of manufacturing tech goes down much faster than inflation can raise it as the level of tech goes up. That means that prices should never, ever rise. The performance per dollar should be constantly going up not remaining the same.

Nevertheless, nVidia raised their prices by 33% between 2014 and 2018 which is only 75% as long as the time period in which the prices remained stable. Then, when someone calls them on it, they have you, their shining white knight (who is always too young to know better), coming to defend "Poor nVidia"'s honour. You do this despite the fact that nVidia is screwing YOU just as hard as everyone else.

Now you know what common sense actually is, an oxymoron.
Nvidia, Intel, AMD are all just big American tech corporations that don’t give a ram about you and only want your money. The only reason AMD (or any of them) have ever offered good value for money is because they couldn’t sell you a product faster than the other companies products, they didn’t do it out of kindness.

These companies are all as bad as each other, I pity anyone dumb enough to believe one is “screwing you” any more or less than the other. It’s just a case that one is able to get away with screwing your more than the other is. Until that other company gets the better product, then the tables turn.
 
What about the Athlon and Athlon XP, they absolutely destroyed Pentium 3 and Pentium 4 while being much cheaper, anyone had to be insane or very misinformed to not go AMD those days! In fact, the only reason Intel did remain alive during that time period was probably due to the much stronger marketing and mindshare among less computer literate people, and OEM deals.

I always thought AMD should have invested a lot more in marketing during the era when they had much better products than Intel for cheaper. The 2000s were an era when TV ads, magazine ads and newspaper ads still mattered a lot more than today, and I can count on my fingers the times I've seen AMD ads in all those media forms combined during that era. On the other hand, I saw Intel ads all the time.

Anyways, I've seen people, even here on Techspot comments areas, arguing in defense of current AMD pricing strategy claiming that they aren't about to "make the same mistake" they did during the 2000s (trying to one-up Intel and Nvidia in prices). The reasoning being that they blame this strategy for AMD's bad spot during most of the 2010s, claiming that thanks to that, AMD lacked R&D funds to develop better products. It makes sense, but still, even if that's true AMD must have something to compel both enthusiasts and the general public alike away from Intel, and without pricing they have nothing if they intend to use Intel's and Nvidia's current price/performance as a standard they intend to follow.
Here's the reality of it... Was the Athlon 64 FX worth paying more for? If you wanted the best performance at the time, absolutely! This is why I started off talking about market share and mindshare (funny how one is two words and the other is one word, eh? LOL). Did the Athlon 64 FX have the performance that *****-slapped the Pentium-4? Of course it did and all things being equal, it should have dominated, but not all things were equal and that's still true today.

This is about brand-power and AMD has never had it. They still don't have it based on their (lack of) market share and mindshare. Regardless of developments in the last three years with Ryzen, most people still own Intel machines and most people aren't enthusiasts who even know what AMD is. I've actually encountered people on TECH FORUMS who were worried that AMD CPUs and GPUs were somehow different, like buying a Mac. Never underestimate the stupidity of the average human.

One of AMD's biggest mistakes was putting the ATi brand into retirement. It's stupid because internally, the "Radeon Graphics Group" is referred to as ATi and the legal name of the "Radeon Graphics Group" is still "ATi Technologies". AMD underestimated how much people find familiar brand-names reassuring when they're set to spend a considerable amount of money. People knew who
and what ATi graphics were because ATi had been the video card market leader for decades but nobody knew who or what AMD graphics were.

I actually have a really good analogy for this. If you're looking to get a new washer and dryer and are looking to spend $1,000 or more, you're probably going to want a brand-name that you recognise, right? If you see brands like Whirlpool, Bosch, Maytag, Samsung, Kenmore or LG, you're probably not going to question the quality because you recognise those names. You'll probably be willing to pay the higher price for them even though you may know nothing about washers or dryers except for what they do and how to operate them.

What you may not know is that Whirlpool owns Maytag and Sears just sells re-badged Whirlpools under the Kenmore name. Despite this, you probably wouldn't be willing to pay as much for a Kenmore as you would for a Maytag or Whirlpool even though the machines themselves may be identical. That's because Maytag and Whirlpool have had brilliantly-made marketing campaigns over the years and having machines with that brand just feels better than machines branded Kenmore, or even other Whirlpool-owned brands like Inglis or Amana.

When you're not familiar with the industry or the market, right or worng, BRAND MATTERS because people are naturally reassured by the familiar (even if those Michelin X-Ice tires cost an extra $100 just because they say Michelin). Since there's nothing more complicated than a computer, Intel and nVidia's brand power ensures that nobody would be willing to pay more for the "off-brand" AMD. This is where AMD is screwing themselves. They must first cease to be the "off-brand" before they can command a price premium like they're trying to.

Now, you and I do know the tech industry so we see it through different eyes. I used the washer and drier because that's something completely different where we think about the brand that we're buying. I think that's the best way for us to understand the people who aren't computer geeks like us. :D
 
Last edited:
What you posted is the opinion of someone who hasn't been around long enough to really know how this industry normally works, the way that it has managed to work for decades.

You're the kind of person that nVidia hopes for, the one that falls for their marketing because you don't know what came before and how the pricing was structured over time. You also don't seem to realise that the pricing of the top-end cards affects the entire product stack. Spending (as you say) "hundreds of millions to produce cards they make almost nothing on", a notion is so nonsensical that I couldn't help but crack up when I read it, is exactly what they've been doing for longer than many adults today have been alive. Oh wait, since that's impossible because they'd be out of business if they did, they have not done that. This proves what you said to be completely wrong because nVidia still exists.

You know, common sense.

The truth is that nVidia's margins have always been sky-high which is why their current value is well over $121,000,000,000USD. If they were just "making a few bucks on their lower end cards", the cards that they sell far more of than any other market segment, then they wouldn't be where they are today.

You know, more common sense.

From 2008-2015, nVidia's cards ending in 80 were $500. That's exactly seven years without raising prices but still making enough profit to keep growing into the corporation that bought ARM this year. Then they started to get greedy but knew that the average consumer wouldn't notice if they increased the price by only $50 per generation. You just proved that it worked and it worked well.

Just google the following:
"GTX 280 MSRP" - You'll see that it was $500* (June 2008)
>There was no GTX 300-series<
"GTX 480 MSRP" - You'll see that it was $500 <-no hike (March 2010)
"GTX 580 MSRP" - You'll see that it was $500 <-no hike (November 2010)
"GTX 680 MSRP" - You'll see that it was $500 <-no hike (March 2012)
"GTX 780 MSRP" - You'll see that it was $500 <-no hike (May 2013)
"GTX 880 MSRP" - You'll see that it was $500 <-no hike (October 2014)
"GTX 980" - You'll see that it was $549 <- $50 hike (June 2015)
"GTX 1080" - You'll see that it was $600 <- another $50 hike (May 2016)
"RTX 2080" - You'll see that it was $700 <- another $50 hike (September 2018)

*Orignally, nVidia wanted the the MSRP of the GTX 280 to be $650 but ATi gut-punched them with the Radeon HD 4850, 4870 and 4890 which offered the same or better performance for $200USD less. This forced nVidia to drop the price of the GTX 280 by $150.

So, between 2008 and 2013, the prices remained stable at $500. I can assure you that "poor little nVidia" wasn't hurting AT ALL in this time period. Their stock price just kept rising and rising, along with the money that they had on-hand. This is because the cost of manufacturing tech goes down much faster than inflation can raise it as the level of tech goes up. That means that prices should never, ever rise. The performance per dollar should be constantly going up not remaining the same.

Nevertheless, nVidia raised their prices by 33% between 2014 and 2018 which is only 75% as long as the time period in which the prices remained stable. Then, when someone calls them on it, they have you, their shining white knight (who is always too young to know better), coming to defend "Poor nVidia"'s honour. You do this despite the fact that nVidia is screwing YOU just as hard as everyone else.

Now you know what common sense actually is, an oxymoron.

You really rocked that guy and his bad opinion.
 
It's stupid because internally, the "Radeon Graphics Group" is referred to as ATi and the legal name of the "Radeon Graphics Group" is still "ATi Technologies".
Not quite - ATi Technologies ULC in Canada is a division within the Radeon Technologies Group, doing ASIC and layout design. Any unique work/research done by them is patented under AMD, not ATi.
 
Well said, I remember getting an ATI 4850 in 2008 and being blown away with how much of an upgrade it was compared to my 8600GT which cost around the same price with only 18 months between them. Literally GPU tech was doubling every few years and like you said remaining similar priced through out.
I remember those days as well. I worked at Tiger Direct at the time and was privy to all of the horrible things that Intel and nVidia did to screw people over. When I saw the HD 4870 matching or exceeding the GTX 260, I got an XFX model (I liked the ATi goalie mask logo throwback to the Radeon 9600 Pro). I bought it because, as a Tiger Direct employee, I got stuff at cost. The card was like $300 but I paid about $150 for it. At the time, USD and CAD were either at par or the Canadian dollar was higher. What a deal that was!
I feel the market has stagnated since 2016 and I'm just shocked at how much people still want for 4 year old graphics cards, especially the Geforce cards, people want £120 for. 1060 or £180+ for a 1070, it's almost as if Nvidia products are being treated like iPhones.
It's nVidia's brand power. When people are spending a decent chunk of money, they become risk-averse to a fault and only want what's familiar. Intel has been the same way with their CPUs not going down in price despite being several generations old. Most people are not very knowledgeable about things and are willing to pay a good chunk of money extra for that peace of mind that they get from a tire that says "Michelin" or "Bridgestone", a washing machine that says "Maytag" or "Whirlpool" and for tech items that say "Intel", "nVidia", or "ASUS". That's just human nature and since most people aren't intelligent enough to research what they're buying, they rely on a brand that has done well for them in the past or is recommended by people they know and trust.
I bought a used RX 470 4GB for £65 last year as my 7870XT 2GB died after 6 years usage (recommend by techspot in 2013 as the best mid range valued 1080p card). I bought used for the first time because I wanted to wait to see what the next generation of cards would come out that would be better than the PS5/XSX and so far I'm disappointed and I'm really tempted to just buy a console and call it a day for PC gaming.
You want to know something funny? I bought a pair of PowerColor Radeon HD 7870 XT's from NCIX back in the day. I remember how stupid I thought that AMD was for calling it the HD 7870 XT instead of the HD 7930 (because it had a Tahiti GPU like the HD 79xx cards). The problem was that they crashed hard and immediately just from me enabling Crossfire. We tried three sets of cards and none of them behaved any differently. AMD had offered their "never settle" bundle and cut the prices of the HD 7970 because they wanted to replace them with the R9 280X (the re-badged HD 7970).

I ended up getting two Gigabyte Radeon HD 7970 Windforce cards with six AAA games (Saints Row IV, Hitman: Absolution, Deus Ex: Human Revolution, Far Cry 3, Tomb Raider and Sleeping Dogs) for $600CAD. Six AAA games (and we know that they're all awesome games) and two HD 7970s for $600CAD? That's the kind of pricing on video cards that makes the PC gaming industry grow. Also, the HD 7970s Crossfired without issue.

Where are the deals like that today? The last time I saw good card selling for ~$300CAD was the Sapphire Radeon R9 Fury Nitro+ OC Edition (what a mouthful!) that somehow popped up in the middle of the mining craze on newegg. It was a far better upgrade (and for less money) from the HD 7970 than the RX 580 was (and a LOT cheaper too!).

That was four years ago....and nothing since. I tell ya, nVidia and AMD will cause the PC gaming market to collapse because of their greed and the stupidity of the average consumer. It's that stupidity that lets them get away with it.
 
Last edited:
Not quite - ATi Technologies ULC in Canada is a division within the Radeon Technologies Group, doing ASIC and layout design. Any unique work/research done by them is patented under AMD, not ATi.
That doesn't change the fact that their legal name is "ATi Technologies". Trust me, I live about a half-hour's drive away from the Markham location (ATi's former head office) by car and I personally know a few people who work there.

Sure, ATi is owned by AMD but the "ATi Technologies" name was neither dissolved or ignored. It's like how NAPA in Canada is legally called "UAP Inc." instead of "NAPA Canada". On AMD's website, the Canadian locations are both called "ATi Technologies ULC" while ALL of their other locations are called either "Advanced Micro Devices" or simply "AMD (insert location here)". Why do you think that only those two are called something different? It's because they ARE something different.
And of course the patents all go to AMD, AMD does still own them. I don't ever remember saying otherwise. I said that they are considered a unique entity called "ATi" under the AMD umbrella, and they are.

Remember, when AMD bought ATi, they adopted ATi's red colour, not the other way around! :D
 
Last edited:
That doesn't change the fact that their legal name is "ATi Technologies".
Is there evidence for the Radeon Technologies Group using ATi Technologies as their legal name?

I know people who work in RTG too, but none of them refer to it as ATi. That said, none of them are based in the Canadian offices, which may be why they don't.
 
You really rocked that guy and his bad opinion.
Well, someone had to. He had no idea what he was talking about and he was being arrogant to someone who did. Since I'm old and have been building PCs since the 80s (and have therefore seen it all) I figured I was qualified. I actually had to try pretty hard to mind my manners because there's nothing more annoying than a "know-nothing know-it-all".

I think I was as civil as could be considering that I was basically informing him that not only was he wrong, but that he was being an arrogant anus. I mean come on, he actually said "common sense" to Zack when Zack was 100% right. That's insulting Zack's intelligence when Zack was clearly the more intelligent of the two. People like that generally need the truth pounded into their skulls with evidence backing up everything that anyone can vet (like telling them to google the MSRPs). If you try to argue against something like that, the rest of the community realises that you're a know-nothing troll and either blocks you or just ignores every post that you make.

On the other hand, if you take your licks, admit that you were wrong and apoligise, that shows tremendous strength of character. Someone who does that earns my respect immediately because I know that it's hard to do. I've publicly eaten my share of crow as well but I learned from it to always be 100% certain of what I'm about to say.

I check my facts thoroughly before making posts. It's the best habit to have because nothing I say is BS and therefore, I can never lose an argument since I always have proof of what I say. It's really not that hard and it ensures that my contributions to the discussion, while not always positive, will always be helpful. That big post is a perfect case-in-point. Can you imagine how great forums could be if everyone just got their facts straight before posting made-up garbage? :D
 
Well, someone had to. He had no idea what he was talking about and he was being arrogant to someone who did. Since I'm old and have been building PCs since the 80s (and have therefore seen it all) I figured I was qualified. I actually had to try pretty hard to mind my manners because there's nothing more annoying than a "know-nothing know-it-all".

I think I was as civil as could be considering that I was basically informing him that not only was he wrong, but that he was being an arrogant anus. I mean come on, he actually said "common sense" to Zack when Zack was 100% right. That's insulting Zack's intelligence when Zack was clearly the more intelligent of the two. People like that generally need the truth pounded into their skulls with evidence backing up everything that anyone can vet (like telling them to google the MSRPs). If you try to argue against something like that, the rest of the community realises that you're a know-nothing troll and either blocks you or just ignores every post that you make.

On the other hand, if you take your licks, admit that you were wrong and apoligise, that shows tremendous strength of character. Someone who does that earns my respect immediately because I know that it's hard to do. I've publicly eaten my share of crow as well but I learned from it to always be 100% certain of what I'm about to say.

I check my facts thoroughly before making posts. It's the best habit to have because nothing I say is BS and therefore, I can never lose an argument since I always have proof of what I say. It's really not that hard and it ensures that my contributions to the discussion, while not always positive, will always be helpful. That big post is a perfect case-in-point. Can you imagine how great forums could be if everyone just got their facts straight before posting made-up garbage? :D

With all due respect mate, you’re coming across as extremely arrogant and ignorant, especially as your posts have been riddled with really, really basic errors.

Do you want me to list them? Besides the fact the GTX 880 never existed...not quite sure how you supposedly googled the MSRP on a product which never existed?

But to shatter your whole thesis, notice how nVidia started jacking up their prices right as AMD stopped competing in the high end?

You talk about brand image etc, but you don’t seem to get the concept of a “halo product”. The idea that just having an amazing top tier product helps sell the lower end. There’s a reason people buy awful overpriced low-end Mercedes cars...

Stop acting like everyone is wrong and you’re the almighty. Your posts get several basic concepts and just straight up facts wrong. Check yourself.
 
With all due respect mate, you’re coming across as extremely arrogant and ignorant, especially as your posts have been riddled with really, really basic errors.

The post was made by someone who had only been on Techspot 2 days.... just putting that out there. I mean people are allowed to be wrong.. as I sit here still with my 4 years old GTX 1080 and old i7 lol.

Pick your battles mate I mean common sense.. worst argument ever. Common sense wasn't even common in Thomas Paine's time.. *cough* Calvinism. For as Aristotle said all the virtues will be present when the one virtue, practical wisdom, is present. The only things truly common are death, taxes and (sexy game) hysteria!

So let's keep this constructive and agree to stop buying ridiculously priced new cards until we see something actually worth paying for. Think of it like the metoo movement for NVIDIA.. lol. What's people's opinion on the RTX 3060 Ti assuming it's available?

From what I can tell last couple years monitors are finally after almost a decade improving in more substantial ways at the same price point.. How do you think this market compares to GPUs, my personal observations are it's actually comparable.. it's just unfortunate NVIDIA is not giving us more Vram on cards like the 3070 compared to the 4 year old 1070. Is there not an argument to be made that Pascal improvements were just too good for the mainstream gamer at 1080p.

I think there's a good reason for instance that people still drive around older cars, when they're reliable, comfy, cheap and easy to repair, AC, cruise control.. what more do you want really. More electronics with contacts to fray out and gears and double clutches and components for minor efficiencies that die soon as warranty is up. It just doesn't add that much more enjoyment, I mean some people even prefer Manuals, I still think the 1989 Nissan Z/Fairlady is still one of the most badass cars to drive and it had all above comforts to boot. Unless you're made of money I suppose and can afford Porsche's and Tesla's.
 
I'm a moderate AMD fan, but seems Nvidia's mid-range RTX 3060 is just the best buy option, with series x060 sweeping all the competition again. And by "competition" I don't mean just AMD, but even some of the Nvidia cards, such as RTX 3070. Over the long period of my computer enthusiasm I had several Nvidia and AMD (or ATI) graphics cards, and I was mostly satisfied with all of them, but I'm pretty sure Nvidia 3060 Ti will be my next gaming renderer. Unless AMD pulls out a miracle.
 
Is there evidence for the Radeon Technologies Group using ATi Technologies as their legal name?
He's nearly correct. When AMD (through their own Canadian ULC) acquired ATI, Inc., they formed an unlimited liability corporation known as ATI Technologies ULC, which is a wholly-owned foreign subsidiary of AMD. You can see it (along with many other subsidiaries they use for similar tax, liability, and related purposes) on their SEC filings.
 
I check my facts thoroughly before making posts.

Evidently not thoroughly enough, since you got the launch date of the GTX 980 wrong (It was September 2014, not June 2015) and included a GTX 880 (which doesn't exist). Also you said the price of the RTX 2080 was $50 higher than the 1080 when it was actually $100.
 
Back