AMD's market share continues to collapse, now resides at a troubling 18 percent

Really seems like bad timing to release a comparison. They just came out with their lineup, what a week ago?
 
Re-branding cards from 2012 for 2015 was s pretty big problem, adding more memory to an already aged GPU is not the way to get back your customers, it only pushes them away, in search of something newer and actually better. AMD has been slowly killing themselves off in both GPU and CPU divisions, anyone who knows anything will be building Intel CPU and Nvidia GPU right now, otherwise your on a budget, and that's still no excuse to. Go ahead and cry AMD fan boys, my first gen i7 is still faster than anything AMD has released in the last 6 years. GPU wise I was a supporter of the ATI side of things, my 5870s served me long and well, but since then I have been unimpressed with every GPU AMD has released since, with exception to Fury, but that's too little too late and too niche a market, nor does it logically make sense when held up to Maxwell's efficiency. As nice as it is to have two companies competing to keep the market fair, this is only worth anything when the two companies can actually compete with each other. For AMD to compete in either CPU or GPU division they'll need to kill off the other, and right now both are a dismal failures in my opinion.
Go ahead and cry AMD fan boys? U should be the one crying, if AMD collapses in GPU sector the GPU market will go the way of CPUS, no more performance improvements cause there will be no competition! Nvidia will just focus on power efficiency cause mobile is where the money's at. Ive only owned Nvidia gpus since 8800gt, but even I know if there's no competition, we'Re all screwed.
 
I'm going to repeat the observation another guy did in the Forbes article: the Fury X was released just 6 days before the end of the quarter, with scarce availability.

Lets look at AMDs console chip market share...
That's actually very good point, especially since console gaming out numbers PC gaming(unfortunately). Let's just hope AMD's next generation of CPU's are as good as they've been hyping them up to be. I miss the days of the Athlon X2's wiping the floor with pentium 4's. Ever since conroe it's been downhill for AMD. I'd love to see the red team make a comeback, but until then I'm sticking with Intel and nVidia.

What, last time I heard, PC gamers surpassed 700 millions. All the last-gen consoles sum op barely 40 million.
700 million and 80% of them are using integrated graphics.
 
Go ahead and cry AMD fan boys? U should be the one crying, if AMD collapses in GPU sector the GPU market will go the way of CPUS, no more performance improvements cause there will be no competition! Nvidia will just focus on power efficiency cause mobile is where the money's at. Ive only owned Nvidia gpus since 8800gt, but even I know if there's no competition, we'Re all screwed.

Your argument is that NVIDIA will abandon the desktop sector because of no competition? So I guess they will stop improving their products, giving people no reason to replace their older cards?

That's a tired argument that doesn't hold water. AMD hasn't been competitive in the GPU or CPU market except on price for a while yet Intel and NVIDIA continue to develop new processes and fabrications investing billions in some cases. It's true mobile has become a focus but there's nothing to fear there either: the profits from the mobile sector fund the improvements in minority desktop performance sector.

If your concern were increased margins and prices in the desktop sector without competition that may be valid but the way I see it NVIDIA has been setting the price of graphics cards in the market for a while now. When they release a line AMD responds with price cuts. There's been a few times where NVIDIA has had to respond but not nearly as much as the way around. Worrying that 950 performance will increase to Titan prices though is silly; people will not tolerate gouging.
 
....[ ].....That's a tired argument that doesn't hold water. AMD hasn't been competitive in the GPU or CPU market except on price for a while yet Intel and NVIDIA continue to develop new processes and fabrications investing billions in some cases. It's true mobile has become a focus but there's nothing to fear there either: the profits from the mobile sector fund the improvements in minority desktop performance sector.
Well maybe but, your extension throws good money after bad. Why throw profits into a non profitable sector of your business, when you can make shareholders happy. Arguably, "it's not the size of the dog in the fight, but the size of the dog in the fight". Any CEO can get wrapped up in a competitive size war, as easy as anyone else. If company "X" goes forward on the sole premise, "we've got to be bigger than company "Y"", there's trouble brewing for them.

If, (as seems to be happening), the desktop market flattens out, I'd wager the server market never will. Intel's "Zeon" pretty much owns that. Where is AMD in all this?

If your concern were increased margins and prices in the desktop sector without competition that may be valid but the way I see it NVIDIA has been setting the price of graphics cards in the market for a while now. When they release a line AMD responds with price cuts.
At some point, people simply run out of money, or need the money they have elsewhere. I have seem business raise prices when they were failing in the past. They raise prices to cover their debt, and it only hastens their demise.

There's been a few times where NVIDIA has had to respond but not nearly as much as the way around. Worrying that 950 performance will increase to Titan prices though is silly; people will not tolerate gouging.
I beg your pardon, they certainly will. Very often, "the first kid on the block to have the next shiny new tech toy", feels honored and obligated to pay too much for, "the next big thing".

Plus, if there's only one company selling a particular, how could you know, or in any way be able to prove, whether you're being gouged or not?:confused:

The reverse of this is exactly what happened in my state. Independent companies were allowed to "sell" electricity. Thus, you were no longer tied to your local utility for both supply and commodity. The first big cold snap, people who changed their electricity suppliers, lured away by bottomed out "variable rates", got the shock of their lives when their January electric bills went from about $300.00 under the mean nasty old utility, to about $800.00 under the new "friendly suppliers. Is seems it costs almost 3 times more to generate electricity in January. Right.
 
Last edited:
Well maybe but, your extension throws good money after bad. Why throw profits into a non profitable sector of your business, when you can make shareholders happy. Arguably, "it's not the size of the dog in the fight, but the size of the dog in the fight". Any CEO can get wrapped up in a competitive size war, as easy as anyone else. If company "X" goes forward on the sole premise, "we've got to be bigger than company "Y"", there's trouble brewing for them.

If, (as seems to be happening), the desktop market flattens out, I'd wager the server market never will. Intel's "Zeon" pretty much owns that. Where is AMD in all this?

At some point, people simply run out of money, or need the money they have elsewhere. I have seem business raise prices when they were failing in the past. They raise prices to cover their debt, and it only hastens their demise.

I beg your pardon, they certainly will. Very often, "the first kid on the block to have the next shiny new tech toy", feels honored and obligated to pay too much for, "the next big thing".

Plus, if there's only one company selling a particular, how could you know, or in any way be able to prove, whether you're being gouged or not?:confused:

The reverse of this is exactly what happened in my state. Independent companies were allowed to "sell" electricity. Thus, you were no longer tied to your local utility for both supply and commodity. The first big cold snap, people who changed their electricity suppliers, lured away by bottomed out "variable rates", got the shock of their lives when their January electric bills went from about $300.00 under the mean nasty old utility, to about $800.00 under the new "friendly suppliers. Is seems it costs almost 3 times more to generate electricity in January. Right.

Well maybe but, your extension throws good money after bad. Why throw profits into a non profitable sector of your business, when you can make shareholders happy. Arguably, "it's not the size of the dog in the fight, but the size of the dog in the fight". Any CEO can get wrapped up in a competitive size war, as easy as anyone else. If company "X" goes forward on the sole premise, "we've got to be bigger than company "Y"", there's trouble brewing for them.

If, (as seems to be happening), the desktop market flattens out, I'd wager the server market never will. Intel's "Zeon" pretty much owns that. Where is AMD in all this?

At some point, people simply run out of money, or need the money they have elsewhere. I have seem business raise prices when they were failing in the past. They raise prices to cover their debt, and it only hastens their demise.

I beg your pardon, they certainly will. Very often, "the first kid on the block to have the next shiny new tech toy", feels honored and obligated to pay too much for, "the next big thing".

Plus, if there's only one company selling a particular, how could you know, or in any way be able to prove, whether you're being gouged or not?:confused:

The reverse of this is exactly what happened in my state. Independent companies were allowed to "sell" electricity. Thus, you were no longer tied to your local utility for both supply and commodity. The first big cold snap, people who changed their electricity suppliers, lured away by bottomed out "variable rates", got the shock of their lives when their January electric bills went from about $300.00 under the mean nasty old utility, to about $800.00 under the new "friendly suppliers. Is seems it costs almost 3 times more to generate electricity in January. Right.

Throwing out platitudes that have little relevance seems misplaced. Desktop graphics cards are profitable for NVIDIA or the board would have the company moving into a new direction not moving forward investing billions in shrinking the fab process.

It's very easy to determine performance increases without competition from AMD. Sites like Anandtech, Tom's Hardware, HardOCP, etc. measure previous generation hardware with current gen showing the gains agnostically of manufacturer. If gains are not apparent it's obvious.

I honestly cannot decode the rest - I highly doubt Intel and Nvidia are worried with more than improving performance/efficiency in order to get customers to upgrade aging, power hungry hardware. The pool of 82% is much larger than taking customers from the 18% of the competition.
 
Throwing out platitudes that have little relevance seems misplaced. Desktop graphics cards are profitable for NVIDIA or the board would have the company moving into a new direction not moving forward investing billions in shrinking the fab process.
Since Nvidia's market share is well established, I'm uncertain as to how you interpreted I was talking about that company. My observations were more along the line of AMD's dilemma. As Intel gains more and more traction in the market, AMD's position becomes less tenable, in that the money simply isn't going to be there for investment.

It's very easy to determine performance increases without competition from AMD. Sites like Anandtech, Tom's Hardware, HardOCP, etc. measure previous generation hardware with current gen showing the gains agnostically of manufacturer. If gains are not apparent it's obvious.
So you're saying websites do product testing. Wow, is that ever deep.

I honestly cannot decode the rest - I highly doubt Intel and Nvidia are worried with more than improving performance/efficiency in order to get customers to upgrade aging, power hungry hardware. The pool of 82% is much larger than taking customers from the 18% of the competition.
Well, when you replace a fifty cent 100 watt incandescent light bulb with a ten dollar LED bulb, it takes quite a while to get your money back. Besides, even with shrinking die size, Intel's new CPU have as high, or higher, TDP ratings as their previous generation. (I must have read one of those tests you're were talking about Dunno how that happened).

So, it's not really like the gaming community is fixated on energy savings either, or those behemoth 1000 watt PSUs would be off the market.

I'll try to explain my earlier statements in a way more easily accessible to you. A lack of competition doesn't necessarily insure the certainty of price gouging, nor does a competitive marketplace necessarily prevent it. "Whatever the traffic will bear", is a time honored cornerstone of capitalism.

I'm a cat fancier too, BTW. Your avatar makes me wonder if you're "Bobcat V 2.0". And no, I have no intention of explaining that.
 
...[ ]...After visiting the link, I think "V 2.0" will be obvious.
Oh well, at least we've hopped genus. "Bobcat" = Lynx Rufus Whereas "ocelot" = Leopardus pardalis.

And this is just a story I associate with this particular cat, it's completely out of context here.(*) The ocelot was the "once upon a time", felid pet of choice by young, newly minted, starlets. Of course this was in the 60's, before the animal became a protected species. It usually provoked a rash of paparazzi photos, as one of these "babes", would be walking the cat with a leash and rhinestone collar, while she was wearing a leopard skin coat, or some other freshly killed animal....

And I just researched this, as I didn't want to have the story unsubstantiated, but even the famous, (or is it infamous ?) artist, Salvador Dali, kept a pet ocelot!

(*) (He says that, but does he mean it)?


13.jpg


That's ^^^^ a real leopard, Panthera pardus Take note of her coat there kitty, you could be next..:eek:
 
Last edited:
Since Nvidia's market share is well established, I'm uncertain as to how you interpreted I was talking about that company. My observations were more along the line of AMD's dilemma. As Intel gains more and more traction in the market, AMD's position becomes less tenable, in that the money simply isn't going to be there for investment.

So you're saying websites do product testing. Wow, is that ever deep.

Well, when you replace a fifty cent 100 watt incandescent light bulb with a ten dollar LED bulb, it takes quite a while to get your money back. Besides, even with shrinking die size, Intel's new CPU have as high, or higher, TDP ratings as their previous generation. (I must have read one of those tests you're were talking about Dunno how that happened).

So, it's not really like the gaming community is fixated on energy savings either, or those behemoth 1000 watt PSUs would be off the market.

I'll try to explain my earlier statements in a way more easily accessible to you. A lack of competition doesn't necessarily insure the certainty of price gouging, nor does a competitive marketplace necessarily prevent it. "Whatever the traffic will bear", is a time honored cornerstone of capitalism.

I'm a cat fancier too, BTW. Your avatar makes me wonder if you're "Bobcat V 2.0". And no, I have no intention of explaining that.

Try being clearer next time. A few criticisms -

Efficiency in PSUs is not determined by power output - a 1500 watt PSU can be more efficient than a 300 watt PSU. It's wasteful to buy one if all that power is not needed.

Power efficiency usually leads to less heat in computer components and less heat to deal with usually leads to quieter, longer lasting rigs. For those people with disposable income who build PCs it's a very valid concern especially when the margin of difference in price is not that great.

I never stated that a lack of competition will drive gouging; I was saying it could be argued to be a concern. I don't consider it to be a concern as there are too many independent outlets who make their money reviewing the performance of the hardware to pull that off.

I see another platitude, I guess that's your shtick.

As to the Bobcat - An Ocelot is a small, wild cat that can resemble a dwarf leopard; it's also in characters names in the Metal Gear series. Rex is the Latin word for king. That makes me king of the Ocelots and unaware of whoever you are referring to.
 
Lets look at AMDs console chip market share...
That's a good point but it will be interesting to see where the next gen consoles go. If Intel can keep making headway with their integrated GPUs, their CPUs are already well placed - AMD could be in a spot of bother there next gen.

They are getting steamrolled on discrete GPU and CPU markets. Personally I think AMD having strong GPU tech will help them maintain their APU marketshare.
 
Try being clearer next time. A few criticisms -
Spare me.

Efficiency in PSUs is not determined by power output - a 1500 watt PSU can be more efficient than a 300 watt PSU. It's wasteful to buy one if all that power is not needed.
And who exactly was talking in terms of PSU efficiency? Certainly not me.

Power efficiency usually leads to less heat in computer components and less heat to deal with usually leads to quieter, longer lasting rigs. For those people with disposable income who build PCs it's a very valid concern especially when the margin of difference in price is not that great.
You actually think you know it all, but in reality, you're either stating the patently obvious. or rote reciting. Me, if I were all wrapped up with a heat issue, I'd upgrade to water cooling, and /or hang a couple more fans on the junk. That's in lieu of dropping a couple of grand on a nil improvement type of transaction.

I see another platitude, I guess that's your shtick.
I don't know about "my shtick", but "platitude" seems to be "your only big word".

As to the Bobcat - An Ocelot is a small, wild cat that can resemble a dwarf leopard; it's also in characters names in the Metal Gear series. Rex is the Latin word for king. That makes me king of the Ocelots and unaware of whoever you are referring to.
No, it's makes you another clown passing through with paranoid delusions of cat-hood.

Since you were unwilling or incapable of following the link Mr. Cooley provided to "Bobcat": I'll thoughtfully post it for you again: https://www.techspot.com/community/members/bobcat.191980/

Do I need to explain who "Salvador Dali" was, while I'm here?

(I referred to him in my previous post also). A little criticism here, be more specific.

And I really am sorry you couldn't pick up the analog between free market electric suppliers gouging in an open market, and computer part suppliers, or suppliers of whatever, gouging in my other post.
 
Last edited:
Ocelot, don't you love when people post stuff about you without any evidence to back it up? Don't worry about him, check his other posts to see this is just his MO... (watch him slam me later :))

The article, on the other hand, was simply trying to say that AMD's market share has declined... drastically... and while there will still certainly be innovation on the GPU side, AMD or no AMD, I think there is definitely the threat of Nvidia not stressing quite as much over it if they lack competition...

You can look at the Intel side and see some worrying corroboration... When AMD was true competition years back, you saw CPUs improving fairly rapidly... now that they are no longer really competing (at least at the high end), you see fairly modest improvements, mainly in power use...

COULD Intel do better? We can never prove anything either way... but I'd suspect that if AMD was marketing something on par with the 5960X right now, Intel might have tried to make Skylake and Broadwell a bit faster...
 
Ocelot, don't you love when people post stuff about you without any evidence to back it up? Don't worry about him, check his other posts to see this is just his MO... (watch him slam me later :))
Oh, stand back ,"big squid" is here.

The biggest problem you have, is confusing your own opinion with fact. I freely admit my content is heavily editorialized, and I have absolutely no intention of changing it to patronize the "troll of the month club", which you seem to have summarily appointed yourself president.

And BTW, are you really as technical knowledge impaired, as this post would indicate?
Yeah... but unless the Feds come in mid-watch... the file would be gone from your HD by the time you were raided... do browser history records constitute enough proof to convict?
Temporary files are removed the same way standard files are deleted by Windows. The header is removed, the information still remains and can be retrieved. Now, I expect if you quick stream another pirated movie, and your computer decides to use the same block of memory, then the file would not be retrievable. But then, you have residual files of the new movies to contend. If you catch my drift, you have to overwrite a file with as much or more data, than is extant on the drive from the previous file.

Then there's always "private browsing" setting to eliminate browser history, but I suspect those files are treated the same way.

You're welcome.
 
Last edited:
Um... wrong thread buddy... had you read the previous posts, you might have understood.... but then again, why bother... mods, can you delete both of these?
 
Um... wrong thread buddy... had you read the previous posts, you might have understood.... but then again, why bother... mods, can you delete both of these?
I think even in a legal sense, when someone such as yourself comes blasting in running their mouth about, "cranky's bad prior acts", it's altogether fitting that that person, get confronted with their prior stupidity. "You opened the door", as they say on those TV courtroom dramas.

So, cry to the mods, because you don't want that brought to light. And you don't want to admit, you're not doing very well at, "handling me".
 
You can look at the Intel side and see some worrying corroboration... When AMD was true competition years back, you saw CPUs improving fairly rapidly... now that they are no longer really competing (at least at the high end), you see fairly modest improvements, mainly in power use...

COULD Intel do better? We can never prove anything either way... but I'd suspect that if AMD was marketing something on par with the 5960X right now, Intel might have tried to make Skylake and Broadwell a bit faster...
The problem that Intel is having, is that they, unlike yourself, have realized the "Moor'es Law", wasn't handed down on stone tablets.

There is a point, when our ability to fashion fabricating equipment has to fail, simply by virtue of the pathway sizes involved. You even run into problems where molecular sizes become a constraint.

Now, I've said this before, you, catboy, and a gross of your friends here, couldn't walk into a fab and walk out with a working 386, and every bank on the planet, likely has the sense to not give you the money to try.

As for the rest of your post here, it consist of a bunch of "well maybe", we'll have to see, they should, and so forth. It's not even as good as an opinion, just worthless blather.. .
 
Toms readers/posters are just as biased as everyone else. The test was not rigged, Gsync is superior and there are many tests comparing the two that support these results.
It was also available sooner.

I agree that G-Sync is better right now. I don't think Tom's readers being biased invalidates the point that Tom's did not conduct the test in the best manner. Free Sync should be the better tech down the road.
 
AHAHAHAHAHAHAH samsung is such crap it would be the crappiest crap company making crap ever. Some decent high end phones but that doesn't help you make better on gpu market exept if you want to make mobilephone gpu.
you know nothing, john snow!
OK but, @deemon You have to admit, with his syntactic abandon, Mr. Alabama has cornered most of the market here on "rustic charm", regardless of content.
 
I think some people miss the whole point about competition.

Sure GPU capabilities will continue to grow even if Nvidia is the only game in town. HOWEVER, competition is what keeps prices low. If there was nothing to compete with, the 980 TI wouldn't be priced where it is. Sure it would still exist, but might cost 1.5-2x as much, just cause they can. Everyone would be buying more budget minded cards and that price of $100-200 would jump up to $200-400 for the latest flavors. THAT is what competition brings, a competitive marketplace for competitive prices.

Right now Nvidia found the sweet spot to under-cut AMD for profitability and is taking members (and other reasons). I myself still run a 690, looking to a pair of 980's. Used to run AMDs. To me it's just raw performance but I'm one of the few percent, and I still wait a while. But I'd be looking at significantly more if there wasn't competition.
 
I think some people miss the whole point about competition.

Sure GPU capabilities will continue to grow even if Nvidia is the only game in town. HOWEVER, competition is what keeps prices low. If there was nothing to compete with, the 980 TI wouldn't be priced where it is. Sure it would still exist, but might cost 1.5-2x as much, just cause they can. Everyone would be buying more budget minded cards and that price of $100-200 would jump up to $200-400 for the latest flavors. THAT is what competition brings, a competitive marketplace for competitive prices.
You're missing a very important point. Price gouging happens every day in an "open market". It's given name is Apple. .Apple turns in obscene profit margins, given its scale. The reasons for that are simple, people allow them to do it., and they've found a way to circumvent competition by way of human stupidity and vanity. Their advertising is brightly colored, musically annoying, and prevails on mental defectives of all ages to simply buy Apple, for no other reason save for the fact Apple tells them to. To wit, Apple's latest iPhone pitch slogan:

"If it's not an iPhone, it's not an iPhone"

IMO, it doesn't get much dumber, more insulting, or more inane than that. Yet, millions of people scamper to their wireless carrier, credit cards in hand, begging to be ripped off for the latest offering.

Right now Nvidia found the sweet spot to under-cut AMD for profitability and is taking members (and other reasons). I myself still run a 690, looking to a pair of 980's. Used to run AMDs. To me it's just raw performance but I'm one of the few percent, and I still wait a while. But I'd be looking at significantly more if there wasn't competition.
No, you'd be looking at more money spent, because you're simply too addicted not to pay what they ask. If nobody bought the latest and greatest, it wouldn't matter what the latest mighty ripoff cost, it would sit on the store shelves, until the price came down.

To put it more simply, addictions always cost more money than you have. It's why the government controls utilities. (or at least attempts to do so).

I honestly could care less what you would have to pay for the latest and greatest graphics card. Unless you're a digital animation or FX artist, it's something you should easily be able to live without.

Grow a pair, throw the tea in the harbor.
 
Back