dividebyzero,
"Desktop yes. Mobile no. Performance per watt rules in mobile, and what is insignificant in desktop power usage tends to translate into some expensive heatsinking on a notebook."
It's not that simple. NV got 300 design wins in the mobile sector before a single GTX600 chip was sold to OEMs. The question is why didn't AMD's team get some of those design wins when they launched Tahiti as early as Dec 2012-Jan 2013? It sounds like execution is a key factor here too. AMD didn't actively promote or market its mobile HD7000 series. I bet its sales people did a lousy job with trying to sell low-end HD7000 series into notebooks. When NV won Apple contract for example, GTX600 was a spec on the roadmap, nowhere near close to launch.
Also, please explain then why AMD suddenly took a huge decline in market share in Q3 but in Q1-Q2? In fact, in Q2, AMD gained market share against NV, which throws your entire theory out of the window.
"AMD increased its market share to 40.3%, Nvidia?s market share slipped but still retains a large majority at 59.3%. Nvidia got off to a slow start in Q2 and cited supply constraint as the main reasons for the decline"
http://www.techpowerup.com/171198/G...nally-Down-from-Last-Quarter-Reports-JPR.html
Sounds like NV just executed far better in Q3 than they did in Q1-Q2, although they were already shipping superior performance/watt GTX600 cards in Q2. Why didn't AMD lower prices to OEMs knowing that NV's supply issues will be resolved in Q3? All these things point to poor execution & strategy, not just inferior performance/watt product. If NV was supply constrained, that means those OEMs needed more product but NV couldn't fill all that demand. AMD's management team should have stepped in and stole market share instead it looks like they fell asleep
"Years of providing the halo product -from G80, G92, GT200, GF100 and GF110, and catering for the gamer user base are going to require AMD to execute for more than one top-of-the-heap generation, and a sizeable ongoing investment in Gaming Evolved."
I guess unlike most consumers who buy GPUs, I am not brand loyal, which is why I don't eat AMD or NV marketing. It seems your entire post is trying to justify why NV cards sell well from a brand/marketing point of view, and that even if NV loses a generation, its strong marketing and brand saves it. Ya, I don't disagree with that notion but it also shows how many of NV's consumers are like sheep (similar to Apples). I followed PC tech and forums for a while and people who buy AMD/ATI cards often have no problems switching to NV and back to AMD/ATI and back to NV. But it is NV fans that keep waiting to give NV their $$ (this generation is a perfect example), or often downplay NV's underwhelming delivery because they "love" the brand.
-- When HD7970 was 20% faster than GTX580 for $100 more, it was overpriced? Yet, when GTX680 lost the performance crown since June, and cost more than 7970s, I rarely heard the same sentiment regarding NV's overpriced cards?
-- When NV took 6 months to roll-out its sub-$300 lineup, it was magically forgiven for being late. In technology, generally being late means you are a follower, not a market leader. The only companies that get away with launching late and still selling despite not necessarily better products are those with a huge number of brand fanboys that would buy anything that firm makes -- Honda, Toyota, Apple, etc. If AMD launched 6 months late, they might as well close shop in the GPU race. NV launches 6-8 months late, no problem......a lot of its customers wait like sheep. Most people who buy AMD cards won't wait for AMD cards, they'll just buy NV
-- Why does NV get a pass for removing voltage control from high-end after-market GTX670/680 cards? If AMD did that, enthusiasts would be extremely displeased. NV does it, people defend NV because "they had to do it to reduce RMA costs to themselves, etc." Oh RLY?
-- Tons of double standard this generation. When GTX400 series overclocked really well, performance/watt was not a consideration, never. GTX460 @ 925mhz exceeded HD5870 power consumption but it was respected for its amazing bang-for-the-buck. HD7950 OCed to 1100mhz uses about as much power as a GTX680/HD7970 and yet its overclocking is now downplayed by NV users as "luck of the draw", etc.
-- Remember Fermi generation? NV improved drivers > 10% over 6-8 months since release in March 2010. When they did this, many gamers said "WOW, NV's driver team is awesome, they extracted so much more performance out of a new architecture." When AMD does the same thing with GCN --> "AMD's driver team sucks! This level of performance should have been there on day 1, etc." Again, another double standard from NV fanboys. It's funny for me to read all this because when I got my 3 Fermi cards, I fully expected NV to add 10%+ over 6+ months and I got it knowing Fermi was a brand new architecture.
Most importantly, NV users never acknowledge that NV got taken to the cleaners this round. They keep pointing to excuses like market share, NV's profits, etc. You pointed out 4 consecutive generation of NV's leadership since G80. GTX680 is not a continuation of that, but I don't see many guys who bought NV last 4 gens criticizing NV for under-delivering. Why? I wanted a 60-70% faster GTX580 and what I got was a 35% faster version instead. To get more performance, I had to go AMD and I don't sit there defending NV. NV users stick by NV even when they lose. I guess more power to NV for nurturing such a sticky customer base.
"Unfortunately for AMD, that doesn't seem like changing anytime soon. Enduro still suffers from utilization issues- and of course, hybrid Crossfire doesn't work with GCN cards. The rebadging seems to have started early (HD 8550M in this case) -although like the rebadged GT7xx Nvidia cards, this is more down to the OEM's than AMD/Nvidia"
Agreed. The rebadging is really awful by both firms, lately worse by AMD in the mobile space. Sounds like a repeat of HD7000 mobile start. Terrible execution/marketing.
"That's actually a subjective argument as you noted below. Nvidia sells at its price so the SKU's stay at that price. As a company Nvidia sells at higher ASP's for smaller dies, and lower performance. A $299 GTX 680 would basically cut AMD off at the knees- and given the yields and Q3 earnings call that would be eminently possible with the smaller die strategy."
Ya, but another way to look at it - NV ripped us off just as badly as AMD did. Where is the real GTX680? They couldn't deliver it. GK100? MIA. GK110? Impossible as they are only now ramping up volumes of K20/K20X chips. NV couldn't release anything bigger than GTX680 due to yield and wafer capacity issues, not because they purposely held it back to next generation. They got away with selling a GTX660Ti successor for $500. Guess which company took the blame for overpricing GPUs this generation? AMD, not NV. Another double standard. Let me see now in 7 months from the time $649 GTX280 launched, NV later delivered GTX285 for $350 and then AMD launched a $259 HD4890 just 9 months after GTX280. GTX280 lost nearly $400 in value in 9 months. Talk about being ripped off. But instead people talk about how HD7970 was a rip-off, despite it still being the fastest single GPU 10 months later?
"Another situation that won't change. Just wait for AAA titles to show up on graphics intensive game engines that support TXAA. [link] ...and I'm guessing that things should start getting quite nasty if Gaming Evolved titles don't support the feature"
I actually disagree with that article. TXAA looks very blurry even in those small-sized pics of COD:BO2. See TXAA is yet another NV marketing tactic. MSAA+FXAA provide superior texture quality. TXAA makes PC games look like console titles as it blurs the textures (see the Secret World as well). TXAA is even worse than FXAA/MLAA filters but I guess new gamers think MSAA is "old/antiqued" when in reality the new FXAA/MLAA/TXAA filters have been created to allow weak systems to have anti-aliasing, especially useful for consoles -- see PS3's blurfested Black Ops 2 using NV's new filter. It doesn't look nice. Again, I can't see how you spin TXAA as some superior NV feature. Enthusiasts who buy $300+ GPUs go MSAA, or for higher GPUs, they go downsampling route. Again this idea that you can get 8xMSAA+FXAA image quality with 4xTXAA is non-sense. You cannot because TXAA blurs image, defeating the purpose of AA and buying a flagship GPU for PC gaming on a 2560x1440/1600P monitor. Who wants to buy high-end GPUs and then blur their entire screen?
"8800GTX, 8800 Ultra, GTX 280, GTX 285, GTX 480, and GTX 580 were all the single GPU performance kings of their generation with ATI/AMD only claiming the title when Nvidia were late to market."
GTX480 was late to the market by 6 months. HD5870 had the market all to itself for 6 months. You just keep reinforcing the same story from NV buyers -- they always come up with excuses why NV is allowed to be late, is allowed to remove enthusiast features and is allowed to underdeliver in performance or performance/watt. They spin anything negative about NV into a positive. GTX680 came out strong, and then lost the performance crown, it's that simple. Why didn't NV release a 1267mhz GTX685? Why did NV not get criticized for not dropping prices or adding game bundles on GTX670/680 cards after AMD added 3 free games?
I agree with you completely that GK110 will retake the performance crown (or some GK200 variant). But even if NV loses, overpriced its cards, its loyal customers will still buy their products.
BTW,
From the second quarter to the third, JPR says Nvidia's PC graphics shipments grew 28.3% in desktops and 12% in notebooks. Intel suffered 7% and 8.6% declines in those same markets, respectively, while AMD saw a 2% decline on the desktop and a 17% decline in notebooks.
http://techreport.com/news/23959/jpr-nvidia-gained-in-pc-graphics-last-quarter
So as you said correctly, AMD is getting rapped in the mobile sector.