TechSpot

JPR: Nvidia weathers turbulent PC market with 20% quarterly growth

By Matthew
Nov 26, 2012
Post New Reply
  1. Jon Peddie Research (JPR) has published its third quarter statistics for the graphics processor market showing weak seasonal performance with shipments down 1.45% quarter-over-quarter and 10.8% year-over-year. The underwhelming results were largely blamed rocky PC shipments...

    Read more
     
  2. dividebyzero

    dividebyzero trainee n00b Posts: 4,950   +731

    While the fanboys big up the GK104 vs Tahiti fight, GK106/107 stealthily slips the knife in.

    Q4 will be more telling. Q3 had Nvidia new releases pitted against AMD refreshes. The next quarter will be reliant upon graphics already extant in the channel -the crippled HD 7870LE and pro GK110 notwithstanding. The AMD slide in the mobile arena seems worse than expected given that any APU powered notebook with switchable graphics counts as two graphics sales units.
     
  3. Blue Falcon

    Blue Falcon TS Enthusiast Posts: 156   +44

    The irony is that while GK104 (GTX670/680) are good competitors for HD7970/7970 Ghz, GK106/107 are complete turds. GTX660Ti gets destroyed by HD7950 V2 (stock or OC vs. OC), GTX660 costs more than HD7870 for slower performance, HD7850 1GB and 2GB versions dominate GTX650Ti 1/2GB by 40% in performance, while HD7770 offers superior value than GTX650.

    As you noted, where NV got killed was mobile and that's understandable since Enduro is nowhere near as good as Optimus and frankly AMD's HD7000 mobile launch was a mess or re-badged crap on the low-end and no strong design wins anywhere else.

    On the desktop though, this is easily the worst showing from NV since GeForce 7. AMD has them beaten in all price levels up to $600 on performance, price/performance, overclocking and game bundles. The main reasons is NV markets their products much better such as sending out cream of the crop after-market GTX660/660Ti cards that boost like crazy out of the box - I.e., MSI Power Edition, while AMD sent out vcore boosted, reference cards that ran hotter and louder than their after-market 7800/7950 versions. Having PhysX in Borderlands 2 must have helped a ton as well as it was one of the best & popular shooters this year.

    AMD's issues are not performance or price, but likely perception and features (no PhysX). It's clear they are brand related, perception from early spring is hurting their products too. Look at TechSpot, it took them months to get up to speed on the performance increases of HD7000 line that were being reflected up by so many other websites. When consumers believe NV makes a faster card, they automatically think GTX650/650Ti/660/660Ti are probably better than AMD's cards in the same price range. The irony is GTX680 lost the performance crown as early as June 2012 but most consumers don't check modern reviews...
     
  4. Cota

    Cota TS Enthusiast Posts: 521   +8

    ^ Wants to buy an Nvidia for the 1st time ever.

    During my hardcore gaming years I had been devoted to ATI, but now days they just seem to had hired geniuses to build their overwhelming list of revisions and models, its just wrong.

    I personally think that buying the top most video cars its a mistake since most of the almost top cards are cappable of running almost anything and the time lapse that splits those cards upgrades are almost the same, BUT the jump that Nvidia made just got me between the sword and the wall.

    I currently have an ATI 5770 and just bought a 6850 for my brother some while ago, but for me its only Nvidia from now on.
     
  5. dividebyzero

    dividebyzero trainee n00b Posts: 4,950   +731

    Desktop yes. Mobile no. Performance per watt rules in mobile, and what is insignificant in desktop power usage tends to translate into some expensive heatsinking on a notebook. Moreover the percentage of cards sold in the $300 bracket (for desktop) is basically insignificant in the balance sheet.
    And as the saying goes; "One swallow does not make a summer". What you are seeing is Nvidia's investment in brand paying off - much the same as the brand kept Nvidia in the game during the mid-2007 to mid-2008 period. Years of providing the halo product -from G80, G92, GT200, GF100 and GF110, and catering for the gamer user base are going to require AMD to execute for more than one top-of-the-heap generation, and a sizeable ongoing investment in Gaming Evolved.
    Unfortunately for AMD, that doesn't seem like changing anytime soon. Enduro still suffers from utilization issues- and of course, hybrid Crossfire doesn't work with GCN cards. The rebadging seems to have started early (HD 8550M in this case) -although like the rebadged GT7xx Nvidia cards, this is more down to the OEM's than AMD/Nvidia
    That's actually a subjective argument as you noted below. Nvidia sells at its price so the SKU's stay at that price. As a company Nvidia sells at higher ASP's for smaller dies, and lower performance. A $299 GTX 680 would basically cut AMD off at the knees- and given the yields and Q3 earnings call that would be eminently possible with the smaller die strategy. This way both AMD and Nvidia make money -I'm also guessing that the relationship between the two companies is somewhat more cordial than the fanboy wars being fought in forums. Doesn't take a genius to fill in the blanks. What you're seeing is as much AMD's (lack of) marketing prowess as anything else, and years of the AMD ethos of spending 99% effort getting a product to launch, then spending 1% effort to market and support the product. (e.g. Six months of Enduro woes played out at NBR, Anand, Rage3D, Tom's, and likely another 6 months to get it working properly...just in time for the HD 7000's to go EOL)
    Another situation that won't change. Just wait for AAA titles to show up on graphics intensive game engines that support TXAA. 8xMSAA+FXAA image quality for a 4xMSAA performance hit...and I'm guessing that things should start getting quite nasty if Gaming Evolved titles don't support the feature
    AMD/ATI have always lagged in game orientated graphics features. The Nv control panel had global game profiles back in the FX5200 days, and of course, the "Way it's meant to be played" is probably indelibly etched in gamers minds. If ATI (and later AMD) had kept GiTG going and actually invested heavily in game development back in 2004-05, they'd probably have close to equal footing in the brand allegiance.
    One SKU doesn't make a dynasty unless it's truly ground breaking -and Tahiti isn't- not in the same way that G80 was for instance ( or RV770 for that matter). 8800GTX, 8800 Ultra, GTX 280, GTX 285, GTX 480, and GTX 580 were all the single GPU performance kings of their generation with ATI/AMD only claiming the title when Nvidia were late to market. You won't change that perception in one generation, and if GK110 see's the light of day as a GeForce card, I'd say AMD's halo reign probably end in March.
     
    Burty117 likes this.
  6. Thank you dividezero. I enjoyed reading your post!
     
  7. valentyn0

    valentyn0 TS Rookie

    AMD fanboy here, psst !
     
  8. Blue Falcon

    Blue Falcon TS Enthusiast Posts: 156   +44

    Cota,

    I don't understand your point at all. What do you mean "AMD hired geniuses to build their overwhelming list of revisions and models, its just wrong."

    First of all, GPU generations have almost always had refreshes during mid-cycle of the GPU generation. NV had them too: GeForce 3 --> GeForce Ti 500, GeForce 5900U --> 5950U, GeForce 6800U --> 6800UE, 7800GTX 256MB --> 7900GTX, 8800GTX --> 9800GTX+, GTX280 --> GTX285, GTX480 --> GTX580.

    Similarly, ATI refreshed their generations in the past too, not just at the top but mid-level too. 9700Pro --> 9800Pro, X800Pro --> X850Pro, X1800XL --> X1950, 2900Pro --> 3850, HD4850 --> HD4860, HD5850 --> revision + HD6950.

    Releasing mid-cycle HD7950 V2 and HD7970 Ghz is actually the norm, not the outlier. Not only that but AMD distributed the BIOSes to review websites and existing owners can download them for free and flash them. If it doesn't work, well there is a BIOS 2 safe feature. That's a nice value add to the consumers that cost them nothing.

    Also, how does a company's release of more modern SKUs at all mean you suddenly view them negatively? It seems like some kinds of an excuse to not buy their products.
     
  9. Blue Falcon

    Blue Falcon TS Enthusiast Posts: 156   +44

    valentyn0,

    Funny I get called an AMD fanboy despite owning NV cards for many many generations. During Fermi generations I went with Tri-SLI setup overclocked. Why? Because GTX400 series had more VRAM for mods, overclocked better, scaled better with overclocking, had superior tessellation performance and had great DX11 performance in games like Hawx 2, Lost Planet 2, etc. I chose GTX400 despite inspite of its inferior performance/watt compared to HD5000 series. It offered a plethora of enthusiast features and included voltage control that allowed incredible % overclocks on air on GTX470/480 cards (GTX460 was especially a start in this regard often overclocking from 675mhz to 925-950mhz). You can call me a fanboy all you want, but all I am a fanboy are enthusiast features and gaming performance and NV lost this round for me which is why I switched back to AMD. Let's look at the facts.

    -- HD7970 handles performance with high-MSAA much better (check)
    -- HD7970 handles higher resolutions and mods in games much better (Skyrim + ENB) (check), plus more VRAM for lower price
    -- Like GTX400 series, HD7900 series overclocks better and scales better with overclocking (check)
    -- Like GTX400 series, HD7900 series performs better in the most demanding titles (Metro 2033, Crysis games, Arma games, BF3, Witcher 2, and 2012 titles with latest DX11 effects such as global illumination and contact hardening shadows/HDAO -- that's your Hitman, Sleeping Dogs, Dirt Showdown, Alan Wake, etc. -- all work faster on HD7900 series as well).

    So please, if you are going to call me an AMD fanboy, but you'd have to call me an NV fanboy last round then. HD7970 series is like GTX480/580 series of last generation since it performs faster in the most demanding titles, overclocks better, works better with modded textures, has more VRAM, scales better with overclocking, but unlike GTX480/580, costs less than competing GTX680 cards. And as I mentioned previously, HD7970 cards make $$ bitcoin mining. So not only did I buy the faster cards this generation, but they have fully paid themselves off. I got the best performance and it cost me no $$ after 10 months of ownership and with latest drivers, there is no contest. Who is the fanboy exactly?

    Did you also forget NV locked out voltage control on Kepler products even on MSI Lightning and EVGA Classified, complete 180* from Fermi days. At any point in time I buy the best GPU and Fermi delivered last round, this round GTX680 offers less of everything other than PhysX and gimmicky TXAA compared to HD7970. I'll switch in 2 seconds to GTX780 if it offers enthusiast features and enthusiast performance I expect from a flagship product, regardless of brand.
     
  10. Blue Falcon

    Blue Falcon TS Enthusiast Posts: 156   +44

    dividebyzero,

    "Desktop yes. Mobile no. Performance per watt rules in mobile, and what is insignificant in desktop power usage tends to translate into some expensive heatsinking on a notebook."

    It's not that simple. NV got 300 design wins in the mobile sector before a single GTX600 chip was sold to OEMs. The question is why didn't AMD's team get some of those design wins when they launched Tahiti as early as Dec 2012-Jan 2013? It sounds like execution is a key factor here too. AMD didn't actively promote or market its mobile HD7000 series. I bet its sales people did a lousy job with trying to sell low-end HD7000 series into notebooks. When NV won Apple contract for example, GTX600 was a spec on the roadmap, nowhere near close to launch.

    Also, please explain then why AMD suddenly took a huge decline in market share in Q3 but in Q1-Q2? In fact, in Q2, AMD gained market share against NV, which throws your entire theory out of the window.

    "AMD increased its market share to 40.3%, Nvidia?s market share slipped but still retains a large majority at 59.3%. Nvidia got off to a slow start in Q2 and cited supply constraint as the main reasons for the decline"

    http://www.techpowerup.com/171198/G...nally-Down-from-Last-Quarter-Reports-JPR.html

    Sounds like NV just executed far better in Q3 than they did in Q1-Q2, although they were already shipping superior performance/watt GTX600 cards in Q2. Why didn't AMD lower prices to OEMs knowing that NV's supply issues will be resolved in Q3? All these things point to poor execution & strategy, not just inferior performance/watt product. If NV was supply constrained, that means those OEMs needed more product but NV couldn't fill all that demand. AMD's management team should have stepped in and stole market share instead it looks like they fell asleep :)

    "Years of providing the halo product -from G80, G92, GT200, GF100 and GF110, and catering for the gamer user base are going to require AMD to execute for more than one top-of-the-heap generation, and a sizeable ongoing investment in Gaming Evolved."

    I guess unlike most consumers who buy GPUs, I am not brand loyal, which is why I don't eat AMD or NV marketing. It seems your entire post is trying to justify why NV cards sell well from a brand/marketing point of view, and that even if NV loses a generation, its strong marketing and brand saves it. Ya, I don't disagree with that notion but it also shows how many of NV's consumers are like sheep (similar to Apples). I followed PC tech and forums for a while and people who buy AMD/ATI cards often have no problems switching to NV and back to AMD/ATI and back to NV. But it is NV fans that keep waiting to give NV their $$ (this generation is a perfect example), or often downplay NV's underwhelming delivery because they "love" the brand.

    -- When HD7970 was 20% faster than GTX580 for $100 more, it was overpriced? Yet, when GTX680 lost the performance crown since June, and cost more than 7970s, I rarely heard the same sentiment regarding NV's overpriced cards?

    -- When NV took 6 months to roll-out its sub-$300 lineup, it was magically forgiven for being late. In technology, generally being late means you are a follower, not a market leader. The only companies that get away with launching late and still selling despite not necessarily better products are those with a huge number of brand fanboys that would buy anything that firm makes -- Honda, Toyota, Apple, etc. If AMD launched 6 months late, they might as well close shop in the GPU race. NV launches 6-8 months late, no problem......a lot of its customers wait like sheep. Most people who buy AMD cards won't wait for AMD cards, they'll just buy NV :)

    -- Why does NV get a pass for removing voltage control from high-end after-market GTX670/680 cards? If AMD did that, enthusiasts would be extremely displeased. NV does it, people defend NV because "they had to do it to reduce RMA costs to themselves, etc." Oh RLY?

    -- Tons of double standard this generation. When GTX400 series overclocked really well, performance/watt was not a consideration, never. GTX460 @ 925mhz exceeded HD5870 power consumption but it was respected for its amazing bang-for-the-buck. HD7950 OCed to 1100mhz uses about as much power as a GTX680/HD7970 and yet its overclocking is now downplayed by NV users as "luck of the draw", etc.

    -- Remember Fermi generation? NV improved drivers > 10% over 6-8 months since release in March 2010. When they did this, many gamers said "WOW, NV's driver team is awesome, they extracted so much more performance out of a new architecture." When AMD does the same thing with GCN --> "AMD's driver team sucks! This level of performance should have been there on day 1, etc." Again, another double standard from NV fanboys. It's funny for me to read all this because when I got my 3 Fermi cards, I fully expected NV to add 10%+ over 6+ months and I got it knowing Fermi was a brand new architecture.

    Most importantly, NV users never acknowledge that NV got taken to the cleaners this round. They keep pointing to excuses like market share, NV's profits, etc. You pointed out 4 consecutive generation of NV's leadership since G80. GTX680 is not a continuation of that, but I don't see many guys who bought NV last 4 gens criticizing NV for under-delivering. Why? I wanted a 60-70% faster GTX580 and what I got was a 35% faster version instead. To get more performance, I had to go AMD and I don't sit there defending NV. NV users stick by NV even when they lose. I guess more power to NV for nurturing such a sticky customer base.

    "Unfortunately for AMD, that doesn't seem like changing anytime soon. Enduro still suffers from utilization issues- and of course, hybrid Crossfire doesn't work with GCN cards. The rebadging seems to have started early (HD 8550M in this case) -although like the rebadged GT7xx Nvidia cards, this is more down to the OEM's than AMD/Nvidia"

    Agreed. The rebadging is really awful by both firms, lately worse by AMD in the mobile space. Sounds like a repeat of HD7000 mobile start. Terrible execution/marketing.

    "That's actually a subjective argument as you noted below. Nvidia sells at its price so the SKU's stay at that price. As a company Nvidia sells at higher ASP's for smaller dies, and lower performance. A $299 GTX 680 would basically cut AMD off at the knees- and given the yields and Q3 earnings call that would be eminently possible with the smaller die strategy."

    Ya, but another way to look at it - NV ripped us off just as badly as AMD did. Where is the real GTX680? They couldn't deliver it. GK100? MIA. GK110? Impossible as they are only now ramping up volumes of K20/K20X chips. NV couldn't release anything bigger than GTX680 due to yield and wafer capacity issues, not because they purposely held it back to next generation. They got away with selling a GTX660Ti successor for $500. Guess which company took the blame for overpricing GPUs this generation? AMD, not NV. Another double standard. Let me see now in 7 months from the time $649 GTX280 launched, NV later delivered GTX285 for $350 and then AMD launched a $259 HD4890 just 9 months after GTX280. GTX280 lost nearly $400 in value in 9 months. Talk about being ripped off. But instead people talk about how HD7970 was a rip-off, despite it still being the fastest single GPU 10 months later?

    "Another situation that won't change. Just wait for AAA titles to show up on graphics intensive game engines that support TXAA. [link] ...and I'm guessing that things should start getting quite nasty if Gaming Evolved titles don't support the feature"

    I actually disagree with that article. TXAA looks very blurry even in those small-sized pics of COD:BO2. See TXAA is yet another NV marketing tactic. MSAA+FXAA provide superior texture quality. TXAA makes PC games look like console titles as it blurs the textures (see the Secret World as well). TXAA is even worse than FXAA/MLAA filters but I guess new gamers think MSAA is "old/antiqued" when in reality the new FXAA/MLAA/TXAA filters have been created to allow weak systems to have anti-aliasing, especially useful for consoles -- see PS3's blurfested Black Ops 2 using NV's new filter. It doesn't look nice. Again, I can't see how you spin TXAA as some superior NV feature. Enthusiasts who buy $300+ GPUs go MSAA, or for higher GPUs, they go downsampling route. Again this idea that you can get 8xMSAA+FXAA image quality with 4xTXAA is non-sense. You cannot because TXAA blurs image, defeating the purpose of AA and buying a flagship GPU for PC gaming on a 2560x1440/1600P monitor. Who wants to buy high-end GPUs and then blur their entire screen?

    "8800GTX, 8800 Ultra, GTX 280, GTX 285, GTX 480, and GTX 580 were all the single GPU performance kings of their generation with ATI/AMD only claiming the title when Nvidia were late to market."

    GTX480 was late to the market by 6 months. HD5870 had the market all to itself for 6 months. You just keep reinforcing the same story from NV buyers -- they always come up with excuses why NV is allowed to be late, is allowed to remove enthusiast features and is allowed to underdeliver in performance or performance/watt. They spin anything negative about NV into a positive. GTX680 came out strong, and then lost the performance crown, it's that simple. Why didn't NV release a 1267mhz GTX685? Why did NV not get criticized for not dropping prices or adding game bundles on GTX670/680 cards after AMD added 3 free games?

    I agree with you completely that GK110 will retake the performance crown (or some GK200 variant). But even if NV loses, overpriced its cards, its loyal customers will still buy their products.

    BTW,

    From the second quarter to the third, JPR says Nvidia's PC graphics shipments grew 28.3% in desktops and 12% in notebooks. Intel suffered 7% and 8.6% declines in those same markets, respectively, while AMD saw a 2% decline on the desktop and a 17% decline in notebooks.
    http://techreport.com/news/23959/jpr-nvidia-gained-in-pc-graphics-last-quarter

    So as you said correctly, AMD is getting rapped in the mobile sector.
     
  11. dividebyzero

    dividebyzero trainee n00b Posts: 4,950   +731

    AMD couldn't sell air to a drowning man- this is not news, so no great mystery. You're also confusing desktop with mobile GPU. Chelsea, Heathrow and Wimbledon (the mobile Cape Verde and Pitcairn) didn't launch until late April this year (so Q2 ship for revenue) Tahiti and Pitcairn were also supply constrained through Q1 thanks to the slow ramp of TSMC's 28nm node.
    And you don't think that Cupertino got advance information regarding Kepler? That Nvidia- in actively pursuing a huge OEM wouldn't have shown Apple the capability of Kepler using boards built from the first batch of risk wafers? GK104 taped out around August 2011. Tape out to production of A1 silicon would be six weeks. Add 2-3 weeks for testing and validation, brings you to October/November 2011.....and of course, first rumours of Apple switching to Nvidia posted in November 2011. Would you consider this coincidence?
    Apple aren't in the business of playing favourites. Choosing Nvidia would have come down to simple economics - price per board, and performance per watt. Lower power consumption means cheaper and less esoteric cooling, lower failure rate, and probably most importantly, longer battery life.
    Not really. Mobile Cape Verde and Pitcairn ramped in Q2 (April), Pitcairn desktop launched in mid-March (close to Q2), Cape Verde launched mid-February. With TSMC's ramp issues, that puts the bulk of AMD's new line-up in late Q1 through Q2.
    For Nvidia, GK107 and N13E-GS1 both launched Q2- but GK107 was supply constrained initially, GK104 mobile was a Q3 release. Likewise GK104 desktop, while a late-March release suffered from shortages that weren't alleviated until near the end of Q2. GK106 -the high volume mainstream GPU also shipped for revenue in Q3. Add in the fact that AMD aren't in the business of sustained marketing and I don't see how my theory gets "thrown out the window". Nvidia were always going to get design wins, since it's a dead cert that Dell, Acer, Asus, Lenovo and HP amongst others would also have got advanced previews of Kepler in the same way that Apple did. If you hold to the argument that AMD's problems stemmed from a junior-league marketing campaign I wouldn't completely disagree...but it doesn't explain how AMD were able to sell in Q1/Q2
    Or maybe they just had more to sell thanks to TSMC's improving yields, and the introduction of GK107, GK106, and pro GK104/GK110 graphics. I can think of two contacts for instance that combined for close to 30,000 units combined ( Amazon's K10 order and ORNL's Titan), and that's without taking into account contracts to upgrade the significant portion of 2050/2070 equipped systems.
    I didn't say it was an all-or -nothing scenario. AMD marketing/sales have a large part to play (esp. in OEM relations) in addition to Nvidia's model range. I could speculate on the "why" of AMD's failures to capitalize...but that would quickly run into book sized written piece.
    Even worse is the fact that Nvidia were thrashing AMD in the mobile market with mostly Fermi based parts(630M/ 635M/ 670/ 675M)
    You're contradicting yourself. Most people buy what is put in front of them. Most people buy prebuilts from OEM's. You've just asserted that AMD can't get meaningful and sustained contracts with OEM's...and you're wondering why people buy Nvidia GPU's.
    BTW: This article is financial in nature - is it any wonder that my take on it from a sales and marketing viewpoint ?
    Sorry. Don't care. Excepting trolls, people who frequent tech sites are generally interested in performance to brand - so you aren''t alone. The vast majority of hardware buyers don't have clue one about what's inside- and don't care. Most enthusiasts will switch between brands depending on circumstance. Since the nineties, myself and a lot of enthusiasts have owned products from the bulk of the 50 or so graphics manufacturers in the marketplace. The trolls seem more numerous because they make the most noise (remove the multiple posting trolls and "Guests" and see what the landscape looks like), and I've given up counting the number of times "nvidia drivers kill cards" every time a new Forceware driver is released. Both fanboy camps are as bad each other (I.e the pre-Bulldozer trash talk, and since you mentioned owning Fermi cards maybe check out this thread)
    And you maybe haven't noticed the AMD fanboys that defend Bulldozer because it wins a couple of obscure benchmarks, or the cries of "Just wait for Piledriver...Steamroller...Excavator", a whole generation of people that eschewed outright performance for efficiency, then defended a large die Tahiti because of it's hashing ability? I guarantee you that I could find hypocrisy in both fan bases (all three if you count Intel)
    They didn't. Nvidia didn't want to face large scale RMA's for board vendors running the cards out of spec -notably EVGA's EVBOT. Nvidia told AIB's that they could run voltage control but the AIB's would be responsible for RMA's on cooked cards instead of AIB's forwarding RMA's back to Nvidia. This is less about RMA's than the fallout in forums and Newegg reviews bringing the brand down [/quote]
    AMD doesn't need to do it because of PowerTune which cannot be disabled. Which leads to some pisspoor OC results in some cases. What's the point of enabling 1.3v or more if the board throttles when it reaches max board power?
    As for the rest of the Nv fanboy diatribe. You have a selective memory, and are basically moving into fanboy territory. I wouldn't have to look hard to find counters to every point you've outlined
    GK110 was never designed to be a GeForce card. Tesla and a Quadro follow up are the market. A good hint for the layman would be: When was the last time a Nvidia GPU appeared as Tesla/Quadro before GeForce (Answer: Never). GeForce will happen once the binning produces a tranche of dies unsuitable for WS/HPC. Full 15 SMX and/or high leakage GPU's will be binned for desktop since 225-235W is the effective limit for a pro board (if you want to use it in an existing rack or WS enclosure) and that has to include the 30W or so required for the larger VRAM component.
    Sorry, but that's bullsh!t. GK104 was taped out in August last year. GK110 taped out in late January. Quick math will tell you that GK110 was on time :
    Tape out to production of risk silicon = 6 weeks
    Testing validation = 2-3 weeks
    Tape out to production of A1 silicon = 6 weeks
    Testing/validation = 2-3 weeks
    Commercial production of A1 silicon, binning, packaging, shipping = 12 weeks
    Total : 28-30 weeks . ORNL received it's first batch of K20X in the first week of September (30 weeks from tape out)
    I've gamed with TXAA and pretty much every other form of AA, and I disagree. The difference isn't startling I'll grant but it shows promise- my opinion. I also game at 2560x1440 (27") for the record. I think Í can find more than a few AMD fanboys singing the praises of MLAA if I had an extra minute or so.
    Yep, almost as bad as those delusional Bulldozer cheerleaders who regaled everyone about how the wait was going to be worth it
    Now, you've managed to turn a post about the sales of GPU's into a tirade on the unreasonableness of Nvidia fanboys. Personally, I just ignore them- unless I'm after a little sport, and I don't generally nut off about fanboys and hold a manufacturer responsible for their actions.
    I suggest you start a new thread if it affects you that much. I can provide you with some juicy posts from fanboys of every stripe for you to be appalled at - including those of the Cult of Charlie Demerjian -who seem to exist only as a rallying point for trolls and id1ots, and for the transformation of vendor agnostic users into diehard Nvidia defenders.
     
     
  12. Great post man! You know your stuffs.
     
  13. I agree, Guest! To be that long it must be REALLY authoritative.
     


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.