Weekend tech reading: Will the GTX 650 Ti be a dud at $170?

By Matthew
Sep 23, 2012
Post New Reply
  1. Specifications for the next graphics card in Nvidia's Kepler-based lineup, the GTX 650 Ti, have leaked. Last week saw the introduction of two new Kepler graphics cards from Nvidia with the GeForce GTX 650 and GTX 660. Those cards filled...

    Read more
  2. TomSEA

    TomSEA TechSpot Chancellor Posts: 2,231   +314

    Nice collection of articles (again). I too was wondering what niche nVidia is trying to fill with their GTX 650 announcement. Looks like they have all their bases covered with what they currently have. Looking at the pricing/performance comparison, I'll bet they don't sell many of those.

    And 50 years since the Jetsons first show?? Crazy... But that was a fun read.
  3. dividebyzero

    dividebyzero trainee n00b Posts: 4,700   +586

    The same niches that the GTX 560 / 560Ti presently occupy. The 560 represents Nvidia's highest volume gaming part and only now seems to nearing EOL. The new GK 106 cards will be tasked with taking AMD on in the volume mainstream market against the HD 7850 and 7870, albeit at a reduced overall performance level- but then, Nvidia have historically maintained a price premium over AMD cards (usually due to brand awareness and software enviroment).
  4. The entire GTX 6xx series is a dud really, we all know the part now known as the 680 was originally intended as a 560 Ti replacement until Nvidia realised how far AMD were behind, so instead it was branded as a high end card (which it clearly isn't in comparison to jumps in previous generations) and the rest of the range got skewed downwards making the low-end cards completely worthless. Anything less than a 660 and you might as well just stick with IGP.
  5. Blue Falcon

    Blue Falcon Newcomer, in training Posts: 143   +36

    Guest,

    Actually, we don't know that. This theory was widely proposed on the Internet on many forums and many of us believed it at first. However, later on many people realized this theory has too many holes to makes sense.

    For starters, if you had actually listened to NV's CEO on earnings calls, he discussed 28nm wafer prices, wafer capacity constraints which forced NV to choose between locking in 300+ notebook contracts or what they ended up going with -- prioritizing laptops for Kepler, and thus delaying the rest of the Kepler desktop line-up by 6 months.

    Secondly, NV publicly announced as early as Spring 2012 that they will be launching K10 and K20 Tesla parts, and that 150,000 pre-orders of such parts have already been placed. These orders went to professional companies and industries that use would want to use these parts as soon as possible. We knew then that K20 won't show up until Q4 2012. Now we even know a more accurate date - December 2012. Now why would NV make its corporate clients wait more than 6-7 months before they get the K20 part they ordered?

    There are actually many logical reasons why NV didn't launch GK110 and none of them may have anything to do with HD7970.

    1) 500-600mm^2 die size on a 28nm wafer means NV cannot sell 2x GK104 chips. Since GTX680 sells for $499 and GTX690 sells for $999, NV would have needed to sell GK110 "the real GTX680" for $1000 at least. It would be more though since 500-600mm^2 chips have worse yields than 294mm^2 chips. Both the CEO of NV and CEO of AMD went on record to say that node shrinks cost more money and it is becoming more difficult to maintain current prices without passing them to consumers. The alternative is letting the node mature and in the 1st generation you lower die size to maintain your margins.

    2) When you are wafer constrained and already committed to 300+ corporate contracts, you do not have excess manufacturing capacity to launch 500mm^2 die GPUs because you have other obligations to meet. NV knew as early as Q4 2011 that wafer capacity at 28nm is going to be an issue at TSMC.

    3) 500-600mm^2 die 28nm chip probably would have meant near Fermi levels of power consumption since the 28nm node was too knew. NV already suffered a 6 months delay with GTX480 by trying to launch a massive chip on a brand new 40nm node and it was a disaster for them. I bet they didn't want to repeat the delays and huge power consumption problems. It made sense to focus on performance/watt since that's what the consumers asked. JHH said so himself during the launch of GTX690 series in front of an audience.

    4) GK110 was simply unmanufacturable for most of this year at sufficient enough volumes. This is probably the most reasonable assessment. How do we know? Because not a single K20 Tesla card has still shipped to consumers who paid $3000+ for each. NV does not want to make these type of clients wait and is doing everything possible to get those K20 cards out on time. And they won't be out until late 2012.

    Iit seems like NV had to use GK104 out of necessity, both in terms of their performance/watt, wafer capacity at TSMC and profit margin strategy. NV also knew that GTX580 was about 15-20% faster than HD6970. This meant that NV had less pressure to deliver on a performance increase since if they just increased performance 30-35%, AMD would have needed to increase it by 45-50% to match them. This is exactly what happened.

    Now that 28nm node has matured over the last 10 months, we could see some version of GK110 as GTX780 next year.

    I wouldn't say GTX 6xx series is a dud, since it accomplished everything NV has set out and it competes fairly well against HD7000 series, although being late by 6-7 months for the sub-$300 line was not a favorable outcome for gamers.

    Look at Borderlands 2, even a GTX660 runs that game well. With most games being console ports, this generation could have easily been skipped by GTX570/580 owners anyway.
  6. TL:DR

    I have an associate in Nvidia, his version of events is that GK110 was pushed back after it was realised that GK104 would do fine. That trumps whatever it is you're conjecturing in your lengthy post.
  7. dividebyzero

    dividebyzero trainee n00b Posts: 4,700   +586

    Long post. Some truth, some supposition, and some info taken at face value from the rumour mills. Tech Report reported GK 110 Tesla ready for initial shipping some months back. Both Cray and ORNL are reporting initial shipments:
    [ORNL press release two weeks ago]
    As for "unmanufacturable for most of this year"....GK 110 is actually on schedule. Tape out for the chip is generally agreed upon as being late January this year. Bearing in mind that production of professional co-processors mirrors commercial GPU's with the exception of a more stringent validation process, you're looking at around 9 months- assuming only 1 or 2 revisions. So tape out to production of A1 risk wafers (6 weeks) followed by 2-3 weeks of testing, fiollowed by a further 6 weeks for tape out and production of A2 revision silicon, 2-3 weeks testing, and the usual 12 weeks for commercial production, die packaging ( die cutting, heatspreader attachment), binning/validation, card assembly/testing and card packaging...total time 28-30 weeks....32 weeks have elapsed since tapeout in late January and first shipments- which would indicate that GK 110 is both on schedule, and like the other Kepler GPU's required little revision from the initial A1 silicon ( I.e. no major revision, no base metal respin)
    Sounds as though your associate is placed closer to the custodial/janitorial end of the Nvidia chain of command.
    GK 110 and GK 104 basically follow seperate timelines. GK 110 has been first an foremost a math co-processor ( ECC memory support, 1:3 rate double precision, wide memory bus) and has been in development (and contracts signed) for some time ( example here February 2011 and October 2011), while GK 104 is primarily a workstation/consumer GPU. GK 110 as a direct competitor to GK 104 (single or dual) is largely predicated on salvage GPU's lacking the functionality of the full 15 SMX's that the Tesla K20 would require. A consumer GK 110 (say, GTX 780) would be simply a way to harvest GPU's that cannot be validated for pro use ( manufacturing defects and/or voltage requirement)- an afterthought that has the ability to gain PR/marketing points for Nvidia.

    How many GK 104 cards would you have to sell to offset GK 110 contracts? An off the shelf Tesla K20 is listed at $3199. HPC versions will be more expensive still. ORNL have orders totalling 14592 units, NCSA's Blue Waters (3000+), each Aurora Tigon requires 256...and that doesn't take into account OEM workstations using Maximus or smaller clusters built by the same OEM's... or the large number of HPC installations that will likely upgrade previous generation components.
    BlueDrake likes this.
  8. You're missing the point that the GK110 is post-decision to brand GK104 as high-end. Prior to that (when it was still GK100) it wasn't jury rigged to be a Tesla part.
  9. dividebyzero

    dividebyzero trainee n00b Posts: 4,700   +586

    And what has that got to do with your supposition that GK 110 production was postponed because of how the GK 104 performed ?
    GK 110 contracts had NOTHING to do with GK 104 -in any guise. GK 110 contracts exist solely for K20 based Tesla and have done for the best part of two years. Show me-and the rest of the forum- where GK 110/Tesla K20 was pushed back because the K10 (with it's 1:24 rate double precision, only partial ECC support, and low bandwidth) is so superlative.
    Name a GK 110 contract that subsequent to GK 104's supposed stellar showing, caused the vendor to change to GK104 Tesla K10's. GK 104 based K10 isn't a replacement for the K20. K10 is a moneymaker for number cruchers who have no need for double precision workloads or ECC- I.e. uses outside of HPC, such as allying K10's with Quadro's for workstation/vizualization.
  10. cliffordcooley

    cliffordcooley TechSpot Paladin Posts: 5,082   +1,184

    After GTX 650 vs GTX 660 provided such a huge gap in performance, I was wondering if they were going to release a few cards between to cover the performance scale. The GTX 650TI and GTX 660SE were the two cards that came to mind.
  11. Sigh, are you being deliberately obtuse, dividebyzero? Or being pedantic about codenames or what, exactly? To make it as clear as possible, Nvidia had two chips in development, one mainstream which became the GK104 and one flagship/HPC which became the GK100 and later the GK110. After AMD's underwhelming competition the GK104 was instead branded as high end while GK100 was re-purposed as the GK110 and geared exclusively for the HPC market with the GTX 690 now being considered "good enough" as the Kepler flagship.

    If you think we've seen the last of GK1x0 as a desktop card then you're massively out of touch, it's simply been pushed back until when it's needed and depending on what AMD comes up with. You only need to look at the TDP, thermals, dimensions, etc. of the GK104 for it to be blatantly obvious this wasn't originally intended as top-end part like the GF110 was. You seem to be fairly oblivious to the fact that GF110 had ECC, better double-precision floating point performance, etc. similarly to the GK1x0 (its natural successor).
  12. Archean

    Archean TechSpot Paladin Posts: 5,989   +66

  13. dividebyzero

    dividebyzero trainee n00b Posts: 4,700   +586

    GK 110 was always intended as a pro part geared for HPC. Forum fanboys were the only ones making noises about GK 110 launching as a desktop card alongside GK 104. Feel free to prove otherwise.
    Your argument makes no sense if you consider the timeline:
    HD 7970 is reviewed via leak and official reviews in December 2011, yet you would have people believe that Nvidia, upon realizing that their in-production GK 104 had enough horsepower to combat Tahiti, made an arbitrary decision to pull GK 110 as desktop card- even though only four weeks elapsed between Tahiti's known performance and GK 110 wafers being laid down.
    Moreover, you expect people to believe that prior to Tahiti's performance becoming known, Nvidia had decided upon launching GK 110 parts as desktop even though, 1. TSMC's 28nm capacity and yield were low, and 2. Nvidia needs every available GK 110 die to fulfill HPC contracts, and 3. Nvidia would rather use GK 110 dies to sell $3+k pro co-processors than $1k gaming cards.
    ePeen for the fanboys. The GK 110 parts will be likewise to a certain extent. Nvidia probably don't even cover expenses on GTX 690 production ( production run vs. R&D and extra binning), GK 110 would likely fall into the same category. Enough of a production run to keep the card included in review comparisons for PR and marketing...at least until existing pro contracts are fulfilled and the 28nm process becomes more mature.
    Comprehension fail on your part.
    Desktop cards based on high-end compute GPU's, always eventuate. It simply becomes a matter of stockpiling GPU's that don't meet the binning process for the pro SKU's and/or excess inventory once the pro market contracts have been filled....and I never said otherwise. I would also very much guarantee that desktop cards do not feature a fully functional die (2880 core/15 SMX)
    And I never said it was. My posting was concerned solely with the GK 110 and it's timeline. You're the one that seems to think that the GK 110 has been shelved/delayed because of the GK 104- which is blatently untrue given the development of the part. You can bleat on ad infinitum about the GK 100 being re-jigged into the GK 110, but the fact remains that if the GK 100 existed at all*, it was cancelled long before AMD's Tahiti performance was known, and it's cancellation could easily have been due to a change in architecture, lessons learnt from the process node, the need to add further compute functionality (Hyper-Q for example), or any number of other variables
    Which has precisely nothing to do with GK 110 other than the fact that the GPU carries on the compute feature set that started with the G80. If you're trying to convince me that the GK 104 wasn't/isn't seen as a high end compute card then I think I've already covered that
    .

    Unless you can provide some supporting evidence (and no, random musings from average Joe forum poster doesn't count) , I'd suggest to take your trolling elsewhere.

    * Show me an instance where the "GK 100" was actually ever assigned to an Nvidia chip ( pdf link or official slide is fine). People assumed that a big-die was called GK100 because the series is GK1xx, and because the previous architecture was GF100
     
  14. TL;DR

    I can see you're getting upset and quite desperately want to be right, so I'll leave it at that though. Like I said, I heard this from the horse's mouth, but believe whatever you like and rationalise it to yourself however you can, sweetheart.
  15. dividebyzero

    dividebyzero trainee n00b Posts: 4,700   +586

    You might want to peruse Dave Kanter's article at RWT on the same, as well as his article on Intel's near-threshhold (for transistor switching) voltage tech.
    _________________________
    I thought that might be the case. Our esteemed Guest posters usually have difficulty when asked to provide support for their claims.
    Increasing your technical knowledge base via a domesticated animal...I think I just won a bet.
    Will do. My hit rate on graphics tech is usually pretty good, so I won't lose too much sleep over the permutations.
    xoxoxo
    cue inane response in 3....2....1...
  16. The less I know about your bets involving domesticated animals the better, I think. Seems you have worse things to be losing sleep over than graphics tech. ;)
  17. dividebyzero

    dividebyzero trainee n00b Posts: 4,700   +586


    Latest update/rumour would tend to refute Guest's "insider knowledge". Colour me surprised...
    GK110 reserved for HPC use. GTX 780 (and presumably 770, 760 Ti) to use what appears to be a new GPU probably modified from the GK 104. Likely a additional memory controller or two (320 or 384 bit) and an increased core count. Also likely that the new GPU won't simply be a couple of extra SMX tacked on the existing eight of GK 104 ( a 40 ROP/ 160 TMU part seems lopsided).


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.