Weekend tech reading: Will the GTX 650 Ti be a dud at $170?

Matthew DeCarlo

Posts: 5,271   +104
Staff

Nvidia's GTX 650 Ti makes too many compromises for a $170 games-oriented graphics card Specifications for the next graphics card in Nvidia’s Kepler-based lineup, the GTX 650 Ti, have leaked. Last week saw the introduction of two new Kepler graphics cards from Nvidia with the GeForce GTX 650 and GTX 660. Those cards filled in the massive void between the nearly-integrated-graphics-equivalent $99 GT 640 and the GTX 660 Ti at $300. Both cards were alluring for gamers as they offered up decent budget gaming performance for the prices. ExtremeTech

Nixeus NX-VUE27 27" monitor: high resolution for the masses The price model for 27" IPS displays has been turned on its head recently by imported models from Korea that you can buy on eBay. Selling for as little as $350, these are stripped down models that lack inputs beyond DVI, have no OSD, have very minimal stands, and often have very little in the way of support. They also use A-grade panels where tolerances for stuck pixels and uniformity errors might not be as high as they are with A or A+ panels that are used in most displays. AnandTech

Quantum cryptography: yesterday, today, and tomorrow Quantum cryptography is one of those amazing tools that came along before anyone really asked for it. Somehow there are companies out there selling very high end, and "provably secure" cryptography gear, all based on fundamental principles of quantum mechanics. Yet, despite being fundamentally unbreakable, there have been quite a few publications on more-or-less practical ways for Eve to eavesdrop on people whispering quantum sweet-nothings in darkened rooms. Ars Technica

Branded for life Wherever Joe Tamargo goes, people stare at his forearms. He likes it that way. Years ago, Tamargo, a resident of Rochester, New York, auctioned off space on his arms, transforming himself into a human billboard. "When I tell them the story, they're like, 'Yo, that's pretty cool. I'm going to check out those websites,'" Tamargo, 38, says of people who see him in public. "And then they get there and there's nothing on the website." Tamargo is not just a walking advertisement. He’s a walking advertisement for businesses that no longer exist. BuzzFeed

The Internet? We built that Who created the Internet and why should we care? These questions, so often raised during the Bush-Gore election in 2000, have found their way back into the political debate this season -- starting with one of the most cited texts of the preconvention campaign, Obama's so-called "you didn’t build that" speech. "The Internet didn’t get invented on its own," Obama argued, in the lines that followed his supposed gaffe. "Government research created the Internet so that all the companies could make money off the Internet." The NY Times

50 years of the Jetsons: why the show still matters It was 50 years ago this coming Sunday that the Jetson family first jetpacked their way into American homes. The show lasted just one season (24 episodes) after its debut on Sunday September 23, 1962, but today "The Jetsons" stands as the single most important piece of 20th century futurism. More episodes were later produced in the mid-1980s, but it's that 24-episode first season that helped define the future for so many Americans today. Smithsonian

Exclusive: A peek inside NASA's Global Hawk hangar Hurricane researchers are enjoying some new capabilities this month thanks to a pair of unmanned Global Hawk aircraft capable of flying for up to 30 hours at a time. Aircraft and satellites have long been used to study and observe hurricanes. The airplanes with pilots and researchers on board are capable of making measurements and getting up close, but only for a maximum of about 10 hours. Wired (also: Royal Observatory picks best astronomy photos of the year)

Microsoft deploys college-kid cool in Windows 8 apps race Microsoft Corp. is forging a future in tablets with students like Joe Shapiro, a computer science major who had never crafted applications for the company’s Windows software until he won an internship to try it out. Shapiro, a senior at Brown University, spent the summer in the Foundry paid internship program at Microsoft’s New England Research & Development Center, known as NERD. Businessweek

Will Apple's tacky software-design philosophy cause a revoltBy now it’s almost inevitable given the company’s track record: No matter what Apple unveils tomorrow at the Yerba Buena Center (an iPad Mini? iPhone 5?), pundits will herald the company for its innovative thinking and bold hardware design. But the elephant in the room will be Apple’s software, which many inside the company believe has evolved for the worse in the last few years. Fast Company

How much tech can one city takeLast year, when Mayor Ed Lee heard that Twitter was planning to move its headquarters out of San Francisco and down to the peninsula, he quickly consulted with his digital experts -- his two daughters, Brianna, 27, and Tania, 30. Was the company important enough to make a top priority? "Of course it’s important, Daddy!" they told him. "We tweet all the time. You have to keep them in town." San Francisco Magazine

Now Facebook wants you to grass-up friends not using their real name My friend and IT, IP and Media Law researcher specialising in privacy and autonomy Paul Bernal has a very good blog on what seems at first glance to be a crazy move from Facebook in their ongoing war on pseudonyms. Facebook has, from multiple independent reports, started asking friends to snitch on friends not using their real names on Facebook: Computerworld UK

Blizzard is "looking at free-to-play" for Starcraft 2 multiplayer Blizzard is considering ways to implement a free-to-play model in its wildly popular real-time strategy game Starcraft 2. When asked about Starcraft 2 going free-to-play at a Valencia eSports Congress panel, lead designer Dustin Browder responded that Blizzard is "looking at free to play as an option for the multiplayer," according to a report at PCGamesN. Eurogamer

Microsoft Project Austin: A new Windows 8 note-taking app inspired by Courier The app allows users to add pages to a notebook, delete or move them, use digital ink to write or draw and add photos. Notes created in Austin can be shared with other Windows 8 apps, like e-mail and SkyDrive. Users can choose different types of "paper" and view the pages in a variety of ways, including leafing through them like a paper book. ZDNet

Meet the new boss: big data When looking for workers to staff its call centers, Xerox Corp. used to pay lots of attention to applicants who had done the job before. Then, a computer program told the printer and outsourcing company that experience doesn't matter. The software said that what does matter in a good call-center worker -- one who won't quit before the company recoups its $5,000 investment in training -- is personality. The WSJ

News image via ShutterStock

Permalink to story.

 
Nice collection of articles (again). I too was wondering what niche nVidia is trying to fill with their GTX 650 announcement. Looks like they have all their bases covered with what they currently have. Looking at the pricing/performance comparison, I'll bet they don't sell many of those.

And 50 years since the Jetsons first show?? Crazy... But that was a fun read.
 
I too was wondering what niche nVidia is trying to fill with their GTX 650 announcement. Looks like they have all their bases covered with what they currently have.
The same niches that the GTX 560 / 560Ti presently occupy. The 560 represents Nvidia's highest volume gaming part and only now seems to nearing EOL. The new GK 106 cards will be tasked with taking AMD on in the volume mainstream market against the HD 7850 and 7870, albeit at a reduced overall performance level- but then, Nvidia have historically maintained a price premium over AMD cards (usually due to brand awareness and software enviroment).
 
The entire GTX 6xx series is a dud really, we all know the part now known as the 680 was originally intended as a 560 Ti replacement until Nvidia realised how far AMD were behind, so instead it was branded as a high end card (which it clearly isn't in comparison to jumps in previous generations) and the rest of the range got skewed downwards making the low-end cards completely worthless. Anything less than a 660 and you might as well just stick with IGP.
 
Guest,

Actually, we don't know that. This theory was widely proposed on the Internet on many forums and many of us believed it at first. However, later on many people realized this theory has too many holes to makes sense.

For starters, if you had actually listened to NV's CEO on earnings calls, he discussed 28nm wafer prices, wafer capacity constraints which forced NV to choose between locking in 300+ notebook contracts or what they ended up going with -- prioritizing laptops for Kepler, and thus delaying the rest of the Kepler desktop line-up by 6 months.

Secondly, NV publicly announced as early as Spring 2012 that they will be launching K10 and K20 Tesla parts, and that 150,000 pre-orders of such parts have already been placed. These orders went to professional companies and industries that use would want to use these parts as soon as possible. We knew then that K20 won't show up until Q4 2012. Now we even know a more accurate date - December 2012. Now why would NV make its corporate clients wait more than 6-7 months before they get the K20 part they ordered?

There are actually many logical reasons why NV didn't launch GK110 and none of them may have anything to do with HD7970.

1) 500-600mm^2 die size on a 28nm wafer means NV cannot sell 2x GK104 chips. Since GTX680 sells for $499 and GTX690 sells for $999, NV would have needed to sell GK110 "the real GTX680" for $1000 at least. It would be more though since 500-600mm^2 chips have worse yields than 294mm^2 chips. Both the CEO of NV and CEO of AMD went on record to say that node shrinks cost more money and it is becoming more difficult to maintain current prices without passing them to consumers. The alternative is letting the node mature and in the 1st generation you lower die size to maintain your margins.

2) When you are wafer constrained and already committed to 300+ corporate contracts, you do not have excess manufacturing capacity to launch 500mm^2 die GPUs because you have other obligations to meet. NV knew as early as Q4 2011 that wafer capacity at 28nm is going to be an issue at TSMC.

3) 500-600mm^2 die 28nm chip probably would have meant near Fermi levels of power consumption since the 28nm node was too knew. NV already suffered a 6 months delay with GTX480 by trying to launch a massive chip on a brand new 40nm node and it was a disaster for them. I bet they didn't want to repeat the delays and huge power consumption problems. It made sense to focus on performance/watt since that's what the consumers asked. JHH said so himself during the launch of GTX690 series in front of an audience.

4) GK110 was simply unmanufacturable for most of this year at sufficient enough volumes. This is probably the most reasonable assessment. How do we know? Because not a single K20 Tesla card has still shipped to consumers who paid $3000+ for each. NV does not want to make these type of clients wait and is doing everything possible to get those K20 cards out on time. And they won't be out until late 2012.

Iit seems like NV had to use GK104 out of necessity, both in terms of their performance/watt, wafer capacity at TSMC and profit margin strategy. NV also knew that GTX580 was about 15-20% faster than HD6970. This meant that NV had less pressure to deliver on a performance increase since if they just increased performance 30-35%, AMD would have needed to increase it by 45-50% to match them. This is exactly what happened.

Now that 28nm node has matured over the last 10 months, we could see some version of GK110 as GTX780 next year.

I wouldn't say GTX 6xx series is a dud, since it accomplished everything NV has set out and it competes fairly well against HD7000 series, although being late by 6-7 months for the sub-$300 line was not a favorable outcome for gamers.

Look at Borderlands 2, even a GTX660 runs that game well. With most games being console ports, this generation could have easily been skipped by GTX570/580 owners anyway.
 
TL:DR

I have an associate in Nvidia, his version of events is that GK110 was pushed back after it was realised that GK104 would do fine. That trumps whatever it is you're conjecturing in your lengthy post.
 
We knew then that K20 won't show up until Q4 2012. Now we even know a more accurate date - December 2012....GK110 was simply unmanufacturable for most of this year at sufficient enough volumes. This is probably the most reasonable assessment. How do we know? Because not a single K20 Tesla card has still shipped to consumers who paid $3000+ for each..
Long post. Some truth, some supposition, and some info taken at face value from the rumour mills. Tech Report reported GK 110 Tesla ready for initial shipping some months back. Both Cray and ORNL are reporting initial shipments:
Oak Ridge National Laboratory (ORNL) has received a handful of the GPUs that will power their upcoming Titan supercomputer. Jeff Nichols, the lab’s scientific computing chief, confirmed that 32 Kepler processors from NVIDIA have been installed in the "development platform” when speaking with the Knoxville News Sentinel last week. ORNL expects to receive roughly 1,000 more Tesla K20’s this week,
[ORNL press release two weeks ago]
As for "unmanufacturable for most of this year"....GK 110 is actually on schedule. Tape out for the chip is generally agreed upon as being late January this year. Bearing in mind that production of professional co-processors mirrors commercial GPU's with the exception of a more stringent validation process, you're looking at around 9 months- assuming only 1 or 2 revisions. So tape out to production of A1 risk wafers (6 weeks) followed by 2-3 weeks of testing, fiollowed by a further 6 weeks for tape out and production of A2 revision silicon, 2-3 weeks testing, and the usual 12 weeks for commercial production, die packaging ( die cutting, heatspreader attachment), binning/validation, card assembly/testing and card packaging...total time 28-30 weeks....32 weeks have elapsed since tapeout in late January and first shipments- which would indicate that GK 110 is both on schedule, and like the other Kepler GPU's required little revision from the initial A1 silicon ( I.e. no major revision, no base metal respin)
I have an associate in Nvidia, his version of events is that GK110 was pushed back after it was realised that GK104 would do fine. That trumps whatever it is you're conjecturing in your lengthy post.
Sounds as though your associate is placed closer to the custodial/janitorial end of the Nvidia chain of command.
GK 110 and GK 104 basically follow seperate timelines. GK 110 has been first an foremost a math co-processor ( ECC memory support, 1:3 rate double precision, wide memory bus) and has been in development (and contracts signed) for some time ( example here February 2011 and October 2011), while GK 104 is primarily a workstation/consumer GPU. GK 110 as a direct competitor to GK 104 (single or dual) is largely predicated on salvage GPU's lacking the functionality of the full 15 SMX's that the Tesla K20 would require. A consumer GK 110 (say, GTX 780) would be simply a way to harvest GPU's that cannot be validated for pro use ( manufacturing defects and/or voltage requirement)- an afterthought that has the ability to gain PR/marketing points for Nvidia.

How many GK 104 cards would you have to sell to offset GK 110 contracts? An off the shelf Tesla K20 is listed at $3199. HPC versions will be more expensive still. ORNL have orders totalling 14592 units, NCSA's Blue Waters (3000+), each Aurora Tigon requires 256...and that doesn't take into account OEM workstations using Maximus or smaller clusters built by the same OEM's... or the large number of HPC installations that will likely upgrade previous generation components.
 
You're missing the point that the GK110 is post-decision to brand GK104 as high-end. Prior to that (when it was still GK100) it wasn't jury rigged to be a Tesla part.
 
You're missing the point that the GK110 is post-decision to brand GK104 as high-end.
And what has that got to do with your supposition that GK 110 production was postponed because of how the GK 104 performed ?
GK110 was pushed back after it was realised that GK104 would do fine.
GK 110 contracts had NOTHING to do with GK 104 -in any guise. GK 110 contracts exist solely for K20 based Tesla and have done for the best part of two years. Show me-and the rest of the forum- where GK 110/Tesla K20 was pushed back because the K10 (with it's 1:24 rate double precision, only partial ECC support, and low bandwidth) is so superlative.
Name a GK 110 contract that subsequent to GK 104's supposed stellar showing, caused the vendor to change to GK104 Tesla K10's. GK 104 based K10 isn't a replacement for the K20. K10 is a moneymaker for number cruchers who have no need for double precision workloads or ECC- I.e. uses outside of HPC, such as allying K10's with Quadro's for workstation/vizualization.
 
After GTX 650 vs GTX 660 provided such a huge gap in performance, I was wondering if they were going to release a few cards between to cover the performance scale. The GTX 650TI and GTX 660SE were the two cards that came to mind.
 
Sigh, are you being deliberately obtuse, dividebyzero? Or being pedantic about codenames or what, exactly? To make it as clear as possible, Nvidia had two chips in development, one mainstream which became the GK104 and one flagship/HPC which became the GK100 and later the GK110. After AMD's underwhelming competition the GK104 was instead branded as high end while GK100 was re-purposed as the GK110 and geared exclusively for the HPC market with the GTX 690 now being considered "good enough" as the Kepler flagship.

If you think we've seen the last of GK1x0 as a desktop card then you're massively out of touch, it's simply been pushed back until when it's needed and depending on what AMD comes up with. You only need to look at the TDP, thermals, dimensions, etc. of the GK104 for it to be blatantly obvious this wasn't originally intended as top-end part like the GF110 was. You seem to be fairly oblivious to the fact that GF110 had ECC, better double-precision floating point performance, etc. similarly to the GK1x0 (its natural successor).
 
After AMD's underwhelming competition the GK104 was instead branded as high end while GK100 was re-purposed as the GK110 and geared exclusively for the HPC market
GK 110 was always intended as a pro part geared for HPC. Forum fanboys were the only ones making noises about GK 110 launching as a desktop card alongside GK 104. Feel free to prove otherwise.
Your argument makes no sense if you consider the timeline:
HD 7970 is reviewed via leak and official reviews in December 2011, yet you would have people believe that Nvidia, upon realizing that their in-production GK 104 had enough horsepower to combat Tahiti, made an arbitrary decision to pull GK 110 as desktop card- even though only four weeks elapsed between Tahiti's known performance and GK 110 wafers being laid down.
Moreover, you expect people to believe that prior to Tahiti's performance becoming known, Nvidia had decided upon launching GK 110 parts as desktop even though, 1. TSMC's 28nm capacity and yield were low, and 2. Nvidia needs every available GK 110 die to fulfill HPC contracts, and 3. Nvidia would rather use GK 110 dies to sell $3+k pro co-processors than $1k gaming cards.
with the GTX 690 now being considered "good enough" as the Kepler flagship
ePeen for the fanboys. The GK 110 parts will be likewise to a certain extent. Nvidia probably don't even cover expenses on GTX 690 production ( production run vs. R&D and extra binning), GK 110 would likely fall into the same category. Enough of a production run to keep the card included in review comparisons for PR and marketing...at least until existing pro contracts are fulfilled and the 28nm process becomes more mature.
If you think we've seen the last of GK1x0 as a desktop card then you're massively out of touch
Comprehension fail on your part.
Desktop cards based on high-end compute GPU's, always eventuate. It simply becomes a matter of stockpiling GPU's that don't meet the binning process for the pro SKU's and/or excess inventory once the pro market contracts have been filled....and I never said otherwise. I would also very much guarantee that desktop cards do not feature a fully functional die (2880 core/15 SMX)
You only need to look at the TDP, thermals, dimensions, etc. of the GK104 for it to be blatantly obvious this wasn't originally intended as top-end part like the GF110 was
And I never said it was. My posting was concerned solely with the GK 110 and it's timeline. You're the one that seems to think that the GK 110 has been shelved/delayed because of the GK 104- which is blatently untrue given the development of the part. You can bleat on ad infinitum about the GK 100 being re-jigged into the GK 110, but the fact remains that if the GK 100 existed at all*, it was cancelled long before AMD's Tahiti performance was known, and it's cancellation could easily have been due to a change in architecture, lessons learnt from the process node, the need to add further compute functionality (Hyper-Q for example), or any number of other variables
You seem to be fairly oblivious to the fact that GF110 had ECC, better double-precision floating point performance, etc. similarly to the GK1x0 (its natural successor).
Which has precisely nothing to do with GK 110 other than the fact that the GPU carries on the compute feature set that started with the G80. If you're trying to convince me that the GK 104 wasn't/isn't seen as a high end compute card then I think I've already covered that
GK 110 and GK 104 basically follow seperate timelines. GK 110 has been first an foremost a math co-processor ( ECC memory support, 1:3 rate double precision, wide memory bus) and has been in development (and contracts signed) for some time ( example here February 2011 and October 2011), while GK 104 is primarily a workstation/consumer GPU
.

Unless you can provide some supporting evidence (and no, random musings from average Joe forum poster doesn't count) , I'd suggest to take your trolling elsewhere.

* Show me an instance where the "GK 100" was actually ever assigned to an Nvidia chip ( pdf link or official slide is fine). People assumed that a big-die was called GK100 because the series is GK1xx, and because the previous architecture was GF100
 
TL;DR

I can see you're getting upset and quite desperately want to be right, so I'll leave it at that though. Like I said, I heard this from the horse's mouth, but believe whatever you like and rationalise it to yourself however you can, sweetheart.
 
Nice collection of articles Mathew, but I think Anand's "Making Sense of the Intel Haswell Transactional Synchronization eXtensions" may have been a good addition as well.
You might want to peruse Dave Kanter's article at RWT on the same, as well as his article on Intel's near-threshhold (for transistor switching) voltage tech.
_________________________
...so I'll leave it at that though
I thought that might be the case. Our esteemed Guest posters usually have difficulty when asked to provide support for their claims.
Like I said, I heard this from the horse's mouth
Increasing your technical knowledge base via a domesticated animal...I think I just won a bet.
but believe whatever you like and rationalise it to yourself however you can, sweetheart.
Will do. My hit rate on graphics tech is usually pretty good, so I won't lose too much sleep over the permutations.
xoxoxo
cue inane response in 3....2....1...
 
The less I know about your bets involving domesticated animals the better, I think. Seems you have worse things to be losing sleep over than graphics tech. ;)
 
[FONT=Arial]
I have an associate in Nvidia.... If you think we've seen the last of GK1x0 as a desktop card then you're massively out of touch, it's simply been pushed back until when it's needed
[/FONT]
[FONT=Arial]Latest update/rumour would tend to refute Guest's "insider knowledge". Colour me surprised...[/FONT]
[FONT=Arial]GK110 reserved for HPC use. GTX 780 (and presumably 770, 760 Ti) to use what appears to be a new GPU probably modified from the GK 104. Likely a additional memory controller or two (320 or 384 bit) and an increased core count. Also likely that the new GPU won't simply be a couple of extra SMX tacked on the existing eight of GK 104 ( a 40 ROP/ 160 TMU part seems lopsided).[/FONT]
 
Back