Nvidia said to be readying $899 GK110-based GeForce Titan card

Matthew DeCarlo

Posts: 5,271   +104
Staff
Nvidia's GK110 graphics chip

Preparing for the upcoming launch of AMD's Radeon HD 8000 series, which will presumably kick off with a single-GPU flagship leading the charge, Nvidia is reportedly hoping to steal some of its rival's thunder by releasing a new card that will exist between today's GeForce GTX 680 and the dual-GPU GTX 690.

According to several sources speaking with SweClockers, Nvidia's newcomer will appear late next month for $899 as the GeForce Titan -- a neat reference to the Titan supercomputer built by Cray at the Oak Ridge National Laboratory, which is comprised of 18,688 nodes equipped with Nvidia's Tesla K20X GPU.

Instead of using the GTX 600 series' GK104 GPU, the Titan will be armed with the GK110, which powers Nvidia's enterprise-class Tesla range, though the Titan's chip will have at least one SMX unit disabled from the top configuration, leaving it with 2688 CUDA cores (still over a thousand more than the GTX 680 has).

It's also said that the Titan will have a clock rate of 732MHz (200 to 300MHz lower than the GTX 680 and 690), while its 6GB of GDDR5 VRAM will run at 5.2GHz and have 384-bit bus (50% wider than the GK104 offers. All told, the card will supposedly be 15% slower and at least 10% cheaper than the GTX 690.

rumor gk110 geforce titan

Assuming those figures are accurate, such a value discrepancy would probably be justifiable when you consider the fact that the GTX 690 has a 300W TDP, while the Titan should consume less given that the Tesla K20X is rated at 235W -- not to mention the reduced hassle of not dealing with a SLI-based card.

Update: As noted by dividebyzero in the comments, the specs striked above are for the Tesla K20X, while the GeForce Titan's configuration hasn't been revealed yet unfortunately. For whatever it's worth, the estimate about the new card being roughly 15% slower than the GTX 690 still seems to be relevant.

Permalink to story.

 
Matthew, just a correction to the story:
It's also said that the Titan will have a clock rate of 732MHz (200 to 300MHz lower than the GTX 680 and 690), while its 6GB of GDDR5 VRAM will run at 5.2GHz
Those specifications are for the 235 watt Tesla K20X -the SweClockers original story is just recounting the spec sheet, but doesn't speculate on a GeForce cards numbers. Probably safe to assume that a GeForce version would have less need for such a restrictive power limit when the PCI SIG allows for 300W. My guess; 850-900MHz, 5.8 - 6GHz (effective) memory and a full enabled die (but with the native 1/3 FP64 rate hobbled).
 
Thanks DBZ. I actually noticed that the specs lined up with the K20X's but I didn't think twice about it -- perhaps because Google's translation didn't make it particularly clear that they were strictly speaking about the Tesla product. Anyhow, I'll update the post :).
 
Why would someone want this over a 690? SuperBiiz is selling this for $984.99 with free shipping. If (which I wouldn't for the obvious reason that by the time I cant play a game on ultra with the 680 a new graphics card will be out that will beat the 690 and be in the $450 price range. Hence saving $100 and having a better card than the 690.) But IF I was going to by a card I would rather spend the extra $85 for a 15% increase! Which is kinda a big difference considering the difference between a 580 and a 680 is about 30%. So cost vs. performance the 690 would be better still.The lower wattage is nice though. 6gb is deffinately overkill considering NO game even uses 4. And the only game I know of that uses close to 2gb is GTAIV. Hitman might also that does use a lot of memory I believe.
 
Why would someone want this over a 690?
1.Single GPU - less hassle with SLI scalability (or SLI profiles for that matter)
2. Remember the crappy performance in AMD Gaming Evolved titles where compute functionality shows up the GK104 based Keplers? A GK110 wouldn't have those problems.
3.The 690 is still a 2GB per GPU card, so...
6gb is deffinately overkill considering NO game even uses 4. And the only game I know of that uses close to 2gb is GTAIV. Hitman might also that does use a lot of memory I believe.
It's not unreasonable to assume that someone dropping 900 on a graphics card might just want use it for multi-monitor gaming. Here's the vRAM usage for BF3 (bear in mind that the GTX 680's are 2GB per card)
1332910830lxuqiwXcM0_5_3.gif

So cost vs. performance...
Rarely has any meaning with a halo-orientated part. You could say the GTX 690 has a better perf/$ metric, although I'd go out on a limb and say that the increased bandwidth and framebuffer of a GK110 part would come into play at higher resolutions and higher levels of game I.q. And if perf/$ is the overriding metric, then a couple of GTX 670/660 Ti's are a better bet (from an Nvidia GPU PoV) than a 690 in any case.

[source for table]
 
Huh thats interesting my GTX 670 runs the Gaming Evolved titles (well the 2 ive played) as good if not better but maybe thats because I only game at 1600p. Honestly... not trying to be a smartass.
 
@dividebyzero
true but according to some sites the beast is due about 1 year after the gtx 690 late 2q13 or same time as hd 8000 series. I couldn't wait for this beast so I bought the gtx 690 at launch thank God I didn't hold my breath.
 
Huh thats interesting my GTX 670 runs the Gaming Evolved titles (well the 2 ive played) as good if not better but maybe thats because I only game at 1600p. Honestly... not trying to be a smartass.

Well, maybe "crappy" performance is an overstatement on my part, but just remember that AMD's-tailored-for-GCN DiRT:Showdown leverages compute for global illumination and advanced lighting calculations- and that's pretty much why this happens...

Showdown_02.png


Now, if you were AMD management, and if you knew that your competitor had an Achilles heel in the form of compute based image quality, what are the chances that they might exploit this in Gaming Evolved titles and every integrated benchmark from those games?
As far as Nvidia are concerned, launching the GK110 as a GeForce board- even if its overpriced (and it will be to stop production being diverted from high priced Tesla/Quadro) and in short supply, ensures that 1. AMD doesn't clock up some easy pickings in the PR benchmark wars, and 2. the card that sits atop most charts wont be a Radeon- unless it's a chart depicting performance-per-watt, dollar, or mm2... three metrics that Nvidia could care less about with this card- and three metrics AMD had been quick to downplay with Tahiti (and likely will again with Curacao)
 
Huh thats interesting my GTX 670 runs the Gaming Evolved titles (well the 2 ive played) as good if not better but maybe thats because I only game at 1600p. Honestly... not trying to be a smartass.

Well, maybe "crappy" performance is an overstatement on my part, but just remember that AMD's-tailored-for-GCN DiRT:Showdown leverages compute for global illumination and advanced lighting calculations- and that's pretty much why this happens...

Showdown_02.png


Now, if you were AMD management, and if you knew that your competitor had an Achilles heel in the form of compute based image quality, what are the chances that they might exploit this in Gaming Evolved titles and every integrated benchmark from those games?
As far as Nvidia are concerned, launching the GK110 as a GeForce board- even if its overpriced (and it will be to stop production being diverted from high priced Tesla/Quadro) and in short supply, ensures that 1. AMD doesn't clock up some easy pickings in the PR benchmark wars, and 2. the card that sits atop most charts wont be a Radeon- unless it's a chart depicting performance-per-watt, dollar, or mm2... three metrics that Nvidia could care less about with this card- and three metrics AMD had been quick to downplay with Tahiti (and likely will again with Curacao)

Recommend me a card. I have 2 Asus gtx 560tis 2GB with a u2711. I'm complaining about the noise they make when I play FPS games. And also the frame rates are not as fast in 2560 by 1440.
 
Recommend me a card...
Probably not a lot in the single card market. Not even a 7970GHz is going to be a major step up. Depending where you're shopping, I'd say that a couple of non-reference 7950 would be the best bet for bang-for-buck. Scaling should generally be good. Single card/single GPU, you're probably limited to a 7970 for some sort of future-proofing. A non-ref factory OCéd card for goes for about ~$350 (about the same as a GTX 670) in the U.S.
6GB ? they are making cards for 4k TV/monitors
With a 384-bit memory bus, its either 3GB or 6GB (without going asymmetrical), and 3GB doesn't really have the wow factor- not when a slew of GTX 670/680's are now 4GB.
 
dividebyzero do you speak Swedish natively? (I'm not trying to be a smartass)
As I read the article originally a few days ago I understood it that the Geforce version would have one of the 15 SMX units disabled, resulting in 2688 CUDA "cores"
But now as I check the PDF you linked that's the spec for the K20X card, very intriguing!
And thus rereading the article I must say I agree with you, the "fully fledged" GPU has 2880 cores but nVidia disabled one SMX unit in the K20X card for a total of 14 SMX units.
This begs the question if the consumer variant will have one further SMX unit disabled, I mean they must have allot of chips leftover from the production of K20X cards that they could not use for die harvesting, since all Geforce cards are based on GK104 and not GK110

P.S: This article goes very well with this newspost: http://www.anandtech.com/show/6421/...-299k-amd-x86-cores-and-186k-nvidia-gpu-cores
And this one too: http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last
 
dividebyzero do you speak Swedish natively?
Nope, just a couple of online translators and a little extrapolation. The chances that Nvidia released a GeForce GK110 at the clocks of the passively cooled K20X would approach 0%. The Tesla (and Quadro) parts are binned for long term stability and GPU core voltage. Stability isn't critical for GeForce cards, and nor is the 225-235W envelope per GPU that most server racks units are specced for
With more relaxed power envelope constraints, and less need to bin GPUs for low power means that generally the GeForce card is seldom core clocked lower than its Tesla/Quadro counterpart, and (almost?) never with vRAM
...the "fully fledged" GPU has 2880 cores but nVidia disabled one SMX unit in the K20X card for a total of 14 SMX units.This begs the question if the consumer variant will have one further SMX unit disabled, I mean they must have allot of chips leftover from the production of K20X cards that they could not use for die harvesting, since all Geforce cards are based on GK104 and not GK110
Possible three scenarios:
1. A full 15 SMX GK110 falls outside of the power envelope. In which case, they could be binned for GeForce. Also a possibility that the 15th SMX is built in redundancy to mitigate against any yield problem with a large die. Fully enabled GK110's could be seen as a bonus which might fit in with the lack of GTX naming for the GeForce card.
2. The GeForce cards are further salvage parts, as you hypothesise. In which case, I'd also expect a 12SMX part if yields are that bad. I'm also going to assume that Nvidia launch a Quadro GK110 that could well be a candidate for the 13 (and 12?) SMX parts.
3. There are 100% functional GK110's, but they are being held back for an (as yet) unannounced model.

Without knowing yield its tough to extrapolate this likelihood. With the probable die size and Nvidia's prioritizing for OEMs they couldn't have been that bad. Initial shipments of K20X started in early September and at least 18688 14SMX (93% functional) were supplied in a two months time period -since ORNL's Cray Titan ran a Linpack benchmark in early November. If Nvidia can produce nearly 20K 87% (K20) and 93% (K20X) functional GK110, what's the chance that there are some 100% functional units as well as 13SMX parts? And if there are 15SMX enabled GPUs, where are they going if not GeForce cards? Stockpiling for a future model seems pointless since yields are generally going to improve with every tranche of wafers produced which should keep the channel supplied. Much as Nvidia would like it not to be, there is a finite market for math co-processors- and even if every GPGPU HPC cluster is replacing Fermi boards with Kepler, I'm not sure the market is that large.

Either way, we wont have long to wait. About five minutes after these boards go to OEMs for validation we should start seeing benchmarks pop up on Chinese tech sites.
 
15% slower than gtx 690, but it's on single GPU..!! That's powerful! Wohow!
 
Powerful and only 235w I am sold but the price is up there. I would pay 6-7 hundred not 9 hundred.
 
Wow 6GB of Vram, and I thought my 4GB 680 was slight overkill even for my 2560 x 1600 display. Talk about a slap in the face to those that just bought a 690...

why is it a slap in the face the card is suppose to be 15% slower! Anyone with a 690 still has a faster videocard and bragging rights!
 
Wow 6GB of Vram, and I thought my 4GB 680 was slight overkill even for my 2560 x 1600 display. Talk about a slap in the face to those that just bought a 690...

why is it a slap in the face the card is suppose to be 15% slower! Anyone with a 690 still has a faster videocard and bragging rights!

Because it is single GPU with 6GB of RAM. Meaning less driver issues, no microstutter, and better compatibility with games. It's generally advised to get the single fastest GPU over two slower ones in SLI. For extreme resolutions, every bit of RAM counts in the long run too.
 
Agreed on the " driver issues, no microstutter, and better compatibility with games"

If you were a 690 owner you would downgrade to this card to avoid these issues?

"For extreme resolutions, every bit of RAM counts in the long run too."

Agree with this to a certain extent there is no such thing as the long run with computers, a year or two after this card is released there will be a gpu out that is faster and for less money.
 
:( Guess I will wait for Maxwell in 2014 at the price of this new card..Unless this card goes down in price $$..My GTX 580 will have to hold on till then..
 
2. Remember the crappy performance in AMD Gaming Evolved titles where compute functionality shows up the GK104 based Keplers? A GK110 wouldn't have those problems.
I'd like more info on that please, I'd love to read an article on that if possible.
 
Agreed on the " driver issues, no microstutter, and better compatibility with games"

If you were a 690 owner you would downgrade to this card to avoid these issues?

"For extreme resolutions, every bit of RAM counts in the long run too."

Agree with this to a certain extent there is no such thing as the long run with computers, a year or two after this card is released there will be a gpu out that is faster and for less money.
Don't forget, highest SLI possible with a 690 is two. So 4x this GPU, and it will be ~240% faster than a single 690.
 
Back