Nvidia GeForce Titan: supercomputer GPU power for the 1%

Julio Franco

TechSpot Editor
Staff member
Cray's XK7 Titan supercomputer is powered by no less than 18,000 Nvidia Tesla K20x GPUs, which Nvidia is proud to say highly contributes to make the Titan the world's fastest supercomputer. Today, the company is presenting a revised version of the graphics...

[newwindow="https://www.techspot.com/news/51680-nvidia-geforce-titan-supercomputer-gpu-power-for-the-1.html"]Read more[/newwindow]
 

ghasmanjr

TS Booster
Mother of God. Balls to next-gen consoles, I'm investing my cash into one of these monsters.
I was about to say "Mother of God". Let me finish it off with the pic: http://i0.kym-cdn.com/photos/images/original/000/199/693/disgusted-mother-of-god.png?1321272571
I'm not sure this will be worth it. I've been on the fence for a while between getting a 690 and selling my 680 or just buying another 680. I heard this was coming out, but I'm not buying this for $900 when the 690 is only an extra $100 (likely less after the annual price cuts). I'm passing on this one. I'll get my future three-screen setup powered by something for affordable.
 
  • Like
Reactions: Littleczr

Jad Chaar

Elite Techno Geek
Don't get too excited yall, people are saying that 2 670s can get better performance for cheaper.
 

mevans336

TS Enthusiast
Don't get too excited yall, people are saying that 2 670s can get better performance for cheaper.
Yes, but as someone who has two 660 Ti's in SLI, dealing with SLI is a nightmare compared to a single-card. As cmbjive pointed out, I wish I had waited. I would have dropped an extra $300 for this card, especially for the monster double-point (FP64) performance.

Anandtech said:
To put all of this in perspective, on paper (and at base clocks), GTX 680 can offer just shy of 3.1 TFLOPS of FP32 performance, 128GTexels/second texturing throughput, and 32GPixels/second rendering throughput, driven by 192GB/sec of memory bandwidth.

Titan on the other hand can offer 4.5 TFLOPS of FP32 performance, 187GTexels/second texturing throughput, 40GPixels/second rendering throughput, and is driven by a 288GB/sec memory bus.

This gives Titan 46% more shading/compute and texturing performance, 25% more pixel throughput, and a full 50% more memory bandwidth than GTX 680.
 

Skidmarksdeluxe

TS Evangelist
It's all very nice and all at the moment but I can never imagine myself shelling out the kind of money this card will demand. I'd rather wait until Nvidia releases the GeForce 8 series, then purchase a high/mid range card for a fraction of the price.
I can never quite get my head around someone willing to waste money on such a frivolous item (or two) that'll only have bragging rights for about a year. As nice as this card is, it's just won't be worth it. My opinion only.
 

TomSEA

TechSpot Chancellor
"I can never quite get my head around someone willing to waste money on such a frivolous item (or two) that'll only have bragging rights for about a year."

Mmmm...I could care less about bragging rights. I just like to play games like Far Cry 3 and the new Bioshock at the highest settings on a 2650 x 1440 monitor without any hassles. Computer gaming is a major source of recreation for me and just like anyone else who spends money on their form of recreation, I do the same.

My tax return is due and it comes up to a little over $1,000. I'm seriously considering getting one of these.
 

amstech

IT Overlord
I'm a big fan of single GPU's so this card I like.
But unless your running eyefinity/3DS or you want to max every game with one GPU at 1440p/1600p, this GPU could be overkill.
"I just like to play games like Far Cry 3 and the new Bioshock at the highest settings on a 2650 x 1440 monitor without any hassles. Computer gaming is a major source of recreation for me and just like anyone else who spends money on their form of recreation, I do the same.
Bioshock and Far Cry 3? :lol: A single GTX 660Ti/7950 could run those games maxed at 1440p/1600p at 60FPS. Buying a GTX Titan for those games at 1440p would be a gigantic waste of money, like the fools who bought a 690 for 1080p.
 

TomSEA

TechSpot Chancellor
"A single GTX 660Ti/7950 could run those games maxed at 1440p/1600p at 60FPS."

Well smart guy, guess what? I'm running two GTX 660Ti's in SLI and have Far Cry 3 maxed out and it sounds like a 737 engine taking off during some of the scenes. Plus there can be occasional choppiness. Try reading some of the bench marks in the reviews before you mouth off about what a waste this would be.
 
  • Like
Reactions: Littleczr

amstech

IT Overlord
I'm running two GTX 660Ti's in SLI and have Far Cry 3 maxed out and it sounds like a 737 engine taking off during some of the scenes.
You should have did more research on GPU's, sounds like you made a poor choice. I love my Windforce 3X 670, overclocked to the hills and quiet as hell.

Try reading some of the bench marks in the reviews before you mouth off about what a waste this would be.
Calm down young padawan, you will learn in time.
 

TomSEA

TechSpot Chancellor
LOL...do some research? You really are a bright one, aren't you? I always go out and buy expensive cards without doing any research. The reviews showed that the GTX 660ti's I own actually outperformed some of the GTX 670's at the time and was the highest rated "bang for your buck" card at the time. In fact it's nothing more than a GTX670 with a 192 buss. Read it and weep Mr. Expert:

http://www.guru3d.com/articles_pages/msi_geforce_gtx_660_ti_power_editon_oc_review,1.html

http://benchmarkreviews.com/index.php?option=com_content&task=view&id=967&Itemid=72
 
  • Like
Reactions: veLa

slh28

TechSpot Paladin
With AMD seemingly not releasing anything for a while it could mean the $999 price tag is going to stay for a while :(
 

bielius

TS Addict
So much hate here!

Well, I just managed to buy myself a nice 7950 and really enjoying those FPS'es. No upgrade for me for the following years...
 
  • Like
Reactions: whiteandnerdy

Littleczr

TS Guru
"A single GTX 660Ti/7950 could run those games maxed at 1440p/1600p at 60FPS."

Well smart guy, guess what? I'm running two GTX 660Ti's in SLI and have Far Cry 3 maxed out and it sounds like a 737 engine taking off during some of the scenes. Plus there can be occasional choppiness. Try reading some of the bench marks in the reviews before you mouth off about what a waste this would be.
Yes to everything. I'm running 2gtx 560tis at 2560x1440 and the cards get hot and I agree it sounds like a jet engine taking off. I am also contemplating using my tax refund to buy one of these.
 
  • Like
Reactions: UNKNOWN9122

dividebyzero

trainee n00b
A quick roundup from around the interwebz
PCGH measured power draw at 214 watts loaded...also seems pretty quiet


A lot of overclocking headroom if Hilberts 1176 MHz core (40% overclock) is any indication. Since OC/boost is tied directly to thermal envelope, the $999 card is likely to benefit from a $100-150 waterblock
 

9Nails

TechSpot Paladin
Did Nvidia just announce a card without any silly numbers or letters? It's just "Titan", and that's all there is to the name?!

I must say: I'm quite proud of them for that!

I imagine at some lunch table they had this discussion and said, "It's the best card we make. There's nothing better, so we don't need to do all the silly letters and numbers and such. Just call it Titan and be done with it!"
 

Skidmarksdeluxe

TS Evangelist
"A single GTX 660Ti/7950 could run those games maxed at 1440p/1600p at 60FPS."

Well smart guy, guess what? I'm running two GTX 660Ti's in SLI and have Far Cry 3 maxed out and it sounds like a 737 engine taking off during some of the scenes. Plus there can be occasional choppiness. Try reading some of the bench marks in the reviews before you mouth off about what a waste this would be.
I dunno. To me it sounds more like a DC 10 landing but it could be the reverse thrust that's fooling me.
 

soldier1969

TS Booster
This card alone already embarrasses the next gen of consoles a year away from release lol. Time to upgrade my 2 3Gb 580s to one of these bad boys. I game at 2560 x 1600 res.
 
G

Guest

Time to upgrade my Nvidia GF110 GPU to GK110 GPU! ;)

The real Fermi architecture

GF110 GPU

The real Kepler architecture

GK110 GPU

Every thing else is just cut down!