Nvidia quietly launches Titan Black graphics card, rolls out new GeForce drivers

By on February 18, 2014, 2:45 PM

Nvidia’s flagship Titan graphics card quietly received a refresher today with the launch of the Titan Black. I say quiet because Nvidia didn’t even publish a press release on the matter (a blog post got the job done), instead electing to focus their attention on the new Maxwell architecture.

Nevertheless, the new Titan Black is here and it features modest updates across the board (no pun intended). The clock speed is up from 837MHz to 889MHz, boost clock is now at 980MHz versus 876MHz on the original Titan and the 6GB of 384-bit GDDR5 memory now operates at 7GHz instead of 6GHz. The stream processor count is also a tad bit higher, too, at 2880 versus 2688 from last year.

The card is available starting today for $999, the same price the original Titan launched at.

In related news, Nvidia also rolled out new GPU drivers. Version 334.89 WHQL are said to boost performance by as much as 19 percent in F1 2013, up to 18 percent in Sleeping Dogs, up to 16 percent in Hitman Absolution and up to 15 percent in Company of Heroes 2.

The new drivers also include new SLI profiles for Assassin's Creed Liberation HD, Assassin's Creed: Freedom Cry, Deus Ex: Human Revolution Director's Cut and The Crew. Additionally, Nvidia has included new 3D Vision profiles for Shadow Warrior, The Stanley Parable, Walking Dead 2, World Rally Championship 4, LEGO Marvel Super Heroes, and Far Cry 3 Blood Dragon.

To download just choose the drivers for your platform:




User Comments: 46

Got something to say? Post a comment
GhostRyder GhostRyder said:

Well I lost my bet on the price...

Kinda overpriced for what it is, in reality you might as well just stick with the 780ti since 3gb is more than enough for even the higher end of the sprectrum. Maybe we will see some 4gb 780ti Cards in the future from EVGA or the likes that will be priced better.

Kinda Surprised they just slipped this card into the market with nothing being even talked about.

Truthfully im waiting to see what the supposed 790 is going to be (780 X2 or 780ti X2) because since 290X's are still on the rise for miners, it might be a good idea to just sell them off and grab a 790 or 2 or a pair/trio of 780ti's.

JC713 JC713 said:

I wonder how OCable the card is now with the boosted clocks. I imagine the 8+6 pin will bottleneck it. @dividebyzero any ideas?

GhostRyder GhostRyder said:

I wonder how OCable the card is now with the boosted clocks. I imagine the 8+6 pin will bottleneck it.

Its just a 780ti with 3gb more VRAM on it essentially minus a few alterations that make it the Titan. 1200mhz will probably still be around normal peak.

JC713 JC713 said:

Its just a 780ti with 3gb more VRAM on it essentially minus a few alterations that make it the Titan. 1200mhz will probably still be around normal peak.

True.

Guest said:

The Titan is the most gorgeous card I have ever seen to date. Pure class IMO

BlueDrake said:

The Titan is the most gorgeous card I have ever seen to date. Pure class IMO

Funny to say that, given a lot of the 700 series share the design. Obviously various 700 series cards from third party vendors, will have OC's and their own unique cooling designs but not all.

2 people like this | dividebyzero dividebyzero, trainee n00b, said:

Maybe we will see some 4gb 780ti Cards in the future from EVGA or the likes that will be priced better.

You won't ever see a 780 Ti with an asymmetrical memory configuration. Clamshell mode using 16-bit I/O on the backside memory IC's is too low.

And no, there won't be any 6GB 780 Ti's either. The larger framebuffer is one of the main segmentation factors. It is also the reason that you wont see a 12GB Titan - since 12GB is reserved for Tesla.

Kinda Surprised they just slipped this card into the market with nothing being even talked about.

Not really. The card isn't aimed at gamers - at least not 99.99% of them. If you want/need a 6GB framebuffer then you're likely using the machine for visualization (Blender etc.), and people using pro cards are well aware of the release schedule, as well as frequenting their own specialist forums.

I wonder how OCable the card is now with the boosted clocks. I imagine the 8+6 pin will bottleneck it. @dividebyzero any ideas?

8+6 pin doesn't really bottleneck the 780 Ti unless you're looking at extreme overclocking. The bigger drawback would be the reference cooler causing downclocking - although that is fairly minor in relation to some other reference design blower/shrouds. Depending upon what the actual boost clock is (it is usually somewhat higher than the minimum advertised).

As for overclock potential - maybe slightly less than than the 780 Ti, since the extra 3GB of vRAM will take 15-20W of the power budget. You're probably looking at about 1100-1150MHz if staying within the Nvidia voltage guideline.

As an idea of what is achievable, Hardware. info clocked FOUR Titan Blacks to 1089 MHz (base), 1180 MHz (boost), 7200 MHz (effective memory). Here's their quad-SLI Titan Black review.

amstech amstech, TechSpot Enthusiast, said:

No one admits more incompetence then the person who talks about Titan being overpriced due solely to it's gaming performance.

GhostRyder GhostRyder said:

You won't ever see a 780 Ti with an asymmetrical memory configuration. Clamshell mode using 16-bit I/O on the backside memory IC's is too low.

And no, there won't be any 6GB 780 Ti's either. The larger framebuffer is one of the main segmentation factors. It is also the reason that you wont see a 12GB Titan - since 12GB is reserved for Tesla.

You say that, but thats only an assumption, unless Nvidia themselves forces a no then the choice is up to the vendor to release such a card. We already experience cards with double the memory.

No one admits more incompetence then the person who talks about Titan being overpriced due solely to it's gaming performance.

Noone admits more fanboism than someone justifying 100 bucks per extra gigabyte of GDDR5 ram and an increased single precision and double precision performance

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

You say that, but thats only an assumption, unless Nvidia themselves forces a no then the choice is up to the vendor to release such a card. We already experience cards with double the memory.

Well no it isn't. Both Nvidia and EVGA's Jacob Freeman have already made the it pretty clear. Seems like every five minutes someone asks the question and we go through the whole 6GB unicorn rumourfest again. [link] and assumed the new SKU would be a 780 Ti variant and I can see how some people might not want to let it go.

EVGA's product manager, Jacob Freeman commenting one of the many "Will it happen?" threads (in this case some mocked-up boxart concept).

...there are no plans to ever bring it to market. In fact it never existed in an actual product form

Noone admits more fanboism than someone justifying 100 bucks per extra gigabyte of GDDR5 ram, increased double single precision and double precision performance

Maybe you should head over to the Blenderartists, VRAY, MentalRay, and Autodesk forums and tell them they don't know what they're doing (Bear in mind the newer Blender builds are further optimized for Titan/Titan Black so don't base your admonishment on the old builds). Titan is pushed pretty hard as a CUDA dev tool as well which ties in with the interest in Blender, VRAY etc.

As you should be able to see, the Titans framebuffer and FP64 rate give a considerable speed up over the 780 Ti .

and a change to 1/3 FP32.

FP32 is full rate (1:1) just as it is for every GPU made for the last 9 or so years. I believe the ratio you're looking for is 1:3 for FP64

GhostRyder GhostRyder said:

Well no it isn't. Both Nvidia and EVGA's Jacob Freeman have already made the it pretty clear. Seems like every five minutes someone asks the question and we go through the whole 6GB unicorn rumourfest again. [link] and assumed the new SKU would be a 780 Ti variant and I can see how some people might not want to let it go.

Ok thats one, now what about MSI, gigabyte, PNY, and Asus all of which have done double Ram variants of popular Nvidia cards. So unless Nvidia puts a foot down and says no they have just as much chance as anyone.

As you should be able to see, the Titans framebuffer and FP64 rate give a considerable speed up over the 780 Ti .

Yes in a few Cuda development tools as I mentioned above, does not justify the price either way. A few changes/add ons to the same card all for a nominal 300 extra dollars. Its just a Boss Hogg style scam...

FP32 is full rate (1:1) just as it is for every GPU made for the last 9 or so years. I believe the ratio you're looking for is 1:3 for FP64

I edited that over an hour before you posted, I noticed the typo...

Guest said:

Dear techspot, we demand a review about Titan Black as soon as possible.. please hear our plea :)

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

Ok thats one, now what about MSI, gigabyte, PNY, and Asus all of which have done double Ram variants of popular Nvidia cards.

Name one company that has a 6GB 780 let alone a 6GB 780 Ti ? There isn't one. There won't be one unless AMD allow vendor 8GB 290X's and Nvidia changes its stance. What the AIB's want is immaterial, they will abide by Nvidia's edict to protect the market. A lucrative market MSI, PNY, and Asus also benefit from.

And BTW, why would any ven destroy their own Titan/Titan Black markets? They wouldn't.

Show me one 6GB GTX 780 or 780 Ti, or any verified vendor statement saying that one is due for release. You're arguing for the sake of it. The 780 has been in the market for 10 months - not one single 6GB card while the Titan remains in the product stack. That should be a clue.

Yes in a few Cuda development tools as I mentioned above, does not justify the price either way.

As I said, maybe you should go tell all the visualization forums all about it. I'm pretty certain that they would come to the same conclusion that I have- namely that you don't have a clue what you're talking about.

VitalyT VitalyT said:

I wouldn't spend that much money on a video card at this point, because DisplayPort 1.3 is due to be released within 2-3 month, which will be a significant new chapter for video cards to initiate proper support of 4K and up to 8K resolutions. Expect corresponding updates from both NVidia and AMD to use the new protocol and to bump up video memory further, possibly twice on the high-end cards.

GhostRyder GhostRyder said:

Name one company that has a 6GB 780 let alone a 6GB 780 Ti ? There isn't one. There won't be one unless AMD allow vendor 6GB 290X's and Nvidia changes its stance. What the AIB's want is immaterial, they will abide by Nvidia's edict to protect the market. A lucrative market MSI, PNY, and Asus also benefit from.

And BTW, why would any ven destroy their own Titan/Titan Black markets? They wouldn't.

Show me one 6GB GTX 780 or 780 Ti, or any verified vendor statement saying that one is due for release. You're arguing for the sake of it. The 780 has been in the market for 10 months - not one single 6GB card while the Titan remains in the product stack. That should be a clue.

Has every Vendor said that they are not creating one? Nope, therefor its more than possible just like How powercolor made the devil 13 dual GPU card way before the 7990 that was not announced till late or how theres 6gb 7970 card. Plenty of opportunity for one unless NVidia firmly says no and makes the companies not do it.

As I said, maybe you should go tell all the visualization forums all about it. I'm pretty certain that they would come to the same conclusion that I have- namely that you don't have a clue what you're talking about.

How about you get a clue you don't know what your talking about? Defending a 1k card because they add an extra set of ram and adjust the card/add a feature or two in the mix which apparently takes you over an hour to find information to justify. Trying to justify it with things similar like Like Cuda Blender and saying well that justifys the price hike because Nvidia can re purpose the same GPU with a few changes in feature set to make it valued at 1k is just trying to justify spending the money.

Also im apparently not the only one who shares that viewpoint that the Titans and Titan black are just a joke for extra cash. I can find an extra use for anything, should we talk about how the half priced AMD card is almost 3/5 as good at many of the features you claim not Cuda related even in your post. Or should we also speak how the fact you can Mine with a 290X 2 times as much as the 780ti would justify it being 150 bucks more, no it doesn't. Guess AMD should just throw 4more gigabytes or ram on a card and change the double and single precision up then charge an extra 300 bucks because thats apparently justifiable. You can spam extra feature sets all you want does not justify the price of a card called the GTX Titan Black.

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

I wouldn't spend that much money on a video card at this point, because DisplayPort 1.3 is due to be released within 2-3 month, which will be a significant new chapter for video cards to initiate proper support of 4K and up to 8K resolutions. Expect corresponding updates from both NVidia and AMD to use the new protocol and to bump up video memory further, possibly twice on the high-end cards.

It's old tech in any case. All Maxwell GPUs -including the just released GTX 750/750 Ti feature Dynamic Parallelism and Hyper-Q...two features previously reserved for GK 110 only.

Plus, if you follow the breadcrumbs...

Exhibit A: Nvidia announced at the official launch that GM106 (28nm) follows shortly, followed by GM206/204 -likely replacements for the GTX 780/770/760.

Exhibit B: This article regarding TSMC http://focustaiwan.tw/news/atod/201312050035.aspx-including this titbit:

A senior TSMC executive revealed recntly that the company will begin 20nm production in the first quarter of 2014, contributing to the company's revenue in the following quarter.

Industry sources said TSMC's 20nm production capacity has been booked up with orders from industry giants including Apple Inc., Qualcomm Inc., Xilinx, Altera, Supermicro, NVIDIA, MediaTek and Broadcom Corp.

Which gels with the rumour mill talk that this present generation of cards might not last long in the spotlight.

Has every Vendor said that they are not creating one? Nope, therefor its more than possible just like How powercolor made the devil 13 dual GPU card way before the 7990....

Nvidia isn't AMD.

Defending a 1k card because they add an extra set of ram and adjust the card/add a feature or two in the mix which apparently takes you over an hour to find information to justify

Whose "defending" and "justifying"? I am pointing out that people buy the card. A simple browse of the render forums shows that this is the case. Since this seems beyond you, here's a forum members Octane renderer

As usual you're (badly) interpreting what I've said rather than reading what I've said....as usual...getting all upset and defensive...as usual

I can find an extra use for anything, should we talk about how the half priced AMD card is almost 3/5 as good at many of the features...[rant snipped]

It might have slipped by you, but this is a thread about an Nvidia card- specifically the GTX Titan Black. Feel free to post in the appropriate thread all you know about AMD cards - I'll give it all the due consideration it deserves...assuming I have a spare 90 seconds and any interest.

GhostRyder GhostRyder said:

Which gels with the rumour mill talk that this present generation of cards might not last long in the spotlight.

Which would even more point to the fact Titan black is not worth near that much especially if the current gen gets phased out very soon.

Nvidia isn't AMD.

Really would not have guessed...Just because they are different companies does not mean the practices always have to be different.

Whose "defending" and "justifying"? I am pointing out that people buy the card. A simple browse of the render forums shows that this is the case. Since this seems beyond you, here's a forum members Octane renderer

As usual you're (badly) interpreting what I've said rather than reading what I've said....as usual...getting all upset and defensive...as usual

As per usual your on your trying to shove your opinions down peoples throats which has been very common thing you do. You also have a tendency when someone disagrees with you to start referring to them as lesser being's. Your saying the cards price is justified because of "insert reason here" which is defending the product.

Anyone can justify the price of a card by saying it has "Insert certain feature with cool name" and throw a different price tag on it. R9 290X can mine great and excels at OpenCL, well by the logic here we should slap a higher price tag because certain features make it worth that much more. How about an extra 300 dollars to cover more ram and those "extra" features and let change some of the priorities. No, having said features is part of it and saying its worth more would not justify AMD going "Well we should charge an extra 300 bucks because its better at this set of features". The fact that Titan exists with the GTX branding is to line wallets. Its a Hybrid card, its not directly the GTX side (well actually it mostly is) and its not fully the work station side.

Its an attempt to fool gamers who don't spend time looking up gaming benchmarks or who don't look at enough to think that this card is the best thing ever invented. The extra features are just tacked on to make it appeal to the other end spectrum for more uses and to justify the price, it sure as heck does not cost them much more to make than the 780 or 780ti.

Many people share my opinion, and an opinion is an opinion.

1 person liked this | Skidmarksdeluxe Skidmarksdeluxe said:

When all is said and done I think you've gotta have rocks in your head to even consider blowing $1000 on a card just to play games. I could find a lot more useful things to waste that kind of moolah on.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

As per usual your on your trying to shove your opinions down peoples throats which has been very common thing you do.
A graphics card being worth more, because it does more is not an opinion.

When all is said and done I think you've gotta have rocks in your head to even consider blowing $1000 on a card just to play games.
Thats just it, the Titan is not a gaming card. DBZ has been trying to tell everyone this, everytime he post about the Titan. He has commented several times, the Titan is a computing card. It may not be worth shit with Bitcoins, but that is not the only computing projects that need processing power.

dividebyzero dividebyzero, trainee n00b, said:

Your saying the cards price is justified because of "insert reason here" which is defending the product.

Nope. You got it wrong again. Quelle surprise.

I haven't said anything about the cards price. Not. One. Thing.

Neither am I defending the product. Comprehension fail on your part.

What I stated is that people who have a use for the card, buy the card. It is actually a rather simple concept.

Bleating on about feature set, price, or what AMD's doing doesn't change the fact that the Titan finds a home with people who have a use for it. Argue as much as you want to, but companies don't put out hardware with the expectation that it wont sell

I have 4500+ posts here, and I have not criticised a single piece of hardware chosen by anyone here- namely, because everyone's needs are individual to them - application usage, pricing, availability, personal preference, resell, ego, as nauseam.

You on the other hand continue to post on a forum thread regarding a product you don't like and wont consider buying - and judge any potential buyer into the bargain.

That is rather sad.

It may not be worth **** with Bitcoins, but that is not the only computing projects that need processing power.

It will never be a "go to" for mining, but CUDA miner is a making some progress. Reducing the cores-per-compute units is definitely helping in that regard, as the 750 shows

(Extremetech review]

GhostRyder GhostRyder said:

Nope. You got it wrong again. Quelle surprise.

I haven't said anything about the cards price. Not. One. Thing.

Neither am I defending the product. Comprehension fail on your part.

What I stated is that people who have a use for the card, buy the card. It is actually a rather simple concept.

Bleating on about feature set, price, or what AMD's doing doesn't change the fact that the Titan finds a home with people who have a use for it. Argue as much as you want to, but companies don't put out hardware with the expectation that it wont sell

I have 4500+ posts here, and I have not criticised a single piece of hardware chosen by anyone here- namely, because everyone's needs are individual to them - application usage, pricing, availability, personal preference, resell, ego, as nauseam.

You on the other hand continue to post on a forum thread regarding a product you don't like and wont consider buying - and judge any potential buyer into the bargain.

That is rather sad.

No the only sad thing is that your the first one to start an argument every time something does not follow your belief. I said I don't think the price is justified and I am more entitled to my belief. The saddest person here is the one who starts every argument because I don't share your biased opinion.

People will justify buying anything, that's why there are games costing 50k and cars for 300k or more. Whether or not it's worth it is an opinion.

P.s don't lie, you have critiqued almost every AMD card release including the 7990.

dividebyzero dividebyzero, trainee n00b, said:

P.s don't lie, you have critiqued almost every AMD card release including the 7990.

Do you have some kind of learning disability?

I have 4500+ posts here, and I have not criticised a single piece of hardware chosen by anyone here

Whose talking about the hardware? I was referring to the user base

Its an attempt to fool gamers who don't spend time looking up gaming benchmarks or who don't look at enough to think that this card is the best thing ever invented.

Aside from the obvious issue that most people dropping a $1000 on a graphics card probably know full well what they're in for, there is also the fact that many people don't even use the card for gaming. I'd actually dare to say that the average GTX Titan buyer might be somewhat more knowledgeable than most given the professional uses the card is put to.

But yeah, carry on misreading and yakking.

captaincranky captaincranky, TechSpot Addict, said:

Since my best VGA at the moment, is Intel's HD-4000 IGP in my Core-i3, I wouldn't even be allowed to troll in this thread, would I......? :oops:

dividebyzero dividebyzero, trainee n00b, said:

Since my best VGA at the moment, is Intel's HD-4000 IGP in my Core-i3, I wouldn't even be allowed to troll in this thread, would I......? :oops:

The bar has been set pretty low, I think you're safe!

At least your posting wouldn't cause me to wince at the grammar, punctuation, and logic.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

I hope my participation doesn't, cause you much pain.

dividebyzero dividebyzero, trainee n00b, said:

I hope my participation doesn't, cause you much pain.

Luckily, your "Incoherent Rambling" quotient is commendably negligible

Guest said:

IMHO, if you got $1000 to spent for vga card (for gaming), add some more cash then pick two 780s.

As for me, compared with Quadro 6000, Titan black 1k card could be a solution to people (with limited budget) who need that large memory and utilize its CUDA cores

Skidmarksdeluxe Skidmarksdeluxe said:

A graphics card being worth more, because it does more is not an opinion.

Thats just it, the Titan is not a gaming card. DBZ has been trying to tell everyone this, everytime he post about the Titan. He has commented several times, the Titan is a computing card. It may not be worth **** with Bitcoins, but that is not the only computing projects that need processing power.

Nevertheless it'll mainly be used as a gaming card but that's the purchasers prerogative. I don't know what all the fuss is about whether it's a computing card or gaming card, it's just too damn expensive to warrant purchasing at any rate.

GhostRyder GhostRyder said:

Since my best VGA at the moment, is Intel's HD-4000 IGP in my Core-i3, I wouldn't even be allowed to troll in this thread, would I......? :oops:

Don't worry as long as you don't share an opinion you should be safe from Mr. Opinion Killer

Nevertheless it'll mainly be used as a gaming card but that's the purchasers prerogative. I don't know what all the fuss is about whether it's a computing card or gaming card, it's just too damn expensive to warrant purchasing at any rate.

Thank you!

Do you have some kind of learning disability?

Do you, because clearly sharing an opinion warrants 3 paragraphs and pages of insults on a tech forum apparently.

Whose talking about the hardware? I was referring to the user base

Bad punctuation, by your logic I must have made you distraught. I am so sorry for that!

Aside from the obvious issue that most people dropping a $1000 on a graphics card probably know full well what they're in for, there is also the fact that many people don't even use the card for gaming. I'd actually dare to say that the average GTX Titan buyer might be somewhat more knowledgeable than most given the professional uses the card is put to.

Did I go onto said forums and tell people they are inferior and foolish for buying such a card? No, only shared an opinion on it being to expensive to warrant a purchase in my eyes even with a few different features. Whether or not they want to spend the money is up to them, I won't put a lock on someones pocket book. But I do have a problem with things like this because like I had stated earlier its falsely labeling this an "Pure Gaming Card".

I hope my participation doesn't, cause you much pain.

Oh relax @cliffordcooley, were all friends here

dividebyzero dividebyzero, trainee n00b, said:

IMHO, if you got $1000 to spent for vga card (for gaming), add some more cash then pick two 780s.

As for me, compared with Quadro 6000, Titan black 1k card could be a solution to people (with limited budget) who need that large memory and utilize its CUDA cores

That's pretty much the rationale behind the sales. If you're into gaming then a cheaper 780 is the way to go. If you do some pro work + some gaming then you might buy a Titan (or a midrange card + a Quadro). If your workload is pro orientated then you probably buy more than one Titan (and maybe a Quadro for Viewport). When you consider how many cards go into render rigs, individual sales to benchmarkers and gamers start to look fairly insignificant.

(Tyan FT72B 4U w/ 8 GTX Titans. Workload: Octane).

Eight Titan's work out 20% cheaper than two Quadro K6000's.

1 person liked this | amstech amstech, TechSpot Enthusiast, said:

Graham, I am sick of your respectful informative posts that make perfect sense. Stop, or I will start ripping you for being an Nvidia fan boy.

God damnit, I am the only one allowed to be a fanboy on this site and get away with it.

:p

Ahhh hahahaha

dividebyzero dividebyzero, trainee n00b, said:

Graham, I am sick of your respectful informative posts that make perfect sense. Stop, or I will start ripping you for being an Nvidia fan boy.

God damnit, I am the only one allowed to be a fanboy on this site and get away with it.

:p

Ahhh hahahaha

I'd never go against the word of "Bob" Dobbs !

1 person liked this | theBest11778 theBest11778 said:

Simply put, Titan is meant for Hobbiests, Enthusiasts, and Small Businesses. Hobbiests that like to dabble in 3D rendering will find $1000 a steal. Enthusiasts don't care about money, they just want the Biggest Fastest thing there is regardless of cost. Small Businesses care take advantage of decent HW at a much cheaper price compared to Quadro line cards. Nvidia's not dumb, nor is Intel with their $1000 Extreme CPUs. There's a market for it. Why not take advantage. If the original Titan didn't sell this one wouldn't exist. Business 101.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Business 101.
Precisely!

I know I'm not gonna spend 1K on the card, but then I don't have a use it either. I also know I'm not going to belittle those who do have a use, whether it be professional or hobbyist oriented. If someone is willing to spend 100K for a rare Nintendo cartridge, why would I frown upon someone spending 1K for a card, they may actually be able to make a living with.

dividebyzero dividebyzero, trainee n00b, said:

Simply put, Titan is meant for Hobbiests, Enthusiasts, and Small Businesses. Hobbiests that like to dabble in 3D rendering will find $1000 a steal..

Sometimes it just comes down to checking all the boxes. Nvidia knows that professional workstation/co-processor cards are already designed around a hardware standard. Existing rackmount units are constrained by power supply limits ( 2 x 1200-1350W) for full height add-in cards, so the maximum number of GPUs are limited to the 1800-2000W as well as the (up to) eight PCI-E slots afforded by the motherboard. It's no real surprise that the Titan has pretty much the same power draw specification as any top-tier pro card, whether it be Quadro, Tesla, or FirePro.

No need to shell out big bucks for pro cards when some applications (most CG render engines) don't use double precision, and if you're filling server racks with 8, 16, 24, 32+ etc. boards at a time to build a render farm, you're not only saving cash, but getting the maximum workload return whilst staying within the form factor.

GhostRyder GhostRyder said:

Quoted from Us.Hardware.info

"In addition to gaming benchmarks we also ran various professional GPGPU benchmarks on a single GTX Titan Black and compared the results with the original Titan's, the GeForce GTX 780 Ti and the Radeon R9 290X. Lest we forget: this is the purpose this card was primarily designed for."

GPGPU Applications Benchmarks

GPGPU Benchmarks, Double Precision

Benchmarks look abyssmal...

Another Quote

"However, these days the first and only real purpose to buying a Titan is GPGPU, either because you want to run specific, professional Cuda applications or because of a particular need for FP64 precision. In that case, Titans are an affordable alternative to Tesla cards. Speaking of affordable alternative: in regular GPGPU applications, AMD's Radeon R9 290X tends to deliver better performance at a significantly lower price than the Titan. Our findings are confirmed by the popularity of AMD's offering among Litecoin miners and the likes."

[link]

"There is a handful of CUDA-optimized titles where the Titan does really well. Blender, 3ds Max, and Octane all show Nvidia's single-GPU flagship with a commanding lead over the GeForce GTX 680 and prior-gen 580. Bottom line: if you're thinking about using a Titan for rendering, check the application you're using first."

Will be Interesting to see how much of an improvement in these areas of course.

On a different note, the new Geforce Driver is great, got some nice performance in the updated game specs for Far Cry for whatever reason. Much more stable on my laptop at medium settings getting around 60FPS.

2 people like this | dividebyzero dividebyzero, trainee n00b, said:

Quoted from Us.Hardware.info....[snip]

Well, I thought it was common knowledge that Nvidia's OpenCL implementation was lacking.

But as Tom's Hardware noted (the link you've attributed to Anandtech)

Quoted from....[snip]

[link]

"There is a handful of CUDA-optimized titles where the Titan does really well. Blender, 3ds Max, and Octane all show Nvidia's single-GPU flagship with a commanding lead over the GeForce GTX 680 and prior-gen 580. Bottom line: if you're thinking about using a Titan for rendering, check the application you're using first."

Octane (see previous posts), Blender (see previous posts), V-Ray (see previous posts), and iRay are all optimized for CUDA, and are the pre-eminent CG renderers. AMD obviously doesn't do CUDA - and thus you see the AMD cards not included in benchmark graph comparison for those render engine benchmarks.

So...

Even though on paper the performance of some of the high end AMD GPU's such as the Radeon HD 7990 match the NVIDIA GTX Titan, software support for OpenCL GPU accelerated rendering is limited. As a result, tests have shown slow performance compared to Nvidia CUDA technology

From the Blender FAQ

Currently NVidia with CUDA is rendering faster. There is no fundamental reason why this should be so---we don't use any CUDA-specific features---but the compiler appears to be more mature, and can better support big kernels. OpenCL support is still being worked on and has not been optimized as much, because we haven't had the full kernel working yet

and...

OpenCL support for AMD/NVidia GPU rendering is currently on hold. Only a small subset of the entire rendering kernel can currently be compiled, which leaves this mostly at prototype. We will need major driver or hardware improvements to get full cycles support on AMD hardware. For NVidia CUDA still works faster

AMD - The immediate issue that you run into when trying OpenCL, is that compilation will take a long time, or the compiler will crash running out of memory. We can successfully compile a subset of the rendering kernel (thanks to the work of developers at AMD improving the driver), but not enough to consider this usable in practice beyond a demo.

NVidia hardware and compilers support true function calls, which is important for complex kernels. It seems that AMD hardware or compilers do not support them, or not to the same extent.

I might also note that the large movie CG design houses tend towards Nvidia for the same reason. Weta Digital (3 hours down the road from me) uses an Nvidia render farm, as does Pixar, amongst others...and of course, being CUDA optimized, supercomputers run Blender pretty well.

In short, while Nvidia's OpenCL performance pretty much sucks, CUDA doesn't, and is faster and more stable than OpenCL whether its running on Nvidia or AMD hardware.

And lastly, from an independent study (PDF)

In our tests, CUDA performed better when transferring data to and from the GPU. We did not see any considerable change in OpenCL's relative data transfer performance as more data were transferred. CUDA's kernel execution was also consistently faster than OpenCL's, despite the two implementations running nearly identical code.

CUDA seems to be a better choice for applications where achieving as high a performance as possible is important. Otherwise the choice between CUDA and OpenCL can be made by considering factors such as prior familiarity with either system, or available development tools for the target GPU hardware.

And....what a difference a few months make...

AMD's Radeon R9 290X tends to deliver better performance at a significantly lower price than the Titan.

Titan's MSRP is $999, the 290X is $699 for the out of stock cards, and hitting $900 for those in stock.

Looks like the "significant difference" is less significant by the day. Although I guess the silver lining is at least the cards you cant buy are significantly cheaper than those you can buy.

amstech amstech, TechSpot Enthusiast, said:

CUDA has turned into a masterpiece, and Borderlands 2 shut the mouths of the PhysX haters/naysayers. Now I just need to sell my U3011 and get a GSync monitor.

You gotta give it to AMD though, their demeanor is vital for the advancement and competition with some of the newer technology. To even say you can battle with Nvidia on certain fronts is nothing to be ashamed about.

GhostRyder GhostRyder said:

I might also note that the large movie CG design houses tend towards Nvidia for the same reason. Weta Digital (3 hours down the road from me) uses an Nvidia render farm, as does Pixar, amongst others...and of course, being CUDA optimized, supercomputers run Blender pretty well.

In short, while Nvidia's OpenCL performance pretty much sucks, CUDA doesn't, and is faster and more stable than OpenCL whether its running on Nvidia or AMD hardware.

And lastly, from an independent study (PDF)

Still depends on the application, only a select few of the Cuda applications which is for the most part all rendering applications.

These are the main 4 I at least hear about :

Lux (OpenCL)

Cycles (CUDA)

Octane (CUDA)

V-Ray RT (OpenCL and CUDA)

Though I have seen these used before:

Redshift (Cuda)

Arion (Cuda)

Thea Presto Engine (Cuda)

Furry ball 4.5 (Cuda)

Brigade (Cuda)

Now as far as Vray is concerned its a beta and unstable (openCL mind you), however the Lux one is not bad from what I have seen and heard and theres one called ratGPU (openCL) but ive heard that died. But its a moot point because there is an over abundance in the category of Rendering tech Cuda implemented software.

So the point is for this one main use this card has a decent price to performance ratio, but on everything else its way overpriced. But unless your soul purpose is rendering, there are many better alternatives out there which is the problem...

Gaming = 780ti

Compute and OpenCL = 290X

Rendering = Titan Black

Every card can have one task (At least at the top end) that they do well, does not make it justified on price.

However that small area is not much to justify, Titan (Original) had its price at 1k. The Titan was released and there was nothing even close except on the Dual GPU spectrum (GTX 690 and later HD 7990) which while still expensive it was because it was all that and a bag of chips (Well except on some of the compute of course). That was why 1k for a 6gb gaming beast, rendering monster, mediocre compute GPU made more since because it was 1 of a kind for a good couple of months. This card has TONS of alternatives to every category except rendering and most of those people probably already have Titan and are not going to give it up for a 5-10% performance difference.

Titan's MSRP is $999, the 290X is $699 for the out of stock cards, and hitting $900 for those instock.

Looks like the "significant difference" is less significant by the day. Although I guess the silver lining is at least the cards you cant buy are significantly cheaper than those you can buy.

No its not, the MSRP of the card is $549.99 and thats what it was released at. The sellers (In America mind you, theres an article about it...) are the ones who chose to raise the price because of supply and demand. Amazon has the XFX Double D for as low as $599.99 and some of the 290's are at the 449.99 threshold for MSI (Albeit that one is out of stock atm, but orderable meaning you can lock in that price). Still up from its launch price, but its come way down though there are a few retailers trying to squeeze said 900 bucks from people but there are cheaper alternatives.

dividebyzero dividebyzero, trainee n00b, said:

Still depends on the application, only a select few of the Cuda applications which is for the most part all rendering applications.

Some notable additions:

Autodesk 3ds Max (CUDA and OpenCL)

Mental Ray (CUDA) and iRAY (CUDA raytracing)

OptiX and SceniX ( both require professional drivers. Install a cheap Quadro to enable with the Titan)

finalRender (CUDA)

Indigo (CUDA and OpenCL)

Blender (CUDA and OpenCL)

The thing about CUDA is that the plug-ins are compatible across a wide range of applications. Nvidia is first and foremost a software company.

Now as far as Vray is concerned its a beta and unstable (openCL mind you), however the Lux one is not bad from what I have seen

V-Ray (CUDA) is stable. SmallLux is the usual choice for the hobbyist using AMD hardware since it works as advertised, but doesn't scale too well from what I've seen.

Every card can have one task (At least at the top end) that they do well, does not make it justified on price.However that small area is not much to justify, Titan (Original) had its price at 1k

If the only other viable solution is more than $1k then the price is justified. Even if you limit the workload to CG rendering (and dismiss distributed computing and a host of other GPGPU applications and benchmarkers), that still encompasses:

Filmmakers and animators (also includes hobbyists who like to game in their spare time)

3D artists (sculpture, installation, CG)

Architects

Advertisers and Marketing agencies

Design houses (everything from interior, furniture, motorsport, custom fabrication, renovators etc) and pretty much anyone who uses CAD that wants to present a virtualized render of the final design.

Now this may well seem like niche markets- and they are. The GTX Titan isn't anything other than niche.

This card has TONS of alternatives to every category except rendering and most of those people probably already have Titan and are not going to give it up for a 5-10% performance difference.

You're assuming that the virtualization and other CG market is static. It is not. A quick look at the quantity of CG content in movies, TV, and advertising should provide an inkling that the market is expanding. If you need some numbers to back that viewpoint up, then...

And CG hardware (not to be confused with graphics card sales)

BTW: The Tyan render racks are popular enough for Tyan to build a custom motherboard with 4 PLX 8747 lane extenders to enable full PCI-E 3.0 bandwidth. At $17,000 a rack it is actually very cost effective. Using seven Tesla K20's (+ one cheap Quadro for video out) would triple the cost.

GhostRyder GhostRyder said:

V-Ray (CUDA) is stable. SmallLux is the usual choice for the hobbyist using AMD hardware since it works as advertised, but doesn't scale too well from what I've seen.

You didn't read, I said "Now as far as Vray is concerned its a beta and unstable (openCL mind you). Never said the Cuda portion was unstable, its the only used version except for experimentation on OpenCL.

If the only other viable solution is more than $1k then the price is justified. Even if you limit the workload to CG rendering (and dismiss distributed computing and a host of other GPGPU applications and benchmarkers), that still encompasses:

Well this would be just fine if a card nearly half the price did not do most of those needs on equal grounds or better... When we were talking original Titan and there was nothing in that category then it made much more sense and fit a wider area. This version is only slighty bumped from its predecessor and has become dated on many of the areas that it once dominated but still keeps in the same price bracket.

Now this may well seem like niche markets- and they are. The GTX Titan isn't anything other than niche.

Your correct, its a small portion of people that really benefit from the power of the Titan. However with Titan black that area has dwindled even further down because its value has gone down significantly. Gamers will choose 780ti in more cases over Titan black, Titan has already sold significant enough numbers (You have even posted pictures of machines with multitudes of those cards inside) that this is nothing that will be bought to replace the current ones, and the compute area is still more closely dominated by AMD for a lower price.

You're assuming that the virtualization and other CG market is static. It is not. A quick look at the quantity of CG content in movies, TV, and advertising should provide an inkling that the market is expanding. If you need some numbers to back that viewpoint up, then...

CG has become a fundamental part of T.V. and movies, you are correct in that regard. But the people who already wanted something like this or to have an extreme render setup bought the 5% different titan around the past 6 months. This one was not hyped up for a reason and thats pretty apparent when you compare the two and how the tech has become dated for many of the aspects it once had under its belt. Professional aspects are still apparent, but the group of people this is needed especially viewing the 1k base asking price is going to be very low.

I'm just going to stick with this quote from US.Hardware

"However, these days the first and only real purpose to buying a Titan is GPGPU, either because you want to run specific, professional Cuda applications or because of a particular need for FP64 precision. In that case, Titans are an affordable alternative to Tesla cards. Speaking of affordable alternative: in regular GPGPU applications, AMD's Radeon R9 290X tends to deliver better performance at a significantly lower price than the Titan. Our findings are confirmed by the popularity of AMD's offering among Litecoin miners and the likes."

Its got a category its good at, but that is not near the original area Titan itself started and dominated. That's the last I care to talk about this, I am not the only one who has come to the same conclusion that its price is steep for what it is. Unless you want to use it for straight up Cuda based apps there are alternatives all around for better prices from its own side and others.

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

You didn't read, I said "Now as far as Vray is concerned its a beta and unstable (openCL mind you). Never said the Cuda portion was unstable, its the only used version except for experimentation on OpenCL.

And I was merely pointing out that the CUDA port was stable to clarify. Please don't nitpick.

Well this would be just fine if a card nearly half the price did not do most of those needs on equal grounds or better...

You realize that the Titan (and it's Quadro/Tesla brethren) are a success in CG in large part due to the 6GB framebuffer. The larger the framebuffer, the more complex the scene can be, the larger the frame can be, and the easier the scene can be rendered.

Now, can you provide me with the name of the card "nearly half the price" that has a 6GB framebuffer?

Sure, a cheaper card with a smaller framebuffer will do the job...eventually, but cost effectiveness is a product of performance per watt and time to completion of task.

If you believe that framebuffer is not a consideration, then you're mistaken- and as I said earlier, this difference is a fairly crucial market segmentation tool that Nvidia have employed ( AMD has also jumped on this bandwagon with the 12GB S10000 - a board that is basically a HD 7990, yet no 12GB 7990 has been or will ever be made).

When we were talking original Titan and there was nothing in that category then it made much more sense and fit a wider area. This version is only slighty bumped from its predecessor.

Doesn't matter. There are new systems being built every day. What do you think that these new systems are going fitted with - the old Titan (likely EOL in any case) or a faster Titan Black at the same price?

However with Titan black that area has dwindled even further down because its value has gone down significantly.

No it hasn't. The graphs and all the literature point to a growing market in CG, and whilst framebuffer is king (along with code compatibility for the fastest render) the Titan/Titan Black rules by default.

Don't believe a framebuffer can have that much effect on efficiency for a GPU render...

The Quadro K6000 is clocked at least 12% lower than the GTX 780 (most renders peg the GPU at or near 100%), has 25% more shaders and the same bandwidth/memory frequency, yet the speed up over the 780 is in the order of 300%. The difference? 12GB of framebuffer per GPU versus 3 for the 780.

Gamers will choose 780ti in more cases over Titan black, Titan has already sold significant enough numbers (You have even posted pictures of machines with multitudes of those cards inside) that this is nothing that will be bought to replace the current ones

I think the gaming aspect of sales for the Titan Black has already been covered and is not in dispute. As far back as Post #8 I said

The card isn't aimed at gamers - at least not 99.99% of them.

and the compute area is still more closely dominated by AMD for a lower price.

Again, it depends upon workload. You certainly don't see many AMD GPU render farms.

CG has become a fundamental part of T.V. and movies, you are correct in that regard. But the people who already wanted something like this or to have an extreme render setup bought the 5% different titan around the past 6 months.

So, even though the CG hardware is predicted to grow by $US5 billion a year, and the Titan is the one of the most cost effective solutions, you don't see anyone buying else buying it from now on? Well, everyone is entitled to their opinion I guess. Maybe I'll just add new posts whenever a new Titan-based render system goes live. If you're right this thread will die a death- if not I guess the thread will get a few bumps.

Professional aspects are still apparent, but the group of people this is needed especially viewing the 1k base asking price is going to be very low.

The unit sales for any $1K card is very low.

GK110's successor (GM200) isn't slated to arrive until early 2015.

If your workload thrives on CUDA and requires a large framebuffer then GK110 is the choice by default. This is a prime example of Nvidia's hand-in-glove approach to market segmentation, hardware/software ecosystem, and strategic marketing. It is also the reason that Nvidia continue to produce revenue, and why they own 80+% of the professional graphics business.

If the GPU was going to be quickly supplanted then they wouldn't have introduced the K6000, K40, and Titan Black. Nvidia need sales of these SKU's to recoup the investment of putting them into the market.

Manufacturers don't tend to replace a new top-tier GPU/board within a year of its introduction, and especially not when the GPU is linked to professional cards. If you think that this is an Nvidia-only trait, I'd ask you where the Hawaii based FirePro are.

Now, as far as I can see, your main problem with the card seems to stem from how it is being marketed. You point out an OEMs advertising

But I do have a problem with things like this because like I had stated earlier its falsely labeling this an "Pure Gaming Card".

Now, I ask you, if you had a product that competes in a number of arena's why wouldn't you direct ad campaigns at every demographic it could possibly appeal to?

Secondly, I'd query why this is so abhorrent to you, yet you openly support a company whose PR and marketing have just as bad (or worse) record -including a continuing string of suspect issues stemming from their CPU claims, that began with completely spurious Barcelona marketing to the guerrilla marketing to pre-sell 900 series chipset boards and Bulldozer.

Personally I don't see much ethically wrong with marketing a product if does what is claimed even if it is expensive. I do see something ethically wrong in marketing fictitious products, and deliberately overestimating performance to pre-sell a product.

Guest said:

@/0

totally agree with your points (no way ill not)

Just a question:

What about the Mac Pro?

Why not nVidia inside atm?

thanx

dividebyzero dividebyzero, trainee n00b, said:

@/0

totally agree with your points (no way ill not)

Just a question:

What about the Mac Pro?

Why not nVidia inside atm?

thanx

From my understanding, it seems like price is a major factor. Both the end product and AMD's willingness to customize the cards for Apple.

The FirePro D500 is basically a HD 7870XT (Tahiti LE) with the full Tahiti 384-bit bus width. No strict analogue exists in AMD's standard FirePro line, but it is between the W7000 ($800 at Newegg / $720 cheapest price found) and W8000 ($1400 at Newegg / $1214 cheapest found). The D700 upgrade for the Pro is a custom W9000 ($3400 at Newegg / $3241 cheapest price found)

A quick look at the Mac Pro configuration page says that you can upgrade from dual D500's to dual D700's for $600. That would be over a $4000-5000 difference if buying the standard FirePro workstation cards. In effect, AMD are selling professional QA'ed boards to Apple at little more than (and probably less if factoring in Apple's profit margins) consumer graphics prices. The only upside of a contract like this is PR linking the brand to Apple.

Gars Gars said:

Sry for posting as a guest (about mac pro)

ive read a lot of rebranding/etc and was missconcepted about it

I was looking for the main reason - AMD vs nVidia (or FirePro vs Keppler/Tesla)

you pointed a lot but missed the Apple point of view

dividebyzero dividebyzero, trainee n00b, said:

I was looking for the main reason - AMD vs nVidia (or FirePro vs Keppler/Tesla)

you pointed a lot but missed the Apple point of view

Apple have always maintained the same strategy with their ODM/OEM partners. Get the component as cheap as possible, and don't let one vendor get too comfortable.

Apple have a record of alternating between Nvidia and AMD which is likely a deliberate strategy to keeping them in line.

As to why FirePro over Tesla/Quadro ? Why not ? Apple have their own software ecosystem that doesn't really leverage CUDA to any great extent for the Mac's target audience (Final Cut excepted AFAIK). As far as I'm aware, Adobe are also pushing OpenCL over CUDA these days also. If Apple are leaning toward OpenCL performance then the FirePro makes sense, as does AMD's willingness to cut Apple a sweetheart deal on pricing and semi-custom SKUs.

Well, the pointless VRAM race is on

9th March: Sapphire show a 8GB 290X...

24th March: EVGA leak 6GB 780/780Ti

The vendor specific AIB GDDR5 race!

Well, I didn't expect any AIB was deranged enough to think an 8GB gaming graphics card was a marketable proposition, but I guess once it arrived the response was predictable

Name one company that has a 6GB 780 let alone a 6GB 780 Ti ? There isn't one. There won't be one unless AMD allow vendor 8GB 290X's and Nvidia changes its stance.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.