Luckily, your "Incoherent Rambling" quotient is commendably negligibleI hope my participation doesn't, cause you much pain.
Nevertheless it'll mainly be used as a gaming card but that's the purchasers prerogative. I don't know what all the fuss is about whether it's a computing card or gaming card, it's just too damn expensive to warrant purchasing at any rate.A graphics card being worth more, because it does more is not an opinion.
Thats just it, the Titan is not a gaming card. DBZ has been trying to tell everyone this, everytime he post about the Titan. He has commented several times, the Titan is a computing card. It may not be worth **** with Bitcoins, but that is not the only computing projects that need processing power.
Don't worry as long as you don't share an opinion you should be safe from Mr. Opinion KillerSince my best VGA at the moment, is Intel's HD-4000 IGP in my Core-i3, I wouldn't even be allowed to troll in this thread, would I......?
Thank you!Nevertheless it'll mainly be used as a gaming card but that's the purchasers prerogative. I don't know what all the fuss is about whether it's a computing card or gaming card, it's just too damn expensive to warrant purchasing at any rate.
Do you, because clearly sharing an opinion warrants 3 paragraphs and pages of insults on a tech forum apparently.Do you have some kind of learning disability?
Bad punctuation, by your logic I must have made you distraught. I am so sorry for that!Whose talking about the hardware? I was referring to the user base
Did I go onto said forums and tell people they are inferior and foolish for buying such a card? No, only shared an opinion on it being to expensive to warrant a purchase in my eyes even with a few different features. Whether or not they want to spend the money is up to them, I won't put a lock on someones pocket book. But I do have a problem with things like this because like I had stated earlier its falsely labeling this an "Pure Gaming Card".Aside from the obvious issue that most people dropping a $1000 on a graphics card probably know full well what they're in for, there is also the fact that many people don't even use the card for gaming. I'd actually dare to say that the average GTX Titan buyer might be somewhat more knowledgeable than most given the professional uses the card is put to.
Oh relax @cliffordcooley, were all friends hereI hope my participation doesn't, cause you much pain.
That's pretty much the rationale behind the sales. If you're into gaming then a cheaper 780 is the way to go. If you do some pro work + some gaming then you might buy a Titan (or a midrange card + a Quadro). If your workload is pro orientated then you probably buy more than one Titan (and maybe a Quadro for Viewport). When you consider how many cards go into render rigs, individual sales to benchmarkers and gamers start to look fairly insignificant.IMHO, if you got $1000 to spent for vga card (for gaming), add some more cash then pick two 780s.
As for me, compared with Quadro 6000, Titan black 1k card could be a solution to people (with limited budget) who need that large memory and utilize its CUDA cores
I'd never go against the word of "Bob" Dobbs !Graham, I am sick of your respectful informative posts that make perfect sense. Stop, or I will start ripping you for being an Nvidia fan boy.
God damnit, I am the only one allowed to be a fanboy on this site and get away with it.
Ahhh hahahaha
Precisely!Business 101.
Sometimes it just comes down to checking all the boxes. Nvidia knows that professional workstation/co-processor cards are already designed around a hardware standard. Existing rackmount units are constrained by power supply limits ( 2 x 1200-1350W) for full height add-in cards, so the maximum number of GPUs are limited to the 1800-2000W as well as the (up to) eight PCI-E slots afforded by the motherboard. It's no real surprise that the Titan has pretty much the same power draw specification as any top-tier pro card, whether it be Quadro, Tesla, or FirePro.Simply put, Titan is meant for Hobbiests, Enthusiasts, and Small Businesses. Hobbiests that like to dabble in 3D rendering will find $1000 a steal..
Well, I thought it was common knowledge that Nvidia's OpenCL implementation was lacking.Quoted from Us.Hardware.info....[snip]
Octane (see previous posts), Blender (see previous posts), V-Ray (see previous posts), and iRay are all optimized for CUDA, and are the pre-eminent CG renderers. AMD obviously doesn't do CUDA - and thus you see the AMD cards not included in benchmark graph comparison for those render engine benchmarks.Quoted from....[snip]
Or theres also Anandtechs review of the Titan since the black review is not out yet.
"There is a handful of CUDA-optimized titles where the Titan does really well. Blender, 3ds Max, and Octane all show Nvidia's single-GPU flagship with a commanding lead over the GeForce GTX 680 and prior-gen 580. Bottom line: if you're thinking about using a Titan for rendering, check the application you're using first."
From the Blender FAQEven though on paper the performance of some of the high end AMD GPU's such as the Radeon HD 7990 match the NVIDIA GTX Titan, software support for OpenCL GPU accelerated rendering is limited. As a result, tests have shown slow performance compared to Nvidia CUDA technology
and...Currently NVidia with CUDA is rendering faster. There is no fundamental reason why this should be so—we don't use any CUDA-specific features—but the compiler appears to be more mature, and can better support big kernels. OpenCL support is still being worked on and has not been optimized as much, because we haven't had the full kernel working yet
OpenCL support for AMD/NVidia GPU rendering is currently on hold. Only a small subset of the entire rendering kernel can currently be compiled, which leaves this mostly at prototype. We will need major driver or hardware improvements to get full cycles support on AMD hardware. For NVidia CUDA still works faster
I might also note that the large movie CG design houses tend towards Nvidia for the same reason. Weta Digital (3 hours down the road from me) uses an Nvidia render farm, as does Pixar, amongst others...and of course, being CUDA optimized, supercomputers run Blender pretty well.AMD - The immediate issue that you run into when trying OpenCL, is that compilation will take a long time, or the compiler will crash running out of memory. We can successfully compile a subset of the rendering kernel (thanks to the work of developers at AMD improving the driver), but not enough to consider this usable in practice beyond a demo.
NVidia hardware and compilers support true function calls, which is important for complex kernels. It seems that AMD hardware or compilers do not support them, or not to the same extent.
In our tests, CUDA performed better when transferring data to and from the GPU. We did not see any considerable change in OpenCL’s relative data transfer performance as more data were transferred. CUDA’s kernel execution was also consistently faster than OpenCL’s, despite the two implementations running nearly identical code.
CUDA seems to be a better choice for applications where achieving as high a performance as possible is important. Otherwise the choice between CUDA and OpenCL can be made by considering factors such as prior familiarity with either system, or available development tools for the target GPU hardware.
Titan's MSRP is $999, the 290X is $699 for the out of stock cards, and hitting $900 for those in stock.AMD's Radeon R9 290X tends to deliver better performance at a significantly lower price than the Titan.
I might also note that the large movie CG design houses tend towards Nvidia for the same reason. Weta Digital (3 hours down the road from me) uses an Nvidia render farm, as does Pixar, amongst others...and of course, being CUDA optimized, supercomputers run Blender pretty well.
In short, while Nvidia's OpenCL performance pretty much sucks, CUDA doesn't, and is faster and more stable than OpenCL whether its running on Nvidia or AMD hardware.
And lastly, from an independent study (PDF)
No its not, the MSRP of the card is $549.99 and thats what it was released at. The sellers (In America mind you, theres an article about it...) are the ones who chose to raise the price because of supply and demand. Amazon has the XFX Double D for as low as $599.99 and some of the 290's are at the 449.99 threshold for MSI (Albeit that one is out of stock atm, but orderable meaning you can lock in that price). Still up from its launch price, but its come way down though there are a few retailers trying to squeeze said 900 bucks from people but there are cheaper alternatives.Titan's MSRP is $999, the 290X is $699 for the out of stock cards, and hitting $900 for those instock.
Looks like the "significant difference" is less significant by the day. Although I guess the silver lining is at least the cards you cant buy are significantly cheaper than those you can buy.
Some notable additions:Still depends on the application, only a select few of the Cuda applications which is for the most part all rendering applications.
V-Ray (CUDA) is stable. SmallLux is the usual choice for the hobbyist using AMD hardware since it works as advertised, but doesn't scale too well from what I've seen.Now as far as Vray is concerned its a beta and unstable (openCL mind you), however the Lux one is not bad from what I have seen
If the only other viable solution is more than $1k then the price is justified. Even if you limit the workload to CG rendering (and dismiss distributed computing and a host of other GPGPU applications and benchmarkers), that still encompasses:Every card can have one task (At least at the top end) that they do well, does not make it justified on price.However that small area is not much to justify, Titan (Original) had its price at 1k
You're assuming that the virtualization and other CG market is static. It is not. A quick look at the quantity of CG content in movies, TV, and advertising should provide an inkling that the market is expanding. If you need some numbers to back that viewpoint up, then...This card has TONS of alternatives to every category except rendering and most of those people probably already have Titan and are not going to give it up for a 5-10% performance difference.
You didn't read, I said "Now as far as Vray is concerned its a beta and unstable (openCL mind you). Never said the Cuda portion was unstable, its the only used version except for experimentation on OpenCL.V-Ray (CUDA) is stable. SmallLux is the usual choice for the hobbyist using AMD hardware since it works as advertised, but doesn't scale too well from what I've seen.
Well this would be just fine if a card nearly half the price did not do most of those needs on equal grounds or better... When we were talking original Titan and there was nothing in that category then it made much more sense and fit a wider area. This version is only slighty bumped from its predecessor and has become dated on many of the areas that it once dominated but still keeps in the same price bracket.If the only other viable solution is more than $1k then the price is justified. Even if you limit the workload to CG rendering (and dismiss distributed computing and a host of other GPGPU applications and benchmarkers), that still encompasses:
Your correct, its a small portion of people that really benefit from the power of the Titan. However with Titan black that area has dwindled even further down because its value has gone down significantly. Gamers will choose 780ti in more cases over Titan black, Titan has already sold significant enough numbers (You have even posted pictures of machines with multitudes of those cards inside) that this is nothing that will be bought to replace the current ones, and the compute area is still more closely dominated by AMD for a lower price.Now this may well seem like niche markets- and they are. The GTX Titan isn't anything other than niche.
CG has become a fundamental part of T.V. and movies, you are correct in that regard. But the people who already wanted something like this or to have an extreme render setup bought the 5% different titan around the past 6 months. This one was not hyped up for a reason and thats pretty apparent when you compare the two and how the tech has become dated for many of the aspects it once had under its belt. Professional aspects are still apparent, but the group of people this is needed especially viewing the 1k base asking price is going to be very low.You're assuming that the virtualization and other CG market is static. It is not. A quick look at the quantity of CG content in movies, TV, and advertising should provide an inkling that the market is expanding. If you need some numbers to back that viewpoint up, then...
And I was merely pointing out that the CUDA port was stable to clarify. Please don't nitpick.You didn't read, I said "Now as far as Vray is concerned its a beta and unstable (openCL mind you). Never said the Cuda portion was unstable, its the only used version except for experimentation on OpenCL.
You realize that the Titan (and it's Quadro/Tesla brethren) are a success in CG in large part due to the 6GB framebuffer. The larger the framebuffer, the more complex the scene can be, the larger the frame can be, and the easier the scene can be rendered.Well this would be just fine if a card nearly half the price did not do most of those needs on equal grounds or better...
Doesn't matter. There are new systems being built every day. What do you think that these new systems are going fitted with - the old Titan (likely EOL in any case) or a faster Titan Black at the same price?When we were talking original Titan and there was nothing in that category then it made much more sense and fit a wider area. This version is only slighty bumped from its predecessor.
No it hasn't. The graphs and all the literature point to a growing market in CG, and whilst framebuffer is king (along with code compatibility for the fastest render) the Titan/Titan Black rules by default.However with Titan black that area has dwindled even further down because its value has gone down significantly.
I think the gaming aspect of sales for the Titan Black has already been covered and is not in dispute. As far back as Post #8 I saidGamers will choose 780ti in more cases over Titan black, Titan has already sold significant enough numbers (You have even posted pictures of machines with multitudes of those cards inside) that this is nothing that will be bought to replace the current ones
The card isn't aimed at gamers - at least not 99.99% of them.
Again, it depends upon workload. You certainly don't see many AMD GPU render farms.and the compute area is still more closely dominated by AMD for a lower price.
So, even though the CG hardware is predicted to grow by $US5 billion a year, and the Titan is the one of the most cost effective solutions, you don't see anyone buying else buying it from now on? Well, everyone is entitled to their opinion I guess. Maybe I'll just add new posts whenever a new Titan-based render system goes live. If you're right this thread will die a death- if not I guess the thread will get a few bumps.CG has become a fundamental part of T.V. and movies, you are correct in that regard. But the people who already wanted something like this or to have an extreme render setup bought the 5% different titan around the past 6 months.
The unit sales for any $1K card is very low.Professional aspects are still apparent, but the group of people this is needed especially viewing the 1k base asking price is going to be very low.
Now, I ask you, if you had a product that competes in a number of arena's why wouldn't you direct ad campaigns at every demographic it could possibly appeal to?But I do have a problem with things like this because like I had stated earlier its falsely labeling this an "Pure Gaming Card".
From my understanding, it seems like price is a major factor. Both the end product and AMD's willingness to customize the cards for Apple.@/0
totally agree with your points (no way ill not)
Just a question:
What about the Mac Pro?
Why not nVidia inside atm?
thanx
Apple have always maintained the same strategy with their ODM/OEM partners. Get the component as cheap as possible, and don't let one vendor get too comfortable.I was looking for the main reason - AMD vs nVidia (or FirePro vs Keppler/Tesla)
you pointed a lot but missed the Apple point of view
Name one company that has a 6GB 780 let alone a 6GB 780 Ti ? There isn't one. There won't be one unless AMD allow vendor 8GB 290X's and Nvidia changes its stance.