They will. For about 2 weeks until Nvidia releases their dual gpu card. Then we all wait for the next gen. In terms of multi gpu the only time ati ever came out on top was the hd 5970.bioflex said:
seriously i dont really care about all the technical nubers here, all i care about is whether that card would be worth the money being spent on it......and here is to wishing amd takes the crown for best gpu performance
That would be if the people of AMD are stupid or confident enough to release the card before Nvidia does, because after that Nvidia would be DELAYING and DELAYING IT untill they get to be better than HD 6990, AMD should think a little bit more abou Strategy, cause Nvidia is really showing that they are much bettern on that.princeton said:
They will. For about 2 weeks until Nvidia releases their dual gpu card. Then we all wait for the next gen. In terms of multi gpu the only time ati ever came out on top was the hd 5970.
Well played !Yeah, they should delay like they did with the gtx 400, it was great for them.
Crysis 2 won't be the be-all-and-end-all game that some seem to think it will be (imo). Once game dev's start making better use of DX11 features -S.T.A.L.K.E.R.2 (CryEngine3) comes to mind- realistic effects, ambient occlusion, more pervasive physics (destructable/interactive enviroments etc.),more widespread use of tessellation along with higher driver/game levels of MLAA/SSAA/TrSSAA (for example) should keep the graphics market ticking over for the foreseeable future....and once DX11 has run it's race, there's always DirectX 12 and ray tracing.I dont know if it is better to buy one of those mosnter$ than will last 5 years being the best or buying a half price card than will, dont know, 3 years being the best?. But I don't see any game in the future than will use more than a gtx 480, only crysis 2.
Not to mention ~99% of hospitals and health facilities for radiography (X-ray/CT/MRI tomography), audiology and numerous other branches of medical discipline....but how does that relate to the HD 6990 ?...or are you just trolling?People also seem to forget that nVIDIA's CUDA cores are actually used by professionals, end users, scientists, servers, and video and picture editors.
Highly unlikely......AMD has Eyefinity (a fad),
Patently untrue. The only negatives I think you can lay at AMD's graphics drivers are lack of legacy support, the (up until recently) prehistoric profile setting and a smaller team of code writers. Crossfire and general gaming applications are for the most part very good. ( I run both SLI and CFX )and time after time, new Catalyst driver releases offer very very very little in adding performance.
You obviously never tried to buy a HD 5870 or 5890 some time between October 2009 and November 2010. CPU's on the other hand are more likely priced due to 1.The fact that the process they use (45nm) is ancient -the tooling and R&D costs have been amortized some time ago, and 2. to maintain marketshare (see recent drops in GTX 460 pricing for a comparison)AMD's GPU's and CPU's are cheaper because they have no choice.
And just how much marketshare, mindshare and revenue did that ethos cost nvidia when the G212 failed to materialize and GF100 (Fermi) was hurriedly pressed into action as a desktop card -which was never nvidia's original intention?Jensen said it himself BEFORE Fermi was out... "There is no safety net at nVIDIA.".
nvidia ended up doing both..........I'm definitely thinking tr....AKA, go big or go home..
....ollAMD has yet to get to get off the couch if you ask me.