Leaked: ATI Radeon HD 6990 specifications

By on November 24, 2010, 6:17 PM
The AMD-Nvidia war is about to go dual GPU. Following the leak of AMD Radeon HD 6970 benchmark numbers, a slide from what appears to be a presentation on the AMD Radeon HD 6990 (codenamed Antilles) has leaked, courtesy of user Gast on the 3DCenter forums.

If the slide is legitimate, and it appears to be, the graphics card will integrate 3840 stream processors and have 4GB of GDDR5 memory clocked at 4.80GHz. It will be equipped with two DVI-I and three mDP connectors. Power consumption will be at 300W under load and around 30W when idle. The next-generation dual-chip flagship offering from AMD will carry two codename Cayman GPUs with 1920 stream processors per chip. The card will deliver 6.0 trillion floating point operations per second (TFLOPS) single-precision performance or up to 1.5TFLOPS double-precision performance.

If it were released today, Antilles would become the top performing graphics card in the world. Nvidia is rumored to be delaying its dual-GPU GTX 590 in order to implement further improvements so that it can once again beat AMD's offering (Nvidia currently holds the crown for single-GPU performance with its GeForce GTX 580). Antilles is slated to ship in the first quarter of next year, though pricing has yet to be announced.





User Comments: 62

Got something to say? Post a comment
madboyv1, TechSpot Paladin, said:

I'm not the only one that is excited that Nvidia is back in the game and that the GPU war is off to a fresh start, am I?

Legendle2007 said:

I don't think Nvidia was ever out of the game.... the HD 5000 series was good but they were only able to match Nvidia's level. For the most part, Nvidia has been winning the GPU war.

Wagan8r Wagan8r said:

madboyv1 said:

I'm not the only one that is excited that Nvidia is back in the game and that the GPU war is off to a fresh start, am I?

Nope. I'm pumped about it too! I can't afford the high-end cards, but I'm ready for the trickle down effect to kick in.

bioflex said:

seriously i dont really care about all the technical nubers here, all i care about is whether that card would be worth the money being spent on it......and here is to wishing amd takes the crown for best gpu performance

Guest said:

Well everyone is entitled to their view of history........

dividebyzero dividebyzero, trainee n00b, said:

The bandwidth would imply that the card is using the older 5Gb GDDR5 chips.

HD 6990 = 307.2 Gb/sec

HD 5870 = 153.6 Gb/sec x 2 GPU's= 307.2 Gb/sec

And it's definitely using a 256-bit memory bus.

Cueto_99 said:

Both GTX590 and AMD6990 specs are impressive to say the least... Impressive as their future retail price... But is good to see both companies again going for the top spot! My good wishes to AMD

princeton princeton said:

bioflex said:

seriously i dont really care about all the technical nubers here, all i care about is whether that card would be worth the money being spent on it......and here is to wishing amd takes the crown for best gpu performance

They will. For about 2 weeks until Nvidia releases their dual gpu card. Then we all wait for the next gen. In terms of multi gpu the only time ati ever came out on top was the hd 5970.

Adhmuz Adhmuz, TechSpot Paladin, said:

Now am I the only one looking at this and thinking Nvidia is going to have a hard time to match or beat this card without having something thats going to consume 400+ watts of power? By the looks of this card ATI is pairing up two 6970's to make this beast which is going to be ATI's flagship single GPU card power consumption on it is going to be in the realm of 200 watts, and I imagine its being downclocked to make that 300 watt number they posted. The 6970 is rumored to be 10% faster than a GTX480 making it 10-20% slower than the GTX580. If by some amazing feat of technological ingenuity Nvidia can put two of these 250 watt chips onto a single PCB without lowering their specs than they'll have it, but I honestly don't see this happening. With heat and power consumption in mind it's going to very close and I wouldn't be surprised if the two cards are within 5% of each other.

Evanspec said:

NVIDIA and AMD are going at it hard now. I've only been involved with computer building for about a year and a half now, I haven't built one yet, but this is the hardest they've been fighting each other since I was interested (which was when the GTX 200 vs HD 4xxx were the flagships). They are fighting for the top name and the top performance, and we, the consumers, get to sit down and watch the sparks fly. It'll only make prices drop and performance soar. A toast to rivalry!

dividebyzero dividebyzero, trainee n00b, said:

@Adhmuz

Bear in mind that the HD 6990 need not actually adhere to the 300w PCI spec in it's entirety. The card is supposed to have an integrated power-limiter that monitors power draw with every GPU clock cycle and will dynamically adjust power usage according to which shader blocks are active at any given time. Since only a "power virus" such as OCCT or FurMark actually causes the whole GPU+VRAM to be active at a given time, the TDP can effectively exceed 300 watts while never breaching that figure - hope I explained that well enough to be understood.

The GTX 580 can call on a similar feature, although the AMD solution (hardware) seems a much better alternative to nvidia's driver based solution.

@Evanspec

This is pretty minor stuff compared with the Nvidia G80 (8800GTX/Ultra) v ATI R600 (HD 2900XT) battle of 2007. The next process node (28nm- nvidia's Kepler vs. AMD's Southern Islands) due in the latter half of next year is shaping up to be a doozy though.

fpsgamerJR62 said:

Time to bring out the heavyweights in the GPU wars. Here's a challenge to Nvidia and AMD. Let's see who can offer gamers the lowest price on a next-gen dual-GPU card.

Guest said:

I think everyone is forgeting something. It isn't single GPU, it is Single PCB.

Guest said:

This sounds like fanboy logic or A nivda employee statement, lol. Nvida lost alot of ground/customers too 4XXX & 5XXX. The 4XXX ATI/AMD did not ever beat them as benches go but was A way better value (cost vs preformance), now as far as the 5XXX they Destroyed Nivda in Value ... & benches were 50/50 (nidva only benched higer in nvida backed games). Now as far as the 580, 580x2 and the 6990 ... the 580 cost more then and 5970 and the 5970 cost $50.00 less and is 30% -40 % faster (yes I know its 2 gpu's .. but the fact is old 5970 is A WAY better value.) now 6990 will cost about $650 when it first comes out and the 580x2 will be $900.00 .. for $250.00 it needs too be alot better & I dont think it will. The point is , if Nivda dont get their price point/cost down & value up they are going too die A slow death. I admit I am Value bandwagoner, I will jump onboard whatever is doing the best at the moment with-in cost and most ppl are like this. Anyhow, you said they were never out ... I dont think they even back in yet .. the 580 Value is not their ... ... As of right now , I do plan on getting 2x 6970 right after x-mas (might hold out for the 6990) , because its time to upgrade for me and this will be the best VALUE. ( running 3x 4850s still( P22,000 3Dmark V )).

dividebyzero dividebyzero, trainee n00b, said:

As of right now , I do plan on getting 2x 6970 right after x-mas

The money might be better spent on an education.

Guest said:

there's a lot of fighting going on.

One thing I'd like to know is quad cf scaling with AMD's latest drivers.

indiangamer said:

yeah!! Now i am ready for a $1000 upgrade......

edison5do said:

princeton said:

They will. For about 2 weeks until Nvidia releases their dual gpu card. Then we all wait for the next gen. In terms of multi gpu the only time ati ever came out on top was the hd 5970.

That would be if the people of AMD are stupid or confident enough to release the card before Nvidia does, because after that Nvidia would be DELAYING and DELAYING IT untill they get to be better than HD 6990, AMD should think a little bit more abou Strategy, cause Nvidia is really showing that they are much bettern on that.

Guest said:

Yeah, they should delay like they did with the gtx 400, it was great for them.

:S

I dont know if it is better to buy one of those mosnter$ than will last 5 years being the best or buying a half price card than will, dont know, 3 years being the best?. But I don't see any game in the future than will use more than a gtx 480, only crysis 2.

Johny47 said:

Looks great, now all AMD have to do is make drivers that these new cards deserve for once =/

dividebyzero dividebyzero, trainee n00b, said:

Yeah, they should delay like they did with the gtx 400, it was great for them.

:S.

Well played !

I dont know if it is better to buy one of those mosnter$ than will last 5 years being the best or buying a half price card than will, dont know, 3 years being the best?. But I don't see any game in the future than will use more than a gtx 480, only crysis 2.

Crysis 2 won't be the be-all-and-end-all game that some seem to think it will be (imo). Once game dev's start making better use of DX11 features -S.T.A.L.K.E.R.2 (CryEngine3) comes to mind- realistic effects, ambient occlusion, more pervasive physics (destructable/interactive enviroments etc.),more widespread use of tessellation along with higher driver/game levels of MLAA/SSAA/TrSSAA (for example) should keep the graphics market ticking over for the foreseeable future....and once DX11 has run it's race, there's always DirectX 12 and ray tracing.

Jurassic4096 said:

People also seem to forget that nVIDIA's CUDA cores are actually used by professionals, end users, scientists, servers, and video and picture editors. Also PhysX, great SLi scaling, etc.

AMD has Eyefinity (a fad), and Crossfire. Havok physics is software driven (owned by Intel), and time after time, new Catalyst driver releases offer very very very little in adding performance. AMD's GPU's and CPU's are cheaper because they have no choice. Why do you think the lesser known brands at your supermarket are cheaper than the big brands? Why would it be any different with silicon?

Jensen said it himself BEFORE Fermi was out... "There is no safety net at nVIDIA."

AKA, go big or go home. AMD has yet to get to get off the couch if you ask me.

dividebyzero dividebyzero, trainee n00b, said:

People also seem to forget that nVIDIA's CUDA cores are actually used by professionals, end users, scientists, servers, and video and picture editors.

Not to mention ~99% of hospitals and health facilities for radiography (X-ray/CT/MRI tomography), audiology and numerous other branches of medical discipline....but how does that relate to the HD 6990 ?...or are you just trolling?

AMD has Eyefinity (a fad),

Highly unlikely......

and time after time, new Catalyst driver releases offer very very very little in adding performance.

Patently untrue. The only negatives I think you can lay at AMD's graphics drivers are lack of legacy support, the (up until recently) prehistoric profile setting and a smaller team of code writers. Crossfire and general gaming applications are for the most part very good. ( I run both SLI and CFX )

AMD's GPU's and CPU's are cheaper because they have no choice.

You obviously never tried to buy a HD 5870 or 5890 some time between October 2009 and November 2010. CPU's on the other hand are more likely priced due to 1.The fact that the process they use (45nm) is ancient -the tooling and R&D costs have been amortized some time ago, and 2. to maintain marketshare (see recent drops in GTX 460 pricing for a comparison)

Jensen said it himself BEFORE Fermi was out... "There is no safety net at nVIDIA.".

And just how much marketshare, mindshare and revenue did that ethos cost nvidia when the G212 failed to materialize and GF100 (Fermi) was hurriedly pressed into action as a desktop card -which was never nvidia's original intention?

AKA, go big or go home..

nvidia ended up doing both..........I'm definitely thinking tr....

AMD has yet to get to get off the couch if you ask me.

....oll

peas said:

Legendle2007 said:

I don't think Nvidia was ever out of the game.... the HD 5000 series was good but they were only able to match Nvidia's level. For the most part, Nvidia has been winning the GPU war.

What bizarro universe have you been living in this past year?

red1776 red1776, Omnipotent Ruler of the Universe, said:

jurassic4096 said:

People also seem to forget that nVIDIA's CUDA cores are actually used by professionals, end users, scientists, servers, and video and picture editors. Also PhysX, great SLi scaling, etc.

AMD has Eyefinity (a fad), and Crossfire. Havok physics is software driven (owned by Intel), and time after time, new Catalyst driver releases offer very very very little in adding performance. AMD's GPU's and CPU's are cheaper because they have no choice. Why do you think the lesser known brands at your supermarket are cheaper than the big brands? Why would it be any different with silicon?

Jensen said it himself BEFORE Fermi was out... "There is no safety net at nVIDIA."

AKA, go big or go home. AMD has yet to get to get off the couch if you ask me.

Hows the benefit package over there at Nvidia? These are some seriously ignorant comments

Regenweald said:

If you actually think about it, the whole 'Nvidia holding back their dual gpu card waiting for Antilles' is pretty silly. The 480 was a power management/performance failure and your competitor has moved 20million+ DX11 gpu parts in comparison to what ? You had nothing to answer the 5970 and a moderately successful upper mid range part in the 460. Now with the chance to stamp your name on dual gpu board performance, you say 'nah, we'll wait and see what AMD does'. That sounds like fear and doubt in the performance of your own product.

dividebyzero dividebyzero, trainee n00b, said:

@Regenweald

I think I speak for most tech enthusiasts when I say PLEASE p*ss off back to troll-land. We already have our quota of mush-brained regurgitated fanboi-speak. Besides I hear Charlie's proctology exam only needs your nose to begin commencement...please don't keep him waiting...and take jurassic4096 with you.

Maybe, with two hemispheres of brain available, your combined efforts will result in a posting thats not fatuous, unoriginal and actually adds something to the discussion.

Regenweald said:

LOL divide, I didn't realize you were so emotional over this stuff.... but you still get an a for effort, creativity and ... colourfulness. As for tech enthusiast, A true one would look at all aspects of a product and in the case of a GPU, more than fps before worshiping a product. In the case of the entire Fermi generation, performance has not been proportional to the sheer power demands of the cards. But hey, what would the internets be without an insult or two on a comment board eh ?

dividebyzero dividebyzero, trainee n00b, said:

As for tech enthusiast, A true one would look at all aspects of a product and in the case of a GPU...

A true tech enthusiast would see that the thread is about the HD 6990.

A true tech enthusiast would likely post regarding possible performance of said card.

A true tech enthusiast might speculate on it's design and feature set

A true tech enthusiast might also speculate on it's introduction date and pricing.

A troll will use the thread to blather on about another companys products

QED

Regenweald said:

From the article:

Nvidia is rumored to be delaying its dual-GPU GTX 590 in order to implement further improvements so that it can once again beat AMD's offering (Nvidia currently holds the crown for single-GPU performance with its GeForce GTX 580)

regenweald said:

If you actually think about it, the whole 'Nvidia holding back their dual gpu card waiting for Antilles' is pretty silly. The 480 was a power management/performance failure and your competitor has moved 20million+ DX11 gpu parts in comparison to what ? You had nothing to answer the 5970 and a moderately successful upper mid range part in the 460. Now with the chance to stamp your name on dual gpu board performance, you say 'nah, we'll wait and see what AMD does'. That sounds like fear and doubt in the performance of your own product.

I guess since you put QED at the end of this one.....I admit....defeat ? it's late, enough with this now, you win.....

dividebyzero dividebyzero, trainee n00b, said:

@Regenweald

Sweet....the article mentions the GTX 580 and 590 (?) as a parting thought/aside in an article about the HD 6990, so that warrants some rambling about how crappy the GTX 480 is, and how fantastic the HD 5970 is, and some faint praise for the 460....in fact, your posting doesn't include one word about either the HD 6990 or upcoming (?) nvidia card.

So basically it's nvidia bashing using only a tangentially connected product...much the same as employed by the green-tinged mouth-breathers when they wax lyrical about the late, hot and underperforming R600.

Note: If using quotes it's probably best if these quotes somehow help your argument

Guest said:

AMD's 5870 and 5970 beat NVIDIA out of the park. You can't beat 3200 stream processing units. AMD beats NVIDIA on raw power and on rendering graphics.

Guest said:

The Cray Jaguar supercomputer runs on AMD processing units and could run a lot faster and more efficiently if it used AMD GPU's as well. China's supercomputer may be faster but it cannot sustain speeds for very long.

dividebyzero dividebyzero, trainee n00b, said:

The Cray Jaguar supercomputer runs on AMD processing units and could run a lot faster and more efficiently if it used AMD GPU's as well.

IF ? WTF is IF ? Daydreaming 101.

The XT5 isn't configured for using GPGPU....big pity really since 37,376 Opteron 2435's obviously isn't the way forward at a guess.

What does a near-obsolete six-core CPU has to do with a graphics card discussion ?

Newsflash fanboy : ALL supercomputers are ranked in theoretical throughput -INCLUDING Jaguar.

China's supercomputer may be faster but it cannot sustain speeds for very long.

Of course, thats why they built it...the local toy shop was out of Meccano.

The Tiahne-1A is so obviously lacking in performance that the U.S. Dept. of Defense has made Intel and nvidia prime contractors, along with MIT and Sandia for a Xeon/nvidia GPGPU SC....best you contact all of them and tell them they're doing it wrong.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Originally Posted by Guest View Post

The Cray Jaguar supercomputer runs on AMD processing units and could run a lot faster and more efficiently if it used AMD GPU's as well.

IF ? WTF is IF ? Daydreaming 101.

The XT5 isn't configured for using GPGPU....big pity really since 37,376 Opteron 2435's obviously isn't the way forward at a guess.

What does a near-obsolete six-core CPU has to do with a graphics card discussion ?

, I keep tellin ya Chef, there should be a quote hall of fame/shame...holy mackerel

dividebyzero dividebyzero, trainee n00b, said:

That should be the subject of TS's next giveaway.

Prizes for finding the:

"Best, Worst, Most Ill-advised, Most ludicrous, Best/worst predictive (might need to disable the Edit function for that one) comments posted on TS treasure hunt" ?

...think of all those page clicks !

ruben1992 said:

jurassic4096 said:

People also seem to forget that nVIDIA's CUDA cores are actually used by professionals, end users, scientists, servers, and video and picture editors. Also PhysX, great SLi scaling, etc.

AMD has Eyefinity (a fad), and Crossfire. Havok physics is software driven (owned by Intel), and time after time, new Catalyst driver releases offer very very very little in adding performance. AMD's GPU's and CPU's are cheaper because they have no choice. Why do you think the lesser known brands at your supermarket are cheaper than the big brands? Why would it be any different with silicon?

Jensen said it himself BEFORE Fermi was out... "There is no safety net at nVIDIA."

AKA, go big or go home. AMD has yet to get to get off the couch if you ask me.

Yeah because the GTX 480 was such a success and the HD 5870 was a giant fail. OH WAIT IT'S THE OTHER WAY AROUND! Also the thing you brought up about the drivers just isn't true anymore and Crossfire scaling is nearly on par with SLI.

Guest said:

i still go to ati instead bec. price is affordable ati can withstand much higher temp and can live longer while nvidia the price *****,horrible it can not afford by middle class people..

bluebob951 bluebob951 said:

Getting off the couch

People also seem to forget that nVIDIA's CUDA cores are actually used by professionals, end users, scientists, servers, and video and picture editors. Also PhysX, great SLi scaling, etc.

AMD has Eyefinity (a fad), and Crossfire. Havok physics is software driven (owned by Intel), and time after time, new Catalyst driver releases offer very very very little in adding performance. AMD's GPU's and CPU's are cheaper because they have no choice. Why do you think the lesser known brands at your supermarket are cheaper than the big brands? Why would it be any different with silicon?

Jensen said it himself BEFORE Fermi was out... "There is no safety net at nVIDIA."

AKA, go big or go home. AMD has yet to get to get off the couch if you ask me.

Then i guess AMD is just like you--won't get off the couch. Luckily, nobody was asking you. When it comes to graphics cards, it's all about opinion. You might like NVIDIA, I might like AMD, my Fiancee might not even care, but it all depends on the person. I think we have to respect that everyone has there own opinions.

dividebyzero dividebyzero, trainee n00b, said:

You might like NVIDIA, I might like AMD, my Fiance might not even care

Hopefully he leans towards VIA/S3 -just in the interest of fair competition

And congratulations- I hope you and your future husband will be very happy together.

/fiancé = male betrothed, fiancée = female betrothed -FYI TS readers

Guest said:

Dividebyzero and Regenweald:

If I post this after your bed time, will you respond before or after school?

http://www.tomshardware.com/reviews/hqv-2-radeon-geforce,284
.html#xtor=RSS-182

That's what else is important to some of us.... picture quality.... (who wants to fap to pixelated women?(or girls in your case))

If you look through the archives, in almost any given year ATI (now AMD) has had better quality rendering of graphics, at the expense of speed.

Who cares how many frames a second your GPU can do if it fails to actually render crystal clear images for every frame????

I, for one, will take mediocre frame rates with stunning visual accuracy over mega-inflated frame rates with pixelated blobs dancing around the screen.

By mediocre, I mean acceptably fast.... more precisely: at least 30FPS, and within a monitor's actual ability to display each and every frame.

So as long as I can display anything I want, games, movies, or fap material, at an enjoyable frame rate I will be happy with the higher accuracy of my AMD/ATI cards.

One last thing: You can't always equate price to quality.... it just doesn't work for everything marketed.

Google to see if others agree and why.....

http://www.google.com/search?source=ig&hl=en&rlz=&am
;=&q=sometimes+cheaper+is+better&btnG=Google+Searc
&aq=f&oq=

I hope you consider these things and remember: You can lead a troll to the truth, but you can't make him believe it.....

Eric W

dividebyzero dividebyzero, trainee n00b, said:

@Guest #44 (Eric W)

I suggest you get the card that makes you happier fapping- you seem well versed in it's application. Don't put your back out.

I'm sure you get a nice little stipend for plugging HQV, but it may have escaped your notice that the HD 6990 isn't being aimed at the HTPC market. Rest assured, if you were asking for a suitable card with which to view your .mpg girlfriend I'd most assuredly point you towards a HD5750 or similar.

Being a gaming orientated card it's probably best to focus on gaming in this case...

3DCentre.org

Oh and here's another article by Tom's Hardware:

[link]

and...

HT4U

and again, and...

ComputerBase

and...

ABT

Leaving aside the banding/shimmering due to LoD mapping settings in the AMD driver, you'll note that ALL the sites -as well as a myriad of others who have tested- have concluded that there is virtually no difference between the image quality of either manufacturers cards.

And here, lastly, is exactly the same viewpoint expressed by one of (if not the No.1) nvidia's detractors.

Guest said:

if it's anything like the 5970 (or hopefully alot better) i'd gladly buy one.

dividebyzero dividebyzero, trainee n00b, said:

Looks like Kyle got a gold star from AMD -Quelle surprise.

Obviously the way to AMD's heart is through benchmarking in such a way as to make the amount of onboard VRAM the main differentiator...unless of course you're measuring one vendors 1Gb cards*...

*Note that [H] dropped one half of their review benchmarks (maximum playable settings) for this one product review.

red1776 red1776, Omnipotent Ruler of the Universe, said:

. There was a distinct gameplay experience difference between the Radeon HD 6970 and Radeon HD 6990

I would think there would be yes....The whole thing was weird.

It was a decent review of the 6970 I guess..

Going over to S/A to see if Charlie got his sample.

dividebyzero dividebyzero, trainee n00b, said:

I'm picking that last line is pure sarcasm...at least it should be.

S|A wont get a sample to review simply because there is no gain to be had for AMD. Sites like S|A, Rage3D and Beyond 3D are so unashamedly AMD-centric (check the number of AMD and Global Foundries employees that post there) that there is no cachet to be had with a preview/review. Not a hell of lot of difference from say, giving SLI Zone forums an nvidia card for the same purpose...basically you'll get a PR slanted review, and if it isn't, the suspicion of the majority of people (that are aware of these sites) will be to dismiss them on the basis of past articles. Kyle and Brent on the other hand have a reputation for, if not impartiality, then certainly a love of high-end tech (especially Eyefinity/Surround gaming in Kyle's case)- so it's all the more disappointing that they seem to be tailoring their benchmarking to fall into a company line.Hence the HD6950 2Gb v 1Gb review where the graphics results don't match the conclusion they reached.

Looking at the framerate results in the review you would be hard pressed to recommend the 2Gb version of the card over the 1Gb -something AMD I'm sure are quite keen on publicising- when a [H]'s usual max playable settings would have shown something more in line with this:

[link]

...rather than the more friendly 4xAA setting they chose to use on this isolated occasion.

My guess is that Kyle got special dispensation for towing the company line with recent reviews and editorials. I would go so far as to guarantee that no other mainstream site has the same favoured status- that is to say I fully expect no other AMD sanctioned "previews"

red1776 red1776, Omnipotent Ruler of the Universe, said:

Going over to S/A to see if Charlie got his sample.

I'm picking that last line is pure sarcasm...at least it should be.

yes it was.

I didn't expect Charlie to get anything anything except possibly the more specific set of architectural slides.

I am just hugely surprised to see Kyle up to this, maybe I shouldn't be, but I am.

Not to mention that it was so restricted and full of "we cant reveal....right now" it was more maddening than anything. It appeared that what they (AMD and [H]) were doing all this 'orchestrated' benching to answer charges that the thing even existed. This is rather odd considering the way they are positioning themselves as having a chance to take the single card performance crown. Possible I guess (in certain benches) with the 300w equalizer they are working with.

dividebyzero dividebyzero, trainee n00b, said:

Seems a little odd that AMD didn't take the opportunity to "launch" the card at CeBIT, rather than what looks like at first glance, preferential treatment for one tech site- it's not as if March 1st is that far away.

I'm pretty certain that if the role was reversed (i.e. Tom's publishing some nebulous "benchmarks" of the GTX 590 for instance, or Anand getting the heads-up on a new Intel architecture) then the forums would have been aflood with charges of the site being a PR lapdog, which is precisely why (IMO) AMD are using Kyle and Brent as their unofficial mouthpiece.

Very strange timing, very strange execution. 99% PR fluff, 1% actual info -and that's being generous. One "bench" on a demo using non-DX11 features at a resolution picked to show the best side of the card.

I await with keen interest the actual reviews....and which sites get to publish them at launch.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Yup, like I said before, everything about it is just weird.

I await with keen interest the actual reviews....and which sites get to publish them at launch.

Care to take a shot at who gets the launch samples?

...or better yet, who doesn't?

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.