AMD Radeon HD 6990 Review

By on March 10, 2011, 4:42 AM
AMD introduced its first Radeon HD 6000 graphics card last October, when we reviewed the mid-range Radeon HD 6870. Since then AMD opened up to show its GPU roadmap and the cards that soon thereafter were coming to market. The high-end Radeon HD 6970 and HD 6950 also arrived late last year, while the dual-GPU version of AMD's last generation graphics series code-named Antilles was expected to arrive shortly after. Coincidentally (or not) both AMD and Nvidia took a few months longer than expected to show its hardcore dual-GPU graphics cards, with the former making the first move to finally unveil the Radeon HD 6990.
Having looked at most of the previous generation Crossfire and SLI products, we are certainly looking forward to see what AMD has in store for us with this dual-GPU monster. Read the complete review.




User Comments: 43

Got something to say? Post a comment
Mizzou Mizzou said:

Good review, $700 is pretty steep even though this card churns out impressive frame rates across the board. Now we wait for Nvidia's answer to see who will claim the performance crown this time around.

fpsgamerJR62 said:

That's one big graphics card ! How does one even install a foot long graphics card in a regular ATX case ? Put two of these monsters in Quad Crossfire configuration and you'll probably need to run a dual PSU setup as well.

princeton princeton said:

I don't even think it'll fit in my antec 900 two. That's definitely a monster card.

Guest said:

There is no problem of space in my case but price and power consuption is extremely high.

I will wait for next hd 7000 series using 28 nm process and will be much more efficient.

Kibaruk Kibaruk, TechSpot Paladin, said:

The power efficiency and the fact that you dont need an extremely bulky case or a crossfire-able mobo, meaning you can set it up as you please for me at least are $100 well spent (And considering you have the money to spend on this it is not a crazy buy).

stewi0001 stewi0001 said:

I love how my graphic card is not even on the comparison tests lol! Yea $700.... I do not care for that price, you could almost buy a whole new machine. Especial with how video cards turn over as fast as cell phones. What surprises me the most is how Inefficient the hardware still is since they still have to put massive heatsinks on them. Overclocked hardware I understand but it is just a thought.

PinothyJ said:

I can only imagine PowerJack - [link] - sales are going to go through the roof with those physical specifications! It'll be interesting to see what other manufactures can do to reduce that massive size and weight.

Impressive though?

Guest said:

I kind of regret buying my gtx 570 sli after looking at this monster

herpaderp said:

@Guest

I hope you're joking

Omnislip said:

If nVidia are able to come close to matching this price, I'm sure their performance will blow this one away... Look how high the 580 is!

Staff
Per Hansson Per Hansson, TS Server Guru, said:

Now if only there where, I don't know, actually games made to take advantage of these cards?

-Yea I know, silly idea...

It's nice to find we can finally play Crysis at full, in 2560x1600 (Of course without antialiasing)

For all other games we need to crank it up to 4x or 8x MSAA before we get the FPS down to around 100.

Yea, I can really see the need for these card, really I do...

-Fires up Crysis 2 ConsoleVersion? on his 3 year old 8800GTS 512MB and watches it play just fine

dividebyzero dividebyzero, trainee n00b, said:

This card, as well as Crossfired HD6970/6950 or SLI'ed GTX580/570 aren't aimed at 2560x1600 res. They are aimed at two specific groups.

1. Multi-monitor gamers. 5040x1050 and up (People who would buy the card)

2. Marketing hype and fanboy-fodder (People who won't buy this card).

Pretty simple strategy really. Top the benchmarks with the übercard and watch the masses gobble up the entry and mainstream versions -the illusion of performance by association.

"Win on Sunday, Sell on Monday" -Bob Tasca

@herpaderp

I think you'll find that "Guest" is a fully paid up member of the local Flamebait union- trust me I've studied them in petri dishes when I was doing my dissertation as a Trollatrician.

One thing the card has going for it is that it must be built like a tank. Tweaktown managed to clock the core to 1000MHz...the only downside (apart from increased power draw) seems that the card's fan needs to ramp to 100%....and 90.2dB. Vendor non-reference cooling might be advantageous in this case.

Classic Rock said:

I'd like to see a benchmark of two of these in Xfire on the new 1155 platform just to see if the x8 bandwidth of the PCI-E lanes cause a performance hit.

Guest said:

If I had that kinda money, I'd buy 2 twin frozr III 6950's, flash to 6970 than xfire. Or just spring for the 6970 editions. 6990 and 590 will be beasts, but I dunno where the usage is practical outside of extreme benchmarking.....

Regenweald.

St1ckM4n St1ckM4n said:

What happened to the 'overclock switch' ?

red1776 red1776, Omnipotent Ruler of the Universe, said:

If nVidia are able to come close to matching this price, I'm sure their performance will blow this one away... Look how high the 580 is!

You are leaving out one very important part of the equation Omni. The power usage/cooling/PCIE spec/warranty are all acting as the great equalizer here. .you can damn well bet that AMD would have ramped up two full and complete 6970's if they could simply ignore these issues (they did to an extent). The same goes for Nvidia. Two full blown 580's is pushing 500w, they will have some limiting factors at work as well.

tacobfm said:

STOP HATING TECHSPOT U KNOW THESE ARE COOL.

But yeah, its performance is what it should be and I think the price is justfiable - for now.

Guest said:

Well, to be completely honest the performance is mindblowing, but I have to say it's still overpriced. I'd rather buy two 6970's or just buy two 6950's and flash them, saves money and squeezes out more performance. Also, the noise factor seems to pull me back too.

However, I did find the power consumption and heat temperatures to not be half bad for its performance, but still not enough to make me buy it. I mean $700? Really? Did AMD forget it had the 6970 or something? $650 or $600 is more realistic, and would make this card completely worth it. I'm eager to see nVidia's GTX 590, if they can pull off around the same performance with a cheaper price tag I'll be incredibly impressed, if not, I'll just stick to the two 6970's.

This card is really only appealing to people with only one PCI-E x 16 slot open, (And if they can have this card without bottlenecking it I don't see how they only have one slot open in the first place...) or to people who want to say "1 h4ve t3h m0sT l337 c4rd".

....Or to people who like big numbers....

princeton princeton said:

Guest said:

Well, to be completely honest the performance is mindblowing, but I have to say it's still overpriced. I'd rather buy two 6970's or just buy two 6950's and flash them, saves money and squeezes out more performance. Also, the noise factor seems to pull me back too.

However, I did find the power consumption and heat temperatures to not be half bad for its performance, but still not enough to make me buy it. I mean $700? Really? Did AMD forget it had the 6970 or something? $650 or $600 is more realistic, and would make this card completely worth it. I'm eager to see nVidia's GTX 590, if they can pull off around the same performance with a cheaper price tag I'll be incredibly impressed, if not, I'll just stick to the two 6970's.

This card is really only appealing to people with only one PCI-E x 16 slot open, (And if they can have this card without bottlenecking it I don't see how they only have one slot open in the first place...) or to people who want to say "1 h4ve t3h m0sT l337 c4rd".

....Or to people who like big numbers....

I can't think of any boards that can use a cpu that won't bottleneck this card that only have one slot. Almost every board can do x8 x8 so this product is really just so AMD can claim the fastest card crown(until Nvidia releases)

Staff
Julio Franco Julio Franco, TechSpot Editor, said:

What happened to the 'overclock switch' ?

Thanks for the feedback. Added to the end of the review:

Update - Dual-BIOS support: Some of you noticed we didn't mention one of the Radeon HD 6990's unique features. <a href="http://www.amd.com/us/products/desktop/graphics/a
d-radeon-hd-6000/hd-6990/Pages/amd-radeon-hd-6990-overview
aspx#4">Dual-BIOS support</a> can be toggled from a physical unlocking switch on the card, which switches between the factory-supported BIOS of 375W (tested throughout this review) and an "Extreme Performance BIOS" that boosts core clock speed from 830MHz to 880MHz and in the process throttles power consumption to a staggering 450W. What you should know: performance difference was negligible when overclocking the card.

Leeky Leeky said:

Well, lets see what nVidia bring to the table now.

Whoever wins this contest gets the millions of dollars on the back of the "I got x because they make the worlds best GPU" race.

As I understood it though, the HD5970 decimated all competition, even XF GPUs, but this is no better than two separate GPUs of the same type -so have we really progressed when someone could achieve better results using two of the same HD6970s in XF?

dividebyzero dividebyzero, trainee n00b, said:

The present situation is not greatly removed from the HD 5970. [link] , even though the latter is ostensively running the same GPU's. The difference being that the 5970 runs at 5850 clocks (725MHz as opposed to 850MHz).

The 6990 vs 6970 CFX is a little more complicated in that the GPU's in the 6990 at binned for substantially lower voltage (1.12 v 1.17) and the 6990 is using the "old" 5Gb vRAM which demands less voltage as well as a slightly lower core/shader clock, but essentially the difference in performance between the 6990 and CF'ed 6970's is much less than that of the previous cards- mostly due to the similarity in core/shader/memory clocks. A certain percentage of that difference can also be attributed to driver immaturity.

Leeky Leeky said:

Thanks DBZ.

So do you feel the performance of the HD6990 will increase as the drivers mature then?

I definitely see the reason to order one, especially if your stuck to one GPU only due to case, PSU, or motherboard restrictions - besides a single GPU is much easier maintenance than dual GPUs, even if it has come a very long way in the last couple of years.

I guess I need to hurry up and decide what one I'm going for - I've been holding out for months now, waiting for the next one to arrive, but I'm beginning to realise that this way of thinking is just meaning I'm sat still while everything else runs rings around me.

Guest said:

Does 6990 still consumes 600 watts as I read emails?? If so,, please allow users to power down to 25 watts for reading emails or browsing.. . I have nothing against hard core gamers with outrageous demands for power only on temporary basis.. It is still ess than having them drive around in a car!!

dividebyzero dividebyzero, trainee n00b, said:

@Leeky

I don't think you'll see any big performance increases with more mature drivers. For all the hoopla surrounding driver releases they never deliver across the board sizeable gains. What I think you will see is a more consistant performance in relation to crossfired 6970's. The 6990 will never beat the dual card setup simply because it is clocked 6% lower in core/shader and 10% lower in memory. These lower resources seem to be mitigated slightly by the better performance offered by the 6990's internal bridge chip over the conventional Crossfire connector - I would stress that this is my observation and not an established fact, although the Crossfire ribbon connector does have potential for bandwidth limitation.

Dual cards still have more downside than upside as far as I'm concerned.

Driver support tails off considerably quicker for dual-gpu cards,

If a game has a SLI/Crossfire driver glitch then switching out of multi-gpu mode is painless if you're using two discrete cards.

Dual-gpu cards traditionally have higher RMA and failure rates. They are more complex, have to handle higher voltages and heat output. A cracked solder joint on a duallie for instance presents a bigger user problem than the same scenario on one of a pair of single-gpu cards.

Dual-gpu cards carry a price premium for the given performance-partially due to manufacturing complexity, partly due to the binning process needed to identify suitable gpu's, and partly to keep customer demand down, since duallies aren't money spinners and an AIB will make more profit from selling two single-gpu cards.

With regards this particular dual-gpu card...it has the potential to become a white elephant if AMD and their board partners don't clarify the warranty status of the card. You will note that it isn't being offered for sale in a number of large markets where you expect it to sold. Scan in the UK, Newegg in the U.S. being prime examples. AMD or the AIB's will simply have to eat the any losses incurred -and that includes system failures due to over-current draw or recall the card and lock the BIOS down to 830MHz/375w -which should be it's death-knell since anyone buying this card is certainly looking at overclocking and benchmarking.

Economically (I'll use Overclockers.co.uk for costings since they have most of the UK's 6990 stock) and using like-for-like comparison

HD 6990 is £557.99 inc VAT (cheapest price on the site)

or 2 x HD 6970 (at £264.98 inc VAT ea.) £529.96

or 2 x HD 6950 2Gb (at £209.99 inc VAT) £419.98

The 6990 can be overclocked to ~920 to 950MHz and 5200-5400 memory before noise levels start playing a significant factor. Tweaktown managed 1000 MHz core at the expense of 100% fan speed and 90dB - a noise level that probably exceeds most developed nations guidelines for preventing industrial deafness.

Dual 6970's can clock higher in general -the caveat being that stacking the reference cards without a spare slot between them raises temps considerably, so either three slots between PCIe x16 or at least a non-reference cooling card as the primary.

Dual 6950's of course can be unlocked to 6970 specification, although they can lack overclocking headroom in comparison with the "real" 6970 - but for £100+ saving you can't have everything.

captaincranky captaincranky, TechSpot Addict, said:

Since at one point in this discussion, it was mentioned that more "mature" drivers didn't always equate to as performance increase, I thought I'd throw this tidbit into the mix.

A video driver discovered by Windows 7 for my Intel IGP (GMA-4500), actually raised the WPI index from 3.5 to 4.3 (Aero), and 4.5 (business & gaming)!

This isn't the driver that's supposed to be installed with the board (Gigabyte Intel G-41) but at least,it works way better than Gigabyte's proprietary offering.

I was going to quick, rush out and buy Crysis, until I realized that multiplying by zero, nets a lower product than dividing by zero. That said, it is less of a conundrum.

The monitor is brighter now, thankfully, since the Dell 23" IPS that I've got connected to it, doesn't have much brill to give away in the first place. (300 Cm^2)

It's not possible to crossfire 2 GMA-4500s is it? Should I have asked that in the video forum?

dividebyzero dividebyzero, trainee n00b, said:

Sorry cap'n crossfiring GMA-4500's (or any Intel IGP) is a non-starter.

I'm glad to hear that Intel are hard at work with their driver revisions.If they are still churning out conventional IGP drivers that increase functionality to such a visible extent then that bodes well for both the older platforms and on-die graphics. Hopefully the mobile sector is included, since that area seems to be a blind spot for most manufacturers.

I should have added that my musings on the 6990 were almost wholly devoted to gaming and the Almighty Frames Per Second. It was in fact all I could muster to not to head off on a tangent by including UVD3/DXVA issues and clock rates, microstuttering, performance per watt, or the sacrilege of questioning the use of $700+ graphics cards with single TN panel screens and gaming resolutions.

Leeky Leeky said:

Thanks DBZ, an excellent informative post as always mate.

grvalderrama said:

What I'm really expecting for Nvidia to launch their GTX590 so that 5990 lower its price tag!! Now, that should be taken into account when the "GTX590 beat the $hit out off the 5990". Hopefully, AMD will low its prices, hehe...

Eddo22 said:

grvalderrama said:

What I'm really expecting for Nvidia to launch their GTX590 so that 5990 lower its price tag!! Now, that should be taken into account when the "GTX590 beat the $hit out off the 5990". Hopefully, AMD will low its prices, hehe...

Good luck. If they throw two 580's together yes they probably will outperform the 6990, they will also use more power, likely run just as hot, and have a REALLY staggering price tag of $800-$900!

I'm guessing it will be two 570's. That would be the only way they could compete imo.

I also wanna say I don't particularly care for this review. Pretty much every review I've seen on the Geforce 580 showed it was significantly slower than the old Radeon 5970. Techspot may be right in their reviews about the 2 cards, but it should also be mentioned alot of those games have been tweaked for Nvidia cards.

Also, Techspot claims the OC switch is negligible. Other sites show up to a 30fps increase in some situations! Also seeing 87C as the top temp under major stress. Should be lower in actual gaming. Would be interesting to know if there is any fps to be gained with a faster cpu.

Imo the Geforce 570 is were it's at. You can get SLI'ed for around $600 and they run pretty cool.

Granted Nvidia's control panel being inferior and the lack of tweaking options makes the Geforce less desirable.

dividebyzero dividebyzero, trainee n00b, said:

Also, Techspot claims the OC switch is negligible. Other sites show up to a 30fps increase in some situations!.

You got some review links to back up that bs ?...and no, I don't mean some bs 1280x1024 benches at suicide run clocks.

Tech Power Up .....mmmm no, not a single benchmark...

Anand.....and no, again -not even a 4fps difference in any bench...

Guru 3D.....no again...not even if you further OC the card to 955/5600...

how about Tech Report?....no!, I'm starting to see a trend...

OC3D...no....KItguru...nada...Hexus...lucked out again...HardWare France...nope

Better head back to fanboy school and finish the course.

Also seeing 87C as the top temp under major stress. Should be lower in actual gaming.

Yeah, the HD 6990 is a veritable refrigerator...of course, everyone knows thermographs aren't the way to go to measure temps. Ha amateurs!

Would be interesting to know if there is any fps to be gained with a faster cpu.

Would also be interesting to know how many trolls spend significant periods of time grasping at straws...I guess we'll both remain in a state of suspense...except your musing might have been answered by Kitguru using a six-core i7 @ 4.33GHz...oh well, at least one of us has an answer

Imo the Geforce 570 is were it's at. You can get SLI'ed for around $600 and they run pretty cool.

Yeah...about the same level of performance as two unlocked HD 6950's for $500.

Granted Nvidia's control panel being inferior and the lack of tweaking options makes the Geforce less desirable.

When has Catalyst allowed the user to set user defined game profiles?

I have Catalyst and RadeonPro set up one machine and Nvidia's CP on another. The best and most streamlined functionality doesn't include red in its user GUI. But each to their own...maybe you can take advantage of the "superior" AMD utilities when you install a couple of crossfired 6950's.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Eddo, for god sake...tap the mat...tap the mat!

Eddo22 said:

dividebyzero said:

Also, Techspot claims the OC switch is negligible. Other sites show up to a 30fps increase in some situations!.

You got some review links to back up that bs ?...and no, I don't mean some bs 1280x1024 benches at suicide run clocks.

Tech Power Up .....mmmm no, not a single benchmark...

Anand.....and no, again -not even a 4fps difference in any bench...

Guru 3D.....no again...not even if you further OC the card to 955/5600...

how about Tech Report?....no!, I'm starting to see a trend...

OC3D...no....KItguru...nada...Hexus...lucked out again...HardWare France...nope

Better head back to fanboy school and finish the course.

You'd better re-read what I said since you obviously didn't understand it...particularly the last sentence of that phrase. BTW I'm using a Geforce 570OC right now.

Would be interesting to know if there is any fps to be gained with a faster cpu.

Would also be interesting to know how many trolls spend significant periods of time grasping at straws...I guess we'll both remain in a state of suspense...except your musing might have been answered by Kitguru using a six-core i7 @ 4.33GHz...oh well, at least one of us has an answer

Thanks for the link. You don't have to be an ***** about it.

Imo the Geforce 570 is were it's at. You can get SLI'ed for around $600 and they run pretty cool.

Yeah...about the same level of performance as two unlocked HD 6950's for $500.

...and also a good chance of voiding your warranty and reducing the life of your card. Bad choice imo.

Granted Nvidia's control panel being inferior and the lack of tweaking options makes the Geforce less desirable.

When has Catalyst allowed the user to set user defined game profiles?

I have Catalyst and RadeonPro set up one machine and Nvidia's CP on another. The best and most streamlined functionality doesn't include red in its user GUI. But each to their own...maybe you can take advantage of the "superior" AMD utilities when you install a couple of crossfired 6950's.

I'm sure you just love how the page scrolls up to the top after changing a setting. I find it irritating myself.

Are you referring the the 'Manage 3d settings panel'? ATI Tray Tools can do that + a heck of a lot more. As far as I can see Nvidia owners have no such program that can compare. Rivatuner pretty much sucks in comparison.

Leeky Leeky said:

I have to say having been a long time ATI fan that my current nVidia GPU and its software is somewhat irritating.

I much prefer the cleaner, more fluid software that you use to alter settings with the ATI/AMDs.

red1776 red1776, Omnipotent Ruler of the Universe, said:

When has Catalyst allowed the user to set user defined game profiles?

You must be talking about something completely different. I have a individual game profile ( core/mem speed/3d settings/how many GPU's,etc...etc) for every game I own set,hels, and activated from within CCC since as far back as I can remember.

dividebyzero dividebyzero, trainee n00b, said:

You must be talking about something completely different. I have a individual game profile ( core/mem speed/3d settings/how many GPU's,etc...etc) for every game I own set,hels, and activated from within CCC since as far back as I can remember.

I was thinking more along the lines of game launching settings based on .exe.

I tend to get inundated by customers who can't get their head around having to do the whole profile/save/macro-hotkey thing. As for forcing AA in some OGL -forget it.

While I haven't spent a lot of time in CCC2 I don't think I've noticed any 'select from installed .exe' type setting. Am I missing something here? If you can post back with a quick step-by-step on setting customizable game profiles that launch automatically upon .exe initialization I'd say my ISP would be eternally grateful

I tend to use RadeonPro and/or Tray tools. Getting a customer to update the CCC driver in a timely fashion is bad enough -likely helped by the new update prompt feature/ steam- let alone third party utilities.

While were on the subject of the driver. Any guess on how long it's going to be before AMD decide to fix the cursor sticking in the top righthand corner of the screen? Doesn't affect me in most games, but playing World in Conflict and Soviet Assault is a real pita.

As I said in my earlier post...

But each to their own....

***********************

You'd better re-read what I said since you obviously didn't understand it
You got some review links to back up that bs ?...and no, I don't mean some bs 1280x1024 benches at suicide run clocks.
IAlso, Techspot claims the OC switch is negligible. Other sites show up to a 30fps increase in some situations!

Really? I thought it was explanatory. According to your original quote you are expounding that an HD6990 at the 880MHz setting is capable of framerates up to 30fps higher than stock. I said that statement is bs. What's to understand? I don't see you posting any of these "up to 30fps difference" review links.

BTW I'm using a Geforce 570OC right now.

Good for you eddo! I would never have guessed judging by the GTX570 plug in your first post. Pretty sly on your part.

...ATI Tray Tools can do that + a heck of a lot more. As far as I can see Nvidia owners have no such program that can compare. Rivatuner pretty much sucks in comparison.

nHancer (think of it as ATI Tray Tools for Nvidia cards...because strangely enough, that's exactly what it is.)

red1776 red1776, Omnipotent Ruler of the Universe, said:

I was thinking more along the lines of game launching settings based on .exe.

I tend to get inundated by customers who can't get their head around having to do the whole profile/save/macro-hotkey thing. As for forcing AA in some OGL -forget it.

While I haven't spent a lot of time in CCC2 I don't think I've noticed any 'select from installed .exe' type setting. Am I missing something here? If you can post back with a quick step-by-step on setting customizable game profiles that launch automatically upon .exe initialization I'd say my ISP would be eternally grateful

I tend to use RadeonPro and/or Tray tools. Getting a customer to update the CCC driver in a timely fashion is bad enough -likely helped by the new update prompt feature/ steam- let alone third party utilities.

Okay, this is what i was talking about

[link]

While were on the subject of the driver. Any guess on how long it's going to be before AMD decide to fix the cursor sticking in the top righthand corner of the screen? Doesn't affect me in most games, but playing World in Conflict and Soviet Assault is a real pita.

Are you kidding? I think that is now officially regarded a feature!

I don't have that happen in any game, but I see it on all the forums...I cant even remember how far back that one goes if its the same as the " "oversized cursor thing" I guess I have been lucky and dodged that one.

dividebyzero dividebyzero, trainee n00b, said:

I've seen video's of the mutant cursor but I've never had to deal with it, and no customer has ever mentioned it. The sticking cursor happens just often enough to make me homicidal. If it happened all the time, I'd get used to making sure that the cursor didn't travel to that corner...but of course it's intermittent.

Yeah, the CCC preset. Personally I could live with it, but some customers end up going into meltdown mode when they try to set up a few macro's and hotkeys. I wouldn't find it a particularly delightful experience on just the games/benches I use. A quick perusal of my CF rigs exe's

shows: UT 2004 (Out of Hell mod), Metro 2033, ET:QW, GTA 4, GTA:SA (textures out to around 500%+ of original), CFS3, Crysis Warhead, CoD:MW, Stalker:CoP, Stalker:ShoC Naradnaya Soljanka, Stalker ShoC Oblivion Lost Ult., Stalker:Clear Sky Complete, Far Cry 2, World in Conflict, World in Conflict:Soviet Assault, BFBC2 and an ongoing full-conversion mod of GTA:VC. I recently purged the machine of CoD:WaW, Dead Space, Singularity, Cryostasis, L4D2 and Cataclysm (which I've never actually played-not my bag). Add in FRAPS, CoreTemp, EasyTune, Kombuster, Afterburner, Memset, Everest, Prime95, 3DMark, Furmark, CPU-Z, GPU-Z and a few macro's for auto-complete and spreadsheet apps and it can get crowded pretty fast.

Want to know what kind of people cant navigate a bucketload of macro's/hotkeys. They are my customers who upon taking delivery of a perfectly setup machine -decide to move all the sliders as far as they they can. Even when you show them the error of their ways, they don't know what they are supposed to be seeing.

Oops sorry, that pic is a little hard to read -here's the relevent bit. It shouldn't take too long to spot the problem! BTW GPU-Z reported VDDC as 140°C and 142°C

red1776 red1776, Omnipotent Ruler of the Universe, said:

Want to know what kind of people cant navigate a bucketload of macro's/hotkeys. They are my customers who upon taking delivery of a perfectly setup machine -decide to move all the sliders as far as they they can. Even when you show them the error of their ways, they don't know what they are supposed to be seeing.

And it gets worse than that.....

Maybe it's just my weird slant on things. I think that if you have someone else build your high test gamer. You should probably stay the hell out of the bios. I would say that 1 out of 5 go straight there, a number of them have ****** things up to the point of not being able to get back into the bios. ...why!?...Why!?

dividebyzero dividebyzero, trainee n00b, said:

I think it simply comes down to being unable and unwilling to put in the hard yards is setting up the system. Most of my customers realize that they can auto-overclock, and most also realize that it isn't conducive to long-term system stability (or longevity)- but when they see me spend 30-100 hours fine tuning the system for overclock and low voltage, trying out different RAM timings, cooling orientations/combinations etc. I think they get a little overwhelmed but still want to put their mark on the machine -if you know what I mean. Some sort of proprietry statement.

Consequently I get emails from customers who push the boundries a little further and are elated that the system boots into OS. All good, as I always leave a little performance on the table. Unless I'm water cooling or using a chiller I never push a system to 100% if it's going into the hands of someone who will seldom if ever open the panels to clean the damn thing- unfortunately, a little success makes the budding uberclocker somewhat reckless.

BTW that HD5970 is basically toast. I told the owner to remove the card and give it a blast of contact cleaner every once in a while to clear the cooling chamber blocks. Not only did he neglect that chore but cooked off the VRM's. The screenshot is a furmark run I did AFTER I backed the clocks down- he had them at the stops on the right hand side.

I believe he was trying for the Minnesotan Overclocking Technique

[link] wave:!

red1776 red1776, Omnipotent Ruler of the Universe, said:

I believe he was trying for the Minnesotan Overclocking Technique

....ummm helloooo!

Minnesota winter night -23c.....air exhausted from case via .20v GPU OC'ing method 103c....really Chef!....I expect more from you.:p

****How about this here critter? I came across it today.

[link]

dividebyzero dividebyzero, trainee n00b, said:

Are you using it ? ( the app, not the GPU as space heater! )

red1776 red1776, Omnipotent Ruler of the Universe, said:

Not yet, just found it. I am going to see if I can patch CF for FM 1.8 (The multi GPU version would never run more than 2 GPU's)

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.