GPU TDP database

red1776

Posts: 5,124   +194
I found this over at Geeks3D.com. Handy if you are putting together a new system, or thinking off adding a second card.


nvidia-geforce-logo.jpg


Graphics Card Model TDP (Watts)
Quadro 6000 204 source
Quadro 5000 152 source
Quadro 4000 142 source
Quadro 2000 62 source
Quadro 600 40 source
Quadro FX 5800 189 source
Quadro FX 4800 150 source
Quadro FX 3800 108 source
Quadro FX 1800 59 source
Quadro FX 580 40 source
Quadro FX 380 34 source
Quadro FX 380 LP 28 source
Quadro 2000 62 source
Quadro 600 40 source
Quadro 400 32 source
GeForce GTX 590 365 (power limiter ON) source
ASUS ROG MATRIX GTX 580 Platinum 370 (peak 3D, OC) source
GeForce GTX 580 330 (peak 3D, OC) source
GeForce GTX 580 280 (peak 3D) source
GeForce GTX 580 244 (power limiter ON) source
GeForce GTX 570 240 (peak 3D, OC) source
GeForce GTX 570 217 (peak 3D) source
GeForce GTX 570 219 source
GeForce GTX 560 Ti 260 (peak 3D, OC) source
GeForce GTX 560 Ti 205 (peak 3D) source
GeForce GTX 560 Ti 170 source
GeForce GTX 560 190 (peak 3D, OC) source
GeForce GTX 560 150 source
GeForce GTX 550 Ti 116 source
GeForce GTX 480 260 (peak 3D) source: Geeks3D test
GeForce GTX 480 250 source
GeForce GTX 470 220 source
GeForce GTX 465 200 source
GeForce GTX 460 160 (1GB)
or 150 (768MB) source
GeForce GTX 460 SE 140 source
GeForce GTS 450 106 source
GeForce GT 440 (retail) 81 (peak 3D) source
GeForce GT 440 (retail) 65 source
GeForce GT 440 (OEM) 56 source
GeForce GT 430 49 source
GeForce GT 420 50 source
GeForce GT 340 69 source
GeForce GT 330 75 source
GeForce GT 320 43 source
GeForce 315 33 source
GeForce 310 30.5 source
GeForce GTX 295 289 source
GeForce GTS 285 204 source
GeForce GTX 280 236 source
GeForce GTX 275 219 source
GeForce GTX 260 182 source
GeForce GTS 250 150 source
GeForce GTS 240 120 source
GeForce GTS 150 141 source
GeForce GT 240 69 source
GeForce GT 230 65 source
GeForce GT 220 58 source
GeForce GT 130 75 source
GeForce GT 120 50 source
GeForce G 210 35 source
GeForce 210 30.5 source
GeForce 205 30.5 source
GeForce G100 35 source
GeForce 9800 GX2 197 source
GeForce 9800 GTX+ 141 source
GeForce 9800 GTX 140 source
GeForce 9800 GT 105 source
GeForce 9600 GSO 105 source
GeForce 9600 GT 96 or 52 source
GeForce 9500 GT 50 source
GeForce 9400 GT 50 source
GeForce 8800 Ultra 175 source
GeForce 8800 GTX 145 source
GeForce 8800 GTS 512 135 source
GeForce 8800 GT 105 source
GeForce 8800 GS 105 source
GeForce 8600 GTS 71 source
GeForce 8600 GT 43 source
GeForce 8600 GS 43 source
GeForce 8500 GT 40 source
GeForce 8400 GT 38 source
GeForce 7950 GX2 110 source
GeForce 7900 GTX 84 source
GeForce 7800 GTX 81 source
GeForce 7600 GT 35 source
GeForce 7300 GS 16


ati_radeon.png


Graphics Card Model TDP (Watts)
FirePro V9800 225 source
Radeon HD 6990 375 (@ 830MHz)
or 450 (@ 880Mhz) source
Radeon HD 6970 256 (Peak 3D, OC) source
Radeon HD 6970 250 (PowerTune +20%) source
Radeon HD 6950 252 (Peak 3D, OC) source
Radeon HD 6950 200 (PowerTune +20%) source
Radeon HD 6870 200 (peak 3D, OC) source
Radeon HD 6870 151 source
Radeon HD 6850 127 source
Radeon HD 6790 150 source
Radeon HD 6670 72 (peak 3D) source
Radeon HD 6670 66 source
Radeon HD 6570 60 (GDDR5)
or 44 (DDR3) source
Radeon HD 5970 294 source
Radeon HD 5870 X2 376 source
Radeon HD 5870 188 source
Radeon HD 5850 151 source
Radeon HD 5830 175 source
Radeon HD 5770 108 source
Radeon HD 5750 86 source
Radeon HD 5670 61 source
Radeon HD 5650 60 source
Radeon HD 5570 42.7 source
Radeon HD 5550 40 source
Radeon HD 5450 19.1 source
Radeon HD 4890 190 source
Radeon HD 4870 X2 286 source
Radeon HD 4870 157 source
Radeon HD 4850 X2 230 source
Radeon HD 4850 114 source
Radeon HD 4830 110 source
Radeon HD 4770 80 source
Radeon HD 4670 70 source
Radeon HD 4650 55 source
Radeon HD 4550 25 source
Radeon HD 4350 20 source
Radeon HD 3870 X2 190 source
Radeon HD 3870 105 source
Radeon HD 3850 X2 140 source
Radeon HD 3850 75 source
Radeon HD 2900 XT 215 source
Radeon HD 2900 GT 150 source
Radeon HD 2600 XT 45 source
Radeon HD 2400 XT 25 source
Radeon X1900XTX 135 source
Radeon X1800XT 113 source
Radeon X1800XL 70
 
They seem to missing a few:
GTX 560 Ti (OEM -cut down GTX 570) 210w
GTX 260 (216SP, 55nm) 171w
GT 545 (GDDR5) 105w
GT 545 (DDR3) 70w
GT 530 - 50w
9800GT Green Ed. 75w
8800GTS 640MB (112 SP) 150w
8800GTS 640MB/320MB (96SP) 143-146w
GT 140 105w
9600GSO 512MB (G94) 90w
GT 520 29w
G 315 33w
8600GT (GDDR3) 47w
G 405 25w
G 205 30.5w


HD 4860 130w
HD 4730 110w
HD 6450 18w (DDR3/625MHz core), 27w (GDDR5/750MHz core)
HD 2900 Pro 150w
HD 3690/3830 75w
HD 2600XT (GDDR4) 50w
HD 3650 65w
HD 2600 Pro 35w
HD 3470 30w
HD 3450 25w
HD 3430 20w
HD 2400 Pro 20w
 
C'mon Chef! ...a 4860!? there's no such thing! i mean really man!

" thought I would beat someone to it.:rolleyes:

I really tried to get my hands on four of those in 09 and couldn't pull it off, I think they were a 'Japan only' release if memory serves.
 
Yup asia release only, as are the 3690 and 3830. Doesn't stop them turning up elsewhere

/You like relish or catsup with your crow ? :)

actually a light balsamic vinaigrette :p :D

where was that listing when I wanted them.

I had the most complete list i had ever seen bookmarked before I rebuilt last week, cant remember where it was though. It had a ton of crossfire/SLI configurations etc and what was supposedly the 'economy of scale' for SLI/CFX. i will endeavor to find it again.
 
I'm not sure about the U.S., but the 4860 started being available down here early 2010 (a few months after the 5000 series launch) although they were easily sourced from mainland China since many importers with Chinese family/connections bought them into the country. At a rough estimate I'd say that a third of all XFX branded cards here are "Chinese market only" SKU's.


If you fancy balsamic crow I would suggest combining just enough balsamic vinegar with soft brown sugar -just enough to get the consistancy of wet sand. Marinate the Crow for a minimum of 20 minutes but no more than an hour, then move the crow and marinade to an ovenproof dish, add a couple of whole star anise, half a cinnamon quill, a few peppercorns and a bay leaf then cover with enough boiliing brown chicken stock to cover the bird, then tightly cover the dish with a lid or aluminium foil. Place into a preheated oven (~160C) and braise for 40-60 minutes -depending on how big the crow(s) are- Ravens would obviously take closer to an hour- and how many portions of crow you intend to eat.
Remove from stove, and allow to cool while still covered. Once cool, remove birds, defat the braising liquor if necessary reduce the liquid over a medium-high heat until the desired consistancy is reached. Season. Serve.

Works equally well, if not better, with Chicken, Duck or Turkey.
 
Well lets see....get it wrong and get a recipe....hmmmm......hey Chef.....the GTX 280 makes a lousy PhysX card!:p:haha:
 
I had the most complete list i had ever seen bookmarked before I rebuilt last week, cant remember where it was though. It had a ton of crossfire/SLI configurations etc and what was supposedly the 'economy of scale' for SLI/CFX. i will endeavor to find it again.

I've seen a few piecemeal charts (such as this one at Tom's) but can't say I've seen a wholly complete list anywhere on the net. I've kept a db on graphics since around 1999-2000, which I started because of the number of vendor special non-reference cards that were around at the time (Kyro's, Rendition Verite, S3 Savage, nV TNT/GeForce2 etc.) so it was handy to be able to call up which version had what particular frame buffer/core etc- Quite handy when some manufacturers weren't keeping up with the DX specification for instance
 
Makes for a long post!

I think these charts are averages from online tech reviews. I also think they use English language reviews only and also use previews -two strikes against in my book. A lot of the numbers are accurate -as far as individual reviews show, but I would be very dubious about trusting the the whole thing
Cases in point:
HD 2900XTX -no such card has existed. Some review sites at the time postulated that the 1GB version of the XT was going to be the XTX. It wasn't- it's a HD 2900 XT, whose TDP incidentally is 215w - so 240 and 270w for single card usage is little out (especially considering the numbers for the 512 and 256MB versions)

HD 4750 -again, no such card as far as I'm aware. I think the nomenclature was put forward by Hilbert over at Guru of 3D - His preview I think was the only mention of the naming.

There also seems to be some odd figures -easy enough to arrive at when there is a limited amount of reviews, but cards such as the 4850 were heavily reviewed. Not sure how a card that usually comes in under it's 110w TDP can end up with 149w under peak 3D load ( first graph, opposite the second "a" in Data of the second "Card Name (Data Quality)" on the left hand side. They also seem to have a LOT of cards exceeding their factory TDP by a factor of ~50% in some cases. The ones that takes the prize must surely be the GT 330 (an OEM downclocked GTS 250) with a TDP of 75w, the GT 320 (OEM downclocked GT 240 -43 watts), and the GeForce 315 (HD 5550 class card at 33 watts) all of which Geeks have at 300 watts !

I'd also go out on a limb and suggest that the 160w for the HD 3850 Trinity is someones wishful thinking.
 
I get the range varying, I have seen the 4850 reviewed and spec'd from 110w to 130w. the puzzling thing to me is the CF/SLI numbers that have lots of energy saving when running multiple cards....I think I need to invest in some testing equipment.
 
Yea, some of the variances are all over the place

GTX 580: Single card 310w, SLI 627w, 3SLI 660w, 4SLI 814w
GTX 480: Single card 310w, SLI 608w, 3SLI 862w, 4SLI 1120w....single and dual are comparable as they should be ( Kyle and co. measured GTX580 SLI at 611w which is on the money for accepted peak usage) tri/quad SLI variance to way too high.

The trouble with aggregating reviews is that most of the reviews report power usage either as whole system and/or at the wall (not taking into account PSU efficiency). Add in that every site seems to use a different metric for calculating max wattage - Furmark, Crysis etc. and there are very few reviews worth taking into account. i.e. compare Xbit's regime to the quick-and-easy approach used by most mainstream sites (page 2 for the hardware implementation), and here applied to graphics cards in isolation
 
The award of longest ever post seen by me on TSF goes to red !

Damn I nearly lost track of time while scrolling down that post :D

And I agree with DBZ that there are so many variations in testing methodologies that things become nearly pointless. I think it would be much more appropriate if there can be some consensus amongst the tech reviewers with regard to developing a 'standard methodology' for testing hardware. I guess once someone has done that standard review, they can apply their own custom methods just to differentiate themselves from others.
 
The award of longest ever post seen by me on TSF goes to red !

Damn I nearly lost track of time while scrolling down that post

Now see here mr CanNorZealandian! I try to be helpful to the TS community...and this is the grief I get?! :p:D

Yeah I don't know what to make of it either. I think Chef is spot on with them being tested under different "full 3D loads" from a game to the likes of Furmark.
I have noticed that sites are beginning to dump Furmark as a standard as its wholly unrealistic.
 
But there is still not much uniformity with regard to testing methods.

Now see here mr CanNorZealandian! I try to be helpful to the TS community...and this is the grief I get?!

Yup, no question about the stated fact. :grinthumb:
 
being tested under different "full 3D loads" from a game to the likes of Furmark.
It comes down to bad information gathering as far as these graphs go. For some reason, some of these cards have their max load rated on the manufacturers PSU minimum requirement for the system. The rest are just an ad hoc collection of review numbers trawled from across the net. Case in point would be the quad SLI GTX 480 numbers. Hilbert (one of the few sites that did a quad SLI review) has 1180w listed -close enough to the 1120w average on the Geeks charts.

1188 x claimed PSU efficiency (88-90%) = 1046-1069 watts. Minus a 130w CPU (980X) and minus an EVGA Classified board (chipset, ram, southbridge, and two NF 200 bridge chips @ 20-25w apiece) at roughly 100-120w. So 1046-1069 minus ~230w = 816 to 839 w for the cards. Not that far off the GTX 580 quad SLI numbers
I have noticed that sites are beginning to dump Furmark as a standard as its wholly unrealistic.
Probably due to the fact that full torture tests like Furmark, Afterburner and OCCT are now being actively throttled either by hardware voltage limiters and/or software. This will be more problematic once cards start consistantly shipping with more than one BIOS setting - Not hard to imagine a card series that ships with an "energy saver" BIOS setting (heavily throttled), one or more stock setting (selective throttling based on temps/voltage), and a max headroom setting. Hands up those that don't think graphics cards in near future will be EFI enabled and preloaded with a raft of profiles? You're going to need some selling point to entice people to upgrade their GTX 580's and HD 6970's when gaming graphics have largely stagnated. Full scene tessellation (medium term) and real time ray tracing (long term) are the carrots for the upgrade stick- but that is going to need a degree of resolve from both graphics manufacturers to move game devs - How hard do you have to look to find half the graphics fan base howling about the unfairness/lack of need for max tessellation in the Heaven benchmark, Lost Planet 2, Metro 2033, HAWX 2 etc. ?
Ah, yes....TDP calculation...
The bigger problem comes with what you replace those tests with. If you are using a whole system power consumption then you need to take into account the vagaries of CPU core/thread loading, RAM and harddrive usage, as well as whether the application is vRAM and/or GPU intensive. Add in multi-GPU efficiency (or otherwise), rendering mode used, and CPU, system I/O and chipset efficiency/voltage usage, and you have a enough variables to make standardization close to a pipe dream....and that's without taking into account driver optimizations (or otherwise) of the apps in question -another case in point would be the voltages that the HD 5000 series shipped with. Since most reviews are conducted at card launch, the power usage is weighted towards the lower voltage/TDP of the initial product. With the associated problems of the GSoD, BIOS's were then flashed/reflashed with higher voltage states to correct the problem. Very few reviews would have reflected the revised idle/2D/3D power consumption, since later reviews would then have moved to non-reference/factory overclock cards which in some cases are markedly more power efficient, or markedly more power hungry.

Meandering off topic I know
What was the topic again?
 
Back