ATI Radeon HD 2600XT vs Nvidia GeForce 8600 GTS

By Julio Franco
Sep 12, 2007
Post New Reply
  1. While Nvidia remains quite dominant in the high-end segment, a much larger battle has taken place to offer the best mainstream graphics product. Currently the two leading mainstream cards are the Nvidia GeForce 8600 GTS and the ATI Radeon HD 2600XT, both of which are priced well under $200.

    Today we will be comparing these two mid-range graphics cards head to head using a 512MB VisionTek Radeon HD 2600XT and a 256MB ASUS GeForce 8600 GTS. While neither product features overclocking out of the box, the Radeon has an obvious advantage in that it sports twice as much video memory. And while this may appear to be unfair, we have found that the average 512MB Radeon HD 2600XT graphics card retails for just $140, while the average price of a GeForce 8600 GTS sporting 256MB of memory is $160.

    http://www.techspot.com/review/66-radeon-hd-2600xt-vs-geforce-8600gts/

    Please leave your feedback here. Thanks!
  2. HalfHuman

    HalfHuman Newcomer, in training

    use latest drivers

    You could have done better by using latest drivers for the test. ATI released for some time the catalyst 7.9 and you used the 7.8 version. You have to put in perspective that ati cards are newer than those of nvidia and that the new drivers continue to improve performance and quality. I'm not a ATI fanboy since i have a mighty on board nvidia 6150 video solution. but just to make the comparison fair... :)

    and another thing that i wanted to add: i believe that ati2600xt competes from performance point of view and price to 8600gt. and i believe that a power consumption test would be nice too. I know it depends a lot on different components starting from the power source and motherboard but i believe the comparison would be correct since the only difference would bet he videocard.
  3. Steve

    Steve TechSpot Staff Posts: 1,286   +398 Staff Member

    Thanks for the feedback and yes we are aware that it is best to use the latest drivers when possible. However, the testing for this article was completed before the 7.9 BETA drivers were available.

    i believe that ati2600xt competes from performance point of view and price to 8600gt - That maybe so but these days the 8600 GTS can be purchased for about the same amount as a 512MB HD 2600XT.
  4. HalfHuman

    HalfHuman Newcomer, in training

    i believe that as we speak 7.9 in final.

    and about the price, you already said that the 512megs of ram aren't doing much to performance.

    and another thing that i forgot to mention is the lack of very prestigious games like quake 4, doom3, HalfLife 2, Oblivion which are also game engines.

    ps: don't forget about the power measurements! :)
  5. Steve

    Steve TechSpot Staff Posts: 1,286   +398 Staff Member

    i believe that as we speak 7.9 in final.

    I hope you do as you have no idea when those tests took place. Guys are placing far too much of an emphasis on drivers and I see no reason why the 7.9 drivers would shed any new light on games that we have had for quite some time now.

    and about the price, you already said that the 512megs of ram aren't doing much to performance.

    There are plenty of 512MB HD 2600XT cards and many gamers are buying them. Furthermore it was what we were sent and price wise they match up. Therefore we took two cards in a similar price range and compared them. I am sure there would have been far more readers up in arms if we compared a 512MB HD 2600XT to a 256MB 8600GT.

    and another thing that i forgot to mention is the lack of very prestigious games like quake 4, doom3, HalfLife 2, Oblivion which are also game engines.

    Prey I believe is built on the “id Tech 4” game engine just as Quake 4 and Doom 3 is, this is just a newer game than the two you mentioned. Therefore out of your list we missed just HalfLife 2 and Oblivion, we will take that onboard but you cannot get everything you want ;)

    ps: don't forget about the power measurements!

    Power measurements would have been interesting to include and while I do believe they are relevant I do not think they would have shaped the conclusion in anyway. These cards use very little power in the grand scheme of things and can work off decent 400w power supplies. Power consumption is really more worthwhile when looking at high-end graphics cards that actually suck down quite a lot of power.
  6. HalfHuman

    HalfHuman Newcomer, in training

    maybe you should have put in comparison a 8600gt as well as a x2600pro and even a 8500gt, as i have seen a lot of offers on 8500gt with crazy memory speeds (1400mhz) but which don't have any reviews.

    i hope you understand the importance of drivers. read the release notes about last three versions of catalysts. they all address performance issues in ati x2400, x2600 cards and the performance increases are from a few percents to quite tens of percents. they are also addressing bugs but that's another story.

    i must apologize for forgetting that doom3 is based on id quake 4 engine, but if i'm not wrong it was modified to some extent.

    about the power consumption now... well i read about some comparisons. discrete video cards consume from 20w to 60w in idle and when in use starting from 40w to 200w for a single card. and 20w is consumed by the lesser cards with 64bit memory interface, little and slow memory, slow gpu with limited capabilities like x2400, x1300pro and 7100gs, 7300gs, 8300gs that are by no means game performers. the middle range cards that you tested are starting from 40w in idle and 60-80w in full load. i believe that this amount is not that little to ignore since sometimes it amounts to half of the total system power needs and that is the energy consumed by the cards without counting the power source efficiency. and that would be important to me since my pc in on 24/7.
  7. mailpup

    mailpup TS Special Forces Posts: 8,390   +205

    Perhaps you could then say something to this effect in the article. While not going into it in any detail, just mentioning this might address the concerns of those who wonder about high power consumption.
  8. Mirob

    Mirob TechSpot Paladin Posts: 841

    512mb is just a gimmick. If I was to buy a 2600xt it might be the 256mb GDDR3 version that is priced around $100 to use in a HTPC. I don't think I have ever seen a review of it. It would be nice to the how it suffers from the slower memory.

    The mid-range is such a disappointment this generation. I hope the next will have some surprises.
  9. Julio Franco

    Julio Franco TechSpot Editor Topic Starter Posts: 6,510   +308

    We definitely have to draw the line somewhere on what games are tested, and we try to use those more recently released and popular. Oblivion is certainly a miss, but we also have Bioshock scores to share with you:
    http://www.techspot.com/article/64-bioshock-performance/

    I thought the same thing at first instance, but like Steve explained, our focus was more on performance which is by far more relevant when comparing the differences between these two cards. We could and perhaps should have mentioned that but it gets real old every time you read a review are we supposed to cover the very basics even if they are irrelevant or not... Ultimately, not either gets picked as a best buy, so again even less point to want to buy them and go in further detail ;)

    Like you said, nothing but a gimmick for a card on this range and performance. The card we tested had the slower GDDR3 memory so you can pretty much assume the performance we posted for our board is almost the same as in a 256mb card, perhaps a few fps difference is a game here or there but nothing too significant.
  10. TEAR_pUbLiUs

    TEAR_pUbLiUs Newcomer, in training

    I think I need to disagree.

    i just upgraded from a 1950 pro to an HD 2600XT made by VisionTek.

    In games like Battlefield 2 and Half-Life 2: Episode 2, i have noticed incredible results. I have gained an average of 40+ fps in BF2, and in some cases up to 70+ in Episode 2.

    On the other hand, games like GRAW 2 haven't improved too much, but I have noticed an improvement.

    I do own FEAR, and haven't played it with my new 2600XT yet because I only got the card today. So many games, so little time!

    Just had to throw in my 2 cents after seeing that it was recommended to just get a 1950 Pro instead of the 2600XT. So far I don't have a single complaint. Hopefully everyone else out there deciding to upgrade has the same results.
  11. Steve

    Steve TechSpot Staff Posts: 1,286   +398 Staff Member

    Amazing results and I really have no idea why your Radeon X1950 Pro was so slow. What I do know is that the X1950 Pro is a faster graphics card when compared to the 2600XT.

    Even in new games such as UT3...

    http://www.legionhardware.com/document.php?id=693&p=5

    Again amazing results and thank you for sharing them with us!

    Edit: I Googled this just to get some more results ... here is another article backing up our claims...

    http://www.bit-tech.net/hardware/2007/08/14/radeon_hd_2600_xt_vs_geforce_8600_gt/1

    Note that the X1950 Pro is faster than the 2600XT in 90% of the tests and in many cases significantly faster.
     
  12. TEAR_pUbLiUs

    TEAR_pUbLiUs Newcomer, in training

    i'm not sure what to say. maybe got a fluke 1950 pro that was slow. but even running itunes in the background i can get over 100 fps at times while playing bf2 online. i even have FRAPS video to prove it. the results are no doubt astounding. so to me there is no reason not to upgrade. how possible is it that my 1950 pro had problems? if so, i might as well return the 2600xt and keep my 1950 pro. haha.
  13. Hello everyone,this is my point of view and my experience that costed me money I almost could have new rig for in these days...

    I recently sold my Powercolor X1950Pro - BOUGHT NEW for 400 euros - SOLD BROKEN for 12 / yes twelve/ euros with its stock
    brutal Arctic Cooling system on it...
    All i can say is that it was a piece of ****-expensive-had my first car for same amount of money,unreliable-some games ran,some just simply not and card was weak not even talking about its power supply needs, 450W brand new! branded and expensive power supply died in 4 hours of intensive gaming the same day I just upgraded it etc. yuck!!

    On the other hand:
    Just bought hardly used MSI 8600GTS DDR3 for 18!! euros -it's January 2012---- Passive cooling,red pcb,very rare card-shaders running 1620Mhz stock thats more than OC/OC2 versions of any manufacturer,I've mounted 3 small fans on its radiator and now I can play Skyrim on 1440x900 - max for my monitor,settings on medium or 1280x800 settings on high and I have no problems with any game all day long... 3 fans because without them the GPU temp went 115 degree Celsius after about 6 minutes of playing Skyrim but card is absolutely brave running newest games it was not designed for and fights wonderfully for its money.
    This is reality.. now go and buy some Radeon for 18 euros and you will recall my words about piece of **** soon ;)

    my System
    Motherboard: MSI K9N Neo V2
    Chipset: nVidia nForce 520
    CPU: AMD Athlon64 X2 5000+ Dual Core 2x2.6Ghz
    Graphic card: MSI 8600GTS
    Chipset: nVidia G84
    RAM: Kingston 2x1GB DDR2 PC5300 667Mhz in dual channel
    PSU: 350W no name crap beaten every day 8-14 hours under SKYRIM!

    Make your own conclusion.
    Sorry for my bad English
  14. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,867   +74

    way to revive an ancient thread. :)
    but as you went through the trouble, what was your point?


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.