ATI versus Nvidia – Stop the Benching!

By andresmal
Sep 28, 2006
Topic Status:
Not open for further replies.
  1. After attending a Lan Party this weekend, I have vowed never to attend another in my life. The venue was brilliant as to be expected from the people from Thermaltake. Instead of enjoying a day of intense game play, the entire afternoon was spend bickering over hard ware performance, and very little game play. I do believe this was not a unique incident, and that several Lan Party’s suffer from a distinct lack of game play.

    You might wonder what the fuss is all about, for the last few years the war between ATI and Nvidia has escalated, and now it reached a new level of absurdity. The argument as to what makes a good gaming graphic card is an open one and I do not believe there is a simple answer for this. I believe it all boils down to what the user demand from the system and the type of game play the user prefer. To put it in simple terms the best graphics card will be the one that can sustain the highest constant frame rate per second throughout the game. This brings us to another question “How many frames per second are sufficient?” Yet again this argument is open to debate, but my believe is that any graphics card that can produce a stable frame rate of 30 frames per second on 1024 x 768 Dpi, 4 X AA, high detail is more than sufficient.

    We all agree that certain games perform better on one graphics card than another. This phenomenon is caused by the game programmers and is not due to a lack of design on the part of the graphic card manufactures. Game programmers can sabotage the performance of a game on a specific graphic card in the development faze of the game; the best example of this is Quake 4. ATI scores much lower in Quake 4 although ATI outperforms Nvidia in almost all the other titles on the shelf.

    Set out below is the average frame rate per second from all the latest and so called greatest graphics cards money can buy. The tests were conducted with the following settings: 1024 x 768 High Detail 4 X AA.



    Sapphire Sapphire Sapphire Nvidia Nvidia Nvidia
    X1600XT X1800XT X1900XTX 7800GT 7900GT 7900GTX
    Age of Empires (Fps) 21.3 53.7 83.6 42.6 57.2 64.0
    4x AA 1024 x 768
    Black & White 2 14.0 42.1 46.1 25.5 33.7 42.1
    4x AA 1024 x 768
    Call of Duty 2 16.1 29.1 59.4 36.0 43.5 48.5
    4x AA 1024 x 768
    F.E.A.R 33.0 70.0 76.0 34.0 59.0 75.0
    4x AA 1024 x 768
    Quake 4 49.8 86.8 89.6 99.0 101.0 102.0
    4x AA 1024 x 768
    Serious Sam 2 21.6 49.0 52.9 43.8 46.6 53.6
    4x AA 1024 x 768
    3D Mark 2006 825 1773 2166 1435 1860 2132

    Overall Performance 26.0 55.1 67.9 46.8 56.8 64.2
    Avarege FPS


    To wrap up the argument we can clearly see that there is very little difference between our top contenders the ATI X 1900 XTX and the Nvidia 7900 GTX. When you take in account the 30 frames per second recommendation neither of these graphic cards have anything to worry about.

    Clearly the best by is the ATI X1900XTX and it is very cool, with its new water cooling system this is the really the dogs bullocks and will make your mates go green with envy. The second best thing about the ATI X1900 XTX is the price, no argument there and they do provide superb value for money.

    Clearly the main decision making factor in this case would be personal taste, all these graphic cards with the exception of the ATI X 1600XT, will perform beautifully in most game environments.

    You might wonder what the outcome was of the game play at the Lan Party, the winner played on an old outdated system with a Sapphire 9600 All in One Wonder Pro 128Mb. This only goes to prove that no matter how good your system is it cannot cure a useless player. I you want to bench I suggest you get a gym contract and leave the game playing to the real enthusiasts.

    Article written by: Andre Smal
    Red Dog Technologies South Africa
    http://www.reddog.co.za
    September 27, 2006
  2. Steve

    Steve TechSpot Staff Posts: 1,135   +276 Staff Member

    Nice write up but arguing over who has the best high-end performance, extremely over price graphics card is just a waste of time like you said. Fortunately, when I attend LAN Parties with friends its all about the games and know one cares about the hardware. Honestly, the smartest people just go with what ever option delivers the most value at the time.

    Fan boys and those that are extremely ignorant argue over such things!

    As you said the top contenders from ATI and Nvidia are always very close. Hell I would happily take either a Radeon X1950XTX or a GeForce 7950GX2. When it comes to value/performance I would pick the X1900XT at the moment. But of course it comes down to your budget and in the low-end to mid-range segment I would go with Nvidia at the moment. The 7300GT, 7600GS and 7600GT are all great products. At the end of the day you can’t really go wrong at either camp so WHO CARES? Pick one and game your heart out!!! After all that’s the only reason you should buy a decent ATI of Nvidia graphics card!
  3. cfitzarl

    cfitzarl TechSpot Chancellor Posts: 2,520   +9

    It really depends on what games you're playing. OpenGL games benefit from ATI, and shader heavy games benefit from NVIDIA; at least that's what I've noticed.
  4. MetalX

    MetalX TechSpot Chancellor Posts: 1,909

    Umm... I think you've got it backwards cfitzarl lol. The ATi cards are the ones with 48 pixel shaders (vs 24 for nvidia cards) and the Nvidia cards are the ones that pwn openGL games.
  5. Sharkfood

    Sharkfood TechSpot Guru Posts: 1,198

    I've found the only thing worth debating on videocards is the conditions/settings when they are run... as well as more than "average" FPS.

    Finding a good apples-to-apples configuration set is what makes this a challenge and very few websites have efforted to do more than click "Go!" then graph the end scores in nice little graphs.

    Firing Squad is one of the few sites I've seen that has now adopted an "IQ" baseline from which to benchmark videocards. I'd like to see more sites adopt this as well.

    FEAR is also a good tool since it not only reports average FPS, but also ranges and percent of frames in that range. Intel has also presented white papers on how to properly benchmark games. Most of the figures consumers look at not only completely neglect the image quality behind them, but "average" fps on charts/graphs do not indicate how particular hardware actually performs in games. The following image, for example- shows how any gamer would prefer 63 fps vs. 68 fps:
    [​IMG]

    More and more though, most gamers are becoming savvy of these shortcomings and marketing gimmicks/driver tricks. I've been noticing more people are getting their hands on hardware and aren't afraid to RMA.. as well as looking past 3dmark and fancy website graphs.. but instead more attuned to how their hardware purchases handle and what value they add to their gaming experience. :)
  6. cfitzarl

    cfitzarl TechSpot Chancellor Posts: 2,520   +9

    Oh, whoops, my mistake.
  7. MetalX

    MetalX TechSpot Chancellor Posts: 1,909

    Lol. Thats alright. I personally hate all the ATi/Nvidia flaming wars that r always on online forums.
  8. Steve

    Steve TechSpot Staff Posts: 1,135   +276 Staff Member

    That is an interesting post, the one with that image from Intel. I see where they are coming from but it’s incorrect from my experience. We always look at the low and high frame rates but report the average as it does give the best indication of performance.

    I have never see a system fluctuate so much from high to low, it just doesn’t happen! If the average is lower you can almost always expect that the lowest frame rate to be lower and the highest frame rate to be lower. That’s just how it works from my experience.

    There is a little more than that to it really. Sure some sites don’t post much in the way of results but testing a graphics card in 6 or more games at 3 or more resolutions for 6 or more quality settings does actually take a lot of time. Although it can occasionally be enjoyable it is generally quite mind numbing :(
  9. Sharkfood

    Sharkfood TechSpot Guru Posts: 1,198

    Actually, using FRAPS with it's histogram capability has indeed unveiled many past driver revisions (usually along with new product launches) to show very similar peaks/lows in order to create artificially higher average framerates. It's been reducing over time now, and FEAR is an example of a benchmark utility that also chides such techniques.

    Actually, it's much simpler than that. As Firing Squad has shown, they simply found the closest set of driver settings between competing products then simply benchmark this.

    It's a well known fact that ATI default driver settings are a closer match to NVIDIA's drivers in their "HQ" setting vs. default setting. Something as simple as spending a few minutes comparing IQ in a couple games before running benchmarks can yield a bit more objectivity in providing a more closer depiction of apples vs. apples. Of course, no precise solution exists.. but getting even somewhat closer than default vs. default is indeed of more value to the end consumer, especially given how close competing products have become.
Topic Status:
Not open for further replies.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...


Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.