DiRT 3 GPU & CPU Performance Test

By on May 31, 2011, 3:30 AM

As one of the first games to take advantage of DirectX 11, we've been using Dirt 2 to benchmark graphics cards since its arrival in late 2009. Although it's been a crucial part of our testbed, Dirt 2 isn't quite as taxing as it was when the first DX11 cards arrived. With Crysis 2 dissapointingly restricted to DX9 and few other knee-buckling games on the immediate horizon, we've been eagerly awaiting the next iteration of Codemaster's racing series.

The company answered our prayers last week, launching Dirt 3 for the PC, Xbox 360 and PlayStation 3. Developed with the latest (v2.0) EGO game engine, Dirt 3 is a spectacular looking racing game with some surprisingly high, but also incredibly vague recommended system requirements.

Codemasters recommends that you play with an AMD Phenom II or Intel Core i7 processor and an AMD Radeon HD 6000 series graphics card, but fails to mention specific models or anything at all from Nvidia. Meanwhile, the minimum requirements say you can scrape by with a paltry Athlon 64/Pentium D and HD 2000/GeForce 8000 class graphics.

While it's nice that gamers with five year-old machines can play Dirt 3, we're more interested in knowing what it takes to experience the game with all its visual splendor. As usual we've compiled the performance of over 20 graphics cards, all DX11 capable, at several different resolutions.

Read the complete review.




User Comments: 33

Got something to say? Post a comment
wardogz said:

Well after reading this I was somewhat confused, so I ran the ingame benchmark myself a couple of times, it was the 'Aspen' track on both occasions and my avg minimum framerates were 97.7 and 97.5 respectively and that was only with an EVGA GTX460 FTW 1gb, i5 2500k cpu@4.2ghz, and 8gb ram. Somewhat different from the scores you obtained.

wardogz said:

Ok, my bad, I'll always admit, when I get it wrong, after discovering the discrepencies in our findings, I went back and re-checked my settings, I didnt realise the 'ULTRA' settings were available (ooops) and everything had been only set to 'high'. I adjusted everything i could to the highest possible (ie ULTRA where available) and re - ran the benchmark, achieving a very different 50.1 avg minimum fps. My apologies for doubting your findings earlier.

Guest said:

What resolution are the CPU-bound tests run at?

KRayner96 said:

Interesting results. I run this benchmark the other day and with VSync on @ 1920x1200 I got an average of 60FPS (due to VSync being on of course it couldn't be any higher). I'll test again tonight with VSync off but I find it strange that my results are so much higher than yours considering I also had the highest level of AA enabled. (whereas you had x4)

Here are my Specs:

i5 2500K (not overclocked yet)

Asus P8P67 Motherboard

2x4Gb Corsair 1666Mhz RAM

MSI N560GTX-Ti Twin FROZR II - (stock settings for the card, albeit overclocked vs. a ref 560)

I used the latest BETA Nvidia drivers (275.27's) as well.

As mentioned, I will re-run the test tonight without VSync and will provide a screenshot as evidence.

Guest said:

Hello!

Why is the Radeon 5850 not in the list of tested graphic cards?

The 5830 and 5870 are there so I don't get the point why we're missing the 5850.

After Sapphire's unresistable offer (Radeon 5850 Xtreme) I'm dying to see its performance compared to the new gen cards...

Such a pity... :(

Peter

Guest said:

for my gamepad apollo doesn't see it, and control is fail.

Guest said:

Nice one Steven,

Good to see the old 58XX is still going strong, seems it gives the newer 69XX a hard time.

Wondering how the old Penryn Quad would do with its massive 12MB cache.

The time of Dual Cores is over!

What I actually would like to see is a comparison between a console and PC maxed out, at least we can see where we spend our hard earned bucks (or Euro's) on.

KRayner96 said:

Seems I made the same mistake WARDOGZ did, never knew the game was running on High (Automatic chose High settings for me).

I get 56FPS with Ultra settings @ 1920x1200 and x4AA which is very similar to what you guys posted. Game definately looks and runs very well

KRayner96 said:

Mmm, with VSync on it runs just as well, 1fps on average less, lowest FPS difference is 2. Definately worth leaving this setting on. (as you get no 'tearing'

Guest said:

can any one explain why this is the only game I've seen where the 2600k isn't the best chip?

Guest said:

Sometimes newer isn't always better?

Guest said:

Great review Steven. Glad to see Eyefinity resolutions included. Techspot is one of the few that keeps upping the ante for multi-monitor testing :)

Quick question, any idea if Catalyst 11.5b hotfix has any impact on Dirt 3 performance for AMD cards?

http://support.amd.com/us/kbarticles/Pages/AMDCatalyst115bho
fix.aspx

Guest said:

Why would these guys do a GPU test with this game over Witcher 2?

blimp01 said:

Suprising results for the sandy bridge, i didnt even expect that. i know the Core 2 are outdated but i might want to see a Q9550 or somethign in the mix next time, unless you dont have any on hand

Staff
Matthew Matthew, TechSpot Staff, said:

A performance review of The Witcher 2 is incoming .

red1776 red1776, Omnipotent Ruler of the Universe, said:

Why would these guys do a GPU test with this game over Witcher 2?

Why wouldn't they? no really, I'm curious why you would say that. Is Witcher 2 supposed to be very demanding GPU wise?

Staff
Steve Steve said:

Interesting results. I run this benchmark the other day and with VSync on @ 1920x1200 I got an average of 60FPS (due to VSync being on of course it couldn't be any higher).

They are not as interesting as they are accurate but I am happy that you did retest and noticed the ultra settings.

Hello!

Why is the Radeon 5850 not in the list of tested graphic cards?

The 5830 and 5870 are there so I don't get the point why we're missing the 5850.

After Sapphire's unresistable offer (Radeon 5850 Xtreme) I'm dying to see its performance compared to the new gen cards...

Such a pity...

Peter

You know how fast the Radeon HD 5870 and 5830 are in relation to the 5850 so do you really need another card clogging up the graphs?

Mmm, with VSync on it runs just as well, 1fps on average less, lowest FPS difference is 2. Definately worth leaving this setting on. (as you get no 'tearing'

Its personal preference, some like to go well beyond 60fps such as myself and I don't seem to get the tearing either.

can any one explain why this is the only game I've seen where the 2600k isn't the best chip?

Unfortunately no I cannot. I ran the test several times and could not work it out, the 2600K is a much faster processor so I am not sure why the game did not take to that platform.

Great review Steven. Glad to see Eyefinity resolutions included. Techspot is one of the few that keeps upping the ante for multi-monitor testing

Quick question, any idea if Catalyst 11.5b hotfix has any impact on Dirt 3 performance for AMD cards?

[link]

It was meant to but I tested well before that driver was released sorry.

Why would these guys do a GPU test with this game over Witcher 2?

If the question is why, then the answer is we have. If the question is why not yet then the answer is the game was unfinished when released and we waited for the first patch which allowed us to play full screen and boosted performance.

Why wouldn't they? no really, I'm curious why you would say that. Is Witcher 2 supposed to be very demanding GPU wise?

It's actually quite incredible really, DX9 only and easily one of the best games I have ever seen (visual quality wise). The game itself puts me to sleep though, not an RPG fan

Reiz3r said:

mine is crashing with directx 11...=(

red1776 red1776, Omnipotent Ruler of the Universe, said:

Hello!

Why is the Radeon 5850 not in the list of tested graphic cards?

The 5830 and 5870 are there so I don't get the point why we're missing the 5850.

After Sapphire's unresistable offer (Radeon 5850 Xtreme) I'm dying to see its performance compared to the new gen cards...

Such a pity...

Peter

@ 1920 x 1200...

HD 5850 = 50 FPS

It's actually quite incredible really, DX9 only and easily one of the best games I have ever seen (visual quality wise). The game itself puts me to sleep though, not an RPG fan

Not an RPG fan either, but if its that good, I think I will give it a go.

*** nevermind...much too Dungeon and Dragony looking.

Guest said:

Can you please post a CPU benchmark of using a Phenom II x4 @ 3.2ghz, so we can see the difference overclocking made?

KRayner96 said:

Matthew said:

A performance review of The Witcher 2 is incoming .

Awesome, I can't wait for that. Am busy playing the game atm and it runs very well on Ultra settings (with ubersampling turned off of course) @ 1920x1200. When I used Fraps it ran around 30-50FPS depending on the scene etc.

KRayner96 said:

@Steve: The Witcher 2 definately is very impressive. I think it's the first game that properly uses depth of field in the game, as well as the cut scenes. Even the subtle blurring as you move around quickly is nicely done. Seeing the light filter through the trees in Chapter 1 is simply breathtaking. It's hard to explain just how good this game looks. I also love the way the scene changes as the time passes i.e. the light changes colour as the time of day changes, this is amplified by whatever setting you are currently in, the forest, a tavern etc.

I remember when the only game that looked this good was Crysis, and it needed to use DX10 for the dynamic lighting effects (sun filtering through the trees etc.) although I know that was somewhat disproved by mods that achieved a similar effect using the DX9 codepath.

Staff
Steve Steve said:

KRayner96 said:

Matthew said:

A performance review of The Witcher 2 is incoming .

Awesome, I can't wait for that. Am busy playing the game atm and it runs very well on Ultra settings (with ubersampling turned off of course) @ 1920x1200. When I used Fraps it ran around 30-50FPS depending on the scene etc.

Our test using the Ultra settings with UberSampling saw the 560 Ti average 40fps exactly so there you go

KRayner96 said:

Say whaaaaat :-p

"Our test using the Ultra settings with UberSampling saw the 560 Ti..." So you got 40FPS with the 560Ti with it ON on Ultra settings?

If so, mmm I haven't tried ubersampling since the patch or since I installed the BETA drivers. I just recall wondering why I was getting 12FPS on the opening cinematics (where Gerald is being interrogated). After a quick Googling I found out that Ubersampling was the culprit.

If I may ask, how are you benchmarking The Witcher 2? I assume you are using FRAPs and running through a specific area multiple times?

Staff
Steve Steve said:

Hell that was a typo, what I mean to say we used the same settings as you. So UberSampling disabled, sorry about the typo. Yes UberSampling hammers the card down to about 10fps. Also using Fraps to test 60 seconds of game play. The article is done and should be posted tomorrow I suspect.

KRayner96 said:

I thought as much. Looking forward to it.

Guest said:

It would have been nice to see the performance of the Phenom II X3-720, this would unveil the true nature of how much of an impact an extra core has (and, if Dirt 3 uses the third core appropriately).

red1776 red1776, Omnipotent Ruler of the Universe, said:

I'm not sure what "unveil the true nature of how much of an impact an extra core has (and, if Dirt 3 uses the third core appropriately)." means , but it uses at least six cores, loads them evenly, and will everything your x3 720 has. (I played it on my x3 720 machine as well)

Dirt 3 on Phenom II x6:

Guest said:

Just a note about quality presets - you said the game doesn't have them, but if you select "Choose Preset" a popup menu will appear with various choices.

Guest said:

This game screams along in DX9.3 on default quality settings with just a Pentium Dual Core E5200@3.8Ghz, Nvidia GF250GTS @ 1600x1200 in WinXP. Comfortably above 60 fps all the time, no slow down, no jerkiness, so you really don't need the latest and greatest hardware for Dirt 3. Looks great as well.

Guest said:

Had liked to see a DX10/10.1 card as well...Perhaps one that meets the minimum requirement or is somewhere in-between the minimum and recommended, people with eg a HD3870 has more to worry about than one with a 6770

Guest said:

I've to disagree with the summary, i'm playing the game in Eyefinity with just one 6970 & i5 2500k 3.3 Ghz at 5760x1080 4xAA all settings high to very high with 50fps.

Second Pc with a 5750 & Phenom 2 X3 2.5 Ghz runs the game at 50fps with Full HD 1920x1080 and settings at high. 0.o

Guest said:

i would definitely like to see core i3 performance in multithreaded latest games.

pls include core i3s in ur benchmarks.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.