My daily rant

Status
Not open for further replies.

Per Hansson

Posts: 1,978   +234
Staff member
Yesterday I read this article on 3DMark03 over at [H]ard|OCP and today they have posted somewhat of a follow-up "Benchmarking Right". I have to say I agree with everything they have to say in there. Now, when I visited their site again later I found this:

NVIDIA has released the latest drivers, version 42.68 for all 3DMark03 testing and benchmarking. This driver will greatly improve your benchmark performance when running the 3DMark03. You can download the driver from this link. (File Size : 8.44MB) We received these drivers directly from NVIDIA, and it's available at their Press FTP.

I mean, is there anything more pointless to optimize your code for, hours of coding that could instead have been used to fix bugs and optimize the code for real games?!

Be sure to read the above articles and also Nvidias own words on how they feel about 3DMark (scroll down half a page) then post your comments right below.

UPDATE: I've read Nvidia's own lab report myself now and can thus confirm that the information posted at Hardware Extreme (linked above) is authentic. This information is however confidential so I can not give you the file, so don't even bother asking for it.
 
It is not a surprise that they are not happy now.

Last time, when they are leading, they didn't make any noise. because their cards are the leading scorer. and drivers keep on releasing to increase that score.

now their cards don't support DX 9, and suffer a lot. isn't this obvious?

well, we know that NVidia is good at optimizing drivers for benchmarks, now they themselves confirm it. :grinthumb :grinthumb
 
Interesting info indeed!

However I still beleive the benchmark itself to be flawed, driver "hints" put aside...
 
I personny get very low score.

But i don't think it is flawed to that much.

1st reason I think of: the 1Ghz PC nowadays is fast enough for most people to do all office applications, web surfing, music, programming etc. and as some industry news says, there is no need to buy a 2.5Ghz machine in order to do this kind of work.
and most people don't really bother to change their 1Gh+ to 2Gh+ machine.

but the 3DMark 03 shows the difference between a 1Gh and a 2.5Gh machine. ( in real office applications, u don't notice much difference ).

2nd reason from me: NVidia's Geforce 4 totally has no support for DX 9. that means the 4th test won't be available to them( am i rite? ). without this test, their score will definitely low.
but for ATI's 9x00 series, they have DX 9 support. this test is enough to make the score a big difference.
 
but for ATI's 9x00 series, they have DX 9 support. this test is enough to make the score a big difference.

Not quite.

Only the R9500 and up have DX9 supoort.

The R8500/9100 and R9000 are DX8 cards.

All DX8 cards have a rough time of it, it's a monster of a benchmark

:)
 
The thing is that the benchmark is not representative at all on how game-engines are made.

There are so many unnecessary redraws, I'll quote Nvidias report here:

The portion of this algorithm labeled “Skin Object in Vertex Shader” is doing the exact same
skinning calculation over and over for each object. In a scene with five lights, for example, each
object gets re-skinned 11 times. This inefficiency is further amplified by the bloated algorithm that
is used for stencil extrusion calculation. Rather than using the Doom method, 3DMark03 uses an
approach that adds six times the number of vertices required for the extrusion. In our five light
example, this is the equivalent of skinning each object 36 times! No game would ever do this.
This approach creates such a serious bottleneck in the vertex portion of the graphics pipeline that
the remainder of the graphics engine (texturing, pixel programs, raster operations, etc.) never
gets an opportunity to stretch its legs.
 
Originally posted by PreservedSwine
Not quite.

Only the R9500 and up have DX9 supoort.

The R8500/9100 and R9000 are DX8 cards.

All DX8 cards have a rough time of it, it's a monster of a benchmark

:)

i just meant to say DX 9 and DX 8 makes a big difference of the socore. :eek:

ya. only the 9500 and 9700 series have DX 9 support.
 
The thing is that the benchmark is not representative at all on how game-engines are made.

There are so many unnecessary redraws, I'll quote Nvidias report here:
Nvidia is LYING!!! Please, do some research to investigate what they say, it tuns out they are FLAT WRONG, which is very troubling on many levels.....

nVidia's allegations regarding 3DM2K3's vertex shader processing algorithms, which we discussed during our analysis of Test 2, are troubling. But our testing shows good scaling by test resolution, indicating that the shader tests are not purely vertex shader bound. Could there be inefficiencies in the vertex shader and stencil shadowing code? Possibly, but it isn't sufficient to totally bottleneck either GPU at the vertex processing level.

Yes, it's asynthetic benchmark MADE TO STRESS HARDWARE. If you want to see how a game performs, then simply play the game. But if you're interested in the limitations of your hardware, then run the synthetic benchie......

There seems to be some inconsistency coming both from nVidia and FutureMark. For its part, nVidia initially expressed concerns about the test methodology not reflecting "how games get made in the real world." But, taking this argument to its logical end, if a 3D benchmark tried to exactly emulate how games get made in the real world, it would absolutely impossible to create a 3D benchmark at all

At startup, games routinely ping the 3D driver to get a GPU ID, and once the game knows what hardware it's running on, the game engine will often setup a render state specifically for that GPU. In other words, different GPUs are made to do different amounts of work. That's the "real world" of 3D games, and because of the disparate workloads that result, it would be unfair and totally indefensible to create a benchmark that used this methodology. In other words, the render state has to be locked down such that the playing field is even.
Also, did you notice Nvdia's Quote here:
Finally, the choice of pixel shaders in game tests 2 and 3 is also odd. These tests use ps1.4 for all the pixel shaders in the scenes. Fallback versions of the pixel shaders are provided in ps1.1 for hardware that doesn’t support ps1.4. Conspicuously absent from these scenes, however, is any ps1.3 pixel shaders. Current DirectX 8.0 (DX8) games, such as Tiger Woods and Unreal Tournament 2003, all use ps1.1 and ps1.3 pixel shaders. Few, if any, are using ps1.4."

Think someone should let him know that Tiger Woods AND UT2K3 BOTH support PS1.4? There is virtually no performance difference between PS versions 1.1, 1.2, and 1.3 Why doesn't the test use a 1.3 version? BEcause the GEforce3 doesn't support it! IT uses PS1.1, which performs virtually identical to 1.3, and Nvidia is slamming them for including MORE cards in the test?? PS 1.4 represents a SUBSTANTIAL leap in technology than 1.1, 1.2, or 1.3, and is therefore tested as well as 1.1

Of course ALL PS2.0+ versions have subsets of previous PS versions, so PS 1.4 is nativley supported in all PS2.0 app's, as is PS 1.1, 1.2, 1.3...

I mean, what is going on here, there is so much innaccurate info in that Nvidia document. They are trying to fool you, and it appears they are succeeding. Please, do some more research on 3D3M03, and you;ll find it's a VASDTLY improved, FORWARD looking synthetic benchmark.

Here is another NON-BIASED review of it:

http://www.extremetech.com/article2/0,3973,888206,00.asp
 
Originally posted by Per Hansson
The portion of this algorithm labeled “Skin Object in Vertex Shader” is doing the exact same
skinning calculation over and over for each object. In a scene with five lights, for example, each
object gets re-skinned 11 times. This inefficiency is further amplified by the bloated algorithm that
is used for stencil extrusion calculation. Rather than using the Doom method, 3DMark03 uses an
approach that adds six times the number of vertices required for the extrusion. In our five light
example, this is the equivalent of skinning each object 36 times! No game would ever do this.
This approach creates such a serious bottleneck in the vertex portion of the graphics pipeline that
the remainder of the graphics engine (texturing, pixel programs, raster operations, etc.) never
gets an opportunity to stretch its legs.

What if that is the point of that test?
3dmark is supposed to test how good your card will be with future games, not current ones...
So instead of trying to figure out how and what future games will do, they stress vertex shader to the extreme...
Thus they'll give you an idea of how a future game with much skinning (and other stuff) will perform on your current setup...

.02$
 
Originally posted by MrGaribaldi
What if that is the point of that test?
3dmark is supposed to test how good your card will be with future games, not current ones...
So instead of trying to figure out how and what future games will do, they stress vertex shader to the extreme...
Thus they'll give you an idea of how a future game with much skinning (and other stuff) will perform on your current setup...

.02$

Not only that, but Nvidia is plainly WRONG in thier conclusion...
Their PR machine is simply out of control, contacting many major hardware sites and spewing out false info.

It almost deosn't matter that Nvidia is lying, as it takes a bit of time and effort to prove their analysis completely wrong, as most users simply remember that 1st impression, and move on.

Right now, the Nvdia PR machine is in full spin cycle...:(

If you're interested in facts, anmd not spin, 3Dmark has responded, in detail, about Nvidia unwarranted critisizm, and takes them apart, in depth, one by one.

You can read it here:

http://www.futuremark.com/companyinfo/Response_to_3DMark03_discussion.pdf

It is a lengthy read, but worth it if you, for one second, are gulping down what Nvidia is pouring...:(
 
hmmmm..... I dont remember Nvidia whining when 3DMark2001 was released which tested Dx8.1. and thus didnt reflect " current games". maybe that was because their card was the top performer then huh. 3DMark was a sysnthetic benchmark then too.

Nvidia's hypocrisy knows no bounds.
 
Status
Not open for further replies.
Back