3DMark 06 Score Low?

Status
Not open for further replies.

thehighroad

Posts: 45   +0
Hello!

I just finished building my PC as follows:

Asus 4870 DK 1GB (stock: 750MHz GPU, 900 MHz VRAM) (stock cooling temp- idle: 41 celcius, load: 50-55 celcius)
WD Caviar Black 500GB
2GB G.Skill DDR2 1066 RAM (Running at 1.1GHz)
Intel Core2 Quad Q8200 2.33GHz (Running at 2.40GHz)
Gigabyte GA-EP45-UD3L
Rosewill 530W PSU

Windows XP 32-bit (the reason I haven't bought 4GB ram)

My 3D mark score doesn't seem to be particularly impressive when comparing it to other's with the 4870 gpu.

3D Marks - 13046
Shader 3.0 - 5038
Shader 2.0 - 6027
CPU - 3949

Any idea if there's a bottleneck or something? Btw, in practical terms Crysis is getting some annoying FPS drops; this shouldn't be due to the 2gb ram...
 
well with an ati 4850 i get 13050 3dmark 06 score but my cpu is also clocked to 3.8GHz. i think you would see a nice jump if you overclocked that cpu to around 3Ghz.
 
Overclocking the Cpu will have the biggest effect on your mark 06 score, mark 06 is heavily cpu oriented.
The range most people try and get to is 3.2-3.6 to really let a gpu breath,

another side problem is the fact that your running in a 32bit system, x86 can only see a max of 4Gb, but is more like 3.5Gb this includes the memory on all the components also.
When the PC starts the system will load all hardware first so 1Gb of video ram and some hdd cache and dvd drive cache starts taking away from whats left for the system memory to load,
With a 1Gb video card it is recommended to be on an x64 system with 4Gb of ram..
if your interested in the new Win7 don't hold back and get it right away.
 
It just so happens that I will be getting Windows 7 32-bit pretty much today. But yeah it has to be the 32-bit version because it is an upgrade edition (I'm getting the HUGE college student discount!!!).

In terms of the RAM dilemma, I have the 2GB 1066 ram (548mhz clocked; 1.1ghz effective) so i pretty much opted for the fastest I could get hoping that could compensate for the 32-bit restriction. Besides, I've seen the benchmark comparisons of 2gb vs 4gb and they're neck-in-neck anyways--sometimes the game will have 2 fps faster on 2gb and in another game 3fps faster on 4gb... only difference seems to be loading time.

Overclocking in terms of my CPU is quite limited as you no doubt know. I kinda knew this coming in so, I guess I'll just ignore the benchmarks lol. (Also, my CPU runs at about 64 celcius full load as it is)

Anyways I'm excited to finally see Direct X 10.1 in action; but my hopes aren't too high the marketing pictures I'm pretty sure are full of it.
 
hey man grab a decent cpu cooler and you can get that cpu up to 3Ghz no problem its limit is around 3.5ghz so you would have plenty of room to work with also sheer speed on ram is not always a great thing and i know for a fact that crysis can use a full 4gb of ram so 2gb is no where near enough my man.
 
Don't take these 3DMark benches to seriously. its been shown that they favor Nvidia products by re-routing GPU functions to the CPU when 3D mark bench is being run.
http://www.techreport.com/articles.x/17732
they also are heavily CPU weighted even though it doesn't show up in real world gaming performance.
here is my run at 3DMark 06.https://www.techspot.com/vb/attachment.php?attachmentid=52362&d=1254288882
you can see that there is only a 200 point difference between my CPU score and yours, the only difference is that I have three 4850's running in tandem. I get better FPS than than most i7/GTX 295 systems...but you wouldn't know it by the weighted system they use.
 
Don't take these 3DMark benches to seriously. its been shown that they favor Nvidia products by re-routing GPU functions to the CPU when 3D mark bench is being run.
http://www.techreport.com/articles.x/17732
they also are heavily CPU weighted even though it doesn't show up in real world gaming performance.
here is my run at 3DMark 06.https://www.techspot.com/vb/attachment.php?attachmentid=52362&d=1254288882
you can see that there is only a 200 point difference between my CPU score and yours, the only difference is that I have three 4850's running in tandem. I get better FPS than than most i7/GTX 295 systems...but you wouldn't know it by the weighted system they use.

What frequency is your Phenom II X3 running at lol?!

I really don't get their assessment of 17000 3dmarks as a "common system." From what I've seen, that score's pretty rare.

Okay, taking a closer look at similar systems shows that their CPUs have been heavily overclocked. At least I have an explanation now!
 
Hi theHighroad,
the "common system" score you see there is the common score of that configuration (the i7/with the GTX 295) you can change the configuration in that column, and it will give you the average score of the chosen configuration from the data they have. I was able to get 20K+ because of the three 4850's in crossfire, not because of my cpu.
 
That score is about where you should be with your system specs. I'm just under 13k with SLI turned off and up around 17k with SLI on.
 
I will second that Metal. I looked in the ORB data base and 13k is about where a 4870 and a 8200 are.
 
Hi,

I was not aware that the common system configuration could be changed, I simply thought it was what Futuremark currently thought was a "gold standard" for users; I didn't agree with this standard, and rightly so. I misinterpreted it! Thanks for clearing that up.

By the way, Red1776, I am still wondering what your Phenom X3 was clocked at considering that your CPU score exceeds my quad core's score. I do understand that your 3-way crossfire yields your impressive 20000 marks lol.
 
Hi again Highroad,
Oh sorry about that, you asked that previously didn't you? at the time I ran that I was running my x3 720 @ 3.8Ghz
 
You think thats impressive ,,,

here ya go.
slicpu374mark06700core18283.jpg
 
red1776 your link ( http://www.techreport.com/articles.x/17732 ) is a little confusing as you state NVIDIA as the culprit but the title clearly states "Intel graphics drivers employ questionable 3DMark Vantage optimizations" and were not even talking discreet gpu its integrated.
For one thing its INTEL not NVIDIA and another its 3dmark vantage not 3dmark 06..

No offense but did you read the article you were so quick to link?

I guess your scratching at PhysX implementation of 3dmark Vantage and NVIDIA cards ?
The PhysX can be disabled in the NVCP to allow equal grounds against ATI cards as well.

And to the O.P. Win 7 has been designed to significantly improve memory relations when your on an x64 with 4Gb of ram. I wouldn't even consider an x86 system in this day and age.
 
Our system's overall score climbs by 37% when the graphics driver knows it's running Vantage.
That's not all. Check out the CPU and GPU components of the overall score:

The GPU score jumps by a whopping 46% thanks to Intel's apparent Vantage optimization. At the same time, the CPU score falls by nearly 10%. Curious.

Next, we ran a perfmon log of CPU utilization during each of 3DMark's CPU and GPU component tests. Vantage takes its sweet time loading each test, so our start and end times aren't perfectly aligned for each run. However, the pattern is pretty obvious.

ahh, first of all your correct, it is intel optimized, i misspoke. secondly it has nothing to do with physX, Intel has an optimization that recognition that kicks in when it sees that Vantage is being run and offloads for the test. For supposedly being the industry standard in graphics benchmarking, I think that hardly presents a level and accurate playing field to accurately determine graphics performance. and as stated in the article, its not the first time that this has happened.
my contention (started in another thread a few weeks ago) was that the results that I have gotten with this configuration, and many others I was testing both in Vantage and 06, did not at all match the real world performance and frame rates I was getting in actual gameplay. This realization is just one of many over that last couple years that point out that there is some nefarious activity going on with Futuremark that has an explicit intention.
The situation gets more complicated when one considers optimizations that specifically target benchmarks. Synthetic tests don't have user experiences to improve, just arbitrary scores to inflate. Yet the higher scores achieved through benchmark-specific optimizations could influence a PC maker's choice of graphics solution or help determine the pricing of a graphics card.

as far as your post " you think thats impressive" not sure what your getting at. My OC is 60Mhz higher...and my 06 score is 2K higher, so 'goody for you I guess. I in fact do not think that my 20K score is at all impressive when I consider the actual real world frame rates i get compared to some systems that score much higher on Vantage and 06. I rather think it sucks to be frank, and I trust Futuremark about as far as I can throw my Neighbors wife.
 
I in fact do not think that my 20K score is at all impressive when I consider the actual real world frame rates i get compared to some systems that score much higher on Vantage and 06. I rather think it sucks to be frank, and I trust Futuremark about as far as I can throw my Neighbors wife.
Seconded. I ran 3DMark06 with my 8800s at 750c/1890s/2200m and got 17.3k. Bumped them up to 780c/1963s/2304m and got 16.9k with the exact same CPU/RAM clocks and the same programs running in the background. Go figure.

Anyway, my bad to have further extended a thread about a topic which has been resolved already.
 
as far as your post " you think thats impressive" not sure what your getting at. My OC is 60Mhz higher...and my 06 score is 2K higher, so 'goody for you I guess. I in fact do not think that my 20K score is at all impressive when I consider the actual real world frame rates i get compared to some systems that score much higher on Vantage and 06. I rather think it sucks to be frank, and I trust Futuremark about as far as I can throw my Neighbors wife.

That's my point ( I should have elaborated more) your score is only 2k higher with three 4850's, I was just showing how my 8800gt's in sli and mark 06 show a score that is not representative towards a real world gaming scenario ,in fact with my GTX280's in SLI i'm getting under 20k in mark 06 so obviously mark 06 is faulted, and as you pointed out , as far as Futermark tests go they should not be treated as a standard.

so goody for me , I guess.... to be frank,,, well yea it is good , ;-).
 
The 8200 isnt a crazy fast processor since the L3 issue. That score isn't bad, but OC it some more and you shouldnt have a problem seeing 15k or more.
 
Status
Not open for further replies.
Back