Gigabyte GeForce GTX 570 (1280MB) Review

As much as I would like to do SLI, I can't because I have an AMD processor running on a board with an AMD chipset. The last time I checked, there were only 2 motherboard models left which featured a NForce 980A chipset capable of handling modern AM3 processors and both models are not available where I live. So, single GPU configs are more relevant to me.
 
I own Metro 2033, and the game most certainly has a built in benchmark. It was included with the free Ranger DLC Pack. It is in the steamapps\common\metro2033 folder and is called Metro 2033 benchmark.exe.
I've seen "Metro 2033" referred to as, "the worst coded game out there".

So here's my two part question; Is this true...?

If it is, then, "why is "Metro 2033" a benchmark? Shouldn't it be rewritten?

It does seem silly, forcing manufacturers to create hardware based on the ability to run poorly coded, bloated software.

Or perhaps the hardware makers should adopt this ability into their advertising, "Or VGAs are soooo fast, they'll even run this pig"?

Then we might not have every juvenile delinquent on the internet, posting the same tired crap wondering, "Will this supercomputer run Crysis"? (Meanwhile, each one is thinking that's never been said before).

(You can substitute "Metro 2033" into that last question. I'm sure the herd will catch on to using, "Metro 2033" to say witty things about supercomputerrs eventually)...:yawn:
 
A poorly coded game - or in Metro's case, basically a straightforward game that had advanced graphics features "tacked on" ( tessellation, Depth of Field etc.) when the game was essentially finished*- often makes the best benchmark. A lack of optimization can then stress every component -GPU, it's interconnect (PCI bus) with CPU, scheduling, frame buffer in both vRAM and system RAM, and of course the resultant power draw/stability/heat production and dissipation from all the activity.

*Metro 2033 was developed by part of the team that worked on the original S.T.A.L.K.E.R. game, the latter (X-Ray engine) is also a real handful most the majority of graphics- partly because X-Ray was never intended to be able handle or incorporate the amount of code that GSC managed to shoehorn into the later games (which are graphically very similar to Metro), and partly because of the sheer amount of shader and signal processing options available to in-game settings. It's probably no coincidence that Metro's 4A game engine bears more than a passing similarity to the X-Ray engine

A well coded game such as most that use the more "polished" Unreal engines usually results in framerates so high that the graphics card is essentially waiting upon system limitations (with high-end cards) or a lack of frame buffer, vRAM or core speed ( lesser specced cards) resulting in card components sitting relatively idle during the frame rendering process.

Probably a lot more than you wanted to know (by about three paragraphs!) but I'm sure there are others out there wondering why certain games tend to used as benchmarks/stress tests while some games are not.

To answer the first question you posed....No and Yes. The game can be played relatively successfully on lower graphical settings by a good percentage of gaming systems -it's not Minesweeper by any means, but still offers a playable game experience for many. The game is also not plagued by bugs that lead to lock-ups or crash-to-desktop, nor broken story lines-which I would also consider a hallmark of a poorly-coded game, however....adding in the image quality settings that were late additions to the game engine and then subjecting them to 4x MSAA will very quickly escalate the cards workrate.

If you're a horror-survival fan of a post-apocalyptic world full of mutants I'd give it a go....if you don't get enough of that sort of thing going to the grocery store that is.
 
If you're a horror-survival fan of a post-apocalyptic world full of mutants I'd give it a go....if you don't get enough of that sort of thing going to the grocery store that is.
"Mutant", is that any thing like a "crack head"?

As to "grocery store", no sweat. You just wait til the gunfire dies down, then make a run for it....:rolleyes:
 
A poorly coded game - or in Metro's case, basically a straightforward game that had advanced graphics features "tacked on"
I hate to have you expound further than your "three paragraphs" Chef, however you may be able to answer this. i use a pedestrian definition of "poorly coded". If it returns a disproportionately bad performance for the hardware thrown at it, i call it poorly coded. When you said "tacked on" graphics features, what exactly is happening there to cause terrible performance? Is it that is working with a separate coding 'loop(s)' for the advanced features and it takes exponential resources compared to it not being in the same loop? I would really like to know, as I wouldn't know bad software code by looking at it if it bit me in the ***.
 
If it returns a disproportionately bad performance for the hardware thrown at it, i call it poorly coded. When you said "tacked on" graphics features, what exactly is happening there to cause terrible performance? Is it that is working with a separate coding 'loop(s)' for the advanced features and it takes exponential resources compared to it not being in the same loop? I would really like to know, as I wouldn't know bad software code by looking at it if it bit me in the ***.
Just get a copy of Adobe Photoshop Elements 5, then compare it to versions 6 or later. This will familiarize you with what poorly coded software is all about..

PSE 5 will import photos into its organizer at about a 5:1 ratio over the later programs, which won't even fully generate thumbnails on the fly. You couple that with a**h*** s*** like face recognition, which like many other features in this program are just poured over the top of old code, like so much "adobe" mud. I wonder if that's where they got their name.

In any event, PSE is now a well over 1GB download, and is being programed in some 3rd world reform school.

If a game meets these basic criterion, then I'd probably brand it "poorly coded" also.
 
Just get a copy of Adobe Photoshop Elements 5, then compare it to versions 6 or later. This will familiarize you with what poorly coded software is all about..

PSE 5 will import photos into its organizer at about a 5:1 ratio over the later programs, which won't even fully generate thumbnails on the fly. You couple that with a**h*** s*** like face recognition, which like many other features in this program are just poured over the top of old code, like so much "adobe" mud. I wonder if that's where they got their name.

In any event, PSE is now a well over 1GB download, and is being programed in some 3rd world reform school.

If a game meets these basic criterion, then I'd probably brand it "poorly coded" also.

I wondered about the Adobe programs. I purchased the Adobe creative suite for school and a writing position I landed, and noticed that InDesign Cs5 takes 9 minutes to open on the college workstations, and several seconds to render a circle with a stroke and fill.
 
I hate to have you expound further than your "three paragraphs" Chef, however you may be able to answer this. i use a pedestrian definition of "poorly coded". If it returns a disproportionately bad performance for the hardware thrown at it, i call it poorly coded. When you said "tacked on" graphics features, what exactly is happening there to cause terrible performance? Is it that is working with a separate coding 'loop(s)' for the advanced features and it takes exponential resources compared to it not being in the same loop? I would really like to know, as I wouldn't know bad software code by looking at it if it bit me in the ***.

I'm no software coder either, but as a "for instance"- field of view (FOV) in Metro is spread over three seperate configuration files (plus an overarching .cfg file in Appdata from memory) - I've had to alter each file when setting up customers' Eyefinity/Surround settings esp. using 16:10 monitors in the past -since patched I presume.
In general I'd look at the overlays of advanced DoF over soft particles and HDR type scenario - all then subjected to multisampled AA- so not poorly coded per se, but poorly optimized for present hardware given the fact that the game IQ settings options are fairly minimal to say the least. A better option list in choosing IQ effects rather than default/hidden values or basic switches* (AAA or 4xMSAA for example) would go a long way to alleviate this- although the effects seem hardcoded into the game, hence the limited user settings.

To my way of thinking, I would categorize "poorly coded" to include any game that is unplayable at the highest in-game settings by virtually every system in existance. The game is likely to be playable (with all bells and whistles) by a single GPU card once we get to Southern Islands/Kepler on 28nm -about a year away...that is not going to help sales of the game in 2010. Indeed the game is usually found in the bargain bin primarily because the system requirements are high (my version lists E8300/Phenom II X2 550 and GTX260/HD 4870 as recommended) and in large part to the negative press the game has received over it's playability- not it's content (which personally I think is very good for a linear shooter).
Crysis required a GPU generation built a year after game launch to be fully playable at HD (or better) level. I would hazard a guess that the games iconic status is due more to it's "unplayability" than it's popularity in sales.

* some of the graphical options built into Metro 2033 can be seen on page 4 of this interview with Oles Shishkovstov
 
just got 1 of these very very happy, to replace my 5850 which was ok , skipped the shambles that was 400 series, back to nvidia very happy!

running on a ex58 ud4p board
i7 chip
12g
blah blah bits and pieces :)

this card has made my led monitor really come to life , tv,games ,movies etc!
good to be back with Nvidia!!!!
 
Back