The Official Half Life 2 thread

Status
Not open for further replies.
Don't listen to those sites, they dont know what they are talking about, even if it's true. It's an estimation :). With all these rumors and sites adding to it, all we can really do is wait for Valve to announce a real release date. :-/

Hopefully it wont be too long.
 
Apparently, quite a few developers are seeing the HL2 performance numbers with little surprise....but the developers themselves have been very quiet up till now...

I caught thses words from the lead developer of Silent Hill over at B3D...
http://www.beyond3d.com/forum/viewtopic.php?t=7873

Yesterday I went to Mojo Reloaded (ATI, MS and Intel game developer day) at Guildford. These events are always a combination of interesting presentations and catching up with the industry gossip. This one for me was far more gossip then prensentation, not that the presentations weren't interesting, My favs were the PRT/SH and non linear post-processing streaks (I'll give a better description in a minute). As this will be up on the web for some time its worth stating what day yesterday was, a day after the HL2 benchmarks were released.
The gossip on this was fairly dominant throughout the day about the benchmarks and comments putting NVIDIA in a bad light at Dx9 shaders.

Obviously ATI weren't that upset to see there hardware coming ahead so well, but what wasn't prehaps so expected was how glad everybody else was that HL2 results matched the results most of us had already seen ourselves. Somebody on the forums (sorry can't remember who) asked why developers seemed quite shy in stating our results, I obviously can't talk for everbody but the answer is probably a simple case of somebody had to first and whoever that person/company was, they better be able to handle the heat that it would produce.

Valve are fairly lucky, HL2 is probably the most eagerly awaiting tltle in the business, everybody has being doing everything to get the best results for this title. Everybody knows these guys are **** hot, they know what there doing and if they can't get good results, something is wrong and its likely not to be them. Smaller developers (and thats probably everbody except iD in the PC arena) don't have that luxery, If I had produced a similar performance table, the response of a lot of people would simply be, that the developers (i.e. me) don't know what there doing and obviously can't program properly. And why not? I don't have the reputation for quality that Valve or ID has, they've earnt there rights to be trusted that they know what there doing.

For Valve to do this, shows they were really annoyed, also the fact Microsoft issued a press release stating HL2 was the DirectX 9 benchmark also show how annoyed they were. To get these two massively important PC games companies to make such a public condemnation means you had to do something bad, just having bad performance wouldn't have been enough.

The basic problem that NVIDIA has caused has been the amount of extra work they've been requiring everybody else to do. Wether its benchmark's having to get smart and try and stop application specific optimisations, or developers having to write extra pipelines to even get half decent preformance at the high tech things its meant to be good at or MS having to upgrade the WHQL test to find spec violations. Everbody has been forced to pay for NVIDIA's mistakes, and that is what has caused the anger.

But in some ways it has had good consquences, quality should go up as loopholes are closed.
Future DX specs should now be much tigher.
WHQL testing to require pixel comparision tests.
Hardware must produce almost exact rendering of the same frame as the REFRAST
Self certification of WHQL, to make sure that WHQL driver will have bug fixes applied quicker without bypassing the quality checks.
Reviewers should be less quick to use 'special' drivers provided by the IHVs or test only under 'special' conditions.

Long term the biggest change this year long fiasco has caused will be to Microsoft and PC game developers. Microsoft have had to learn to protect its baby Direct3D, before its largely left quality and stability issues upto individual IHVs, now it knows that this is also its reputation thats damaged when IHVs play dodgy quality games. And us humble game developers have learnt we have to shout sometimes to protect our games from bad discisions made by IHVs, we can't just mop up the **** when it hits us. We have to be willing and able to communicate that certain things are NOT acceptable to our customers, so don't bother doing it. If your card is crap at something, at least be honest earlier on, don't make us find out when our games runs like a dog on your hardware, even though were using the techniques you've been suggesting for the last year.
 
intersting events in the video card industry to say the least. anyone remember the "way it was meant to be played" campaign? and the word that developers would be coding games especially for Nvidia cards? well turn out the developers are and they are more than a little pissed off about it.

another giant misstep for Nvidia they banked on being the 900 lb gorilla of video card industry would force developers to "optimize" for their cards. what they didnt count on was ATI's sucess with the R300 core.

Nvidia fans can dream wistfully of new Drivers that are going to make "massive" leaps for the Nvidia line of cards but it simply isnt going to happen. yes Nvidia will certainly release new drivers that will improve the performance of their cards in games like HL2 but it is going to be at the expense of features and image quality.

the bottom line is simply this ATI manufactured cards to comply with the DX9 standard Nvidia didnt.
 
Things will certainly be interesting over the next year. Nvidia probably gambled with their architecture, and if they had won, then it would have been at the expense of ATI. Fortunately for ATI, they have some very talented engineers onboard, and it will be very difficult for nvidia to catch up for some time. That R300 core was a real piece of top engineering and has given nvidia headaches for quite some time now. ATI is also cheaper, so all credit to their engineers for giving us gamers what we want at a decent price. Things are going to get pretty heated in the graphics market, and I hope that all the benchmark fiasco will now stop.

Some things to consider ...

ATI has always been strong in DirectX, whereas Nvidia prefers OpenGL.
Many HL2 developers (inc. Gabe Newall) are ex Microsoft employees.
Microsoft is the developer of DirectX.
Nvidia and Microsoft fell out over XBox.

Nvidia's hardware is good, but archtecturaly different to ATI's. There may have been some dirty tricks played on Nvidia, but they probably deserved it. Expect to see better performance in other DX9 titles, and especially in OpenGL based games.
 

And where does it say that HL2 is optimized for the Radeon cards?

You know, it's not at all abnormal for a GPU maker to bundle a new game with a new card....

Perhaps if you read up a bit more, you'll find that, according to VALVE, they spent 5 times as long and much more $$ optimizing for Nvidia....

But hey, why listen to facts if you disagree with them...:rolleyes:

As if VALVE has anythng to gain by optimizing for the card that no one has....think about it..Valve wants to sell as many games as possible, and you think they'll do it by optimizing for ATI, LOLOL:knock:

Better performance in other DX9 titles? not if they contain alot of PS2 shader ops....better wait until the NV40.....
 
I agree I think that Nvidia made a gamble hoping they could influence MS and the developers to their "standard" and they lost. and as you pointed out if they had won thier gamble then ATI would be the one floundering.

and it is good to bear in mid that the problems Nvidia's cards have today are the result of decisions made two years ago so hopefully Nvidia has learned from their fiascos and have pulled theri collective heads out of their nether regions and we will see some decent cards from them over the next year or two.

but I think Nvidia's days as undisputed king of the hill are over ATI made the right choices and has surged ahead. they may not always stay ahead of Nvidia all the time but I think it is going to be a much more even contest between the two from here on out. and that is good news for us consumers. frankly I would like to see ATI and Nvidia acheive parity on the hardware after all if you cant outmach the other guys hardwayre that only leaves you one real place to compete. the price tag.
 
Originally posted by PreservedSwine
And where does it say that HL2 is optimized for the Radeon cards?
And where did I say HL2 is optimised for Radeon cards? Nowhere! That post was taken directly from another website. :) You'd have to be pretty naive to believe that Nvidia's engineers don't know what they are doing. Its architectural issues that have led to the performance difference and nothing to do with the merits of one over the other. I think iss said it pretty well.
 
Originally posted by Nic
And where did I say HL2 is optimised for Radeon cards? Nowhere! That post was taken directly from another website. :) You'd have to be pretty naive to believe that Nvidia's engineers don't know what they are doing. Its architectural issues that have led to the performance difference and nothing to do with the merits of one over the other. I think iss said it pretty well.

Umm, right here...


Yes, it's a link, and you quoted the title...does that mean you didn't say it? I don't wish to argue semantics, Nic....I hope you don't as well....
After reading the article, now where did it even allude to HL2 being optimized for any card in any fashion...in fact, Valve has spent more time and resources optimizing for nVidia...

And it's nice to see that you've finally noticed the architectural issues that are currently hounding the NV3x line-up....decisions thate were made by engineers...they are getting an every ounce of perfromance out of their product, to the credit of their driver team, it's essentailly the hardware itself that is limiting their PS shader performance. As far as who is to blame, well, I'm sure it matters to them, but not so much on this end.....it doesn't change the fact that nV3x cards can't run DX9 shaders without some sacrifices....
 
Originally posted by PreservedSwine
... nV3x cards can't run DX9 shaders without some sacrifices ...
Unfortunately for nvidia users it appears that their cards are compromised by software being mismatched to their hardware. I doubt that nvidia's hardware is in any way inferior, but it doesn't seem to be well matched to DX9 specs, at least so far that's how it appears. Maybe they'll have more luck with their shaders when OpenGL 2.0 comes along with support for 32 bit shaders. That'll leave ATI at a performance disadvantage, as their cards only support 24 bit shaders. It's still a tad premature to form judgments at this stage, so we should leave it at that.
 
Originally posted by Nic
Unfortunately for nvidia users it appears that their cards are compromised by software being mismatched to their hardware..

Umm, yeah, any DX9 app that uses PS2.0 shaders......
I doubt that nvidia's hardware is in any way inferior, but it doesn't seem to be well matched to DX9 specs, at least so far that's how it appears.
And not being able to run PS2.0 shaders doesn't make it inferior? Reverting to PS1.4 DX8.1 shaders doesn't make it inferior?:confused:

Maybe they'll have more luck with their shaders when OpenGL 2.0 comes along with support for 32 bit shaders. That'll leave ATI at a performance disadvantage, as their cards only support 24 bit shaders.

I think you've confused shaders and Floating points.... In case you aren't aware, although the NV3x supports FP32, it lacks the registers use them for gaming purposes. The hardware simply isn't there...which is why DOOM3 will be run on Nvidia hardware using FP16 and FX12..... as opposed to FP24 on ATI hardware. This will make the nVidia solution faster, with *some* trade-off in IQ, but probably not much, as everything in DOOM3 gets broken down to 8bits internal...

Interesting to note that ATI will run the standard ARB32 path, needing no optimizations whatsoever, while nVidia gets it's own optimized path....b/c according to Carmack, if the NV3x were to run the standard ARB32 path, the ATI solutions are *much* faster....
 
PreservedSwine: Can you post ATI vs Nvidia flames somewhere else. This thread is about HL2, the game. Thanks.
The jury is still out on Nvidia, so it isn't really fair to make sweeping comments at this point. Nvidia may have screwed up big time. Nuff said.
 
Originally posted by Nic
PreservedSwine: Can you post ATI vs Nvidia flames somewhere else. This thread is about HL2, the game. Thanks.
The jury is still out on Nvidia, so it isn't really fair to make sweeping comments at this point. Nvidia may have screwed up big time. Nuff said.


It is not my intent to flame, but make a few points...I sincerely hope I didn't offend.
It appears as though simply by disagreeing with you on a few points.... you feel I'm flaming.... I was hoping we could have a discussion...that's what a forum is for, right?

On the other side of the coin, the Ti4800 may well be the best bang for the buck for HL2 out there....although it's only DX8, and HL2 is DX9, the "value" end of nVidia's cards must revert to DX8.1 or DX8 to run HL2 with decent FPS....and the Ti4800 seems a bit faster than even an FX5600U thus far...certainly makes a case for holding onto that Ti4600 a littel while longer, not every day you are presented with a reason to not "upgrade"
 
Nic

RE: Drivers helping DX9 HL2



I'm sure drivers will bump up the speed in HL2, but the issues that they simply cannot overcome are hardware issues when Nvidia has to face Texture and Pixel Shader Work in the same clock......... they are at an Extreme Disadvantage. Which is the case in every single PS2.0 application, like HL2...

As it currently stands, only certain tradeoff's can be made to gain speed in one area, but lose some detail in another....it requires extensive coding to find those instances when the loss of detail will not be readily noticable......The nVidia driver team has their work cut out for them, they'll be earning their paychecks in the months to come....

EDIT:

What kind of trade-offs can be expected in HL2? Well only time will tell, but we know nVidia was quite dissapointed their DET 50's didn't get benched with HL2...


nVidia cliams the DET 50's have the kind of "optimizations" (according to nVidia) that will speed up the nv3X line-up for HL2...

How do the DET 50's make DX9 app's like HL2 faster? By compromising IQ..as we all (some) expect...

http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_rel50/001.htm

Not sure this is the correct forum for this...I started one in the Video section, but Nic directed me to this one...Seems it's a little both of HL2 and GPU...

Hope this isn't seen as a flame by some, as I know it's negative towards nVidia....just letting facts speak for themselves....it has turned the DX9 gaming world a little upside-down.....and merits attention, IMO
 
Re: Nic

Originally posted by PreservedSwine
...it has turned the DX9 gaming world a little upside-down.....and merits attention, IMO
And you are sure giving it enough of that. :p

I guess when you put engineers under pressure to meet deadlines, creativity and clear thinking goes out the window. Nvidia look set to take a dive in sales and will probably find it hard to win back support from their loyal consumers. It's a tough market, but that's what drives manufacturers to produce better products. I don't think Nvidia will screw up like this again. :blush:
 
hopefully not but I wouldnt count on the problem being resolved until at least NV40 arrives.
 
My crystal ball predicts an IT sector recovery coming soon ...

Half-Life 2 Performance Preview Part 2 - FiringSquad.com

Some quotes ...

... Serious gamers are going to want at least 512MB of RAM for optimum performance, while the hardcore will insist on a gigabyte ...

... Half-Life 2 also craves CPU performance. In many cases, we were CPU-limited with an ATI RADEON 9800 PRO and a 2.8GHz Pentium 4 (800MHz FSB) in our testing! ...

... the RADEON 9800 PRO is CPU-limited all the way to 1280x1024! Clearly you’re going to want a 3.0GHz+ or AMD equivalent for optimum performance, even if you’re running one of the mainstream cards such as the RADEON 9600 PRO ...

... City 17 clearly wants the fastest processors money can buy ...

... NVIDIA has deviated quite a bit from the specs Microsoft lay out for DirectX 9, which could be costing them now ...

... The second factor to keep in mind is the significant demand on the CPU and memory subsystems of your PC. Half-Life 2 will press both of these system components like no other game before it ...

... those of you with soft audio solutions or sound cards that lack dedicated hardware acceleration for DirectSound streams may want to upgrade your audio subsystem for Half-Life 2, as you may not want to give up precious CPU cycles for audio processing ...

... you’re going to want lots of memory for the best gaming experience with Half-Life 2 ...

... Fortunately, the wait for Half-Life 2 is almost over. This is the first title we’ve seen in quite a long time that we can see selling lots of hardware. We’re not just talking graphics cards here either. Half-Life 2 will push every component within your PC no matter how cutting edge it is. Make no mistake about it, Valve will soon own us all ...
Yes folks, its time to check your wallets. :cool:
 
I'm not sure but I don't think your integrated graphics supports Transform and Lighting. HL2 will either not run or run poorly without it. The minimum requirements on the HL2 box neglects to mention this "requirement," however.
 
I have an AMD64 athlon 2800
512mb ddr400
onboard 128mb graphics(getting new card soon)
until i get my new graphics card will i be able to play HL2?i know its not suppose to run on intergrated graphics but i want to play now,maybe i should just wait..
 
You could just try it and see. Without knowing which onboard graphics (there are different ones) or the motherboard (which will tell the onboard graphics), I couldn't venture a guess.
 
Status
Not open for further replies.
Back