The Official Half Life 2 thread

By acidosmosis · 47 replies
Aug 30, 2003
  1. acidosmosis

    acidosmosis TechSpot Chancellor Topic Starter Posts: 1,350

    Don't listen to those sites, they dont know what they are talking about, even if it's true. It's an estimation :). With all these rumors and sites adding to it, all we can really do is wait for Valve to announce a real release date. :-/

    Hopefully it wont be too long.
  2. PreservedSwine

    PreservedSwine TS Rookie Posts: 325

    Apparently, quite a few developers are seeing the HL2 performance numbers with little surprise....but the developers themselves have been very quiet up till now...

    I caught thses words from the lead developer of Silent Hill over at B3D...

  3. iss

    iss TechSpot Chancellor Posts: 1,994

    intersting events in the video card industry to say the least. anyone remember the "way it was meant to be played" campaign? and the word that developers would be coding games especially for Nvidia cards? well turn out the developers are and they are more than a little pissed off about it.

    another giant misstep for Nvidia they banked on being the 900 lb gorilla of video card industry would force developers to "optimize" for their cards. what they didnt count on was ATI's sucess with the R300 core.

    Nvidia fans can dream wistfully of new Drivers that are going to make "massive" leaps for the Nvidia line of cards but it simply isnt going to happen. yes Nvidia will certainly release new drivers that will improve the performance of their cards in games like HL2 but it is going to be at the expense of features and image quality.

    the bottom line is simply this ATI manufactured cards to comply with the DX9 standard Nvidia didnt.
  4. Nic

    Nic TechSpot Paladin Posts: 1,549

    Things will certainly be interesting over the next year. Nvidia probably gambled with their architecture, and if they had won, then it would have been at the expense of ATI. Fortunately for ATI, they have some very talented engineers onboard, and it will be very difficult for nvidia to catch up for some time. That R300 core was a real piece of top engineering and has given nvidia headaches for quite some time now. ATI is also cheaper, so all credit to their engineers for giving us gamers what we want at a decent price. Things are going to get pretty heated in the graphics market, and I hope that all the benchmark fiasco will now stop.

    Some things to consider ...

    ATI has always been strong in DirectX, whereas Nvidia prefers OpenGL.
    Many HL2 developers (inc. Gabe Newall) are ex Microsoft employees.
    Microsoft is the developer of DirectX.
    Nvidia and Microsoft fell out over XBox.

    Nvidia's hardware is good, but archtecturaly different to ATI's. There may have been some dirty tricks played on Nvidia, but they probably deserved it. Expect to see better performance in other DX9 titles, and especially in OpenGL based games.
  5. PreservedSwine

    PreservedSwine TS Rookie Posts: 325

    And where does it say that HL2 is optimized for the Radeon cards?

    You know, it's not at all abnormal for a GPU maker to bundle a new game with a new card....

    Perhaps if you read up a bit more, you'll find that, according to VALVE, they spent 5 times as long and much more $$ optimizing for Nvidia....

    But hey, why listen to facts if you disagree with them...:rolleyes:

    As if VALVE has anythng to gain by optimizing for the card that no one has....think about it..Valve wants to sell as many games as possible, and you think they'll do it by optimizing for ATI, LOLOL:knock:

    Better performance in other DX9 titles? not if they contain alot of PS2 shader ops....better wait until the NV40.....
  6. iss

    iss TechSpot Chancellor Posts: 1,994

    I agree I think that Nvidia made a gamble hoping they could influence MS and the developers to their "standard" and they lost. and as you pointed out if they had won thier gamble then ATI would be the one floundering.

    and it is good to bear in mid that the problems Nvidia's cards have today are the result of decisions made two years ago so hopefully Nvidia has learned from their fiascos and have pulled theri collective heads out of their nether regions and we will see some decent cards from them over the next year or two.

    but I think Nvidia's days as undisputed king of the hill are over ATI made the right choices and has surged ahead. they may not always stay ahead of Nvidia all the time but I think it is going to be a much more even contest between the two from here on out. and that is good news for us consumers. frankly I would like to see ATI and Nvidia acheive parity on the hardware after all if you cant outmach the other guys hardwayre that only leaves you one real place to compete. the price tag.
  7. Nic

    Nic TechSpot Paladin Posts: 1,549

    And where did I say HL2 is optimised for Radeon cards? Nowhere! That post was taken directly from another website. :) You'd have to be pretty naive to believe that Nvidia's engineers don't know what they are doing. Its architectural issues that have led to the performance difference and nothing to do with the merits of one over the other. I think iss said it pretty well.
  8. PreservedSwine

    PreservedSwine TS Rookie Posts: 325

    Umm, right here...

    Yes, it's a link, and you quoted the title...does that mean you didn't say it? I don't wish to argue semantics, Nic....I hope you don't as well....
    After reading the article, now where did it even allude to HL2 being optimized for any card in any fact, Valve has spent more time and resources optimizing for nVidia...

    And it's nice to see that you've finally noticed the architectural issues that are currently hounding the NV3x line-up....decisions thate were made by engineers...they are getting an every ounce of perfromance out of their product, to the credit of their driver team, it's essentailly the hardware itself that is limiting their PS shader performance. As far as who is to blame, well, I'm sure it matters to them, but not so much on this doesn't change the fact that nV3x cards can't run DX9 shaders without some sacrifices....
  9. Nic

    Nic TechSpot Paladin Posts: 1,549

    Unfortunately for nvidia users it appears that their cards are compromised by software being mismatched to their hardware. I doubt that nvidia's hardware is in any way inferior, but it doesn't seem to be well matched to DX9 specs, at least so far that's how it appears. Maybe they'll have more luck with their shaders when OpenGL 2.0 comes along with support for 32 bit shaders. That'll leave ATI at a performance disadvantage, as their cards only support 24 bit shaders. It's still a tad premature to form judgments at this stage, so we should leave it at that.
  10. PreservedSwine

    PreservedSwine TS Rookie Posts: 325

    Umm, yeah, any DX9 app that uses PS2.0 shaders......
    And not being able to run PS2.0 shaders doesn't make it inferior? Reverting to PS1.4 DX8.1 shaders doesn't make it inferior?:confused:

    I think you've confused shaders and Floating points.... In case you aren't aware, although the NV3x supports FP32, it lacks the registers use them for gaming purposes. The hardware simply isn't there...which is why DOOM3 will be run on Nvidia hardware using FP16 and FX12..... as opposed to FP24 on ATI hardware. This will make the nVidia solution faster, with *some* trade-off in IQ, but probably not much, as everything in DOOM3 gets broken down to 8bits internal...

    Interesting to note that ATI will run the standard ARB32 path, needing no optimizations whatsoever, while nVidia gets it's own optimized path....b/c according to Carmack, if the NV3x were to run the standard ARB32 path, the ATI solutions are *much* faster....
  11. Nic

    Nic TechSpot Paladin Posts: 1,549

    PreservedSwine: Can you post ATI vs Nvidia flames somewhere else. This thread is about HL2, the game. Thanks.
    The jury is still out on Nvidia, so it isn't really fair to make sweeping comments at this point. Nvidia may have screwed up big time. Nuff said.
  12. PreservedSwine

    PreservedSwine TS Rookie Posts: 325

    It is not my intent to flame, but make a few points...I sincerely hope I didn't offend.
    It appears as though simply by disagreeing with you on a few points.... you feel I'm flaming.... I was hoping we could have a discussion...that's what a forum is for, right?

    On the other side of the coin, the Ti4800 may well be the best bang for the buck for HL2 out there....although it's only DX8, and HL2 is DX9, the "value" end of nVidia's cards must revert to DX8.1 or DX8 to run HL2 with decent FPS....and the Ti4800 seems a bit faster than even an FX5600U thus far...certainly makes a case for holding onto that Ti4600 a littel while longer, not every day you are presented with a reason to not "upgrade"
  13. PreservedSwine

    PreservedSwine TS Rookie Posts: 325


    RE: Drivers helping DX9 HL2

    I'm sure drivers will bump up the speed in HL2, but the issues that they simply cannot overcome are hardware issues when Nvidia has to face Texture and Pixel Shader Work in the same clock......... they are at an Extreme Disadvantage. Which is the case in every single PS2.0 application, like HL2...

    As it currently stands, only certain tradeoff's can be made to gain speed in one area, but lose some detail in requires extensive coding to find those instances when the loss of detail will not be readily noticable......The nVidia driver team has their work cut out for them, they'll be earning their paychecks in the months to come....


    What kind of trade-offs can be expected in HL2? Well only time will tell, but we know nVidia was quite dissapointed their DET 50's didn't get benched with HL2...

    nVidia cliams the DET 50's have the kind of "optimizations" (according to nVidia) that will speed up the nv3X line-up for HL2...

    How do the DET 50's make DX9 app's like HL2 faster? By compromising we all (some) expect...

    Not sure this is the correct forum for this...I started one in the Video section, but Nic directed me to this one...Seems it's a little both of HL2 and GPU...

    Hope this isn't seen as a flame by some, as I know it's negative towards nVidia....just letting facts speak for has turned the DX9 gaming world a little upside-down.....and merits attention, IMO
  14. Nic

    Nic TechSpot Paladin Posts: 1,549

    Re: Nic

    And you are sure giving it enough of that. :p

    I guess when you put engineers under pressure to meet deadlines, creativity and clear thinking goes out the window. Nvidia look set to take a dive in sales and will probably find it hard to win back support from their loyal consumers. It's a tough market, but that's what drives manufacturers to produce better products. I don't think Nvidia will screw up like this again. :blush:
  15. BrownPaper

    BrownPaper TS Rookie Posts: 407

    For all the gamers in the world, i sure do hope so.
  16. iss

    iss TechSpot Chancellor Posts: 1,994

    hopefully not but I wouldnt count on the problem being resolved until at least NV40 arrives.
  17. Nic

    Nic TechSpot Paladin Posts: 1,549

    My crystal ball predicts an IT sector recovery coming soon ...

    Half-Life 2 Performance Preview Part 2 -

    Some quotes ...

    Yes folks, its time to check your wallets. :cool:
  18. chocobogo_rct

    chocobogo_rct TS Rookie Posts: 93

    got my 9800 pro ready! :p
  19. Vilu Daskar

    Vilu Daskar TS Rookie


    I have an Emachine T2825 and i was wandering if it would be able to run Half-Life 2 on it???
  20. mailpup

    mailpup TS Special Forces Posts: 7,182   +469

    I'm not sure but I don't think your integrated graphics supports Transform and Lighting. HL2 will either not run or run poorly without it. The minimum requirements on the HL2 box neglects to mention this "requirement," however.
  21. ::.Zoltan.::

    ::.Zoltan.:: TS Rookie Posts: 27

    I have an AMD64 athlon 2800
    512mb ddr400
    onboard 128mb graphics(getting new card soon)
    until i get my new graphics card will i be able to play HL2?i know its not suppose to run on intergrated graphics but i want to play now,maybe i should just wait..
  22. mailpup

    mailpup TS Special Forces Posts: 7,182   +469

    You could just try it and see. Without knowing which onboard graphics (there are different ones) or the motherboard (which will tell the onboard graphics), I couldn't venture a guess.
  23. torkuda

    torkuda TS Rookie

Topic Status:
Not open for further replies.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...