Ati and Half Life 2

Status
Not open for further replies.

Steg

Posts: 268   +0
OHHO!!! ATi are really rubbing there $8000000 worth of HL2 in nvidias faces - they intent to get every penny of that 8million back and more i shouldnt wonder
clip from new ATi advert....

Done Running?
Out of ammo?
This fall
Rely on two things
Your crowbar
and the brute force of RADEON to survive

*half life 2 logo pasted all over the place*

now im a die-hard nvidia fan but ATi are pushing them for all they have got - but im staying with nvidia - if the NV40 turns out the same as the NV3x then there probebly wont b an nvidia to support - NOW WE NEED A SPECTACULAR NVIDIA COMEBACK! please!

Steg
 
well you cant really blame ATI they have been on the recieving end of the same thing from Nvidia for many years. now it's their turn.

While I prefer ATI to Nvidia cards ( at least as far as the FX series go) I shudder to think of Nvidia not resolving theirr DX9 problems by the time NV40 hits and I really dont think they will allow that to happen.
 
i wasnt blaming them - i was just slightly shocked to c the speed and agresion that ATi put nvidia down - but ur right - they have put up with it for many years
and i think the nvidia logo is on doom3? or am i wrong? i think it is....

Steg
 
NVIDIA are a bit of a mess at the moment, anything they are trying is backfiring on them. This is only as regards the "enthusiast" community though, most people probably don't have a clue about all the Driver issues going on, or the poor Pixel Shader performance, etc.
ATi I think has "Get in the game", NVIDIA have "The way its meant to be played". It will indeed be fun to see what happens in 6 months or so though, Half-Life 2 is being billed as what ATi cards were meant for while NVIDIA are going on about NV40 being meant for Doom 3.
 
What do you guys think Nvidia can do to fix this problem? I mean yeah its only a slight dilema for those who havnt bought a card yet.. all they need to think about is, "do i buy the radeon, or the fx. Do i convert, or hold onto what i have used for years?"

Then you have those like me.. who bought a 5900 ultra for 500 bucks thinking it would be like all the other nvidia products.. (aside from the original 5800). Bah, do I have anything to look forward to? Or do I now have a 500 dollar video card that runs dx9 games just a tid bit faster than my 4700 did??

I'm extremely POed that I spent 500 bucks on this card that performs like crap in all dx9 games other than doomIII... and the best "fix" nvidia can come up with is improved drivers that just cheat their way into giving the impression that its performing to spec.. bahhhh

*rips out his hair*

think nvidia would ever have a recall or something of that sort??

doubt it.. but its the only hope i have left it seems :(
 
I'd like to see 3D cards other than Nvidia or ATi running games. It seems to be nothing but these two fighting all the time. I'm really tired of this.
 
Did you catch what John Carmack said about nVidia and HL2 performance?

Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it [nVidia] is a lot slower.

http://english.bonusweb.cz/interviews/carmackgfx.html

The deal ATI has w/ valve is concerning distribution, not game optimization. ATI simply runs standard DX9 apps much better than the nV3xx is able to. The reasons for this are plenty (mainly FP registers and number of available pipelines during pixel and shader ops)

nVidia fans should be much happier w/ the nV40.....no reason to grab an FX, stick w/ the Ti or go ATI....
 
Carmack went on to say though the GeForce FX is faster if you use its optimized paths.
Not every Developers gonna have time to create them though. NVIDIA just need to acknowledge issues & stop messing around with Drivers for benchmarks &/or that lower IQ in order to speed things up.
 
yeh - nvidia had better make sure the nv40 comes out a winner - or there will be alot of very angry fans - and 2 mistakes is a row will kill nvidia in the graphics market
fortunatly they have a back up - They make the BEST AMD chipset (and probebly soon the BEST intel chipset) so no doupt that is make them at bit of cash :D

Steg
 
Originally posted by TS | Thomas
Carmack went on to say though the GeForce FX is faster if you use its optimized paths.
Not every Developers gonna have time to create them though. NVIDIA just need to acknowledge issues & stop messing around with Drivers for benchmarks &/or that lower IQ in order to speed things up.

He said that the NV30 was faster in a Feb. 8th article, but didn't mention which one was overall faster in this most recent interview. I wouldn't assume that anything has changed, just thought it was interesting that the rest of the Lead Developers think the HL2 DX9 bench results are indicitive of what you can expect on the latest from ATI and nVidia when performing DX9 app's... Unfortunatley, changing drivers is about all nVidia can do to make the FX line competitive, even if it means sacrificing IQ. In their shoes, I would do the same thing....Until the Nv40 anyway...

Why can't they improve DX9 performance w/out sacrifing IQ??....Here's a very interesting read w/ new information about each core form ATi and nVidia:
http://www.beyond3d.com/forum/viewtopic.php?t=8005

Check out the Diagram.

That pretty much lays out how we got to where we are today with game performance. You can see that Ati simply has more execution units per pipeline. Nvidia also has its Texture lookup shared with one of the FP ALUs for each pipeline. Ati has it in seperate hardware.

Actually, because of constant propagation optimization, it should execute in 1 cycle on an R3x0 (eventually). Something like:
add_sat oC0, (c0+c1-c2), v1

We are working hard on improving our current PS compiler, so that it can map PS ops to our HW in an optimal way. The current stuff is pretty simple. The HW is naturally very fast and executes well. However, it will get better. That's also why one should be careful when trying to determine our internal architecture based on shader code.
From Sireric of ATi. apparently they are only just begining to optomize their own shader compiler. Everything you see currently is based on Simple Raw Calculation performance of the hardware....("Hellbinder's" words)


It's a very informative thread....
 
Status
Not open for further replies.
Back