Nvidia slowly being defeated?

Status
Not open for further replies.

acidosmosis

Posts: 1,310   +0
Now here is a company that can turn a press conference into their own trade show. While NVIDIA had a lot more than VIA to say about their chipsets they seemed to be avoiding the products that we all came to hear about. Those products would be the NVIDIA GPU’s. There was little mention about any new FX cards or the current ones. The NVIDIA CEO Jen-Hsun made it quite clear that the graphics industry would no longer be the primary focus point of the company. They then continued to demonstrate a wide range of new multimedia products.

http://www.legionhardware.com/html/doc.php?id=261&p=3
 
That's really bad.. We need nVidia to keep ATI on their toes and keep prices reasonable no matter how unreasonable they may currently be. ;)
 
Nvidia own 73% of the graphics market (according to some statistics I just read). I don't think they'll give that up easily. Even ATI are diversifying their portfolio of products, as its not good to put all your eggs in one basket. Just look what happens when you screw up. There are new players entering the graphics market, so things will only get tougher for those players that remain.
 
Originally posted by consie89
long live nvidia

I agree.

They make GREAT products, and that's not just following marketting hype.

The products are stable, and I've certainly found the drivers to be overall stable.

They are doing a really good job with their linux drivers.

And a good many version of their products aren't especially overpriced.

I hope they are here to stay for a while.
 
Originally posted by consie89
wh-hay, ive finally have someone agree with me, thats definately a first.

Make that 3 people ;) I sure hope Nvidia dont die, I find their products to be high reliable, and high quailty. Im sure they can come back :)
 
p66 and agissi, I too hope that nvidia will stay around for a while, as we need the competition to keep the market healthy...

Apart from that, my (dis)agreement depends on whether or not you include recent actions & products from nVidia, or thinking back to the GF4 times... With the exception of the linux drivers, which they are doing a good job with...
 
Interesting interview over at FiringSquad, with Nvidia's chief Scientist David Kirk, on DirectX 9 performance issues.

Some quotes ...

FiringSquad: One of the things that ATI has kind of said, or least they were suggesting at Shader Day is the fact that they can do more floating-point operations than you guys can. How would you respond to those types of statements?

Kirk: Well I guess the first response would be of course they would say that. But I don’t really see why you or they would think that they understand our pipeline, because in fact they don’t. The major issues that cause differing performance between our pipeline and theirs is we’re sensitive to different things in the architecture than they are so different aspects of programs that may be fast for us will be slow for them and vice versa. The Shader Day presentation that says they have two or three times the floating point processing that we have is just nonsense. Why would we do that?
FiringSquad: Could you give us specific examples of where maybe you feel you guys, you mentioned you guys can do some things better than they can, can you give us some specific examples of that?

Kirk: Well one example is if you’re doing geometric calculations with reflections or transparencies and you need to do trigonometric functions. Our sine and cosine takes two cycles theirs takes eight cycles, or seven cycles I guess. Another example is if you’re doing dependant texture reads where you use the result of one texture lookup to lookup another one. There’s a much longer title time on the pipeline than there is in ours. So it just depends on the specific shader and I feel that for the calculations I mentioned are pretty important for effects and advanced material shaders and the types of materials that people use to make realistic movie effects. So they will get used as developers get more used to programmable GPUs and we’ll have less of a performance issue with those kinds of effects.
FiringSquad: Do you feel that fact that you guys, your hardware came out later -- does that also contribute to the initial performance that’s coming out in terms of the DX9 titles that have been benchmarked with?

Kirk: Yeah, I would say that one of the issues is that since our hardware came out a little bit later some of the developers started to develop with ATI hardware, and that’s the first time that’s happened for a number of years. So if the game is written to run on the other hardware until they go into beta and start doing testing they may have never tried it on our hardware and it used to be the case that the reverse was true and in this case now it’s the other way around. I think that people are finding that although there are some differences there really isn’t a black and white, you know this is faster that is slower between the two pieces of hardware, for an equal amount of time invested in the tuning, I think you’ll see higher performance on our hardware.
FiringSquad: So you do think that the initial numbers that have kind of come out really aren’t indicative of final performance.

Kirk: [Yes] I believe. Of course, as I say if other people are commenting on our architecture without any knowledge, I don’t why [their comments on our architecture], why do you think it would be right? You know in our case I haven’t worked with all of the developers, but the ones that I have worked with have seen marked improvements when they start to actually work with our hardware and optimize for our hardware.
FiringSquad: Do you feel that in terms of the Half-Life 2 performance numbers that were released recently…do you feel that maybe you guys were, I don’t want to say given a bad rep, but maybe an unfair deal?

Kirk: Well again, not wanting to start a flame war back and forth, my feeling is if they had issues with speed, it’s really not appropriate to say that it doesn’t run at all. (Our question had mentioned this --FS) It’s just that so far in their state of optimization it doesn’t run fast. But after we’ve had a chance to work together on [inaudible] that will be able to provide a very good game experience with Half-Life on the full GeForce FX family. There’s no question in my mind that we’ll get there, it’s just a matter of time.
FiringSquad: So would you say that it’s a software type of issue that can be corrected?

Kirk: Yeah I don’t think there’s any issue with the hardware.

Check out the rest over here ...

An Interview with NVIDIA's David Kirk
 
who to trust, nividia isnt going to say ATI has better sofwear and ATI are not going to say Nvidias is better. What we need is proper indipendant recearch done by people with objective views. I think that you can never trust a review or press conferance carried out about quality/perfomance issues unless its done by and indipendant company who isnt secretly getting free beer from nvidia or ATI for saying their softwear is better. I bet soemhow i get ridiculed for this but never mind.
 
Well, there may be good news for nVidia. It appears they may have pulled the proverbial rabbit out of their hat. They seem to be making progress on their real-time pixel and shader compiler in their 52.10 drivers. I found this tidbit at Guru3D...

I received an email from Ronald over at 3dcenter today that he finished an article on Detonator 52.10 drivers. Basically what he did was IQ and performance test the Detonator drivers 45.23, 51.75 and 52.10 drivers towards each other. We all know by now that NVIDA has been pushing very hard on their Shader performance for the GeForce FX series as they where downright dissapointing compared to the competition. Although NVIDIA has not stated this, we believe they have been working on something clever, a Pixel and Vertex Shader engine handled through a real-time compiler. First of all, this by definition is not cheating, it's optimizing the shader engine. There's nothing wrong with that. The results of the 52.10 drivers really are interesting to observe.


and here's the link...

http://www.guru3d.com/newsitem.php?id=584
 
Yeh - those performancing increases are incredeble - i hope these releases prove as sucessful as the 40.xx release - they were an incredeble set of drivers (over 15% performance boost across the whole range)

im with consie89 also "LONG LIVE NVIDIA!"

there products are of the highest quality, very stable and overclock well

i also dont want to c ATi die out (which they wont) - it pushes that market along and keeps prices down! all good things!

Steg

p.s. ive got my eye on a 5700 Ultra - fast, resonably priced - and after the 5x.xxs come out it will b fast enought to play HL2 on - WOOHOO!
 
Status
Not open for further replies.
Back