PhysX hobbled on the CPU by x87 code

Status
Not open for further replies.
How so...?
The opening sentence should be a clue:
Nvidia has long promoted its PhysX game physics middleware as an example of a computing problem that benefits greatly from GPU acceleration, and a number of games over the past couple of years have featured PhysX with GPU acceleration

If you use nVidia graphics then the PhysX compute on CPU is a moot point.
On the other hand, AMD graphics owners basically flood forums telling all-and-sundry how PhysX/Physics is a complete waste of time, a fact seemingly borne out by how little AMD invests in open source Physics engines like Havok and Bullet (at least up til this point in time), that are optimized for CPU computation- that is to say, past the usual PR related notices.
So we have a situation where either AMD follow through "Gaming Evolved" (and better game dev funding) and PhysX becomes irrelevant (or more likely better optimized) in the face of AAA titles using open source physics, or Gaming Evolved turns into another stillborn GITG
Personally, I can't find a great deal to get excited about. PhysX was bought,paid and further developed with nVidia capital investment, so I don't find it overly strange that they would protect their investment, any more than AMD would allow nVidia graphics to use AVIVO (as example).
I would also add that if it were not for nVidia's huge brand awareness and loyalty/fanboyism garnered through TWIMTBP (something AMD only lately seems to be aware of and is attempting to match/combat) and providing SDK's (inc PhysX) and funding then two things seem certain;
1. PC gaming would likely be a great deal poorer, and..
2. nVidia would likely be defunct as a desktop graphics company given AMD's advances over the last two generations of cards...and anyone want to guess the MSRP of graphics cards when a manufacturer has only Intel IGP's and S3 as competition?
 
If my memory serves me well I think its Intel who were making much more investment in open source physics engines (i.e. Havok).

For now though, physx seems to be overhyped as i am unsure how much qualitative difference does it make in the visual appearance of the games. May be in future it will be more noticeable but that day is not here yet.

But, in the longer run, right now if someone asks which company can be at greater risk in this market, it seems to be nVidia. Lastly, I agree, I don't want nVidia to go away either we all will be at loss because of it not only in monetary terms with regard to solution we'll get, but also speed of innovation.
 
Havok is an Intel product, but unless Intel are somehow blocking game devs and AMD from implementing it, then the onus is on AMD to to provide game devs with an alternative avenue to PhysX.
From my viewpoint, AMD seem like they consider the job finished at creating good hardware-maybe the rationale is that if the hardware exists then software developers will jump in and provide the apps. This would obviously reduce R&D on their part, and by promoting open-source garners AMD a non-marketing/non-proprietry PR coup without significant risk. The downside is that apps and open source standards have a tendency to move at glacial speed and tend to lose focus (as does a significant amount of anything when "run by committee").
nvidia on the other hand seem just as much (if not more) a software/marketing entity as much as a hardware developer, and while their software is proprietry and their marketing aggressive (at the very least), I think that actually having working software (CUDA, PhysX) has allowed them to maintain a presence far beyond what the actual hardware would command.

Nvidia is definitely at risk in the market. It doesn't have (and can't get) an x86 licence (unless it somehow teams up with VIA-unlikely!), whereas it's two competitors in the market both have CPU + GPU resources. I would doubt that the entry-level marketplace will exist once CPU with on-die graphics reaches a respectable performance, mainstream graphics could hold out longer depending upon OEM share and enthusiast graphics are probably a fairly minor income earner (sales versus R&D)...which leaves the HPC arena, with compute cards hobbled for desktop enthusiast graphics a la Fermi and SoC (Tegra)...so hardly surprising that nvidia fights tooth-and-nail to maintain it's own presence in the software field.
 
Status
Not open for further replies.
Back