Nvidia cuts PhysX if AMD card is present

By on October 2, 2009, 3:51 PM
Nvidia has cut the cord on folks who are using its GPUs to accelerate PhysX effects alongside an AMD card. Nvidia's newer drivers deny the ability to use hardware PhysX acceleration unless your GeForce is handling the graphics as well. A forum post on NGOHQ shows an email explanation from Nvidia's Customer Care, which lists a variety of causes for the decision, including "development expense," "quality assurance," and "business reasons."

The Tech Report notes that while this applies to GeForce 186 drivers and beyond, it only really affects people running Windows 7, as Vista's Windows Display Driver Model (WDDM) 1.0 can't run multiple GPUs with different drivers simultaneously -- as opposed to Windows 7's WDDM 1.1, which can. The site also spoke with Nvidia general manager Ujesh Desai, who cited technology and compatibility conflicts as the motive for the limitation.

No matter Nvidia's intention, can you blame them?




User Comments: 40

Got something to say? Post a comment
CrisisDog said:

Not really news for me, they've disabled PhysX even for the older Ageia cards a long time ago if you had anything other than a Nvidia GPU along with the card. Went around in circles for a month before they admitted to it at third level support. I got Mirror's Edge working with my ATI and Ageia cards only by using the very first PhysX driver that Nvidia provided under their logo. Unfortunately, that driver kit is about two years old now and looks like it doesn't support the new Batman game. Bummer...

TorturedChaos, TechSpot Chancellor, said:

Yes I can blame them. You bought the graphics card. Its your property. You should be able to use it in any fashion you want, including running it side by side with their competitor.

Guest said:

They own the software, they can make it work/not work with whatever they want it to.

Guest said:

Seems counterintuitive to me. Why deny themselves some extra sales?

tengeta tengeta said:

Pretty sure Microsoft was just told they couldn't their stuff work anyway they want, so no, Nvidia doesn't get a free slip.

Guest said:

"Yes I can blame them. You bought the graphics card. Its your property. You should be able to use it in any fashion you want, including running it side by side with their competitor."

Agreed. But how many people know how to write their own custom drivers for their hardware? And how many of those who can will share their drivers with others having a multitude of system configurations?

It is only proper for a company competing with another to support only their own products. Unless you want Nvidia to charge more for their cards because they have to develop drivers and software that is compatible with their competitors?

TJGeezer said:

Guest said:

Seems counterintuitive to me. Why deny themselves some extra sales?

This is is what happens when a corporation puts its own agenda ahead of its users. It could backfire on them.

It's kind of a cop-out to say they can design for any hardware they want. Yes, that's true, but note that they had to actively exclude another company's specific hardware. That's not writing code for the customer, it's writing code for the corporate agenda, and to hell with the customers.

Seems to me like an unhealthy attitude toward the customer and unwise marketing strategy. In an age of open standards, attempts to lock customers into your specific hardware demonstrate poor corporate judgment.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

*sigh*

I can't believe how falsely entitled people seem to believe they are these days...

nVidia owns PhysX, outright, they paid BIG money for the entire package. That means, in a business decision, they decided that PhysX was a good fit with their company, to help them earn revenue, so they made the investment. Now, they don't want to let other graphics cards (which they cannot control) to run PhysX on an unsupportable platform. If there is a problem with the PhysX engine compatibility with this 3rd party card, who do you think will be the target of the whining and lamenting? Who do you think will be expected to remedy the situation? Who do you think will have to pay for the (possibly massive) support structure and reprogramming of the PhysX engine every time the other graphics card players make changes to their hardware and/or driver software? Seriously, who?

Think before you start complaining about a company "putting it's own agenda ahead of its users" please. They are NOT putting an agenda before THEIR users, they are simply making sure that THEIR users (who have purchased THEIR hardware) get the priority, and the perks for buying an nVidia product. That's part of the game, a product has to have things that give you a reason to buy them, in this case the PhysX engine is a bonus. It is NOT by any means an inalienable right that every PC user can claim. You choose ATi, you lose out, and have to make due with the Havoc engine. The end. Period. End of debate.

freythman freythman said:

God I love Nvidia. They're just so kind

Nitroburner77 said:

Vrmithrax, Mac vs PC, PS3 vs Xbox, ATI vs nVidia. You're a fanboy. If another company makes a product that supports the standard that PhysX purportedly follows, it is their choice. This is purely a matter of trying to force ATI and other graphic card users to switch to nVidia. Nothing more. Let's see what happens to PhysX when industry standards are implemented into DX11.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Nitroburner77 said:

Vrmithrax, Mac vs PC, PS3 vs Xbox, ATI vs nVidia. You're a fanboy. If another company makes a product that supports the standard that PhysX purportedly follows, it is their choice. This is purely a matter of trying to force ATI and other graphic card users to switch to nVidia. Nothing more. Let's see what happens to PhysX when industry standards are implemented into DX11.

Sorry, fail, Nitro... I use ATi exclusively, have for years, and am in love with the new 5890, but (im)patiently waiting for the dual GPU version before I grab one. So what does that do for your theories now, huh? The fact is, nVidia owns PhysX outright, and can choose when, and if, they allow whoever they want to use it. If I was nVidia, I'd be keeping it for myself to give another edge in a VERY competitive marketplace, but I would also NOT want to have tons of "your crap doesn't work" technical support issues due to 3rd party hardware incompatibilities - at least with their internal GPU products, they can handle their own internal hardware/software.

It's really basic common sense and a tiny morsel of business intelligence. What you (and those others who seem to feel entitled to PhysX no matter who owns it) seem to miss, is that MONEY WAS PAID FOR THE TECHNOLOGY. nVidia makes absolutely NOTHING on an ATi card that can run PhysX when it is sold. To give an invested technology away is absolutely, irrevocably stupid and would probably completely piss off their shareholders. Don't ever forget, this is a business, and I can virtually guarantee that if ATi was the one that owned PhysX, we'd be seeing this same argument with just the sides switched.

You guys might as well be pissing and moaning about Norton not just giving away its virus scanning technology to anyone that wants it, or Apple for making iTunes use proprietary formats that don't work flawlessly on every single personal media player on the market... Whether we like it or not, business is business, and if people can't make money, they won't BE in business. Imagine what the video card market would be like if it was just a 1 pony race (either nVidia or ATi)... You can bet that the regular MASSIVE upgrades and enhancements to GPU technology would not be coming at anything close to the breakneck speeds that they are coming now... No reason to push to win, if there's only 1 racer.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Slight edit... Make that the 5870 I'm in love with, got ahead of myself (and current technology, apparently - it appears too much caffeine can make you leap forward in time).

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Guest said:

Seems counterintuitive to me. Why deny themselves some extra sales?

Sales of what? If they support PhysX on ATi cards, they are selling their competitor's product, not their own. That would be a pretty bad business move there, don't ya think?

pgbsamurai said:

Vrmithrax your argument has a small flaw. This is not about running Physx on ATI cards. This is about running graphics on an ATI board while running physx on an nVidia board. Having both pieces of hardware on one machine. What nVidia is doing is writing code to disable physx effects running on nVidia gpu's if it detects an ATI card as well. While not supporting the hardware conflicts this might entail is understandable, flat out disabling the functionality because your competitor's hardware is present seems more like anti-competitive business practices.

Guest said:

you missed an important point. its not support for PhysX on ATi that people are complaining about. NVIDIA is not letting people run PhysX on an NVIDIA card if the ATi is the primary card running the game. so yes NVIDIA are shooting themselfs in the foot, in that, IF ati have a faster card and people want to use it for graphics someone who might afford to buy an nvidia card along side to handle the graphics now cant. And further more which is making me more angry is that people who bought their card are now being forced to either stick with them or to just throw away their current card. Its unfair in my humble opinion.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Oh, I got the basic point of it... But my point is that, from a support and compatibility standpoint, the mixing of cards and then trying to make sure a proprietary software package like PhysX would work in some mixed-pair combination could become a burden for nVidia, but nobody seems to care about that. If you have an ATi card as your primary output, and an nVidia card also on-board (which seems weird, why not go one or the other, so you can SLI or Crossfire?), then who do you think will get the flak when some incompatibility between 2 completely different GPU types with different graphics drivers doesn't process PhysX correctly? Even if you take the whole sales pitch into account, in many ways nVidia are just protecting themselves from a possible crapstorm of support nightmares. Who are we to tell them that they MUST support a 3rd party's hardware configuration, and keep up with that outside company's driver and hardware changes, as well as their own? It's their software, they should be able to choose what platform configurations they choose to support. Or do they not get the same rights as all of the other PC hardware companies in existence? As much as I dislike some of nVidia's practices, I can't find fault in their actions in this case.

Me, I'm rooting for AMD's open-source physics engine initiative. It would eliminate all of this debate, as the playing field would be leveled.

Guest said:

I cant believe what just happened to me this morning. I got a new 6 disk cd changer installed in my truck yesterday. So today I put 6 of my favorite cd's in and low and behold, none of them would play. I called Alpine, the manufacturer of the cd changer to see if they could help. This is what they told me:

"We apologize sir, however, Sony/BMG has implemented a new policy to help protect their investment, their music cd's will not play if there is another music cd present."

If this sounds like a joke to you, well, it is. Just like Nvidia.

Guest said:

Darnit, my Alpine was supposed to be a Sony :(

But im sure you all get my point.

Guest said:

If I pay 300 dollars for a video card that supports PhysX. It better do it! Whether I have an ATI, NVidia or Intel graphics chip. If you have a motherboard with an NVidia chipset you expect it to work with an ATI card. If you buy a an AMD processor you would expect it to work with an NVidia graphics card. And if I choose to run my 300 dollar NVidia card as just a PhysX board then as it is an advertised feature, it should work. Or they need to put a disclaimer on all their boards and provide compensation for those of us who already spent the money. PhysX will probably go by the wayside since they already have issues getting game companies to support it in GPU. OpenCL and its associated technologies should be more attractive to game developers since its one less licensing expense for the studio. But then again, DirectX is still here and theres licensing fees for it too.

Darth Shiv Darth Shiv said:

I hope the bad publicity bites nvidia for what they have done. They aren't really in a strong position anymore to dictate terms like this and alienating a market segment hopefully will hit them in the wallet.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Guest said:

If I pay 300 dollars for a video card that supports PhysX. It better do it! Whether I have an ATI, NVidia or Intel graphics chip.

If you pay 300 dollars for that video card, you should probably do a bit of research before plunking down those hard earned dollars. nVidia IS PhysX now, and PhysX is embedded in their graphics processing drivers, as part of the PRIMARY video display system. You can add a second nVidia card and have it do the lion's share of the processing, but as far as I've ever seen, it was always worded this way: "Add a SECOND nVidia card to do your PhysX calculations" and such. Just because they have acquired PhysX and rolled it into their GPUs doesn't mean they ever expected people to buy an nVidia graphics card and basically use it as a newer version of an Aegia dedicated PhysX card, it was always intended to be utilized in an nVidia graphics environment. This announcement and move to limit the PhysX applications has just cemented the concept, to protect them from issues that might stem from mix-and-match sets. That, and to push sales of their cards, of course.

I tend to settle for ATi in Crossfire running the Havok engine, rather than some bastardized hybrid system where I'm using 2 different GPU types - software physics doesn't care as much what platform(s) you run it on. And as you say, PhysX will likely become obsolete in the long run, particularly if AMD's open-source physics engine initiative catches on.

Guest said:

I

I tend to settle for ATi in Crossfire running the Havok engine, rather than some bastardized hybrid system where I'm using 2 different GPU types - software physics doesn't care as much what platform(s) you run it on. And as you say, PhysX will likely become obsolete in the long run, particularly if AMD's open-source physics engine initiative catches on.

you're misunderstanding the scenario. I dont execpt anyone to buy a nvidia card just for physx along side and ATi, but consider current owners of NVidia cards. Card gets a bit old, time to upgrade and unless you're religious about your vendor you're gonna buy the best option for the best price. As it is right now I guess that would be an ATi card at least until gt300 comes out, we'll see. now if NVidia didnt flat out disable support for physx unless its the primary card, they can still use it to do physx instead o just throwing away a pretty expensive piece of hardware. Which is fair and it is what you expect. As for Nvidia having to deal with mixed enviroments, yes of course they do, its part of the support they have to provide Vrmithrax, there will always be new scenarios, what if Microsoft release and update that breaks something in nvidia driver, they just disable the feature cause well its not a supported scenario? this is no different. if their driver has issues running physx or anything else for that matter unless card is in primary mode, its a bug that requires fixing end of story.

This seems different only cause we're talking about competitors and the same hardware. Would you think the same and keep the same stence if instead of another ATi card the problem would be if the computer has a certain sound card? I bet not this is no different.

I obviously dont know the details so I cant really say, but this is one of two things in my opinion:

Either NVidia are trying to keep a strong hold on its current customers and using this to deter its current customers from jumping ship

Or they have a bug in their driver system and they dont want to fix it.

both are unexceptable in my opinion I have always owned a nvidia card but i think its time to switch if they're gonna stay playing these games, this feels like blackmail and I dont stand for it.

I understand they may be some issues and in some cases it might be unstable but you dont fix that by disabling everything!

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Again, I am not misunderstanding the scenario... Maybe I should boil it down to basics: If you want PhysX, you need a "pure" nVidia graphics system. If you want to throw a wildcard ATi into the mix, nVidia wants no part in any possible support nightmare scenarios, so PhysX is disabled. Period.

I've done my share of programming (including drivers) so I can attest firsthand to the fact that it is often a microscopically fine line you are running, and hoping that nobody throws a curveball at you by changing THEIR drivers in such a way that it affects you too. nVidia (and ATi for that matter) has enough work to do just always keeping up with the general trends and constant advancements, to stay in step with current performance in terms of drivers and technical issues. What most people saying "nVidia sucks" on this thread seem to not get is that nVidia is not responsible for issues that might stem from mixing and matching wildly different GPU hardware. But you all seem to think they should be. If you put an ATi card into the mix, you have now created a hybrid situation, it is no longer an nVidia graphics system, so you have voided your option to run PhysX. It's amazing to me how people just expect companies to support stupid crap that they don't NEED to support, just because those people feel they are entitled to it. Who pays for all the extra time, research, programming, and support for that endeavor? You expect nVidia to just give away any profits they might make so small percentage of their customer base can be mollified in case they MIGHT want to do something a certain way that is, by its very definition, a nonstandard configuration?

The sound card comparison is ridiculous, that is apples and oranges. This is a situation where you have 2 pieces of hardware that do the same job, but use 2 totally different driver sets, and have any number of possibilities for conflict between the 2 platforms.

You say they possibly have a bug in their system, but in reality it's not a bug, they just didn't waste the time to develop, adjust, and maintain their applications to benefit THEIR COMPETITORS. Yes, hardware gets dated and needs updating, but you always have a choice, and in this case your choice is to stay with nVidia if you want PhysX, or go with ATi and lose that option. It's not rocket science, but it IS business, and nVidia is playing it smart. As I have said before, I may not agree with nVidia's practices (in fact I almost never do), but this one I can't fault.

Oh, and "you don't fix that by disabling everything!" Really? Come on, they are disabling PhysX, a (currently) seldom-used piece of GPU-intensive code that has been cobbled into their driver system to run on their hardware, rather than forcing you to buy an expensive physics processing card (hey, but let's not give them any kudos for THAT part). That's it. But, by all means, over-dramatize it by saying it is "everything" and join the doomsayers... heh

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Sorry, just have to address this

now if NVidia didnt flat out disable support for physx unless its the primary card, they can still use it to do physx instead o just throwing away a pretty expensive piece of hardware.

As opposed to an expensive ATi card you might be replacing, which gives you exactly nothing extra and you would just throw away. Double standard there, I'd say.

As for Nvidia having to deal with mixed enviroments, yes of course they do, its part of the support they have to provide Vrmithrax, there will always be new scenarios, what if Microsoft release and update that breaks something in nvidia driver, they just disable the feature cause well its not a supported scenario? this is no different.

Again, this is MASSIVELY different, but nobody seems to get it. If, as you say, Microsoft releases an update that breaks something in the nVidia driver, well they will fix it - they have a team that does nothing BUT fix and update their driver on those occasions where updates and changes affect their GPU drivers and performance. But, what if MS updates something that breaks the ATi driver enough to affect the PhysX interaction? Now nVidia is responsible for fixing THEIR driver to compensate for their COMPETITOR's driver shortcomings? Or are they supposed to just take over and program ATi's drivers for them, to make sure it all works? See, there is a huge component that nVidia has absolutely no control over in this equation, which is how the ATi (as the primary graphics engine) will interact with their stuff. And, you can guarantee that if the PhysX isn't working in this situation, they will be the ones getting the griping and support requests. Even if they point back at ATi as the culprit, most people who don't understand the complexities involved won't listen and pull the "well what do they have to do with YOUR software?" card. It's a support nightmare in the making, why not just eliminate the possibility?

Guest said:

Sorry, just have to address this

As opposed to an expensive ATi card you might be replacing, which gives you exactly nothing extra and you would just throw away. Double standard there, I'd say.

Not at all, the reason I went with NVidia is cause it offered better value and more features then ATi but certainly if I had an ATi card, I still expect it to work if I plug an Nvidia card in turn.

as for your comments that competitors dont have to support their product any more if you buy their competitors product I understand your point of view but sorry I have to disagree. I am obviously not getting a refund from nvidia if i but an ATi card so how does that free them from any supportability obligations to their product?

Also i disagree with your distinction between a sound card and a video card in this context for multiple reasons.

Mind you, we're not saying here I want both cards running as a primary video controller, there I understand there might be conflicts, what we're saying here is the driver has to give access to its underlying hardware something that it should always do no matter what. I honestly fail to see how and why they would ever conflict but obviously i have no access to both their technical details I understand there might be something I am not thinking off and I would have no trouble waiting for them to fix any issues there might be. But flat out disable it ?

I mean come on seriously, you really think that vendors have a right to refuse to support specific scenarios cause maybe it could turn out into a supportability nightmare for them? And you think thats the reasonable thing to do? and where do you draw the line? imagine buying an expensive stereo system which when you hook up doesn't work, you go back demanding they respect their warranty and they refuse on the grounds that you have put electricity through it and hey you have no idea what electricity can do to components when things go wrong, we're not gonna support that! it might turn into a nightmare! And I know its a ridiculous example but please realize this is an extension of what you're saying and is my whole point. Vendors are obliged to fix their issues (which lets not forget technically should not be there in first place but obviously nothing is perfect nor i am expecting them to be) and I dont think its fair for them to rub off responsibility simply because its not cost effective for them to fix something and even that is arguable cause doing this will loose them customers. I myself am seriously thinking of jumping ship and I bought numerous of their cards between 10 - 15. I'd say that keeping your customers happy is a very cost effective thing to do. A happy customer might stick with your product even if your competitor has an little advantage over you. an Angry one will jump ship even if you competitor lag a little bit behind.

Finally I am not asking something arcane here like you seem to think. May I remind you that others do this without issue. Ever heard of a soundcard that disables things if you have other soundcards in your system? nope its normal (reminder that most boards include onboard sound which you dont need to disable before your soundcard works). same thing goes for GPUs, again plenty of motherboards have onboard gpus and again you dont need to disable that before your gpu works fully. Right now i am working on a system with multiple gpu (physical) not onboard and they both live happy together. same goes for network cards, modems and anything else i can think off. Only thing I can honestly think of that will not coexist with a competitor is a a processor and thats understandable as the they're widely different.

Dont get me wrong, I understand what you're saying I understand it can be an expense for nvidia to fix their issues (are there any really? I looked around and couldnt find anyone complaining of issues when running physx along side and ATi card I would really be angry if this is exclusively a business move to blackmail their customers into staying and a pretty stupid one if so, especially if Ati gets its own physics system and it becomes widely adopted.) but lets give them the benefit of the doubt, maybe its true there are issue and maybe its a big expense to fix them and to benefit a few customers how ever it is still their responsibility to do so Vrmithrax and again I would expect no one would ever excuse a vendor of their after sales responsibility if they experience an issue on something they bought that no one else did. This is no different in my opinion.

Guest said:

Oh, and "you don't fix that by disabling everything!" Really? Come on, they are disabling PhysX, a (currently) seldom-used piece of GPU-intensive code that has been cobbled into their driver system to run on their hardware, rather than forcing you to buy an expensive physics processing card (hey, but let's not give them any kudos for THAT part). That's it. But, by all means, over-dramatize it by saying it is "everything" and join the doomsayers... heh

Seriously what else would use an Nvidia card for if use an ATi as a primary GPU? the only thing it has use for is for PhysX and GPGPU operations. You know you just got me thinking, it is just physx they disabled or also any other GPGPU operation? cause if GPGPU operations still work I bet its all about a in my opinion misguided business decision since as far as i know Physx is an an Ageia engine implemented to use GPGPU and its in the GPGPU layer I expect any incompatabilities would be.

So if nvidia are being honest I didnt dramatize in saying everything. if nvidia are playing around then they disabled 50% however the only thing i intended to do with card was physx so for me everything is still correct.

minor thing that i wanted to clarify.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Well, as I'm reading through everything in this thread, there are obviously intelligent and well thought out responses to my defense of nVidia's stance on this PhysX issue. I have realized that there is a large gap in some clarity on the exact nature of PhysX and what nVidia offers. So, let me backtrack, and then perhaps my position will make sense.

PhysX is a physics co-processing system that has been incorporated into the nVidia driver system, intending for it to be handled by their GPU rather than a separate processing card (as in the Aegia system that nVidia acquired and shelved). nVidia acquired this system to incorporate in their cards, basically because it was a good fit with their strong GPUs and created a value-added situation for their customers. As things have progressed, they have even made it possible for the PhysX engine to be run on a secondary nVidia GPU in a system, through some creative bridging. Newer nVidia-based motherboard chipsets even enhance many of these bridging functions. However, this bridging and offloading of PhysX was all developed in pure nVidia graphics systems. The PRIMARY nVidia graphics driver handles most of the bridging, and coordinates the PhysX handling by a chosen secondary nVidia card. This can even work in things like and SLI configuration, where you have 2 nVidia cards running the primary graphics acceleration, and a 3rd card running just PhysX. Again, all nVidia. This was how it was designed. The PhysX component was always an enhancement to the nVidia graphics platform, never an advertised independent function. While they HAVE advertised that you can get a second smaller nVidia GPU to operate as a PhysX engine, this was always in the context of adding to an existing nVidia-driven graphics platform, it was NEVER advertised as a "buy an nVidia card to use as a PhysX processor in whatever kind of system you have" situation. But everyone seems to just expect that.

nVidia did alot of work incorporating the PhysX into their graphics engine. They did alot of testing to get it to integrate and diversify within their nVidia driver system. But, at no time were they ever obligated to test or develop protocols for this relatively minor component (PhysX) of their graphics package to be used in configurations other than the ones they designed for. The PhysX integration, at its core level, was designed with the assumption that the nVidia graphics drivers would be primary, and thus able to coordinate and handle the offloading and correct physics calculations and communications back into the system.

People seem to want nVidia to spend an inordinate amount of time on what is essentially an embedded side feature of their primary product, just so it will fit in with what their system might need. People seem to forget, you buy nVidia for the nVidia performance, you get PhysX as a bonus. If you are buying nVidia just for PhysX, and have not researched the limitations, then the only person to blame is yourself.

Oh, and just FYI... The following is directly from nVidia's FAQ on PhysX, and has been there for quite some time:

Can I use an NVIDIA GPU as a PhysX processor and a non-NVIDIA GPU for regular display graphics?

No. There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, NVIDIA PhysX technology has been fully verified and enabled using only NVIDIA GPUs for graphics.

And with that, I'll bow out of this... I've defended nVidia's rights to choose the format and platform for their own proprietary (and wholly owned) product features far more than any ATi user should ever have (or have to). The thing to remember is this: just because you WANT a company to do something, doesn't mean it HAS to, unless it specifically stated that it was offered. In this case, nVidia never stated that PhysX could be used on anything other than an nVidia graphics system. Case closed. On to the next debate

Guest said:

I can't believe anyone would defend Nvidia in this.

This is not an issue of asking for support for mixed vendor cards.. it's an issue of Nvidia reaching its green hand into your computer and disabling your property that you presumably paid good money for.

If the whole thing just flat out didn't work, that would be one thing. Nvidia wouldn't be obligated to make it work.. even if it would be a grand gesture of goodwill. But it DOES work. Microsoft has ensured that it will work in Windows 7, something they didn't have to do.

So you have something that works just fine and people are getting the best of both worlds. PhysX and the best the graphics performance. But since the best graphics performance is now coming from ATI, Nvidia can't stand it.. so they disable the entire functionality. This isn't a support issue, it's a 'force you to buy a new Nvidia card only' issue and it reeks of desperation.

It's already backfiring. Hacked drivers are already available to enable the functionality. The code to turn off your hardware can be intercepted. Don't settle for these pathetic business practices. There is no defense here. This is not a support issue. This is not a compatibility issue. This is a cutthroat business move that hurts one person, you the consumer.. not ATI. ATI is laughing to the bank while Evergreen shines and Nvidia's Fermi sits smoldering in ruins on the lab floor.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Guest said:

I can't believe anyone would defend Nvidia in this.

This is not an issue of asking for support for mixed vendor cards.. it's an issue of Nvidia reaching its green hand into your computer and disabling your property that you presumably paid good money for.

Another complete overdramatization of what is really a minor issue. You did NOT pay good money for PhysX, you paid good money for an nVidia graphics card, which just happens to come with this handy little PhysX feature at no extra cost. If you "paid good money" for the PhysX technology, odds are you would continue to "pay good money" to stay with the platform that actually is responsible for PhysX.

If there are hacks and such to make it work, that's fine, great, super! Hack away, my friend! But, those are hacks, and now when there are issues, guess who is NOT responsible for the support backlash in an officially unsupported configuration? Yep, you guessed it, nVidia is free and clear. So they can concentrate on making newer better GPUs and trying to catch up to ATi's 5800 series, instead of holding the hands of customers who have jumped ship but still expect them to bend over backwards to support their mutiny... It's a painfully obvious case of people wanting to have their cake and eat it too, and if you can't see that, then nothing anyone says will ever dissuade you from your feelings of false entitlement.

In the long run, it won't matter anyways. I foresee the open-source physics processing initiative making this a moot point in the near future.

Guest said:

instead of holding the hands of customers who have jumped ship but still expect them to bend over backwards to support their mutiny...

Vrmithrax, I really admire your logic, sure let all company cease support for anyone that buys any other brand so as to not support mutinies right :)

NVIDIA want to lock down their product for no good reason at all (no issues reported / it works with hacked drivers - if they were afraid of support backlash they could have just said it will not be a supported setup and refused support... its ugly but i could live with that, much better then taking away something i got when i bought their product... wonder how they would feel if i where to take $20 away from the $500 i had paid them in the first place) fine their decision. I cant do much about it, all i can do is jump ship before this trend gets worst.

Vrmithrax you will probably think i am over dramatizing but today they do this, what about tomorrow? nahh its not worth it I would rather go with someone who embraces openness and ATI seem more inclined to do that. I guess this really helps with that mutiny thing.

People made PhysX run on ATI cards with hacked drivers, nvidia is afraid it might have issues if it runs it on its own hardware if ATI is the primary, i dont buy it... this is a marketing move pure and simple. funny thing is when companys go down this road and people start jumping ship they will think they need to restrict more and makes things worst, hope i am wrong, time will tell

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Oh, I never doubted that there was a big marketing component to this scenario... My point was, there are any number of things that COULD go wrong with a hybrid configuration, and maybe nVidia just wanted no part in the possible crap they could find themselves in. Since you went down the road of "what about tomorrow" here, what if tomorrow ATi decided to tweak their card in such a way that it screwed with PhysX, making nVidia look like they have a crap product? Seriously, you can go round and round with "what ifs" and trying to determine what will happen in the future. In the now, nVidia wanted no part in the technical side of upkeep on keeping PhysX working perfectly in configurations outside of their control.

Now, on the marketing side, the points I made seem to have been missed as well. nVidia spent alot of money purchasing PhysX, and alot more time and money incorporating it into their GPU system to give their customers an added perk for choosing to use nVidia as a graphics platform. They did this to prevent you from having to spend $300 on a separate physics processing card. Why on earth would they want to let consumers take their work and go out to buy the cheapest card possible that can run PhysX, then throw it in a system run on the competitor's graphics system? Or reward someone from buying an ATi card and relegating the old nVidia card to a back burner as just a lowly PhysX coprocessor? Yes, it's a marketing move, designed to encourage product loyalty by rewarding it with enhancements you can't get otherwise. You know, kinda like "buy 2 of our games, get a 3rd one free" deals - you think the company giving those away would be offering a competitor's game as the bonus? Or giving you the bonus game if you only bought 1 of their games and 1 from a competitor? I know, apples and oranges there, just trying to establish that this is a product loyalty situation that they are playing out. They want, in no uncertain terms, for consumers to be given a choice: use an nVidia graphics system, you get PhysX - use something else, you don't. In the end, it's their property, their license, their call.

Do I like the move? No, not really - marketing ploys usually screw at least some portion of the consumer base. But can I respect the move as smart business-wise? Yep, and I'm sure their shareholders would be mortified if the move hadn't been made, and suddenly a bunch of the cheap nVidia cards were selling at minimal profit to be thrown into ATi 5870 driven graphics systems. If you can't beat em, trip em up. Welcome to business in a tight, cut-throat marketplace.

Guest said:

nVidia spent alot of money purchasing PhysX, and alot more time and money incorporating it into their GPU system to give their customers an added perk for choosing to use nVidia as a graphics platform. They did this to prevent you from having to spend $300 on a separate physics processing card. Why on earth would they want to let consumers take their work and go out to buy the cheapest card possible that can run PhysX, then throw it in a system run on the competitor's graphics system? Or reward someone from buying an ATi card and relegating the old nVidia card to a back burner as just a lowly PhysX coprocessor? Yes, it's a marketing move, designed to encourage product loyalty by rewarding it with enhancements you can't get otherwise. You know, kinda like "buy 2 of our games, get a 3rd one free" deals -

I dont agree. Yes I am sure nvidia would rather someone buy a cheap $300 along side with an ATI card then just buy the ATi card... and its definitely not an incentive scheme. if you stick to nvidia, they are not giving you extra functionality, you take exactly what you're paying for no more no less. if you jump ship they are taking away from you something you have and that you paid for. its a punishment scheme for those that jump ship not reward for brand loyalty.

I am not a lawyer, however i think it would be illegal to do such a thing if it weren't hidden behind a "possible technical issues" excluse right? I mean wouldnt this go against the antitrust laws? artificially disabling your hardware if it detects another competitors product?

Vrmithrax Vrmithrax, TechSpot Paladin, said:

I am not a lawyer, however i think it would be illegal to do such a thing if it weren't hidden behind a "possible technical issues" excluse right? I mean wouldnt this go against the antitrust laws? artificially disabling your hardware if it detects another competitors product?

As far as I know, it's not illegal for a very distinct reason. If you are running an ATi card as your primary graphics system, you would be using the ATi driver as the primary graphics control for your system. PhysX is only guaranteed, authorized and licensed to run if the nVidia graphics engine is primary. If for no other reason, the nVidia driver is the one that determines physics processing loads and distributions. Great loophole? Yes, yes it is. But a completely valid one - they have no legal requirements to run their proprietary software in a competitor's graphics environment.

And, for the record, it IS a customer loyalty reward... There is positive reinforcement and negative reinforcement methods for reward. In this case, nVidia chose the negative reinforcement - you choose to use ATi as your graphics quarterback, that's fine but we're taking OUR ball and going home, you're on your own. It's not exactly a superior mature response, but it's their ball to take.

Guest said:

I just bought an nForce 730i mobo. It has onboard GeForce 9300. I have a GeForce 9500 card. Now, I can put that 9500 card into the 9300 mobo and run it in Hybrid SLI or have the 9300 handle PhysX and the 9500 handle graphics. The 9300 then becomes a discrete processor unit, completely separate from the graphics pipeline. When I bought the mobo however, I also bout a Raedon 4650 card which is a fair bit faster and has twice the RAM of my 9500 (and it was $40). There's no logical reason why I can't use my Raedon card and my discrete Physx processor unit other than Nvidia being dicks. I've always used Nvidia and was hesitant about getting the 4650 just because of brand loyalty. After this, I never want to buy an Nvidia product again.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

I just bought an nForce 730i mobo. It has onboard GeForce 9300. I have a GeForce 9500 card. Now, I can put that 9500 card into the 9300 mobo and run it in Hybrid SLI or have the 9300 handle PhysX and the 9500 handle graphics. The 9300 then becomes a discrete processor unit, completely separate from the graphics pipeline. When I bought the mobo however, I also bout a Raedon 4650 card which is a fair bit faster and has twice the RAM of my 9500 (and it was $40). There's no logical reason why I can't use my Raedon card and my discrete Physx processor unit other than Nvidia being dicks. I've always used Nvidia and was hesitant about getting the 4650 just because of brand loyalty. After this, I never want to buy an Nvidia product again.

Just to beat a dead horse: If you use the Radeon card, you are losing your hybrid SLI and PhysX, because you will be using ATi's driver for the primary display. Your choice of primary video card has precluded being able to use PhysX, this was your own doing (and you probably should have read the fine print and done some research if PhysX is that important to you). It is all due to the primary driver application, the PhysX code is part of that primary driver, not a secondary distinct driver set. So, for it to work properly in your cross-manufacturing hyrbid situation, both ATi and nVidia video drivers would have to be working concurrently, with zero conflicts. Why does nobody seem to see how open a system like that is to possible conflicts, and driver compatibility, version collision and upgrade nightmares? Seriously!

Guest said:

hybrid sli is two cards executing the EXACT same code in harmony. ati for gpu and nv for physx is two seperate cards doing two seperate jobs. it's a poor analogy and like most you've used have missed the mark in 1 or more crucial areas. i wouldn't doubt for a moment that you work for a marketing firm with the way you rationalize, and spin.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

hybrid sli is two cards executing the EXACT same code in harmony. ati for gpu and nv for physx is two seperate cards doing two seperate jobs. it's a poor analogy and like most you've used have missed the mark in 1 or more crucial areas. i wouldn't doubt for a moment that you work for a marketing firm with the way you rationalize, and spin.

Ummm... I believe I'm not the one who missed the mark. The enhanced hybrid SLI technology is what is used to allow things like an integrated nVidia graphics GPU and a secondary discrete nVidia graphics card to work synchronously, or have the option of offloading the PhysX into one processor and standard graphics on the other processor... Or using the integrated GPU for standard graphics and the discrete card for full-on 3d gaming processing.

ATi for GPU and nVidia for PhysX is what the topic is about, a hybrid graphics system (NOT hybrid SLI), in which case you are using ATI's graphics driver to control your graphics output. The nVidia PhysX code was incorporated in the nVidia graphics driver, it was never intended to be run as a separate module, and is only stated as available in a pure nVidia graphics environment, in which case their GeForce Boost system for the hybrid SLI can allow one of the cards in the system to be used for PhysX calculations. Again, only officially available if the nVidia driver is primary, which it is not in these situations where ATi is the primary graphics card.

I'm not in marketing, I just happen to actually read and research what is going on. And, as I have stated before, I am an avid ATi user, have been for years since I jumped off the nVidia train (the horrid driver bugs made me crazy). But just because I use ATi, doesn't mean I feel I am entitled to use of nVidia's proprietary technologies just because I say so. It's their code, their product, they get to choose how it's utilized. The end.

ellulbrian ellulbrian said:

Nope. I wouldn't blame them. Its all about buisness and its commonly done in hardware (even not related to IT).

fadownjoo said:

Agreed. But how many people know how to write their own custom drivers for their hardware? And how many of those who can will share their drivers with others having a multitude of system configurations?

It is only proper for a company competing with another to support only their own products. Unless you want Nvidia to charge more for their cards because they have to develop drivers and software that is compatible with their competitors

i agree with that guy

Ju1iet said:

Then, ATI's new "Evergreen" series have enough horsepower to handle PhysX effects fluently. Moreover they have Eyefinity, DX11 and energy efficiency. Nvidia won't have any advantages over ATI if they don't do this.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.