1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Nvidia cuts PhysX if AMD card is present

By Matthew ยท 40 replies
Oct 2, 2009
  1. Not at all, the reason I went with NVidia is cause it offered better value and more features then ATi but certainly if I had an ATi card, I still expect it to work if I plug an Nvidia card in turn.

    as for your comments that competitors dont have to support their product any more if you buy their competitors product I understand your point of view but sorry I have to disagree. I am obviously not getting a refund from nvidia if i but an ATi card so how does that free them from any supportability obligations to their product?

    Also i disagree with your distinction between a sound card and a video card in this context for multiple reasons.

    Mind you, we're not saying here I want both cards running as a primary video controller, there I understand there might be conflicts, what we're saying here is the driver has to give access to its underlying hardware something that it should always do no matter what. I honestly fail to see how and why they would ever conflict but obviously i have no access to both their technical details I understand there might be something I am not thinking off and I would have no trouble waiting for them to fix any issues there might be. But flat out disable it ?

    I mean come on seriously, you really think that vendors have a right to refuse to support specific scenarios cause maybe it could turn out into a supportability nightmare for them? And you think thats the reasonable thing to do? and where do you draw the line? imagine buying an expensive stereo system which when you hook up doesn't work, you go back demanding they respect their warranty and they refuse on the grounds that you have put electricity through it and hey you have no idea what electricity can do to components when things go wrong, we're not gonna support that! it might turn into a nightmare! And I know its a ridiculous example but please realize this is an extension of what you're saying and is my whole point. Vendors are obliged to fix their issues (which lets not forget technically should not be there in first place but obviously nothing is perfect nor i am expecting them to be) and I dont think its fair for them to rub off responsibility simply because its not cost effective for them to fix something and even that is arguable cause doing this will loose them customers. I myself am seriously thinking of jumping ship and I bought numerous of their cards between 10 - 15. I'd say that keeping your customers happy is a very cost effective thing to do. A happy customer might stick with your product even if your competitor has an little advantage over you. an Angry one will jump ship even if you competitor lag a little bit behind.

    Finally I am not asking something arcane here like you seem to think. May I remind you that others do this without issue. Ever heard of a soundcard that disables things if you have other soundcards in your system? nope its normal (reminder that most boards include onboard sound which you dont need to disable before your soundcard works). same thing goes for GPUs, again plenty of motherboards have onboard gpus and again you dont need to disable that before your gpu works fully. Right now i am working on a system with multiple gpu (physical) not onboard and they both live happy together. same goes for network cards, modems and anything else i can think off. Only thing I can honestly think of that will not coexist with a competitor is a a processor and thats understandable as the they're widely different.

    Dont get me wrong, I understand what you're saying I understand it can be an expense for nvidia to fix their issues (are there any really? I looked around and couldnt find anyone complaining of issues when running physx along side and ATi card I would really be angry if this is exclusively a business move to blackmail their customers into staying and a pretty stupid one if so, especially if Ati gets its own physics system and it becomes widely adopted.) but lets give them the benefit of the doubt, maybe its true there are issue and maybe its a big expense to fix them and to benefit a few customers how ever it is still their responsibility to do so Vrmithrax and again I would expect no one would ever excuse a vendor of their after sales responsibility if they experience an issue on something they bought that no one else did. This is no different in my opinion.
  2. Seriously what else would use an Nvidia card for if use an ATi as a primary GPU? the only thing it has use for is for PhysX and GPGPU operations. You know you just got me thinking, it is just physx they disabled or also any other GPGPU operation? cause if GPGPU operations still work I bet its all about a in my opinion misguided business decision since as far as i know Physx is an an Ageia engine implemented to use GPGPU and its in the GPGPU layer I expect any incompatabilities would be.

    So if nvidia are being honest I didnt dramatize in saying everything. if nvidia are playing around then they disabled 50% however the only thing i intended to do with card was physx so for me everything is still correct.

    minor thing that i wanted to clarify.
  3. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,333   +280

    Well, as I'm reading through everything in this thread, there are obviously intelligent and well thought out responses to my defense of nVidia's stance on this PhysX issue. I have realized that there is a large gap in some clarity on the exact nature of PhysX and what nVidia offers. So, let me backtrack, and then perhaps my position will make sense.

    PhysX is a physics co-processing system that has been incorporated into the nVidia driver system, intending for it to be handled by their GPU rather than a separate processing card (as in the Aegia system that nVidia acquired and shelved). nVidia acquired this system to incorporate in their cards, basically because it was a good fit with their strong GPUs and created a value-added situation for their customers. As things have progressed, they have even made it possible for the PhysX engine to be run on a secondary nVidia GPU in a system, through some creative bridging. Newer nVidia-based motherboard chipsets even enhance many of these bridging functions. However, this bridging and offloading of PhysX was all developed in pure nVidia graphics systems. The PRIMARY nVidia graphics driver handles most of the bridging, and coordinates the PhysX handling by a chosen secondary nVidia card. This can even work in things like and SLI configuration, where you have 2 nVidia cards running the primary graphics acceleration, and a 3rd card running just PhysX. Again, all nVidia. This was how it was designed. The PhysX component was always an enhancement to the nVidia graphics platform, never an advertised independent function. While they HAVE advertised that you can get a second smaller nVidia GPU to operate as a PhysX engine, this was always in the context of adding to an existing nVidia-driven graphics platform, it was NEVER advertised as a "buy an nVidia card to use as a PhysX processor in whatever kind of system you have" situation. But everyone seems to just expect that.

    nVidia did alot of work incorporating the PhysX into their graphics engine. They did alot of testing to get it to integrate and diversify within their nVidia driver system. But, at no time were they ever obligated to test or develop protocols for this relatively minor component (PhysX) of their graphics package to be used in configurations other than the ones they designed for. The PhysX integration, at its core level, was designed with the assumption that the nVidia graphics drivers would be primary, and thus able to coordinate and handle the offloading and correct physics calculations and communications back into the system.

    People seem to want nVidia to spend an inordinate amount of time on what is essentially an embedded side feature of their primary product, just so it will fit in with what their system might need. People seem to forget, you buy nVidia for the nVidia performance, you get PhysX as a bonus. If you are buying nVidia just for PhysX, and have not researched the limitations, then the only person to blame is yourself.

    Oh, and just FYI... The following is directly from nVidia's FAQ on PhysX, and has been there for quite some time:
    And with that, I'll bow out of this... I've defended nVidia's rights to choose the format and platform for their own proprietary (and wholly owned) product features far more than any ATi user should ever have (or have to). The thing to remember is this: just because you WANT a company to do something, doesn't mean it HAS to, unless it specifically stated that it was offered. In this case, nVidia never stated that PhysX could be used on anything other than an nVidia graphics system. Case closed. On to the next debate :)
  4. I can't believe anyone would defend Nvidia in this.

    This is not an issue of asking for support for mixed vendor cards.. it's an issue of Nvidia reaching its green hand into your computer and disabling your property that you presumably paid good money for.

    If the whole thing just flat out didn't work, that would be one thing. Nvidia wouldn't be obligated to make it work.. even if it would be a grand gesture of goodwill. But it DOES work. Microsoft has ensured that it will work in Windows 7, something they didn't have to do.

    So you have something that works just fine and people are getting the best of both worlds. PhysX and the best the graphics performance. But since the best graphics performance is now coming from ATI, Nvidia can't stand it.. so they disable the entire functionality. This isn't a support issue, it's a 'force you to buy a new Nvidia card only' issue and it reeks of desperation.

    It's already backfiring. Hacked drivers are already available to enable the functionality. The code to turn off your hardware can be intercepted. Don't settle for these pathetic business practices. There is no defense here. This is not a support issue. This is not a compatibility issue. This is a cutthroat business move that hurts one person, you the consumer.. not ATI. ATI is laughing to the bank while Evergreen shines and Nvidia's Fermi sits smoldering in ruins on the lab floor.
  5. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,333   +280

    Another complete overdramatization of what is really a minor issue. You did NOT pay good money for PhysX, you paid good money for an nVidia graphics card, which just happens to come with this handy little PhysX feature at no extra cost. If you "paid good money" for the PhysX technology, odds are you would continue to "pay good money" to stay with the platform that actually is responsible for PhysX.

    If there are hacks and such to make it work, that's fine, great, super! Hack away, my friend! But, those are hacks, and now when there are issues, guess who is NOT responsible for the support backlash in an officially unsupported configuration? Yep, you guessed it, nVidia is free and clear. So they can concentrate on making newer better GPUs and trying to catch up to ATi's 5800 series, instead of holding the hands of customers who have jumped ship but still expect them to bend over backwards to support their mutiny... It's a painfully obvious case of people wanting to have their cake and eat it too, and if you can't see that, then nothing anyone says will ever dissuade you from your feelings of false entitlement.

    In the long run, it won't matter anyways. I foresee the open-source physics processing initiative making this a moot point in the near future.
  6. Vrmithrax, I really admire your logic, sure let all company cease support for anyone that buys any other brand so as to not support mutinies right :)

    NVIDIA want to lock down their product for no good reason at all (no issues reported / it works with hacked drivers - if they were afraid of support backlash they could have just said it will not be a supported setup and refused support... its ugly but i could live with that, much better then taking away something i got when i bought their product... wonder how they would feel if i where to take $20 away from the $500 i had paid them in the first place) fine their decision. I cant do much about it, all i can do is jump ship before this trend gets worst.

    Vrmithrax you will probably think i am over dramatizing but today they do this, what about tomorrow? nahh its not worth it I would rather go with someone who embraces openness and ATI seem more inclined to do that. I guess this really helps with that mutiny thing.

    People made PhysX run on ATI cards with hacked drivers, nvidia is afraid it might have issues if it runs it on its own hardware if ATI is the primary, i dont buy it... this is a marketing move pure and simple. funny thing is when companys go down this road and people start jumping ship they will think they need to restrict more and makes things worst, hope i am wrong, time will tell
  7. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,333   +280

    Oh, I never doubted that there was a big marketing component to this scenario... My point was, there are any number of things that COULD go wrong with a hybrid configuration, and maybe nVidia just wanted no part in the possible crap they could find themselves in. Since you went down the road of "what about tomorrow" here, what if tomorrow ATi decided to tweak their card in such a way that it screwed with PhysX, making nVidia look like they have a crap product? Seriously, you can go round and round with "what ifs" and trying to determine what will happen in the future. In the now, nVidia wanted no part in the technical side of upkeep on keeping PhysX working perfectly in configurations outside of their control.

    Now, on the marketing side, the points I made seem to have been missed as well. nVidia spent alot of money purchasing PhysX, and alot more time and money incorporating it into their GPU system to give their customers an added perk for choosing to use nVidia as a graphics platform. They did this to prevent you from having to spend $300 on a separate physics processing card. Why on earth would they want to let consumers take their work and go out to buy the cheapest card possible that can run PhysX, then throw it in a system run on the competitor's graphics system? Or reward someone from buying an ATi card and relegating the old nVidia card to a back burner as just a lowly PhysX coprocessor? Yes, it's a marketing move, designed to encourage product loyalty by rewarding it with enhancements you can't get otherwise. You know, kinda like "buy 2 of our games, get a 3rd one free" deals - you think the company giving those away would be offering a competitor's game as the bonus? Or giving you the bonus game if you only bought 1 of their games and 1 from a competitor? I know, apples and oranges there, just trying to establish that this is a product loyalty situation that they are playing out. They want, in no uncertain terms, for consumers to be given a choice: use an nVidia graphics system, you get PhysX - use something else, you don't. In the end, it's their property, their license, their call.

    Do I like the move? No, not really - marketing ploys usually screw at least some portion of the consumer base. But can I respect the move as smart business-wise? Yep, and I'm sure their shareholders would be mortified if the move hadn't been made, and suddenly a bunch of the cheap nVidia cards were selling at minimal profit to be thrown into ATi 5870 driven graphics systems. If you can't beat em, trip em up. Welcome to business in a tight, cut-throat marketplace.
  8. I dont agree. Yes I am sure nvidia would rather someone buy a cheap $300 along side with an ATI card then just buy the ATi card... and its definitely not an incentive scheme. if you stick to nvidia, they are not giving you extra functionality, you take exactly what you're paying for no more no less. if you jump ship they are taking away from you something you have and that you paid for. its a punishment scheme for those that jump ship not reward for brand loyalty.

    I am not a lawyer, however i think it would be illegal to do such a thing if it weren't hidden behind a "possible technical issues" excluse right? I mean wouldnt this go against the antitrust laws? artificially disabling your hardware if it detects another competitors product?
  9. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,333   +280

    As far as I know, it's not illegal for a very distinct reason. If you are running an ATi card as your primary graphics system, you would be using the ATi driver as the primary graphics control for your system. PhysX is only guaranteed, authorized and licensed to run if the nVidia graphics engine is primary. If for no other reason, the nVidia driver is the one that determines physics processing loads and distributions. Great loophole? Yes, yes it is. But a completely valid one - they have no legal requirements to run their proprietary software in a competitor's graphics environment.

    And, for the record, it IS a customer loyalty reward... There is positive reinforcement and negative reinforcement methods for reward. In this case, nVidia chose the negative reinforcement - you choose to use ATi as your graphics quarterback, that's fine but we're taking OUR ball and going home, you're on your own. It's not exactly a superior mature response, but it's their ball to take.
  10. I just bought an nForce 730i mobo. It has onboard GeForce 9300. I have a GeForce 9500 card. Now, I can put that 9500 card into the 9300 mobo and run it in Hybrid SLI or have the 9300 handle PhysX and the 9500 handle graphics. The 9300 then becomes a discrete processor unit, completely separate from the graphics pipeline. When I bought the mobo however, I also bout a Raedon 4650 card which is a fair bit faster and has twice the RAM of my 9500 (and it was $40). There's no logical reason why I can't use my Raedon card and my discrete Physx processor unit other than Nvidia being dicks. I've always used Nvidia and was hesitant about getting the 4650 just because of brand loyalty. After this, I never want to buy an Nvidia product again.
  11. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,333   +280

    Just to beat a dead horse: If you use the Radeon card, you are losing your hybrid SLI and PhysX, because you will be using ATi's driver for the primary display. Your choice of primary video card has precluded being able to use PhysX, this was your own doing (and you probably should have read the fine print and done some research if PhysX is that important to you). It is all due to the primary driver application, the PhysX code is part of that primary driver, not a secondary distinct driver set. So, for it to work properly in your cross-manufacturing hyrbid situation, both ATi and nVidia video drivers would have to be working concurrently, with zero conflicts. Why does nobody seem to see how open a system like that is to possible conflicts, and driver compatibility, version collision and upgrade nightmares? Seriously!
  12. hybrid sli is two cards executing the EXACT same code in harmony. ati for gpu and nv for physx is two seperate cards doing two seperate jobs. it's a poor analogy and like most you've used have missed the mark in 1 or more crucial areas. i wouldn't doubt for a moment that you work for a marketing firm with the way you rationalize, and spin.
  13. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,333   +280

    Ummm... I believe I'm not the one who missed the mark. The enhanced hybrid SLI technology is what is used to allow things like an integrated nVidia graphics GPU and a secondary discrete nVidia graphics card to work synchronously, or have the option of offloading the PhysX into one processor and standard graphics on the other processor... Or using the integrated GPU for standard graphics and the discrete card for full-on 3d gaming processing.

    ATi for GPU and nVidia for PhysX is what the topic is about, a hybrid graphics system (NOT hybrid SLI), in which case you are using ATI's graphics driver to control your graphics output. The nVidia PhysX code was incorporated in the nVidia graphics driver, it was never intended to be run as a separate module, and is only stated as available in a pure nVidia graphics environment, in which case their GeForce Boost system for the hybrid SLI can allow one of the cards in the system to be used for PhysX calculations. Again, only officially available if the nVidia driver is primary, which it is not in these situations where ATi is the primary graphics card.

    I'm not in marketing, I just happen to actually read and research what is going on. And, as I have stated before, I am an avid ATi user, have been for years since I jumped off the nVidia train (the horrid driver bugs made me crazy). But just because I use ATi, doesn't mean I feel I am entitled to use of nVidia's proprietary technologies just because I say so. It's their code, their product, they get to choose how it's utilized. The end.
  14. ellulbrian

    ellulbrian TS Rookie Posts: 30

    Nope. I wouldn't blame them. Its all about buisness and its commonly done in hardware (even not related to IT).
  15. fadownjoo

    fadownjoo TS Rookie Posts: 50

    Agreed. But how many people know how to write their own custom drivers for their hardware? And how many of those who can will share their drivers with others having a multitude of system configurations?

    It is only proper for a company competing with another to support only their own products. Unless you want Nvidia to charge more for their cards because they have to develop drivers and software that is compatible with their competitors

    i agree with that guy
  16. Ju1iet

    Ju1iet TS Rookie

    Then, ATI's new "Evergreen" series have enough horsepower to handle PhysX effects fluently. Moreover they have Eyefinity, DX11 and energy efficiency. Nvidia won't have any advantages over ATI if they don't do this.
Topic Status:
Not open for further replies.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...