TechSpot

AMD approached Nvidia before ATI acquisition in 2006

By Leeky
Feb 24, 2012
Post New Reply
  1. News has emerged that Advanced Micro Devices also held talks with graphics processor designer Nvidia about acquiring them before they went ahead and purchased ATI in 2006, according to former…

    Read the whole story
     
  2. hahahanoobs

    hahahanoobs TS Evangelist Posts: 1,631   +432

    Deep down I hope AMD finds their "Core 2". Even if it happens by accident.
     
  3. Ea80

    Ea80 TS Rookie

    HD 7970 fastest GPU?!
    Yes he is faster but i think gamers like quality ...am i right? As a gamer i say yes!
    I can't see the point in fast PC card that cant get me proper Physics and image quality that as his Competitor sorry...
     
  4. Yeah really awesome quality by Nvidia..
    it in 2006 and the later years that Nvidia sold a load of bad graphics chips that broke one after the other. no wonder that Nvidia is now so large to capitalize since they know how to profit at any cost..
     
  5. Ea80

    Ea80 TS Rookie

    I always had problems with ATI cards... Glitches in the image...i can tell you Nvidia chips never did me any problems. every one and his luck what can i say,
    Still you cant tell me that I'm wrong and Nvidia doesn't give the best image and physics in specific games
     
  6. hahahanoobs

    hahahanoobs TS Evangelist Posts: 1,631   +432

    I've had 7 nVIDIA cards and 4 ATi/AMD cards, and the only one I ever returned was a GTX 570. That won't stop me from trying another. Kepler is gonna decide where I go next.
     
  7. Ea80

    Ea80 TS Rookie

    I'm hating no one i just like quality and dont understand why amd gpus don't use software like nvidia's physx
    By the way can you help me? How can i answer a thread like you did on mine ?
     
  8. AMD should of agreed! and made nVidia CEO AMD's CEO

    we would of probably had a competitive AMD product right now!

    but ATI would of been out of business . .i'm sure!
     
  9. Tekkaraiden

    Tekkaraiden TS Evangelist Posts: 991   +90

    I though this was already pretty well know.
     
  10. ohsilly

    ohsilly TS Rookie

    I am not a gamer but I want fast performance like loading webpages full of crappy ad banners that always bump the article around until done. Or I want fast performance for converting graphic video formats like from .mov to .wmv or .avi and vice versa . I still have a puny x1300 ATI back in 2005 or so. I want to minimum graphic card that is fast on the cheap. any comments? Does 6XXXX or 7XXX go faster on my specs above or just for the gamers only? I have PCI-16 slots yessir!

    I mean do faster graphic cards help ad banners come on faster even on same broadband speed than older graphics cards like X1300 I have now.. Or does it make no difference ? DuH?

    I am not gonna upgrade my broadband as I think that 3 meg is good enough for me. I think that those crappy ad banners are just foolin' around with my old graphic cards!

    Why is it that those crappy ad banners keeping me reminded about the stupid Keystone Kops movies, you know cops running around and getting run over by cars or whatnots in the old B&W flicks..
     
  11. Use an ad blocker...
     
  12. Graphic cards are not going to make video encoding any faster. We're not there just yet. It's a tough nut to crack. If you want faster video encoding, grab the fastest cpu you can get.

    Graphic cards won't help you with so much ad rendering. It's mostly just on the cpu, with some javascript accelerated by the gpu. Get the fastest cpu you can and an ad-blocker to block flash and cpu consuming ads.

    A new graphics card will mostly help image processing, video decoding acceleration and gaming.
     
  13. Darth Shiv

    Darth Shiv TS Evangelist Posts: 1,620   +376

    Adobe Premiere Pro does GPU accelerated encoding... not sure what you are talking about :p
     
  14. I'm also pretty sure that Photoshop supports GPU acceleration as well.
     
  15. Ultraman1966

    Ultraman1966 TS Enthusiast Posts: 85   +8

    It's debatable whether image quality is always better on the Nvidia cards but regarding Physx; it is Nvidia's technology (after they purchased it) and they don't really want to share it with their rival AMD/ATi. Or if they did it would come at a price. At the end of the day it would be fair better for all consumers if an open source standard was used which is sort of what AMD has been pushing.

    I've had about 3 Nvidia cards and 4 ATi/AMD cards in the last decade so I'm not biased for either, I will buy whatever is the best suited for my needs at the best reasonable price.
     
  16. Ea80

    Ea80 TS Rookie

    To ultraman1966 what about sli it was first in nvidias card right ? And after nvidia ,amd done it on there cards isnt it the same as physx if amd will do it on there cards? About standard you right i too belive algoritem as physx need to be on every card as standard :)
     
  17. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Sounds reasonable right? Nvidia spend capital to acquire Ageia. You wouldn't expect Nvidia to foot the bill then give away an asset for free ? I didn't see AMD falling over themselves offering Eyefinity tech, or give Nvidia a blessing to use AMD's GDDR5 memory controller when AMD were first to market with the tech. The difference is that Nvidia-centic posters don't harp on about AMD's proprietry IP.

    There's also the argument that who would foot the bill for PhysX working on AMD cards. Would anyone see AMD giving Nvidia their drivers ahead of release to ensure compatibility with PhysX? And every tiime Nvidia update the PhysX engine(latest build 9.12.0213 released six days ago) are they responsible for ensuring compatibility with AMD drivers?

    AMD "push" open standards because they do software only marginally better than North Korea do democracy
     
  18. Darth Shiv

    Darth Shiv TS Evangelist Posts: 1,620   +376

    Not sure what software you are referring to there. Anyway that is besides the point. AMD has significant GPU marketshare and the argument goes that games developers would benefit more from the competition of the GPU market. To allow this to be a more level playing field, an open physics standard would be beneficial. AMD obviously agrees with this because it helps them gain market share as their products are more on par with the competition wrt features. NVIDIA doesn't because they have the market share to lose and are happy with the situation at the moment.
     
  19. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Here's a nice recent (and current) example:
    AMD tout VCE as a significant part of their GCN architecture...yet have no functioning drivers for the feature. Dev's aren't coding for VCE (maybe ArcSoft?) in part because they have no functioning driver available for Radeon HD 7000 series with which to run/test/implement the feature....and AMD probably wont push implementing the feature until the software library becomes available. Familiar with the phrase "Catch-22"?

    Easy enough to find similar -and more glaring problems- with the late and constantly changing roadmaps associated with AMD's Stream (now APP). If you're looking solely at gaming -a rather narrower emphasis- then you still see less-than-shining examples with AMD's loss of interest in GITG, recent driver woes in Crossfire/GCN, and exactly how long was it between HD 7970/7950 "launch" on 22nd December to the appearance of WHQL drivers ?
    Sounds like an argument worthy of a special needs institution.
    Dev's benefit from sponsorship by companies via SDK's or direct coding or wads of cash.
    Gaming is not dependant upon competition in the GPU market- at least not directly. AMD and Nvidia haven't gone seriously head-to-head in a price war since 2007-08. A few minor skirmishes ( GTX 460 v HD 5830, GTX 260 v HD 4870 etc.) aside, they have managed by some minor miracle-or otherwise- to dovetail price and performance-the exceptions being the top-dog SKU of any generation. All the present/future pricing structure does is to limit uptake of new parts if they are 1. too expensive for the performance gained, and 2. the Osborne effect comes into play if a newer series is imminent
    There are already ample iterations of physics engines in gaming aside from PhysX. Bullet, Havok (Intel) and a whole raft of game specific physics applications.

    The fact also remains that proprietry or company driven standards develop and mature faster than open source as a general rule. Don't agree? then look at the development cycle of OpenGL and lately, OpenCL.
    You know many people that base a graphics card purchase primarily on it's ability to use game physics?
    Marketing bulletpoint only- As are the majority of in-game physics effects.
    With all the anti-nvidia rhetoric I still haven't heard who has the job of debuggiing and code implementation for incorporating into AMD drivers. Would you see this -hypothetically-as something the Nvidia driver team should pick up the tab for ?
    TL;DR : There's no way on Earth that AMD would allow Nvidia anywhere near their driver code, and AMD have zero interest in PhysX in any case.

    Game developers have the choice of what game engine they utlize, and what-if any- physics ettects they incorporate. For the smaller game studio's maybe the incentive is there to add a sponsors codepath...so what's the alternative? No SDK and development help, and late release (or not at all)?...Open source game or no game? I just love the pontificating from on high regarding open standards and the advancement for gaming when AMD folded their tent for 5+ years as far as providing game devs with sponsored support.

    Ever wonder why Nvidia has such entrenched gamer support even in the face of generally more expensive cards that frequently arrive late and use more power? Maybe it's a continued presence within gaming- call it mind share. You can call it marketing, pushing a proprietry standard or any number of other money-speak catch phrases...it still amounts to Nvidia putting funding into gaming when basically no one else was interested.

    PhysX is nothing more than an added level of eye-candy. It changes nothing regarding gameplay. Nvidia's cards' CUDA ability also allows for shader effects such as bokeh filtering- it's an optional extra if you have a card that can process the workload whilst still maintaining a playable framerate -some people would see that as added-value feature, just as Nvidia has offered transparency antialiasing.

    So long as the additions are additions and not handicaps for the opposing teams consumers (ie the AA issue with Batman:AA) then I have no problem with implementing the effects- proprietry or not- and if it means the difference between a game being launched or not then that should be a non-brainer.
     
    BlueDrake likes this.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...