The History of the Modern Graphics Processor, Part 3: The Nvidia vs. ATI era begins

By Julio Franco
Apr 10, 2013
Post New Reply
  1. cliffordcooley likes this.
  2. Patski

    Patski Newcomer, in training

    Have been enjoying this series of articles thoroughly. Very insightful as I was a young gamer in these times and took less of an interest in the technology and more in just playing games. Looking forward to reading the final one!
  3. highlander84

    highlander84 TechSpot Member Posts: 98   +20

    The pencil trick... Did that a few times...also with the AMD CPU's.... the XP series I think... long time ago...but it did indeed work.
  4. TS-56336

    TS-56336 TechSpot Booster Posts: 571   +98

    ..And here it is, THE FUTURE.
  5. ShadowDeath

    ShadowDeath Newcomer, in training Posts: 51

    These articles have brought back some amazing memories. This was about the time I got into computers in high school. I still remember my Diamond Stealth Pro video card... I remember it vaguely... I also remember my jaw hitting the floor when nVidia bought out 3dFX....
  6. amstech

    amstech TechSpot Enthusiast Posts: 804   +201

    I remember the Radeon DDR and SDR very well. I bought an SDR, the version I got had a picture of a Scorpion on the box. Thats when AMD and Nvidia started thier first official wars.
  7. JC713

    JC713 TechSpot Evangelist Posts: 6,717   +873

    Great read once again.
  8. For such an in-depth 4 part article, they really slouched on one of the very instrumental and upturning part of the graphics industry. Actually they watered it down so much they didn't even bother to suggest how powerful the PowerVR Kyro II (2) was compared to the competition, instead just lumping it in with the Geforce 2 MX lineup and then making a claim that it was only powerful enough to provide low end performance. This is an outright falsehood. The PowerVR Kyro 2, of which the Hercules 4500 was a stellar seller, having made up a good chunk of the low cost gaming market for such a relatively "new" competitor. Being an underdog and having a few scares from the previous problems it had that had required a specially built API to run on it's technology to try and avoid.

    The PowerVR Kyro and most notably the Kyro 2 completely shed all its deficiencies that PowerVR had in previous generation cards. Not only did it work with Direct3D based games and OpenGL out of the box without wrappers to "make" it work. It was the first card that was truly Tile Based and Purely Efficient at it's tasks. It was priced initially below that of the Geforce 2 MX series but after Nvidia's price cuts resulted in being about the same, however of considerable value to note is the fact that it performed equivalent in speed to that of a Geforce 2 GTS and in various scenarios managed to edge out the Geforce 2 Ultra. The biggest value of the Kyro 2 was that it excelled at high resolutions, not suffering the pitfalls of the other 3D technology due to its efficient tile based deferred rendering tech, which only had to work on and present what was visible. The other cards suffered massively as you increased the resolution, and even with their hyper Z tech and various methods to help cull the information that wasn't ever going to be seen, only was able to do so much and wasted a lot of resources to do it.

    Another noticeable advantage is that it was the first product to do pure 32bit coloring even while operating in 16bit mode for legacy games that still used 16bit. At the time it was still something that was ongoing, to play games that only provided 16bit color depth options. In this case the Kyro 2 excelled at producing the best image quality in class.

    Among other things it was the first productive to fully support and operate with Application/Game Profiles that worked. Apply and overriding game/application Graphic settings. Additionally it had the ability to provide what was coined as "Free Trilinear Filtering" along with a very VERY minimal hit for Anisotropic Filtering, something that was relatively new and other products were seeing a larger performance hit by adding.

    The product raised a lot of hoopla over at Nvidia as Nvidia was determined to remain king in almost all the markets, however was not only losing to what was considered to be a much slower, less resource hungry and considerably less powerful product that managed to outperform their low end products, but managed to directly content with their highest end. This resulted in Nvidia producing a set of slides/documents to try and displace and whittle down PowerVR's product as best it can. Only after which all of which was determined to be relatively false and nothing more than being in bad taste PR stunt.

    The true potential of the PowerVR Kyro 2 was only lost due to it's lack of a dedicated Transforming and Lighting capabilities, requiring the CPU to do the work and therefore ensuring the card remained a DX7 compliant GPU. A large group of newly won over PowerVR users waited patiently for a newer DX8 model to arrive, but ST Micro had dismantled that idea.

    For a product that only had about 1/4th of the raw power a Geforce 2 Ultra had, it played toe to toe with it quite a bit, brought active gaming profile creation to the market properly and also made both Nvidia and ATI extremely nervous and jump started their next generation products to be better designed and up to the task of being more efficient as well. It's DEFINITELY not a product that should only have a off hand mention and categorized as just a low end budget card that didn't have any impact on anything at all.
  9. GreasyMcGrimace

    GreasyMcGrimace Newcomer, in training

    My first video card was a 1 meg generic vga. Then I got the Matrox Millennium. Starcraft flew on that. Then that died I got a Voodoo 3000. Then a Nvidia TNT2. Then a Rage 128. Then a Radeon 9800 pro. Half life 2 on that was incredible. Then a Radeon 4850. Now I got a Nvidia 580. What's next???
  10. Geforcepat

    Geforcepat Newcomer, in training Posts: 60

    Thanks for the info.
  11. dividebyzero

    dividebyzero trainee n00b Posts: 4,783   +639

    Really? I don't actually see that claim in the article. The paragraph concerning the Kyro II reads:
    The price point suggests a card aimed at the mainstream (or lower) user where a lower screen resolution is likely to have been employed, and as such acquitted itself well against its competition ( ultimate hardware's Kyro II review vs the MX400 >> here<< using resolutions up to 1024x768.

    Various scenario's I think revolved around Serious Sam. From Anandtech's Kyro II review for example:
    [​IMG]

    Well, firstly, your post has rectified the first point, and secondly, the article did not say the Kyro was "just a low end budget card that didn't have any impact on anything at all"
    I'd hope that you would appreciate that 20,000 words aren't going to do justice to every decent graphics product, and every company, and every technology developed over the course of fifty years. The articles' main arc concentrates on how we got to the PC graphics landscape of today. If tile-based rendering had become the dominant form, then rest assured that VideoLogic and PowerVR would have more words devoted to them.
    cliffordcooley likes this.
     
  12. EEatGDL

    EEatGDL TechSpot Booster Posts: 256   +45

    In the same situation of Patski. And were those 20 thousand words? Didn't notice; very good articles. I've read completely part 1 and 3, I'll finish part 2 when I have a chance.

    Of course the article can't do justice to everyone, it's a huge market and many years; for narrative purposes it can't much from the central story indicated by the title.
  13. Most of the initial reviews showed the Kyro 2 not anywhere near its peak being a relatively new product, I guarantee you having had both products, initial release testing wasn't exactly as polished as it later would be.

    But as I mentioned, CPU power was a massive factor in the overall performance of the Kyro 2 versus the T&L equipped competition. This was the only significant result of the Kyro 2's pitfalls.

    It's hard for reviews to cover a huge swath of games and benchmarks, much of which significantly improved when
    PowerVR
    's driver development over time always brought about very substantial improvements. Still being relatively new to the game and being as small as they were compared to the giants. As I said, it'd played ball very well with the GeForce 2 GTS quite often, and occasionally went toe to toe with the ultra. It was a relatively rare thing to find performance levels dropping down to even the GeForce 2 mx400 range though.

    The wording in the article leaves a little bit to be desired as for clarification, it's worded in a way that sounds like it really didn't pose any significance at all on the market. And while I don't expect in-depth analyze of each specific product. I'd say it deserves bit more than it got.

    In fact I hope to see more
    PowerVR
    info showing up in the 4th article considering that tile based rendering did proceed. Least they did mention that in this article regarding the where
    PowerVR
    went after the Series 4 was dumped and the Series 5 onwards was directed to portable devices.

    I'd post some of the old reviews and benchmark results concerning the Kyro 2's performance as it matured, but almost all of the old websites and images and data is entirely nonexistent after a 10 year run. Like I said, the major performance impact came when dx8 started to heavily use Hardware T&L making it a requirement, it's just really unfortunate that the series 4 and ST micro had dropped the ball as
    PowerVR
    was the ones to help ratify the newer T&L Shaders and dx9 model, Also the first to demonstrate it too.
  14. The AMD-ATI era... two of my favorite companies together, I never expected that back then...
  15. This article kind of disappointed me, it's not nearly as descriptive and unbiased as the previous two.

    "The 6800 GT generally bested the X800 Pro at $399"

    What?!?! No it wasn't, I specifically remember the GeForce 6800 GT being a little bit faster then the X800 Pro at any resolution it was only when AA AND AF was enabled did it fall behind in SOME games.

    I have a GeForce 6800 GT and Athlon XP 2800+ @ 2.5GHz with 1 GB so I know exactly how it performs...
  16. Mikymjr

    Mikymjr TechSpot Enthusiast Posts: 122   +7

    Interesting ^^ It's been a long while since I saw these cards again in these pictures =)
  17. dividebyzero

    dividebyzero trainee n00b Posts: 4,783   +639

    Undoubtedly. The only problem was that a couple of months after the Kyro II launched (and the drivers offered a better experience), Nvidia and ATI also had more mature drivers...and of course, Nvidia also had the GeForce3 Ti 200 in the market:
    [​IMG]


    A lot of the old reviews and benchmarks are available- you just need to know where to look- although many more reviews were available in the print media. The above charts are from the August 2001 roundup of GF3 cards at Anandtech.

    "Generally" is a generalization based on the likely usage pattern for a $400 graphics card. You dispute the use of "generally" and then say that the 6800GT was a "a little bit faster". I'd call that hair splitting. BTW, the $399 6800GT of course refers to the 256MB version of the card and not the 128MB should there be any confusion.
    a random look at 6800GT vs X800 PRO comparisons courtesy of Google
    Hot Hardware
    Hexus
    Trusted reviews
    cliffordcooley likes this.
  18. LOL I wasn't reading it right, I thought it said the GeForce 6800 GT generally bested BY the X800 Pro.

    whooops
     
  19. Great stuff! Can't wait for the next article where we go into the "modern era of graphics cards".
  20. Littleczr

    Littleczr TechSpot Booster Posts: 368   +68

    When 4k panels start to become popular, I think gamers will have to use SLI or Crosfire. A single card cannot keep up the the graphical demands.
  21. GreasyMcGrimace

    GreasyMcGrimace Newcomer, in training

    Don't you think those 2014-15 .20nm cards with ARM 64 bit CPUs will handle 4k by then
  22. Boilerhog146

    Boilerhog146 TechSpot Member Posts: 67

    Can't wait for the next installment. had an old Trident ISA card in my hand today,lol with my A
    MD k6/266,upgraded to a 32 meg all in wonder I found for 90 bucks in a pawnshop... still have and use my 64 meg aiw...1 gig p3 in my ibm Aptiva from 1998.. great read..
  23. Boilerhog146

    Boilerhog146 TechSpot Member Posts: 67

    My 2 x 512 meg BFG tech 6800 ultra with t-shirt and pretty blue fan . with exchange rate in Canada .cost me 1299.00 + tax over 2800 after said and done. receive them 4 days before 7800 gtx hit announcement and shelves ,ending paper launches for maybe first time ever.still have those..
  24. hahahanoobs

    hahahanoobs TechSpot Booster Posts: 942   +94

    "ATI bounced back with a $35.2 million profit in 2003 after posting a $47.5 million loss in 2002. A good chunk of this came from higher selling prices for the dominant 9800 and 9600 cards."

    To this day, AMD fanboys still don't get it. AMD does NOT prefer to be the "price/performance" guys. Look at the release of the 5870 as the most recent example. The card was great, nVIDIA was late, and the infamous "Wood Screws holding the I/O plate to the PCB" Fermi GPU didn't live up to nVIDIA's expectations, and AMD capatilized on that with $500+ MRSP's with their flagship GPU. AMD lowers their prices when nVIDIA does well because they HAVE to if they want any revenue, even if it means lower profits.

    "Meanwhile, Nvidia retained 75% of the DirectX 9 value segment market, thanks to the popularity of the FX 5200."

    <3 My first video card! There's nothing like playing Doom 3 and Far Cry on Low Detail and no AA.
    *sniff*
  25. Learning a lot from these articles so thanks very much for posting them.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.