TechSpot

geforce articles

The History of the Modern Graphics Processor, Part 4: The Coming of General Purpose GPUs

With DX10's arrival, vertex and pixel shaders maintained a large level of common function, so moving to a unified shader arch eliminated a lot of unnecessary duplication of processing blocks. The first GPU to utilize this architecture was Nvidia's iconic G80.

Four years in development and $475 million produced a 681 million-transistor, 484mm² behemoth -- first as the 8800 GTX flagship and then with cards aimed at several segments. Aided by the new Coverage Sample anti-aliasing (CSAA) algorithm, Nvidia saw its GTX demolish every single competitor in outright performance.

The History of the Modern Graphics Processor, Part 3: The Nvidia vs. ATI era begins

With the turn of the century the graphics industry bore witness to further consolidation. Where 3dfx was once a byword for raw performance, its strengths before its dismissal laid in its full screen antialiasing image quality. By the time 2001 dawned, the PC graphics market consisted of a discrete card duopoly (Nvidia and ATI), with both of them in addition to Intel supplying the vast majority of integrated graphics chipsets.

Prior to the Voodoo 5’s arrival, ATI had announced the Radeon DDR as “the most powerful graphics processor ever designed for desktop PCs.” Previews of the card had already gone public on April 25, and only twenty-four hours later Nvidia countered with the announcement of the GeForce 2 GTS (GigaTexel Shader).

The History of the Modern Graphics Processor, Part 2: 3Dfx Voodoo, the game-changer

Launched on November 1996, 3Dfx's Voodoo graphics consisted of a 3D-only card that required a VGA cable pass-through from a separate 2D card to the Voodoo, which then connected to the display. Voodoo Graphics revolutionized personal computer graphics nearly overnight and rendered many other designs obsolete, including a vast swathe of 2D-only graphics producers.

The 3D landscape in 1996 favoured S3 with around 50% of the market. That was to change soon, however. It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign. Later on Nvidia would revive with the RIVA series and eventually land their greatest success with the first GeForce graphics card.

BioShock Infinite Tested, Benchmarked

Three years having passed since BioShock 2 and the dawn of a new console generation on the horizon, BioShock Infinite has taken the opportunity to mix things up. Along with DX11 effects, folks playing on PC can look forward to higher resolution textures and a healthy range of customization over settings like anti-aliasing, texture detail and filtering, dynamic shadows, post-processing, and so on.

Our Bioshock Infinite test comprises 24 DirectX 11 graphics card configurations from AMD and Nvidia covering a wide range of prices, from the affordable to the ultra-expensive.

SimCity Tested, Benchmarked

Normally when we benchmark a first person shooter, finding a good portion of the game to test with is simply a matter of playing through the game until we find a section that is rather demanding. But with SimCity things were considerably more complex and time consuming.

A city with few sims will see graphics cards such as the GeForce Titan or GTX 680 render massive frame rates because they are not being capped by the CPU (yet). As with most simulation and strategy games, SimCity is CPU dependent and overclocking should result in a healthy boost if needed. More inside.

Tomb Raider Tested, Benchmarked

Although this year's Tomb Raider reboot made our latest list of most anticipated PC games, I must admit that it was one of the games I was least looking forward to from a performance perspective because of previous titles' poor showing.

However, we were relieved to learn that Tomb Raider supports DirectX 11, which brings access to advanced rendering technologies such as depth of field, hd ambient occlusion, hardware tessellation, and super-sample anti-aliasing. Additionally, compared to the diluted console versions, the PC build offers better textures as well as AMD's TressFX real-time hair physics system.

Testing Nvidia's $1,000 Graphics Card: GeForce GTX Titan Review

The new GeForce GTX Titan carries a GK110 GPU with a transistor count that has more than doubled from the GTX 680's to a staggering 7.1 billion The part has 25% to 50% more resources at its disposal, including 2688 stream processors (up 75%), 224 texture units (also up 75%) and 48 raster operations (a healthy 50% boost).

It's worth noting that there's "only" estimated to be a 25% to 50% performance gain because the Titan is clocked lower than the GTX 680. Given those expectations, it would be fair to assume that the Titan would be priced at roughly a 50% premium, but that's simply not the case. Nvidia is marketing the card as a hyper-fast solution for gamers with deep pockets, setting the MSRP at a whopping $1,000.

Crysis 3 Tested, Benchmarked

Crytek has given us another opportunity to hammer some hardware with the arrival of Crysis 3. Built with CryEngine 3, the engine has been updated with improved dynamic cloth and vegetation, better lighting and shadows, and plenty more.

Plus, PC gamers won't have to wait for graphical extras. Crysis 3 launched with high-resolution textures, DX11 support and plenty of customization options that set it apart from the diluted consoles builds. The result looks incredible and we get the feeling this will prove to be the game that folks who are heavily invested in multi-GPU setups have been waiting for. Here's hoping we aren't woefully disappointed.

Triple Monitor Gaming on a Budget: SLI vs. Crossfire vs. Single High-end GPU

Considering next-gen cards are still months away, we didn't expect to bring any more GPU reviews until the second quarter of 2013. However, we realized there was a gap in our current-gen coverage: triple-monitor gaming. In fact, it's been almost two years since we last stress tested games at resolutions of up to 7680x1600.

We're going to mix things up a little this time. Instead of using each camp's ultra-pricey dual-GPU card (or the new $999 Titan), we're going to see how more affordable Crossfire and SLI setups handle triple-monitor gaming compared to today's single-GPU flagships.

Far Cry 3 Tested, Benchmarked

Like the original game, Far Cry 3 is set on a tropical island, this time found somewhere at the intersection of the Indian and Pacific Oceans. In typical TechSpot fashion, we'll be testing Far Cry 3's open world environment using 29 DirectX 11 graphics cards from AMD and Nvidia across all price ranges.

This new game is built using an advanced version of the Dunia engine called Dunia 2, which is said to feature new water rendering technology, a realistic weather system, advanced AI technology, a new animation system, realistic facial expressions, motion capture technology and global illumination -- many of which are made possible by the game's adoption of DirectX 11 and can only be experienced on the PC version.

The Best Graphics Cards: Full AMD and Nvidia GPU Comparison with Latest Drivers

After a busy year with numerous GPU releases by mid-September things had settled down for good. And then, AMD threw us a curve ball. Their Catalyst 12.11 beta drivers delivered major performance gains in many popular games such as Battlefield 3, Borderlands 2, Civilization V, Skyrim, Sleeping Dogs and StarCraft II. Around the same time, Nvidia released a new beta driver of its own which claimed gains in several titles, and this driver has since been replaced by the GeForce 310.61 update, which made further performance enhancements.

With updated pricing and performance across the board, we figured it would be worth revisiting both company's offerings to see where you should spend your hard-earned cash this holiday season and into early next year.

Call of Duty: Black Ops II Tested, Benchmarked

For Call of Duty fans, developer Treyarch just delivered an early Christmas present when they released Black Ops II this week. As the ninth game in the Call of Duty franchise and the sequel to the 2010 game Black Ops, we are hoping to see something meaningfully new from Black Ops II.

But as usual, our main concern from a performance article perspective has to do with the game engine which has been slow to evolve over the years. The key changes to the game engine include a new technology called "reveal mapping" and HDR lighting. On paper the upgrade also calls for the move to the DirectX 11 API for the PC version of the game. This means PC gamers should enjoy better visuals when compared to those using console versions.