TechSpot

geforce articles

Metro: Last Light Tested, Benchmarked

When the Metro 2033 was released in 2010 it contributed to raise the PC graphics bar making good use of the latest DirectX 11 rendering technologies. Metro: Last Light follows its predecessor roots by using a heavily customized and improved version of the 4A Engine.

Furthermore, the developer has continued to cater to loyal PC gamers who have considerably more power than console gamers at its disposal by including a richer gaming experience visually as well as a benchmark tool for measuring your system's performance.

GeForce GTX 650 Ti Boost Review, SLI Performance Tested

The GTX 650 Ti was our favorite $100 - $150 graphics card last year, as it thrashed the Radeon HD 7770, its direct competitor. Then last month AMD decided to attack the $150 price point with a new HD 7790 GPU, but the reaction didn't take long to arrive.

Just a week later Nvidia officially countered by releasing the poorly named GeForce GTX 650 Ti Boost, now the third graphics card to carry the GTX 650 name. At $170, the GeForce GTX 650 Ti Boost sits between the Radeon HD 7790 and the 7850. In terms of performance, we actually expect the GeForce GTX 650 Ti Boost to be a lot faster than the GTX 650 Ti, even when it's based on the same GK106 architecture.

The History of the Modern Graphics Processor, Part 4: The Coming of General Purpose GPUs

With DX10's arrival, vertex and pixel shaders maintained a large level of common function, so moving to a unified shader arch eliminated a lot of unnecessary duplication of processing blocks. The first GPU to utilize this architecture was Nvidia's iconic G80.

Four years in development and $475 million produced a 681 million-transistor, 484mm² behemoth -- first as the 8800 GTX flagship and then with cards aimed at several segments. Aided by the new Coverage Sample anti-aliasing (CSAA) algorithm, Nvidia saw its GTX demolish every single competitor in outright performance.

The History of the Modern Graphics Processor, Part 3: The Nvidia vs. ATI era begins

With the turn of the century the graphics industry bore witness to further consolidation. Where 3dfx was once a byword for raw performance, its strengths before its dismissal laid in its full screen antialiasing image quality. By the time 2001 dawned, the PC graphics market consisted of a discrete card duopoly (Nvidia and ATI), with both of them in addition to Intel supplying the vast majority of integrated graphics chipsets.

Prior to the Voodoo 5’s arrival, ATI had announced the Radeon DDR as “the most powerful graphics processor ever designed for desktop PCs.” Previews of the card had already gone public on April 25, and only twenty-four hours later Nvidia countered with the announcement of the GeForce 2 GTS (GigaTexel Shader).

The History of the Modern Graphics Processor, Part 2: 3Dfx Voodoo, the game-changer

Launched on November 1996, 3Dfx's Voodoo graphics consisted of a 3D-only card that required a VGA cable pass-through from a separate 2D card to the Voodoo, which then connected to the display. Voodoo Graphics revolutionized personal computer graphics nearly overnight and rendered many other designs obsolete, including a vast swathe of 2D-only graphics producers.

The 3D landscape in 1996 favoured S3 with around 50% of the market. That was to change soon, however. It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign. Later on Nvidia would revive with the RIVA series and eventually land their greatest success with the first GeForce graphics card.

BioShock Infinite Tested, Benchmarked

Three years having passed since BioShock 2 and the dawn of a new console generation on the horizon, BioShock Infinite has taken the opportunity to mix things up. Along with DX11 effects, folks playing on PC can look forward to higher resolution textures and a healthy range of customization over settings like anti-aliasing, texture detail and filtering, dynamic shadows, post-processing, and so on.

Our Bioshock Infinite test comprises 24 DirectX 11 graphics card configurations from AMD and Nvidia covering a wide range of prices, from the affordable to the ultra-expensive.

SimCity Tested, Benchmarked

Normally when we benchmark a first person shooter, finding a good portion of the game to test with is simply a matter of playing through the game until we find a section that is rather demanding. But with SimCity things were considerably more complex and time consuming.

A city with few sims will see graphics cards such as the GeForce Titan or GTX 680 render massive frame rates because they are not being capped by the CPU (yet). As with most simulation and strategy games, SimCity is CPU dependent and overclocking should result in a healthy boost if needed. More inside.

Tomb Raider Tested, Benchmarked

Although this year's Tomb Raider reboot made our latest list of most anticipated PC games, I must admit that it was one of the games I was least looking forward to from a performance perspective because of previous titles' poor showing.

However, we were relieved to learn that Tomb Raider supports DirectX 11, which brings access to advanced rendering technologies such as depth of field, hd ambient occlusion, hardware tessellation, and super-sample anti-aliasing. Additionally, compared to the diluted console versions, the PC build offers better textures as well as AMD's TressFX real-time hair physics system.

Testing Nvidia's $1,000 Graphics Card: GeForce GTX Titan Review

The new GeForce GTX Titan carries a GK110 GPU with a transistor count that has more than doubled from the GTX 680's to a staggering 7.1 billion The part has 25% to 50% more resources at its disposal, including 2688 stream processors (up 75%), 224 texture units (also up 75%) and 48 raster operations (a healthy 50% boost).

It's worth noting that there's "only" estimated to be a 25% to 50% performance gain because the Titan is clocked lower than the GTX 680. Given those expectations, it would be fair to assume that the Titan would be priced at roughly a 50% premium, but that's simply not the case. Nvidia is marketing the card as a hyper-fast solution for gamers with deep pockets, setting the MSRP at a whopping $1,000.

Crysis 3 Tested, Benchmarked

Crytek has given us another opportunity to hammer some hardware with the arrival of Crysis 3. Built with CryEngine 3, the engine has been updated with improved dynamic cloth and vegetation, better lighting and shadows, and plenty more.

Plus, PC gamers won't have to wait for graphical extras. Crysis 3 launched with high-resolution textures, DX11 support and plenty of customization options that set it apart from the diluted consoles builds. The result looks incredible and we get the feeling this will prove to be the game that folks who are heavily invested in multi-GPU setups have been waiting for. Here's hoping we aren't woefully disappointed.

Triple Monitor Gaming on a Budget: SLI vs. Crossfire vs. Single High-end GPU

Considering next-gen cards are still months away, we didn't expect to bring any more GPU reviews until the second quarter of 2013. However, we realized there was a gap in our current-gen coverage: triple-monitor gaming. In fact, it's been almost two years since we last stress tested games at resolutions of up to 7680x1600.

We're going to mix things up a little this time. Instead of using each camp's ultra-pricey dual-GPU card (or the new $999 Titan), we're going to see how more affordable Crossfire and SLI setups handle triple-monitor gaming compared to today's single-GPU flagships.