Gigabyte GeForce GTX 570 Review

By on December 7, 2010, 8:14 AM
Nvidia's GeForce GTX 580 may be the current king of the hill, but this could change before the end of the year when AMD launches their new Radeon HD 6900 series. So with the real high-end battle still to take place Nvidia is not sitting around waiting for AMD to strike back. Instead, they are getting ready to release their second GF110-based product. The GeForce GTX 570 should be hitting shelves shortly, featuring similar specifications to that of the GTX 480 but with a lower price at around $350. As the GeForce GTX 470 was to the GTX 480, the GTX 570 is a cut down version of the recently released GTX 580. Thermals and operating efficiency will remain much the same, as will the features. What has changed is the core configuration, operating frequencies, and memory bus width, all of which have been slightly reduced.
The GeForce GTX 570 is likely targeting the yet to be released AMD Radeon HD 6950 – which we still know little about. For now, though, let's take a closer look at the GeForce GTX 570 we have in our hands and find out how well it performs. Read the complete review.




User Comments: 34

Got something to say? Post a comment
Steelhedgehog said:

Looks very sweet. I wish I had one.

SilverCider said:

Nvidia are being brave putting this out before AMD release their newcomings! Good on them I say Power draw is getting much better for Nvidia now XD

killerclown96 said:

im gobsmacked!!!!This is my new Christmas presant

Guest said:

Seemed a bit biased. Why was there no comparison to a 460 sli setup? Why does he keep saying the amd 6900 series cards are 'weeks' away when it has been known for weeks that they are out on Dec 15th?

I'm glad Nvidia and AMD keep forcing each other to produce faster/cheaper cards. I own both and always base my purchasing decisions on which has the best bang for the buck on the day I need it. I buy in the just under $300 range so the 570 is just a bit high for me at the moment, but it looks like a card I might be interested in in the new year. I hope amd beats it on price, if not performance, with the 6950 next week. Keep the rivalry alive.

Anyway, just posted because of the irritation factor. It seemed like the author might have been paid to leave out or fudge a few important details about price performance competitive setups and amd's release date next week.

My two bits.

madboyv1, TechSpot Paladin, said:

Guest said:

Seemed a bit biased. Why was there no comparison to a 460 sli setup? Why does he keep saying the amd 6900 series cards are 'weeks' away when it has been known for weeks that they are out on Dec 15th?

The only solutions that are going to give the GTX 570 a run for its money are Crossfire or SLI setups using Radeon HD 6870 or GeForce GTX 460 graphics card configurations.

They may not have been benchmarked, but they are at least mentioned. It's only a matter of looking for another review with a similar test bed to make comparisons.

And the whole week/weeks thing is basically semantics. =p\

TomSEA TomSEA, TechSpot Chancellor, said:

I'm with Guest. I think GTX 460 SLI should be a standard in card comparisons these days. It's become one of the most popular configurations if not the most popular in the history of PC gaming builds. I've seen GTX 460 cards in the $150 range now making two of them $50 cheaper than this 570. And I'll bet dollars to donuts that the 460 SLI rig provides better numbers.

Just sayin'...

Alster37 Alster37 said:

oh I wish it was a lil cheaper, might be a stretch. It is a very powerful card though.

Xero07 said:

Nice card. Starting off with a single cards only review makes sense to me. The majority of video card owners still use single card solutions over sli or cross fire despite their increase in popularity. SlI/crossfire reviews can come later to help establish the best options for those who would consider such a route.

Cueto_99 said:

Xero07 said:

The majority of video card owners still use single card solutions over sli or cross fire despite their increase in popularity.

It's true, althought I believe this trend is changing with time; as you mention, this technologies (SLI/XFire) have increase in popularity, probably because most motherboards have at least both Pci-express x16 slots, in fact, I barely see a motherboard which sports only one slot, most of them being mini-Itx... with this in mind, not only enthusiast, but also mainstream, budget minded gamers choose to upgrade by buying another middle class card, and not spending a treasure buying the top of the line card...

About the 570... Hell of a good card... If I had the money and my board was Sli compatible It would be on my christmas list... I'll have to stick with my pair of 4830s for some time...

Johny47 said:

Good review but except for maybe the hotter temperatures while playing games for a long time I'm still happy with my 470 and don't regret buying one about 3 weeks after release =P

dividebyzero dividebyzero, trainee n00b, said:

Seemed a bit biased. Why was there no comparison to a 460 sli setup?

I'm with Guest. I think GTX 460 SLI should be a standard in card comparisons these days.

mmm...probably because LGA775 is still the most dominant socket for the majority of gamers. Nvidia chipsets aside, do you see many SLI-capable 965P, P35, P45, X38 and X48 boards around?....and the percentage of SLI capable AMD chipsets ?

Like it or not the majority of gamers still use one physical discrete card. I'm sure both TS and various other sites will in due course cater for the non-mainstream GPU variations

It's become one of the most popular configurations if not the most popular in the history of PC gaming builds?

Because you run the same setup ? The GTX 460 SLI still has a long, long way to go before it reaches the iconic status that the 8800GT/8800GTS (512Mb) SLI enjoys

Why does he keep saying the amd 6900 series cards are 'weeks' away when it has been known for weeks that they are out on Dec 15th?

You mean where it says....

...or did you miss that due to mental bandwidth saturation before getting to page thirteen ?

And I'll bet dollars to donuts that the 460 SLI rig provides better numbers.

Just sayin'...

Gaming is about smooth gameplay not numbers Tom. From Kyle Bennett's GTX 580 v GTX 460SLI review:

In all of our gameplay testing, we were surprised how well GTX 460 1GB SLI keeps up with the more expensive GeForce GTX 580. While it comes close, and at times exceeds it in framerate, it doesn't deliver the same gameplay experience

...Kyle's Just sayin'...

red1776 red1776, Omnipotent Ruler of the Universe, said:

Xero07 said:

The majority of video card owners still use single card solutions over sli or cross fire despite their increase in popularity.

thats true, the last numbers I saw had crossfire users at under 3% triple and quad numbers were a fraction of a percent.

TomSEA TomSEA, TechSpot Chancellor, said:

LOL...you're judge and jury now divide?

I'm asking for the 460 SLI to be included because it is a dominant build in today's PC gaming rigs. Why else would you see a number of websites (including the one you referenced), doing the "460SLI vs X card" reviews? Go look at the literally thousands of "I have a 460SLI rig" postings in NewEgg and TigerDirect reviews as well as tech forums throughout the Internet. The point being is that you can go the SLI route and save yourself a considerable chunk of change for near similar performance

Which was quite emphatically pointed out in Kyle's review (which you conveniently neglected to also cut and paste):

"If you are on a budget, and want to get into the NV Surround and/or SLI game, there is no questioning that GeForce GTX 460 1GB SLI is the best value. It provides high framerates and incredible SLI scaling. We are continued to be impressed by GeForce GTX 460 1GB SLI."

As long as you can do the 460 SLI for 300 bucks and it throws up the numbers it does, then there is no reason why it shouldn't be compared to the $350 and higher cards as comparison for bargain hunters.

Staff
Steve Steve said:

Tom if we still had all the cards we would have included the setup, its hard to keep so many graphics cards on hand.

[link]

That said if you are really keen to work out how they stack up most of the numbers from here will make for an apples to apples comparison for you.

dividebyzero dividebyzero, trainee n00b, said:

LOL...you're judge and jury now divide?

Simply pointing out the factual numbers Tom

I'm asking for the 460 SLI to be included because it is a dominant build in today's PC gaming rigs

Obviously....thats why the GTX 460 makes up 1.32% of cards in the Steam HW survey...what percentage of the 1.32% do you think are SLI'ed ?

Why else would you see a number of websites (including the one you referenced), doing the "460SLI vs X card" reviews?

Uh, maybe because it's a combination that can be used in a system build. Using that argument is somewhat flawed considering the same websites also review GTX 480 SLI ( here, here, here, here, here, here, here, and here to show but a few -and not including the copious 3-way and 4-way SLI reviews...or are these dominant builds also?

Go look at the literally thousands of "I have a 460SLI rig" postings in NewEgg and TigerDirect reviews as well as tech forums throughout the Internet

Because as we all know, these postings are very reliable. I'd probably also say that going by forums/tech sites that of the (hypothetical number) 30,000 GTX 480's built, at least 50,000 are in SLI/3SLI/4SLI rigs.

BTW Newegg and TD are showing a total of 1264 combined reviews

The point being is that you can go the SLI route and save yourself a considerable chunk of change for near similar performance

No argument there Tom. What I am pointing out is that a very large proportion of gamers are unable to use SLI

Which was quite emphatically pointed out in Kyle's review (which you conveniently neglected to also cut and paste):

I didn't use it Tom because it isn't germane to the point I'm making. See above.

As long as you can do the 460 SLI for 300 bucks and it throws up the numbers it does, then there is no reason why it shouldn't be compared to the $350 and higher cards as comparison for bargain hunters.

Kind of cuts TS out of another review down the track wouldn't you say?

If TS also included HD 5850/6850/5870/6870 CFX (and 3CFX/4CFX where applicable) and GTS 450/GTX 470/480/570/580 SLI (and 3SLI/4SLI where applicable) Techspot's Steve Walton and the editorial staff could then save themselves some considerable time by condensing all the reviews into one- thereby being able to devote the entire front page to the really pressing issues of ongoing piracy and infiltration by Eastern Europeans and the mainland Chinese?

TomSEA TomSEA, TechSpot Chancellor, said:

Understood Steve - thanks.

Guest said:

This just confirms the death of the GTX480.

For $50 cheaper, not only do you get the same or in some cases exceed the performance with a GTX570, but it runs cooler and uses less power.

But I'm also waiting to see the next AMD cards for comparison.

Otherwise, I think I might just go with the 570. It's a much better value than the 580

DokkRokken said:

Amazing performance for the dollar here, Now, here's hoping some people start selling their 470's for these, so I can get a nice SLI/Surround setup going.

Relic Relic, TechSpot Chancellor, said:

Good review Steve & TS, wish you guys did have a few more variations to compare against but it is understandable that keeping all those cards around and time requirements aren't always feasible. Curious to see what the 6950/6970 has to offer now next week.

Guest said:

This just confirms the death of the GTX480.

My thoughts exactly, I guess Nvidia just wants to put the 4xx series to rest and they are doing a great job.

TeamworkGuy2 said:

Nice review, good job Steven and TS.

The 570 looks like a good card, alot better than the 470, even a little bit better than the 480.

compu4 said:

I own Metro 2033, and the game most certainly has a built in benchmark. It was included with the free Ranger DLC Pack. It is in the steamapps\common\metro2033 folder and is called Metro 2033 benchmark.exe.

Staff
Steve Steve said:

I own Metro 2033, and the game most certainly has a built in benchmark. It was included with the free Ranger DLC Pack. It is in the steamapps\common\metro2033 folder and is called Metro 2033 benchmark.exe.

It most certainly didn't when we began benchmarking with it, our benchmarking brief for the game was written when we first started testing with it and since then gets copied and pasted into every graphics card review.

We will continue to test using Fraps rather than the benchmark that 4A Games added 6 months after release as we prefer this method to canned benchmarks.

dividebyzero dividebyzero, trainee n00b, said:

Curious to see what the 6950/6970 has to offer now next week.

Judging by the collective mood swing and pricing leaks on the web, I'd probably say that the refresh season is a re-run of the Evergreen v Fermi show. That is to say...

HD 6950 < GTX 570 < HD 6970 < GTX 580 < HD 6990 (somewhat late to the party-Q1 2011)

AMD-centric sites (i.e. Beyond3D etc.) now seem fixated on Cayman "winning" on die size and performance/mm˛ and less "GTX 580 killer" rhetoric.

Taking into account the first-cab-off-the-rank over exuberant pricing ( fwiw OC.UK spilt the beans on pricing a week ago ~£350 HD 6970 and ~£250 HD 6950) and factoring in the increased cost of the boards (2Gb of 6GHz GDDR5), the original estimates* circulating on the net seem more plausible by the day. If the HD 6970 was the "GTX 580 killer" that many AMD-philes/fanboys were expecting I don't think the launch date would have slipped from the 22nd November (original NDA expiry date) to 15th December (present NDA expiry date)

* roughly 380-400mm˛ die size, 1536 shaders (1408 for HD 6950), 900MHz core (800 for HD 6950), 32 ROP, 96 Texture units, <225w TDP (<200 for HD 6950)

Andrek Andrek said:

So this is now the fastest Graphics Card?

Staff
Steve Steve said:

So this is now the fastest Graphics Card?

Umm no. That would be either the GeForce GTX 580 or Radeon HD 5970.

fpsgamerJR62 said:

As much as I would like to do SLI, I can't because I have an AMD processor running on a board with an AMD chipset. The last time I checked, there were only 2 motherboard models left which featured a NForce 980A chipset capable of handling modern AM3 processors and both models are not available where I live. So, single GPU configs are more relevant to me.

captaincranky captaincranky, TechSpot Addict, said:

I own Metro 2033, and the game most certainly has a built in benchmark. It was included with the free Ranger DLC Pack. It is in the steamapps\common\metro2033 folder and is called Metro 2033 benchmark.exe.
I've seen "Metro 2033" referred to as, "the worst coded game out there".

So here's my two part question; Is this true...?

If it is, then, "why is "Metro 2033" a benchmark? Shouldn't it be rewritten?

It does seem silly, forcing manufacturers to create hardware based on the ability to run poorly coded, bloated software.

Or perhaps the hardware makers should adopt this ability into their advertising, "Or VGAs are soooo fast, they'll even run this pig"?

Then we might not have every juvenile delinquent on the internet, posting the same tired crap wondering, "Will this supercomputer run Crysis"? (Meanwhile, each one is thinking that's never been said before).

(You can substitute "Metro 2033" into that last question. I'm sure the herd will catch on to using, "Metro 2033" to say witty things about supercomputerrs eventually)...:yawn:

dividebyzero dividebyzero, trainee n00b, said:

A poorly coded game - or in Metro's case, basically a straightforward game that had advanced graphics features "tacked on" ( tessellation, Depth of Field etc.) when the game was essentially finished*- often makes the best benchmark. A lack of optimization can then stress every component -GPU, it's interconnect (PCI bus) with CPU, scheduling, frame buffer in both vRAM and system RAM, and of course the resultant power draw/stability/heat production and dissipation from all the activity.

*Metro 2033 was developed by part of the team that worked on the original S.T.A.L.K.E.R. game, the latter (X-Ray engine) is also a real handful most the majority of graphics- partly because X-Ray was never intended to be able handle or incorporate the amount of code that GSC managed to shoehorn into the later games (which are graphically very similar to Metro), and partly because of the sheer amount of shader and signal processing options available to in-game settings. It's probably no coincidence that Metro's 4A game engine bears more than a passing similarity to the X-Ray engine

A well coded game such as most that use the more "polished" Unreal engines usually results in framerates so high that the graphics card is essentially waiting upon system limitations (with high-end cards) or a lack of frame buffer, vRAM or core speed ( lesser specced cards) resulting in card components sitting relatively idle during the frame rendering process.

Probably a lot more than you wanted to know (by about three paragraphs!) but I'm sure there are others out there wondering why certain games tend to used as benchmarks/stress tests while some games are not.

To answer the first question you posed....No and Yes. The game can be played relatively successfully on lower graphical settings by a good percentage of gaming systems -it's not Minesweeper by any means, but still offers a playable game experience for many. The game is also not plagued by bugs that lead to lock-ups or crash-to-desktop, nor broken story lines-which I would also consider a hallmark of a poorly-coded game, however....adding in the image quality settings that were late additions to the game engine and then subjecting them to 4x MSAA will very quickly escalate the cards workrate.

If you're a horror-survival fan of a post-apocalyptic world full of mutants I'd give it a go....if you don't get enough of that sort of thing going to the grocery store that is.

captaincranky captaincranky, TechSpot Addict, said:

If you're a horror-survival fan of a post-apocalyptic world full of mutants I'd give it a go....if you don't get enough of that sort of thing going to the grocery store that is.

"Mutant", is that any thing like a "crack head"?

As to "grocery store", no sweat. You just wait til the gunfire dies down, then make a run for it....

red1776 red1776, Omnipotent Ruler of the Universe, said:

A poorly coded game - or in Metro's case, basically a straightforward game that had advanced graphics features "tacked on"

I hate to have you expound further than your "three paragraphs" Chef, however you may be able to answer this. i use a pedestrian definition of "poorly coded". If it returns a disproportionately bad performance for the hardware thrown at it, i call it poorly coded. When you said "tacked on" graphics features, what exactly is happening there to cause terrible performance? Is it that is working with a separate coding 'loop(s)' for the advanced features and it takes exponential resources compared to it not being in the same loop? I would really like to know, as I wouldn't know bad software code by looking at it if it bit me in the ***.

captaincranky captaincranky, TechSpot Addict, said:

If it returns a disproportionately bad performance for the hardware thrown at it, i call it poorly coded. When you said "tacked on" graphics features, what exactly is happening there to cause terrible performance? Is it that is working with a separate coding 'loop(s)' for the advanced features and it takes exponential resources compared to it not being in the same loop? I would really like to know, as I wouldn't know bad software code by looking at it if it bit me in the ***.

Just get a copy of Adobe Photoshop Elements 5, then compare it to versions 6 or later. This will familiarize you with what poorly coded software is all about..

PSE 5 will import photos into its organizer at about a 5:1 ratio over the later programs, which won't even fully generate thumbnails on the fly. You couple that with a**h*** s*** like face recognition, which like many other features in this program are just poured over the top of old code, like so much "adobe" mud. I wonder if that's where they got their name.

In any event, PSE is now a well over 1GB download, and is being programed in some 3rd world reform school.

If a game meets these basic criterion, then I'd probably brand it "poorly coded" also.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Just get a copy of Adobe Photoshop Elements 5, then compare it to versions 6 or later. This will familiarize you with what poorly coded software is all about..

PSE 5 will import photos into its organizer at about a 5:1 ratio over the later programs, which won't even fully generate thumbnails on the fly. You couple that with a**h*** s*** like face recognition, which like many other features in this program are just poured over the top of old code, like so much "adobe" mud. I wonder if that's where they got their name.

In any event, PSE is now a well over 1GB download, and is being programed in some 3rd world reform school.

If a game meets these basic criterion, then I'd probably brand it "poorly coded" also.

I wondered about the Adobe programs. I purchased the Adobe creative suite for school and a writing position I landed, and noticed that InDesign Cs5 takes 9 minutes to open on the college workstations, and several seconds to render a circle with a stroke and fill.

dividebyzero dividebyzero, trainee n00b, said:

I hate to have you expound further than your "three paragraphs" Chef, however you may be able to answer this. i use a pedestrian definition of "poorly coded". If it returns a disproportionately bad performance for the hardware thrown at it, i call it poorly coded. When you said "tacked on" graphics features, what exactly is happening there to cause terrible performance? Is it that is working with a separate coding 'loop(s)' for the advanced features and it takes exponential resources compared to it not being in the same loop? I would really like to know, as I wouldn't know bad software code by looking at it if it bit me in the ***.

I'm no software coder either, but as a "for instance"- field of view (FOV) in Metro is spread over three seperate configuration files (plus an overarching .cfg file in Appdata from memory) - I've had to alter each file when setting up customers' Eyefinity/Surround settings esp. using 16:10 monitors in the past -since patched I presume.

In general I'd look at the overlays of advanced DoF over soft particles and HDR type scenario - all then subjected to multisampled AA- so not poorly coded per se, but poorly optimized for present hardware given the fact that the game IQ settings options are fairly minimal to say the least. A better option list in choosing IQ effects rather than default/hidden values or basic switches* (AAA or 4xMSAA for example) would go a long way to alleviate this- although the effects seem hardcoded into the game, hence the limited user settings.

To my way of thinking, I would categorize "poorly coded" to include any game that is unplayable at the highest in-game settings by virtually every system in existance. The game is likely to be playable (with all bells and whistles) by a single GPU card once we get to Southern Islands/Kepler on 28nm -about a year away...that is not going to help sales of the game in 2010. Indeed the game is usually found in the bargain bin primarily because the system requirements are high (my version lists E8300/Phenom II X2 550 and GTX260/HD 4870 as recommended) and in large part to the negative press the game has received over it's playability- not it's content (which personally I think is very good for a linear shooter).

Crysis required a GPU generation built a year after game launch to be fully playable at HD (or better) level. I would hazard a guess that the games iconic status is due more to it's "unplayability" than it's popularity in sales.

* some of the graphical options built into Metro 2033 can be seen on page 4 of this interview with Oles Shishkovstov

Guest said:

just got 1 of these very very happy, to replace my 5850 which was ok , skipped the shambles that was 400 series, back to nvidia very happy!

running on a ex58 ud4p board

i7 chip

12g

blah blah bits and pieces :)

this card has made my led monitor really come to life , tv,games ,movies etc!

good to be back with Nvidia!!!!

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.