Nvidia GeForce GTS 450, GeForce GTS 450 SLI and Palit GTS 450 Sonic Platinum Review

Julio Franco

Posts: 9,092   +2,043
Staff member
Nvidia is keen to point out that they are targeting gamers on a budget with the new GeForce GTS 450. Those wanting to upgrade to a relatively inexpensive DX11-capable graphics card and using a monitor smaller than 24" (resolutions of 1280x1024 to 1680x1050), Nvidia says the GTS 450 will deliver in those fronts. Akin to the release of the GTX 460, card availability should be strong at launch with factory overclocked versions of this GPU expected to be present from day one.

Read the full review at:
https://www.techspot.com/review/315-nvidia-palit-geforce-gts-450/

Please leave your feedback here. Thanks!
 
Well this isn't good. Someone correct me if I am wrong here...(Hi Chef):wave:
If this is shaping up like it appears, this all but guarantees that GPU prices are going to be sky high for a good while.
We have Nvidia still trying to get out a semblance of a complete VGA lineup. The GTX 465 is done, (or at least makes absolutely no sense) The 470 is rumored to be on its way out. That leaves the GTX&S 450/460/and 480. The later appeals to only a very small percentage of the market, and has its heat/Power consumption challenges weather real or perceived (personally I have built with them and the card is a monster and works perfectly well). That leaves two cards to choose from from Nvidia for the rest of the spectrum, and for the enthusiast segment, the GTX 460 is limited to 2x SLI at that.
Meanwhile if the reports, rumors, and alleged leaked benches are accurate, Radeon is 30-60 days from releasing the Northern islands 6000 series that will lead performance wise in every segment before Nvidia can get a full line of Fermi 1 non-OEM desktop VGA's assembled. This does not appear as if it is going to be a scenario that is conducive to competitive pricing for quite some time to come...Rats!

**before some visceral yahoo screams fanboy, try actually reading the content...its about VGA pricing and competition, not brand preference....thank you and have a great day :)***
 
:wave:Hi red

AMD and nVidia fanboys might be at each others throats, but my guess is that relations at corporate level are far more cordial.
As I postulated in an earlier thread, it seems that there is a carve-up of the graphics market taking place which will benefit both players. Rather than a head-to-head battle at every price point, I think for the most part, the product lines (performance)/pricing are going to continue to mesh for the foreseeable future (read until 28nm)

AMD needs a higher average selling price (ASP) from the desktop graphics division to help cover any shortfall in CPU/workstation graphics (as well as meet repayments on the $2.4bn loan I would imagine), While nvidia probably need a period of stability to re-establish marketshare after the very much delayed introduction of Fermi. So we now have the situation where HD5xxx series cards are still (for the most part) at, or near launch price due to no direct competition, and GF100/104 cards being priced to sell to 1. clear inventory, and 2. retain marketshare (which to a degree validates PhysX and to a lesser degree, CUDA) and lays the groundwork for future releases even if the profit line on current desktop cards is marginal...or possibly non-existant.

nvidia may not have a great range of SKU's in play - I thought that the GT420 (OEM-already launched), GT 440, GTS 455 and GTX 475 would likely fill out the lineup for the next six months or so. What I've read so far about the GTS 450 and rumoured performance/pricing for the HD6000 series just reinforces my view that AMD and nvidia have managed to dovetail their product lines quite well ;)
 
Good review, performance wise it seems to be in line with what was expected.

However the dodging of one another in price/performance is definitely concerning as a consumer. I'm happy though that the GTX 460 seems to be creeping down in price, at least the 768 versions. You can grab one for $155 on newegg after MIR which is pretty awesome. To me at least it doesn't make sense to even look at a 450 or 5770 when you can spend a little extra to get a much better card.
 
Hi Chef,
It certainly appears to be exactly the way you described it. I read the GTS 450 reviews here at TS and Toms, and the 450 does not make much sense unless that's true. They could have easily unlocked another cluster or two, and clocked it up 150Mhz to right between the 5770 and the 460 and taken ATI out of the mid range.
Not good when you largely build for the "I want to play Crysis2 but only have $700 to spend" crowd.
You think the same hand holding will continue if the NI series is the 25-30% improvement that rumor has it?
 
You think the same hand holding will continue if the NI series is the 25-30% improvement that rumor has it?
Definitely. I don't think nvidia has any more cards up it's sleeve (pun intended)- or rather architecture. NI will have it's time in the spotlight since the series are effectively up against yesterdays news- much the same scenario as happened regarding the general hoop-la when the GTX 460 launched and AMD had no immediate answer on hand.

From a business standpoint it probably makes sense. If both camps released a series( or SKU) at the same time (and same price/perf point) then the consumer makes a choice based on whatever empirical (or emotive) values and then has the card until the next μArch or series becomes available, whereas if both camps stagger their releases and dovetail their range to meet every price point, you stand to gain a sizeable percentage on overall sales simply through constant card releases and a card-for-every-budget marketing mentality that promises the new next-best-thing for only $19.95 more than the card you already have....and who wouldn't want an extra 1, 2 or 3 fps (or save a hypothetical $10 a year on power) while (re)playing Crysis ?

I think the days of the showdown at high noon ( 8800GTX/Ultra v HD 2900XT, 9800GTX v HD 4850 et al) will be put on hold while both companys consolidate their position -AMD probably needs income (and higher ASP's) to tide it over while Intel's Sandy Bridge runs roughshod over the mainstream CPU market, and nvidia need to keep their branding front-and-centre in the mainstream/enthusiast gaming markets' eye -since the entry level graphics card (OEM's excepted) could largely be extinct once on-die CPU graphics become more widely adopted, and the HPC and workstation sales can't keep nvidia afloat indefinitely.

This of course is all conjecture on my part....and I'll likely be proved wrong when nvidia lauches the GTX499.99 a day after AMD launch the HD 6999
They could have easily unlocked another cluster or two, and clocked it up 150Mhz to right between the 5770 and the 460 and taken ATI out of the mid range.
Under normal circumstances I think the GTS 450 would have been clocked closed to 900MHz (core), and would have temporarily achieved exactly what you've proposed. It would also have meant AMD slashing prices on the HD 5770 directly (not a biggie since it would clear inventory for it's replacement 6000 series card) and the 5750 below that. You would then have absurdly (in the companys eyes) cheap cards that could be Crossfired/SLI'ed to give a much better price/perf ratio that the upper mainstream/enthusiast single card offerings, which would likely necessitate price cuts in that sector too -great for the consumer, not great for NVDA's shareholders and Mubadala.
 
But for now it seems inevitable that nVidia will not be able to stay as major player in the longer run, and even considering their much touted HPC strategy, I don't see them penetrating in that segment much as well, wether we like it or not, the world is run by x86 architecture (mostly ;)), and now merging of it with the on-die graphics core means just one thing whoever out there doesn't have license to build x86 CPUz is doomed (i.e. in the graphic arena). I am reasonably sure that a company like nVidia can not stay afloat only selling high end graphic solutions, the CGS in this scenario would simply kill them.

However, there is an opportunity out there for nVidia as well, and I think that lies in the mobile space, if they can deliver the right product at right price point with competitive features and performance, they may well last and even prosper, because frankly most of the graphics in mobile space are pathetic to put it mildly.
 
You guys have done a really nice job with these recent video cards reviews.

Even for a budget builder, I don't see why anyone would really want one of these when you could have a 460 for a few bucks more.
 
Please phase out those 480 & 475. They demand nuclear power plant to operate with lot of heat wastage. U can even boil eggs on that 480 monstor.. No need to buy expensive heaters now, simply turn on the 480 and it will burn ur house on fire :p

I am looking for gr8 price and performance card around which is 5850. Wish if i can get them in below $200. Waiting for the price drop after which i will buy one for myself until they dont come up with 32 or 28mn process technology
 
I think some DX10 cards around that performance should have been included like the 9800gtx, and the gts 250

good review, none the less
 
@ruzveh I've used a GTX 470 and 480. Your statement is one of the most over exaggerated things I've ever seen. They don't run nearly as hot as people claim they do. Just so you know a GTX 460 1GB is just as good as the HD 5850 in terms of price-performance. Also you can't boil eggs on a GTX 480. Boiling eggs requires water.
 
@Archean
I would beg to differ on a couple of (or three) points.
1. APU (or CPU w/IGP) is not likely to threaten the gaming sector for the foreseeable future. A boon for the netbook, laptop, HTPC and “light” or “casual” gaming sectors, but GPU’s still require shader pipelines for higher resolutions (1080p is fast becoming an industry standard), which leads to higher power consumption, higher transistor density and larger die. Another point to note: Intel’s record with graphics drivers.
2. Workstation marketshare
Figures from the previous link
1.3m shipped ws boards x .875 nvidia market share = 1,137,500 SKU’s.
Quadro price range ~ $120 (QuadroFX 370-budget) -$5000 (Quadro 6000-professional), with an average selling price that exceeds most (if not all-barring triple/quad card setups- desktop graphics solutions), Not including the Quadro Plex 7000 @ $14,500 (yet to launch) or module based solutions such as this EVGA Quadro Plex 1000.

Point to note: Whereas CUDA is now widely adopted, AMD are reliant upon OpenCL uptake (slow), Linux and OpenGL (less than stellar driver performance, a seeming lack of documentation with the Stream SDK, and long-standing bugged core apps such as clBuildProgram() (as example), whereas the ability to port CUDA to OpenCL seems to reasonably well established and relatively troublefree. I’m not sure about the worldwide situation, but in this country, any person undergoing medical imaging (CT, MRI, X-ray radiography etc.), auditory mapping etc. , likely has their results compiled and graphically rendered using nvidia hardware and CUDA (or in some cases Red Hat Linux) based programming.

3. HPC. If you believe that parallel computing is not in the future then GPGPU could, I suppose be consigned to the margins. It would seem that DARPA and a few (new) supercomputer builders would beg to differ
Note: SKU costs: Tesla C2050 ($2500), C2070 ($3999), S2050 ($12995), S2070 ($18995) - not bad markup for what are essentially GTX465/470 specced GF100 GPU's.

Until nvidia get a significant sales win (as opposed to design win) with Tegra in the smartphone market-or the SoC’s achieve a wider audience, then I definitely wouldn’t see nvidia leaving the compute + software arena. They (in my opinion) be squeezed out of entry level desktop graphics –but only when a significant percentage of OEM systems ship with an APU/CPU+IGP, but a shift in strategy or an evolution in computing doesn’t neccessitate the death of a company (Remember when IBM sold desktop PC’s?).

If you limit the scope of proceedings to desktop gaming (since video encode/playback should be well within the remit of Sandy Bridge/AMD Fusion) then the future could be somewhat more grim if the scenario you envision comes to pass. Like it or not, nvidia’s TWIMTBP is still the pre-eminent game development program. If nvidia exit desktop gaming cards then the program exits too. There is no way in hell that AMD’s Gaming Evolved fills the gap- in fact I wouldn’t be overly surprised if the program gets quietly buried like it’s predecessor so that every new game is playable on the all new-all singing-all dancing APU (read console port)…but requires just that teeny, tiny bit more graphics horsepower than Intel graphics can supply for good gameplay.
Software has never been AMD's (and ATI's) strong suit. I most certainly wouldn't see them making a strong funding committment to what is in effect a niche market (mainstream/enthusiast PC gaming). History would tend to back that view up I think.

@ruzveh
Thanks for the laugh. It always make me smile when someone invokes Poe's Law and parodies the non-tech savvy. Keep up the good work.
 
Sorry for delayed response, but that is because sometimes I do not get alerts from the threads I've commented/subscribed earlier (which may even result in me not responding at all to some :( ).

1. I think I've commented somewhere that if SB processors are able to provide graphic performance of near/around 5500 series GPUs for now, that will for sure wipeout nVidia from that segment of the market to start with, and even current Intel solutions are providing reasonable enough performance when it comes to video playback. Now, consider the above scenario, coupled with the pace of development/evolution in this industry, I think by 2015 (give or take a year), both Intel and AMD may be able to provide CPUs with graphics core which performs on-par or bit better in the mid-mainstream level, that would have huge knock on effect on nVidia.

2. All true, but can Workstation market alone provide enough revenue to nVidia to invest enough in R&D to stay competitive ?

3. Lets consider GPGPU is the future, that put nVidia even more at peril, because ...... consider the costs of porting all x86 software to their platform, and big companies (in general you may even consider all corporate sector) do not like to invest heavily in new technologies that easily as opposed to something which works for them. We hear lots of fancy talk about 'change/change management etc.' but that is all what it is, even if it happens, it happens at snails pace.

4. I only discussed in my earlier comments situations which can put nVidia in such difficult positions, personally, I would love them to stay because if they are not around we will all loose in a big way, e.g. we've already seen why 5xxx series prices never came down by much.

5. In mobile space, I think it is open field for now, if nVidia can offer something good/competitive and can get a foothold with Tegra (at least to start with) who knows where they may end up.

Lastly, nothing is definitive in the world we live in, remember who invented the Walkman? Sony ........ and who was the biggest beneficiary from this idea/invention ? Apple .......... Who brought us the best phones earlier in the mobile arena ? Nokia ...... and look where they are now, beaten up and bruised pretty badly by all others including Apple.
 
I think I've commented somewhere that if SB processors are able to provide graphic performance of near/around 5500 series GPUs for now, that will for sure wipeout nVidia from that segment of the market to start with, .
That rather depends on Sandy Bridge's uptake. Much as Conroe (C2D/C2Q) became the "must have" CPU of its generation, how many people still run Pentium D...or worse...in mid-2010 ? The entry-level discrete graphics segment is bound to disappear-thats a given, but the timeframe largely depends aggressive promotion (Intel strength), and competitive pricing (no need until AMD offers a threat to marketshare)
and even current Intel solutions are providing reasonable enough performance when it comes to video playback. .
Which is pretty much all thats promised by SB. With the on-die video transcode/encode block, video should not represent a problem (drivers excepted*¹-see below). Gaming on the other hand is still going to require some semblance of DX10/10.1 (doubtful on SB) or DX11 (non-existant on SB)
Now, consider the above scenario, coupled with the pace of development/evolution in this industry, I think by 2015 (give or take a year), both Intel and AMD may be able to provide CPUs with graphics core which performs on-par or bit better in the mid-mainstream level, that would have huge knock on effect on nVidia.
Possible given the right (or wrong) circumstances. Intel should be on Rockwell at 16nm (AMD will probably be due to release Bulldozer on 32nm...:wave: hi Red) and the power draw shouldn't be a major drawback if you were aiming at GTX460/HD5850 performance. In that scenario it presupposes that gaming will not advance to any great degree for this level of performance to competitive. Which would effectively rule out ray tracing an/or more intensive use of antialiasing, tesselation, ambient occlusion etc etc
2. All true, but can Workstation market alone provide enough revenue to nVidia to invest enough in R&D to stay competitive ?
Depends on how you look at it. nvidia has for most of it's existance turned a profit. The company carries no debt as far as I'm aware. Most financially minded tech analyst's seem to think that nvidia could continue to lose the ~$US100m they were down this quarter, every quarter for a number of years before it seriously impacted the company.
R&D is not where nvidia's problems lie. The professional arena is much more slow to change than is the consumer. nvidia's marketshare in workstation graphics is not likely to alter significantly for some years since the product is acceptable and the support is highly rated. WS sales would more than cover R&D.
I would say nvidia's problem is one of inflexibility in management. The stubborn adherance to the single large, monolithic die, and an unwillingness to explore a more flexible approach. nvidia's R&D, and especially their engineers must be top notch to have achieved what they have given the complexity, architecture and die-size parameters
3.Lets consider GPGPU is the future, that put nVidia even more at peril, because ...... consider the costs of porting all x86 software to their platform, and big companies (in general you may even consider all corporate sector) do not like to invest heavily in new technologies that easily as opposed to something which works for them. We hear lots of fancy talk about 'change/change management etc.' but that is all what it is, even if it happens, it happens at snails pace.
My point exactly re the workstation scenario. There is no particular reason why OpenCL's uptake should be so slow....other than CUDA is a so widespread. GPGPU is obviously in the future, but it's not an either/or situation since GPGPU is built more as an adjunct -multiple parallel simple instructions with little vectorization. It's this vectorization (AVX, SSE 5 etc.) that is still going to have to be utilized on a true "core" processor (IMO)

4. I only discussed in my earlier comments situations which can put nVidia in such difficult positions, personally, I would love them to stay because if they are not around we will all loose in a big way, e.g. we've already seen why 5xxx series prices never came down by much.
nvidia is definitely in a difficult position. Regardless of that I doubt you'll see any serious pricing competition for the foreseeable future. Either...
The status quo continues (price/performance stratification) AMD and nvidia set prices to suit themselves and each other by extrapolation
nvidia lose desktop marketshare and diversify (software, pro market etc.) -AMD set prices to suit themselves
nvidia lose complete marketshare...company sold piecemeal, Intel buy nvidia's remaining IP*¹ and offer contracts to the better nvidia engineers. Back to a two-horse race....Intel CPU + Intel fabs+ nvidia graphics, IP and drivers+ Intel money v AMD + ATI graphics + GloFo. Intel and AMD set prices to suit themselves .Assuming that process technology slows once transister density/size runs headlong into the laws of physics at 11-16nm (and both chipmakers/foundries can successfully transition to EUV etc.)
5. In mobile space, I think it is open field for now, if nVidia can offer something good/competitive and can get a foothold with Tegra (at least to start with) who knows where they may end up.
nvidia will undoubtably will be selling in that space. My point is that they still need the marketplace win which has not materialised as of now (Boxee?...PSP2?..Playstation 4?...all touted at one stage)
Lastly, nothing is definitive in the world we live in, remember who invented the Walkman? Sony ........ and who was the biggest beneficiary from this idea/invention ? Apple .......... Who brought us the best phones earlier in the mobile arena ? Nokia ...... and look where they are now, beaten up and bruised pretty badly by all others including Apple.
Quite so. But then, I fully expect Sony and Nokia to still be alive and kicking in 2015 as well
 
That rather depends on Sandy Bridge's uptake. Much as Conroe (C2D/C2Q) became the "must have" CPU of its generation, how many people still run Pentium D...or worse...in mid-2010 ? The entry-level discrete graphics segment is bound to disappear-thats a given, but the timeframe largely depends aggressive promotion (Intel strength), and competitive pricing (no need until AMD offers a threat to marketshare)

Fair point, but most of such users are either corporate users (I have couple of Pentium Ds running in my own department at work) or people who use their computers for general tasks or just for email/surfing. To the second part of this argument, as most of the Pentium Ds are probably running XP (which is nearing EOL anyway) if and when corporate sector decides to shift to Win 7, I think Intel should find significant uptake of sandy bridge CPUs there.

On the mobile side, it will provide adequate performance for all tasks including HD video playback, hence the need for lower end graphic solutions are just about on death throes right now no matter how you look at it (basically we agree on this point but probably may differ on timing).

Gaming on the other hand is still going to require some semblance of DX10/10.1 (doubtful on SB) or DX11 (non-existant on SB)

Gaming crowd for now seems to be not the target IMO, so anyone who does even casual gaming would go for a discrete solution.

The status quo continues (price/performance stratification) AMD and nvidia set prices to suit themselves and each other by extrapolation

Spot on, but just to add a bit of twist, this status quo actually looks more like a fixed pricing/segmentation to me, may be its inadvertent ...... but remember what happened with regard to RAM pricing?

My point is that they still need the marketplace win which has not materialised as of now (Boxee?...PSP2?..Playstation 4?...all touted at one stage)

They lost boxee which I am sure you know already. :(

Quite so. But then, I fully expect Sony and Nokia to still be alive and kicking in 2015 as well

They will, unless they totally drop the ball .... they got the new CEO so lets see what changes he brings. The last Nokia I used was E65, it served me well, was very good in general except for one thing ...... Symbian OS which would crawl to snail's pace for no apparent reasons at times.

I wonder if they think about bringing out Phones with other OS as well, although I don't see that happening. :rolleyes:

Edit:
the power draw shouldn't be a major drawback if you were aiming at GTX460/HD5850 performance. In that scenario it presupposes that gaming will not advance to any great degree for this level of performance to competitive. Which would effectively rule out ray tracing an/or more intensive use of antialiasing, tesselation, ambient occlusion etc etc

It is a moving but not an elusive goal post, look at this way: nVidia rebranded 8xxx series GPUs three times over, without significant hardware redesign, now if something similar happens again the gap wouldn't open up as fast as we may think otherwise. One major advantage for Intel is, they have delivered on their Tick-Tock model, now if they perhaps adopts something similar with regard to their IGP development, things could get interesting. Anyway, its all in the future, and at this moment in time, at best we can only guess.
 
I wouldn't mind trying to GTS 450's in SLI. Sounds pretty good. I'd definitely want a good Motherboard, but it seems to be good on a decent Power supply.
 
wow very nice article....regardless tho i think i am a loyal radeon customer....however nvidia does have its good sides...
 
Back