Alleged ATI Radeon HD 5750 leaked, benchmarked

By on October 2, 2009, 6:02 PM
Just over a week after AMD unleashed its powerful ATI Radeon HD 5870 to the world and subsequently the 5850, a forum member at Mymypc.com has posted several images and benchmark numbers of the upcoming HD 5750. Codenamed Juniper LE, this is AMD's first mainstream graphics processor in the ATI Radeon HD 5000 family and is reportedly set to replace the once high-end Radeon HD 4870 with a $150 price tag.


The alleged reference model has a relatively short board and sports a dual-slot design with a teardrop-shaped cooling system slapped on it. Like the 5800 series, it features DirectX 11 support, GDDR5 memory (but on a 128-bit bus and clocked at 1150MHz), and the same port arrangement -- two DVI, one DisplayPort, and one HDMI. The card is also equipped with 1120 stream processors and a 700MHz core clock, according to GPU-Z.

In terms of performance, the new Radeon HD 5750 manages to beat both the HD 4850 and the GTS 250 when running 3DMark06 and 3DMark Vantage's synthetic tests -- although by small margins. We'll of course have to reserve judgment until we can put this card to the test in real-world scenarios, but it seems the Radeon HD 5750 could give Nvidia's GeForce GTS 250 a run for its money when it launches (presumably) later this month.




User Comments: 35

Got something to say? Post a comment
BlindObject said:

Ew, it's ugly. And 128bit? Seriously? I'm sticking to Nvidia. Can't wait for my GTX =)

red1776 red1776, Omnipotent Ruler of the Universe, said:

Ew, it's ugly. And 128bit? Seriously? I'm sticking to Nvidia. Can't wait for my GTX =)

yes i usually choose my graphics by how pretty they are .....geez,

Captain828 Captain828 said:

Seriously now... it's ugly as hell.

Pricing and performance would be interesting factors here. Also, a third-party single-slot design would make this card perfect for a more powerful HTPC.

Reloader2 said:

Nvidia fanboy eh?

Staff
Steve Steve said:

First of all who cares if it has a 128-bit memory bus? What does that mean? Yes traditionally 128-bit graphics cards have been bandwidth challenged but that does not mean the Radeon HD 5750 is slow and it certainly does not mean its bad value. Furthermore its an ATI referance card so expect the usual suspects to come up with their own designs.

BMfan BMfan said:

[-Steve-] said:

First of all who cares if it has a 128-bit memory bus? What does that mean? Yes traditionally 128-bit graphics cards have been bandwidth challenged but that does not mean the Radeon HD 5750 is slow and it certainly does not mean its bad value. Furthermore its an ATI referance card so expect the usual suspects to come up with their own designs.

Was just going to say that.

I use the 4770 and there is nothing wrong with that card,get good frame rates at 1920x1080 with it's 128 bit bus.

With ATI using 128 bit bus width have to ask how come nvidia need 256 bit.

Badfinger said:

Short PCB wise, what's the best card right now?

My 8800GTS-512 (G92) is still pretty good, but it just barely fit.

I know a 9800-GTX for instance was too long.

I have plenty of PSU with a good 650w though, just size.

BrownPaper said:

This card's heatsink looks very similar to the new XFX HD 4870 cooler: [link]

BMfan BMfan said:

It is the same, just a sticker on the cover

Guest said:

'Ew, it's ugly. And 128bit? Seriously? I'm sticking to Nvidia. Can't wait for my GTX =)'

Yeah, Heaven forbid that you install an 'ugly' piece of hardware into a box that'll just sit in there for a good year or so!

freythman freythman said:

God that thing is ugly

---agissi--- ---agissi---, TechSpot Paladin, said:

Apparently the stickers matter to some people. Maybe if you end users had a clue about the circuitry and engineering design you'd appreciate it for what it is - not the sticker or color of the fan.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Apparently the stickers matter to some people. Maybe if you end users had a clue about the circuitry and engineering design you'd appreciate it for what it is - not the sticker or color of the fan.

what agissi said ....geeeez LOL

Technochicken Technochicken, TechSpot Paladin, said:

Who will be seeing the sticker anyway? Won't it be pointing down? I might look to get one of these when it is released. Does anyone know the power consumption figures for this card? I would assume it is less than a 4870 due to the 40nm architecture.

red1776 red1776, Omnipotent Ruler of the Universe, said:

....on second thought....it does look like one of those big insects that skitters across the floor when you turn on the kitchen light in the middle of the night to raid the last of the lasagna....not the real lasagna , no no god forbid she should take the time to make anything homemade ! so while your in there trying to kill it with the broom she hollerin at you for raising to much racket!....maybe she would prefer the damn bugs!....and you think its dead...but oh nooooo! it just plays dead till you lift the broom and goes on its merry way to get more of its friends to come back and crawl on you while you sleep!!! so you get the dust pan to smash the thing and the wife just hollers more! like your doing it for the fun factor! .......still....it does seem to get decent frame rates....I wonder how it crossfires?

BMfan BMfan said:

AHHH Lasagna,i know what i am making for supper tonight.

On the things look's,didn't you guy's know that a lot of people put their pc's on a flat table so they can look at their components while they play games.

BlindObject said:

With a nice big window on my case, along with internal lighting to show the goods, I do care if it's ugly or not. Yeah yeah, it's a lot more then it meets the eye, but it doesn't stop the fact that it's ugly. And nope, the 128bit can still be pretty fast, but if it was more it would make a big difference. I want to upgrade my card in every way possible if I'm dropping 200-300 dollars. And looks for me is certainly an upgrade. =)

BMfan BMfan said:

With the card in your case you can't see the cooler properly unless you have a big window and it's above your head height.

red1776 red1776, Omnipotent Ruler of the Universe, said:

im just giving ya a hard time Blind , I have a clear acrylic case FCS, i would take ugly if it performed well though. with a $200-300 budget though, this would not be your card, you wouldn't find a 128 bit bus on a card of that price range.

Rage_3K_Moiz Rage_3K_Moiz, Sith Lord, said:

I don't really care if it looks ugly; I would probably miss a headshot if I was playing a game and staring at my PC from time to time! =P

Darth Shiv Darth Shiv said:

BlindObject said:

Ew, it's ugly. And 128bit? Seriously? I'm sticking to Nvidia. Can't wait for my GTX =)

Honestly how many mainstream gfx card manufacturers actually use the reference heatsink design? There are always plenty of alternate arrangements.

Guest said:

From what I've been seeing the 5750 renders approximately the same as a 4870 while the 5770 renders approximately the same as the 4890 and the 4870 and 4890 both have 246 bit memory controllers so there's nothing wrong with these cards having 128 bit controllers, it just makes them a cheaper version of the 4870 and 4890 and with the new process they will be less power hungry than the 4870 and 4890 and be DX11 compatible. I'm sure it won't be too long you start seeing rebates or price drops as we get closer to seeing GTX 3x0 boards being relelased by nVidia and then you won't be able to get your hands on one for love or money.

Guest said:

From what I've been seeing the 5750 renders approximately the same as a 4870 while the 5770 renders approximately the same as the 4890 and the 4870 and 4890 both have 246 bit memory controllers so there's nothing wrong with these cards having 128 bit controllers, it just makes them a cheaper version of the 4870 and 4890 and with the new process they will be less power hungry than the 4870 and 4890 and be DX11 compatible. I'm sure it won't be too long you start seeing rebates or price drops as we get closer to seeing GTX 3x0 boards being relelased by nVidia and then you won't be able to get your hands on one for love or money.

Oops that should say 256 bit memory controllers. My eyes are getting old.

Guest said:

Judging by that sneak peak slides it aint be very power efficient ... 108W on RV840 Juniper at around 180mm2 while the roughly same performing dx10.1 rv770 chip consume same amount of power just on bigger die <260mm2> and older processing node. Well weird to put it mildly. Ntm. more than twice faster sister rv870 that consumes ONLY 75% more power and even 1W more in idle (vs. 17W on RV870 thats almost twice larger chip)

xykz said:

Who will be seeing the sticker anyway? Won't it be pointing down? I might look to get one of these when it is released. Does anyone know the power consumption figures for this card? I would assume it is less than a 4870 due to the 40nm architecture.

Unfortunately not to much lower 125W+ on 4870 vs. 108W as i try to point out above post while its only ~70% of RV770's chip size is in fac much more power consumption w/o any extra performanc just dx11 compliance and almost half of it's mem bw. not that RV770 really needed that 103-110GB mem bw that we could gain thru oc.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Unfortunately not to much lower 125W+ on 4870 vs. 108W as i try to point out above post while its only ~70% of RV770's chip size is in fac much more power consumption w/o any extra performanc just dx11 compliance and almost half of it's mem bw. not that RV770 really needed that 103-110GB mem bw that we could gain thru oc.

well your wrong xykz ....twice. the power consumption of the 4870 is 151w @ 12.6A so the 5750 draws 30% less at load. secondly you said " w/o any extra performanc just dx11 compliance " ....well thats kind of a big deal considering that the 5xxx line is ringing in the new generation of games.

xykz said:

well your wrong xykz ....twice. the power consumption of the 4870 is 151w @ 12.6A so the 5750 draws 30% less at load. secondly you said " w/o any extra performanc just dx11 compliance " ....well thats kind of a big deal considering that the 5xxx line is ringing in the new generation of games.

Well we could be always wrong about the wattage cause it's maximal TDP and i dont have measurments of my own or HD577 in my hand i just referr to [link] and it says 130W so it could be much lower than 108W for HD5770 or just this 108W.

On the dx11 games we still need to see developers dispatch some of them and i'd bet most of these first kitty games will be troublesome for mainstream HD5770 just as Crysis were for most of first gen dx10 cards. So i'd rather wait for GT300 and price drops of the top performer like HD5870 especially when it's more than twice performance under 175% TDP of this litte sis (RV840-HD5770). And somehow i think even these RV870 monsters will have to coup with small troubles in most of dx11 games (native AA 4x perf drops etc), nothing new in fact.

Guest said:

Ugly looking ?

My GF picked her latest vehicle, based on color and style, not unfortunately on quality and performance. That is why I keep driving her to work, when her car is in getting fixed. Why would anyone not consider performance, quality and value if choosing to upgrade a graphics card?

Staff
Julio Franco Julio Franco, TechSpot Editor, said:

In case you've missed it, TechSpot's review of the Radeon 5770 + benchmarks of the 5750 throughout the article here:

[link]

xykz said:

Good review but something is definitely weird when it came up to idle power consumption [link] when we see that HD5770's idle 18W represents itself in 171W while HD5850's <27W (HD5870 has 27W) reflects as only 164W, so it consumes 7W lower than much smaller chip?!!

And in L4D bench you use only 2AA/16AF when nVidia has more performance drop on 4AA and ATis RV770 series with 16AF active. So in fact you favor nV in this so called test just like in nzone demo Resident Evil 5 you just use 4AA again so that nV 260, 4870 and 5770 came up virtually pretty much equal. And probably with 16AF the scores would be much more on 5770 side and 4870 came up last of those three. And its nV optimized demo/game. I'd call it pretty unfair benches not diggin deeper. Why did you use 0AF on most of benches so that gpu cames up much more CPU throttled?

And on Crysis Warhead we saw how much unoptimized is second incarnation of Crysis game and favors a lot of memory bandwidth and nV dx10 architecture

Staff
Julio Franco Julio Franco, TechSpot Editor, said:

@xykz Our power consumption levels in the graph represent the whole system idle and load levels, not just for the graphics card. We've always done it this way but we forget sometimes to put it in explicit terms, I will add that note on the review and on future tests.

xykz said:

@xykz Our power consumption levels in the graph represent the whole system idle and load levels, not just for the graphics card. We've always done it this way but we forget sometimes to put it in explicit terms, I will add that note on the review and on future tests.

I assume that in the first place :o Besides i assume that you did all that testing on the same rig and only changing graphic cards during the test as you stated sand that's the only way when power consumption graphs would make a sense. From that i conclude that according to the graph HD57x0 series consumes a lot more in idle (ATi state 18W) load then HD5850 counterpart, while for HD5870 ATI stated 27W in idle and HD5850 should consume even less tha that considering less shader power and far less clock. So i make a reply on what I saw even you didn't mention that in your review conclusion.

On the power bar graphs in idle mode you should be a pretty much of an **** to conclude that it's ONLY GPU consumption (170W?!), so did we came to name calling and obfuscating my reply when it tackles error in power rating in HD57x0 series? (or it's somewhat wrong by that power measurement or it's not taken on the same testing rig as HD5850 card??)

Rage_3K_Moiz Rage_3K_Moiz, Sith Lord, said:

This is pointless; the power values provided by AMD are TDP-based, so they cannot be considered completely accurate. And besides, a difference of less than 10W doesn't mean anything as far as a video card is concerned.

red1776 red1776, Omnipotent Ruler of the Universe, said:

On the power bar graphs in idle mode you should be a pretty much of an **** to conclude that it's ONLY GPU consumption (170W?!),

well you have part of that statement correct, why don't you check this out. [link] . BTW your writing is almost incomprehensible ....and who called you names?

bushwhacker, TechSpot Chancellor, said:

A budget card running hot on 128bit memory? I'll just keep my beasty ol' 9800GT.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.