AMD Radeon R9 290X Review: Challenging the Titan at half the price

Your on! When you get your R9 290x. game on! xD
How we measuring? 3D Mark Fire Strike? ;)
Include Tessmark (Tessellation:Insane ofc) and put the bubbly on ice
dividebyzero
Why is the R9 290x so expensive everywhere else except for America? Maybe they knew most the reviews would go through American Sites thus making sure the price is lower to make it seem higher value?
All the hallmarks of a supply constrained part. Limited supply means that the larger geographic distribution areas get serviced first - basically the U.S., and Asia/Australia. Except for Club3D I don't think AMD has any native European AIB's. If your distribution area is supply constrained you can charge a premium since you only have limited stock and a large surplus of customers, many of whom are willing to pay over the odds.
Newegg seem to have run out, so if the parts stay at auto-notify status for any length of time its because of demand or supply. That will become clearer with the 290 (non-X) launch next week. A good supply of these and it would tend to indicate a yield/supply issue with the top part. The week between launches is a bit of a head scratcher also. A normal course of events would be to launch both together if price-to-performance is matched, or a 3-4 week gap if the top part commands a price premium (to hook the early adopters to the expensive model). One week doesn't really fit the usual scenarios.
 
I'm struggling to find a way past having a graphic card running at 94 degrees, but then I realized it would be perfect to turn into a custom liquid cooling loop because its not like your throwing out a perfect cooler, for example I could never get it together to throw away the EVGA GTX780 ACX cooler
 
I'll pass on any fire breathing card with specs that say, "up to 1GHz" for core clock. That is a spec only a fanboy could ignore. I have more sense than that.

The comments here are sad.
"Well you can lower the gpu target". WHAT?!
"I bet I can get another 100MHz if I water cool it and raise the voltage". WHAT???!!!

So so sad.
 
I'll pass on any fire breathing card with specs that say, "up to 1GHz" for core clock. That is a spec only a fanboy could ignore. I have more sense than that.

The comments here are sad.
"Well you can lower the gpu target". WHAT?!
"I bet I can get another 100MHz if I water cool it and raise the voltage". WHAT???!!!

So so sad.
First off "Guest", you too sad to actually make an account and hide your name mister fanboy. Second, its called boost clock, Nvidia does the same exact thing, so your saying that showing the boost clock is bad, then almost every graphics card on the market is apparently stupid as well.

You apparently don't have more sense than that, the card is designed to handle those extra high temps, while its not great to have a card that runs 95 degrees, the card itself is fine and the GPU target can be adjusted based on what you want it to run at. Many great video cards in history have run very hot and did just fine with it, we have just gotten comfortable with a lower threshold as of late. Anyone saying this is the perfect card is lying to themselves, but anyone saying the card is horrible is also only fooling themselves.

Back on discussion:
DKRON I hear that, I can see a couple swiftech blocks on these, would be fun to do this, I may go ahead actually and this round build a 3 way CFX setup (My board does not have 4 PCIE x16 Lanes so I either have to get 2 dual GPU cards like I have or only 3).
 
Back on discussion:
DKRON I hear that, I can see a couple swiftech blocks on these, would be fun to do this, I may go ahead actually and this round build a 3 way CFX setup (My board does not have 4 PCIE x16 Lanes so I either have to get 2 dual GPU cards like I have or only 3).


Would look mad as, just have to decided on a colour code for cables and tubes. Not sure adding a third card would make much difference though. Waiting for the Haswell-E before I make a move on a new build though, I'm still happily using a first gen i7
 
Also there's a huge difference between furmark and a demanding game. Considering 290X was limited to 50% fan speed while the other cards were not, you need to cast a skeptical eye over those results.
 
Also there's a huge difference between furmark and a demanding game. Considering 290X was limited to 50% fan speed while the other cards were not, you need to cast a skeptical eye over those results.
Generally only an issue for comparative purposes where one card detects Furmark and throttles accordingly. With the 290X that isn't the issue:
Tech Report ( Temps measured while gaming - Skyrim)
gpu-temps.gif

Crysis 3
59323.png

Unigine Valley (Hardware Canucks)
R9-290X-R-55.jpg


Note the similarities.
95C isn't really the issue either- if AMD are guaranteeing the operation it's all good. What is the issue is the throttling from accumulated heat buildup. Note how the clockspeed falls away under continual GPU load:
analysis_uber.gif
 
Yep, I've seen the above graph (last one). So the issue is decibel level of the fan, then. At leats, that's the conclusion that I'm pulling from all the data. AMD is throttling the max fan speed due to huge dBa readings even at the throttled speed, which in turn is also hurting max GPU performance.
 
Yep, I've seen the above graph (last one). So the issue is decibel level of the fan, then. At leats, that's the conclusion that I'm pulling from all the data. AMD is throttling the max fan speed due to huge dBa readings even at the throttled speed, which in turn is also hurting max GPU performance.
Then we've come full circle. I posted a link in post #8 where HardOCP forced 100% fan speed. It has been reported elsewhere in addition that max cooling is going to net increases that are more GPU dependent than VRAM. GPU dependent = likely gain. VRAM dependent = little/no gain. That could all change with overvolting, but if the design is anything like the current crop of AMD and Nvidia cards that becomes a case of diminishing returns unless you have some serious cooling -a 240mm/280mm radiator equivalent minimum just for the card (290X/GK110) if the 650W system power (see below) requirement is correct.
From W1zzards review (the commentary just under the above graph)
AMD's stock cooler is completely overwhelmed with the heat output of the card during voltage tweaking, though. Even at 100%, it could barely keep the card from overheating and was noisier than any cooler I've ever experienced. My neighbors actually complained, asking why I used power tools that late at night.
Power draw also increases immensely, going from just above 400 W for the whole system to around 650 W!
 
In order, I think the contest should have a segment where we see which card can cook a roast more evenly, if we can play Crysis, things like that :D



I dunno why were so much cheaper on these cards, its probably the stores/sites up charging because its new and theres so much hype. I cant get one no matter how hard I try on newegg, though im shooting for 2.


Well hang on a second, Having two is a little unfair xD I only have a single 780. I would get second but I won't be able to until early next year :(

However when it comes to cooking a roast more evenly, I think you've already won there :)
I have the original Crysis and Crysis 2 but not 3. Unigine Heaven maybe? I do have 3DMark xD

Include Tessmark (Tessellation:Insane ofc) and put the bubbly on ice
I think this should be included also :)
 
Well hang on a second, Having two is a little unfair xD I only have a single 780. I would get second but I won't be able to until early next year :(

However when it comes to cooking a roast more evenly, I think you've already won there :)
I have the original Crysis and Crysis 2 but not 3. Unigine Heaven maybe? I do have 3DMark xD


I think this should be included also :)

Well I was going to just bench one on one with CFX disabled if you like, but we could do the Firestrike one as well and BF4 :D

Definitely, although if there is only a single slot between cards it means you're limited to burger patties or Panini. Two slot separation gives you the full George Foreman Grill experience

That is true, I guess we could have a grill off and see which card makes the best burger. I think what we have here is the ultimate video card, think about it, they have invented a video card that means you never have to leave your computer because you can cook your food while you game, its perfect :p

I think im going to start with 2 of these when they actually get in stock again on newegg, guess ill have the auto alert on stand-by. They have roughly the same amount of stream processors now as my 6990's, so it would only be an improvement grabbing both. Ill probably do a third and fourth once I get a haswell-E board later in 2014. I wonder when the swiftech Komodos will be out for this card, or maybe I should try EK again this round and mix it up.
 
And the real winner here is: Nvidia Geforce 660Ti SLI

I don't see how any normal person can justify $500+ on a video card EVER for gaming purposes alone.


Not really the frame time results were much worse and if SLI doesn't work you have a single card which would be too slow for the settings people expecting that kind of performance would be playing with.

You just said everyone go 660 Ti SLI and in the next sentence say its crazy for anyone to spend $500+ on a graphics card. How much do a pair of GTX 660 Ti cards cost? Think you will find its at least $500.

The reason AMD and Nvidia don't make dual-GPU cards out of mid-range GPU's isn't because they don't want your money, its because it doesn't make any sense. If you had the choice of SLI GTX 660 Ti's or the R9 290X you would have to be crazy to pick the SLI setup. However if you already have a GTX 660 Ti then it makes sense to get another one, only in this situation is the GTX 660 Ti SLI a smarter option.

Frame times was better in quite a few of the tests, infact the only time the frametime was bad with SLI was with metro (if I remember right). But as far as the frametime argument goes the data in the review does not back up your claim.

SLI not working? I'm sorry but when was the last time you played something where SLI was not supported or working? I've had two 560 Ti's for more than a year and never once had a single hiccup on any game ever with that setup. And to prove my point even further I got a 670 as a present and its performance is identical to what my two 560's were.

Also my point with the $500 thing wasn't very clear, what I meant was: the article goes on about how the titan is $1000 and this new R290 is supposed to be around $550 the 780 is damn expensive also. So if you are going to spend $500-550 then the best value for money according to the tests done was to get a pair of 660's. I don't see how you could possibly justify getting a 780 or even this 290X when SLI has proven to be very reliable and cost effective.

Lastly the part about midrange dual gpu cards not making sense, do you seriously believe that?
1. Cooling, its far easier to handle two mild heat sources than it is to handle one insanely hot (and very small) one.
2. Performance, refer to the benchmarks in the article.
3. Price, cheaper than or the same cost as a single slot card of similar performance.
4. It has been done already by evga in the past with a dual gpu 560 I think but by the time the released that card it was many months after the initial release and it was substantially more expensive than two individual cards. I firmly believe if they released it faster and at a better price it would have been quite popular.
 
Frame times was better in quite a few of the tests, infact the only time the frametime was bad with SLI was with metro (if I remember right). But as far as the frametime argument goes the data in the review does not back up your claim.

SLI not working? I'm sorry but when was the last time you played something where SLI was not supported or working? I've had two 560 Ti's for more than a year and never once had a single hiccup on any game ever with that setup. And to prove my point even further I got a 670 as a present and its performance is identical to what my two 560's were.

Also my point with the $500 thing wasn't very clear, what I meant was: the article goes on about how the titan is $1000 and this new R290 is supposed to be around $550 the 780 is damn expensive also. So if you are going to spend $500-550 then the best value for money according to the tests done was to get a pair of 660's. I don't see how you could possibly justify getting a 780 or even this 290X when SLI has proven to be very reliable and cost effective.

Lastly the part about midrange dual gpu cards not making sense, do you seriously believe that?
1. Cooling, its far easier to handle two mild heat sources than it is to handle one insanely hot (and very small) one.
2. Performance, refer to the benchmarks in the article.
3. Price, cheaper than or the same cost as a single slot card of similar performance.
4. It has been done already by evga in the past with a dual gpu 560 I think but by the time the released that card it was many months after the initial release and it was substantially more expensive than two individual cards. I firmly believe if they released it faster and at a better price it would have been quite popular.

If your going straight off of value to money, then yea in some ways a couple 660's are a great deal. However, its all dependant on software and the games that support it. Yea I have found it hard in recent times to find a game that does not support SLI/CFX (The only one off the top of my head at the moment is Sims 3) but its still then relaint heavily on drivers.

Plus then you have to look at lastability, grabbing 2 660 TI's maybe great at the moment, but where do you go from there, your pretty much already maxed out. If you grab a 780 or 290X, you can later at a price drop grab a second and almost double your power giving you further lastability along with even a third or fourth.

If your a person who buys on a yearly basis at each gen of cards, you might benefit better grabbing 2 660 ti's because it will be great that year or so you keep them until the next 760 ti's or whatever. But for long term, buying one Super high Single GPU will give you alot of room for improvement along the road and the least amount of problems in general.

As for Dual GPU 660's or what not, if they sold them cheaper than buying 2 individual 660 ti's, that would butcher their hardware sales on the Single GPU variants because the point of owning them would be gone except if you only wanted one card at that cheap a price. In the past, they have not been very successful because of pricing and the lack of Quad/Tri SLI support on the 460's and 560s. The companies have their setups with GPU and performance in the orders they see fit, they dont want to harm their own products by making something that has significantly better value than 2 of their other cards. The Dual GPU top of the line Cards have a place for people in small spaces for computer components or for an easy Quad Setup and do not have 4 slots on the boards (Most board do not have 4 slots available).
 
In a sense, the R9 290X (codenamed "Hawaii XT") could be considered AMD's Titan, as it takes the Tahiti architecture and stuffs with nearly 2000 million more transistors.

This gave me a pretty good laugh. 2000 million more huh?

Also on one of the 99th percentile graphs it said "12 seconds between frames" which I am sure meant to be 12 ms between frames.
 
This card needs a better cooler for anything but short benchmarking runs it seems to me.
It throttles heavily if left at the stock fan setting, and even with the "Uber" jet-engine setting it throttles still.
http://www.techpowerup.com/reviews/AMD/R9_290X/30.html

I'm sure it would be lovely on water or with an improved cooler

For those guys worried about throttling I just did some extra testing over the past day. I ran the Crysis 3, Max Payne 3, Metro Last Light, Bioshock and Tomb Raider tests over and over again for an hour and then recorded the results. The frames per second in all those games went unchanged from our original results.

The core frequency was monitored and never fluctuated from 1000MHz though temperatures in all games were between 92 – 95 degrees and the card was quite loud. Testing was conducted in the Uber mode.

I am not sure what game TechPowerUp tested in but I cannot reproduce those results.

This gave me a pretty good laugh. 2000 million more huh?

Also on one of the 99th percentile graphs it said "12 seconds between frames" which I am sure meant to be 12 ms between frames.

I am only going to say this once more, a billion is not measured the same way globally. A billion can either be 1000 million or a million million, kind of a big difference.

Frame times was better in quite a few of the tests, infact the only time the frametime was bad with SLI was with metro (if I remember right). But as far as the frametime argument goes the data in the review does not back up your claim.

Really, you might want to take a proper look. Overall the R9 290X was 22% faster when measuring frame time performance. The R9 290X was 12% faster in Crysis 3, 53% faster in Medal of Honor, 17% faster than Resident Evil, 45% faster in Bioshock, 54% faster in Metro Last Light and 127% faster in Sleeping Dogs. Are we reading the same review?

SLI not working? I'm sorry but when was the last time you played something where SLI was not supported or working? I've had two 560 Ti's for more than a year and never once had a single hiccup on any game ever with that setup. And to prove my point even further I got a 670 as a present and its performance is identical to what my two 560's were.

You must only play AAA titles then. Still the frame time results prove that even when SLI is working it’s not always working that well. In many of the games above the R9 290X will provide a much smoother gaming experience, even when the frames per second are slightly slower.

Also my point with the $500 thing wasn't very clear, what I meant was: the article goes on about how the titan is $1000 and this new R290 is supposed to be around $550 the 780 is damn expensive also. So if you are going to spend $500-550 then the best value for money according to the tests done was to get a pair of 660's. I don't see how you could possibly justify getting a 780 or even this 290X when SLI has proven to be very reliable and cost effective.

Your point still must not be very clear because you are wrong. If you are spending $500 - $550 the best option is the Radeon R9 290X, not the GTX 660 Ti SLI cards. If you buy the SLI setup today rather than the R9 290X you would have made a mistake.

Lastly the part about midrange dual gpu cards not making sense, do you seriously believe that?
1. Cooling, its far easier to handle two mild heat sources than it is to handle one insanely hot (and very small) one.
2. Performance, refer to the benchmarks in the article.
3. Price, cheaper than or the same cost as a single slot card of similar performance.
4. It has been done already by evga in the past with a dual gpu 560 I think but by the time the released that card it was many months after the initial release and it was substantially more expensive than two individual cards. I firmly believe if they released it faster and at a better price it would have been quite popular.

Yes I seriously believe that.

1. No, plain wrong. How can you possibly come to that conclusion when a pair of GTX 660 Ti SLI cards consume roughly 30% more power?
2. Yes the frames per second performance is a fraction better, the R9 290X was 2% slower overall. If you believe that is enough to warrant a dual-GPU setup over a single GPU then good luck to you.
3. Yes you are making my point with your third point, I agree.
4. Yes that is why the eVGA card failed. Its funny how you just don’t see dual-GPU GTX 660 Ti cards, what gives? It's such a brilliant idea.
 
Well, we can already say their won't be a dual GPU version of this. Running two of them in crossfire, drops your core clocks to ~720MHz! And because of that, this Hawaii GPU lineup with rebrands from the previous series is clearly a desperate attempt to take the crown from nVIDIA. Bad form AMD.
 
I Believe .that most if not all Dual gpu cards were manufactured after a die shrink and optimizations added,the 7950 GX2,9800GX2, both of which I have pairs of.and the 295, my friend still has .though I think the second version 295 with single pcb was the shrunkin head version. while the original dual pcb was 2 x 270 or some thing as such.which he has..and yeah, I think they will be able to pair these after a shrink ,if there was a shrink planned.slightly lower clocks,and voltage. and a waterblock. may let 2 of these play nice..
 
I'm really surprised that people are talking about the heat and noise issue. ATi reference cards have ALWAYS been hot and loud. It's a tradeoff they make to ensure that the heat is exhausted out the back of the case instead of into it. This allows the card to be a working solution in any computer case. Having said that, I have 5 terms to describe the easy solution to the problem: Windforce, Vapor-X, DirectCU, IceQ and TwinFrozr.
 
This card might be challenging Titan, but it is also challenging our wallets for the monthly electrical bill and venting requirements for the PC case.

Maybe if one lives in a cold region, those can be considered on the upside.

For now I'm still happy to have settled for GTX 780 back in July, and wouldn't swap it for this new AMD's hybrid of a power hog with a vacuum cleaner. As far as the ergonomics go, this one is an abomination.
 
I live in eastern Canada ,eastern Newfoundland ,minus 1 and frosty here last night ,so fire up a good game on both rigs ,the ol-lady doest have to turn on the heat until I stop gaming ,lol.WoT co-op!.jpg

WoT CO-OP anyone? lol , 2 big rooms jammed into 1 smaller room :(
 
I'm really surprised that people are talking about the heat and noise issue. ATi reference cards have ALWAYS been hot and loud. It's a tradeoff they make to ensure that the heat is exhausted out the back of the case instead of into it.
And this alleged trade-off is different from nVidia how? nVidia by your own submission is practically admitting to nVidia card running cooler and quieter. While they both strive for the same goal of venting heated air out the back of the case. Unless you can spell it out your comment is null and void. Because adding band-aids to reference cards is no solution when comparing reference cards. It only points out how much more a band-aid is needed for comfortable usability, that is if you have a well ventilated case. By the way, who would actually buy one of these reference cards without first having a well ventilated case? AMD and nVidia should forget about releasing reference designs with the topend cards where high ventilation would be a requirement.

But then who am I to be commenting, I won't spend over $300 for a card. I'm just trying to understand what I'm reading and why it was posted.
 
If I spend less than 300 bucks on a card .its either too WEAK ! or too Freakin azz OLD! and its in a clearance bin ,where I got one of my 670's, and prolly where I'll find one of these ,lol.. RMA recerts are usually a good score.
 
This card might be challenging Titan, but it is also challenging our wallets for the monthly electrical bill and venting requirements for the PC case.

Maybe if one lives in a cold region, those can be considered on the upside.

For now I'm still happy to have settled for GTX 780 back in July, and wouldn't swap it for this new AMD's hybrid of a power hog with a vacuum cleaner. As far as the ergonomics go, this one is an abomination.
Still outplays the titan and 780 so yea, even so, that difference on electricity is not going to be much and most people paying for this high end of a card know what's coming anyways. Its like going to buy an SUV and going well this one gets 2 miles more to the gallon but the other one has 20 more horsepower, sure its going to be slightly more fuel efficient, but your still buying an SUV, and more horsepower might benefit you in the long run versus the slightly improved fuel efficiency.

Well, we can already say their won't be a dual GPU version of this. Running two of them in crossfire, drops your core clocks to ~720MHz! And because of that, this Hawaii GPU lineup with rebrands from the previous series is clearly a desperate attempt to take the crown from nVIDIA. Bad form AMD.
Id like to see where you get your facts from, also your saying that the rebranding of the AMD cards is a desperate attempt from AMD to take the crown? So what does that make the 770, 760, and 760 ti then? 770 = 680 for instance... Both Companies only released 2 high end new cards and left the older gens to be rebranded and clock speeds adjusted to make the "New" cards.

I like the top tier cards at this point, I cant stand going to far down the totem pole anymore as well. Trying 2 460's after being at the top cards for awhile just felt like a bad decision and I was just unsatisfied with the performance.

And this alleged trade-off is different from nVidia how? nVidia by your own submission is practically admitting to nVidia card running cooler and quieter. While they both strive for the same goal of venting heated air out the back of the case. Unless you can spell it out your comment is null and void. Because adding band-aids to reference cards is no solution when comparing reference cards. It only points out how much more a band-aid is needed for comfortable usability, that is if you have a well ventilated case. By the way, who would actually buy one of these reference cards without first having a well ventilated case? AMD and nVidia should forget about releasing reference designs with the topend cards where high ventilation would be a requirement.

But then who am I to be commenting, I won't spend over $300 for a card. I'm just trying to understand what I'm reading and why it was posted.

Blowers are cheap and easy for anyone to run, that's the key thing. When you release cards that have changed up coolers even if they are better, you get those people who find some way to break them or cant run them in CFX/SLI properly. A blower is loud and not the most efficient for cooling, but its also the safest bet. I always buy reference top of the line cards (Minus the PNY LC 580's I had, with more of these would get released) and slap water blocks on them. It saves me money from buying an aftermarket cooler one and having to waste such a nice cooler.
 
Back