AMD reveals the Radeon R9 290X, their next-generation GPU

The "OUR most powerful GPU ever" statement says a lot to me... Then, there's the FireStrike benchmark graph itself. The R9 290X is leveled up to a 7k score. Well, a GTX 780 already 8k.
Finally, Plus, if the R9 290 Series would be "killing" the TITAN - or the GTX 780 for that matter -, I don't doubt AMD would have said it quite clearly.
Pre-liminary leaked benchmark scores how it to be besting the titan in many games, but that's all up to how its released and that's a full unlocked Hawaii GPU. Honestly only time will tell.

I did indeed. the thing was so loud and hot that you could cook a full meal in the case. I remember 120 celsius was normal for that card during gaming. too bad that thing didn't last. but that's not the biggest problem. the biggest problem was SLi wasn't optimized for the games you play until a year after the games are already out. so you would be playing most games at a single 9800gt speed until they come out with proper optimization a year later anyway.

that was a waste of money.

hah, yea that was the one of (if not the first, but I cant remember if there was an 8000 series dual GPU off the top of my head) dual GPU cards. But yea his kept working, but he had to do some modifications to it to keep it cool.
 
The gtx 780 is better than the titan in performance. go to nvidia website and they have a relative graph of all their cars from 200 up to the 700 series. the 780 is the highest, a good chunk over the titan. so if amd new card can hold with the 780 then it will beat the titan. and lets not forget when the 7970 came out it was better than the gtx 580 and was a good up against the 680...so the new card could very well smack down the 780.
 
DivideByZero, what do think about these benchmark scores?
Starting from the top down:
Using AMD's own slide, the 290X is pushing 8000 in 3DMark's Firestrike performance preset
20130925amd5.jpg

While the GTX 780 (depending upon clock once the CPU and RAM are normalized), scores 8400-8700
firestrike.jpg


My guess is that under the Extreme preset, along with any GCN optimized application, that gap will close and there will certainly be instances where the 290X disposes of not just the 780 but the Titan as well. How many apps, and how many extreme corner cases (such 8xSSAA + post process) I couldn't say, but the 512-bit bus of the AMD card is certainly going to help pushing pixels in antialiasing or downsampling mode.
The rest of the lineup is pretty clear cut since they seem to be largely rebrands:
R7-250X's Firestrike score of ~2000 is pretty much the same as the HD 7750/7770
R7-260X's Firestrike score of ~3700 is exactly that of the HD 7790
R9-270X's Firestrike score of ~5500 is slightly above the HD 7870XT
R9-280X's Firestrike score of ~6800 is 7970GE (and GTX 770) territory -Tahiti by another name.
Looks as though Curacao (if that's what the tweaked Tahiti LE/Pitcairn facelift is called) and Bonaire will join the Tahiti rebranded parts along with the unannounced R9-290 (Hawaii Pro), which looks to be the extremely interesting part if priced at $399 and should offer performance between the HD 7970GE and the 290X.

/My $0.02
 
There's more to the naming than just three and four digit scheme. how many cars do we have out there, and yet, we identify them all. I am not saying we start naming them Nvidia Panda, or that nature. but there're way more option than sticking to the two and recycling them. you can either add generation number after the model number, or use non numerical names.

AMD Koala would be good. :) its really hard to remember all the numbers of this cards.
 
Too bad it is a blower :p. The ones on the 7000 series were really loud.
Talk about a blower, I bet 3 of those cards will still be quieter than one of my HD 6990s. Talk about a Jet taking off lol, 2 of those just were horrible in noise.

Odd. I ran a 6990 for 2 1/2 years and the only time it was kinda noisy is when I ran benchmark programs. It was just barely audible in gaming in 27-32C degree room temps. That was with the overclock switch enabled as well.

I agree with the blower though, Would it really hurt AMD to put a better cooler on their cards? Especially considering you are paying an arm and a leg for their top tier products.

I have a Gigabyte Radeon 7970ghz now and thanks to the 3 fan cooler it runs below 70 in games and I never hear the fans.
 
Well if you're buying an expensive graphics cards and you don't even bother to do the necessary research, I'd say it's your own fault...
 
That's the thing... to many a $150 video card isn't expensive. it's just another hardware to get their pos hp or Dell running again. think of how many business professionals who own those computers and just easily drops $150 on a video card so they can play Sims 3 on the weekend when the kids are out playing in the yard.
 
Odd. I ran a 6990 for 2 1/2 years and the only time it was kinda noisy is when I ran benchmark programs. It was just barely audible in gaming in 27-32C degree room temps. That was with the overclock switch enabled as well.

I agree with the blower though, Would it really hurt AMD to put a better cooler on their cards? Especially considering you are paying an arm and a leg for their top tier products.

I have a Gigabyte Radeon 7970ghz now and thanks to the 3 fan cooler it runs below 70 in games and I never hear the fans.
Did you leave it on auto? Also what cooling did you have running through it.

I had mine in a Corsair Obsidian 800D, before I got the liquid cooling components, I had a stock corsair fan blowing on it and it was still loud even with light gaming. I tried putting a nicer Aerocool Shark Fan on that I was saving for when I was going to order the rest of my stuff which helped out a lot, but while playing Battlefield 3, the fan would always spin up pretty far. When I put 2 in CFX, they would hit 100% fan speed under almost all gaming loads even with lots of airflow going through and it was just unbelievably loud. I still have yet to hear a computer that comes close to as loud as just one of those cards at 100% fan speeds was.
 
PC enthusiasts are still hopelessly trying to extrapolate real world gaming performance from 3DMark scores? How many times must it be shown that 3DMark scores between AMD and NV do not directly translate into gaming performance?

GTX680 > 7970Ghz in 3dMark but in real world 7970Ghz > 680
GTX680 is 67% faster than GTX580 in 3dMark but in the real world it's more like 35-40%
https://static.techspot.com/articles-info/546/bench/3Dmark_02.png

All these synthetic benches like 3dCrap and Unigine heaven are better utilized for testing GPU stability when overclocking. For extrapolating real world gaming performance between AMD/NV, they not very accurate.
 
So should I ditch my son's 1 GB 7850 for $100 flat if I can get someone to take it and get one of these cards coming out ? Also if I get one of these new cards what do you think the life span will be for one of these ? Btw we only game in 1920x1080 and that will continue for quite awhile (3 yrs most likely)
 
Eh no ability to edit a comment for a few minutes ?

Anyway after seeing that there is no Crossfire connection I say no to their new cards as I would want to eventually Crossfire on his board and he won't be able to do so with these new cards.
 
PC enthusiasts are still hopelessly trying to extrapolate real world gaming performance from 3DMark scores? How many times must it be shown that 3DMark scores between AMD and NV do not directly translate into gaming performance?..........etc....etc
You're doing it wrong :smh:
AMD R9-290X Firestrike score ~8000 (as per AMD's own slide)
AMD HD 7970GE Firestrike score ~6800 - 7000
Percentage increase between the two generations of AMD top-tier GPU : 14.3% - 17.6% . You now have a basis for comparison- and that is the comparison that most people studying performance are looking at since the 7970GE's capabilities are well known.
Both are AMD designs. Check
Both are GCN µarch. Check
Valid comparison. Check

For Firestrike this actually works as a very good cross-reference. In the chart I posted on the previous page the GTX 780 scored 8684, the HD 7970 (non-GHz) scored 6624 - a 31.1% advantage to the 780. Latest comparison between the GTX 780 and HD 7970 (non-GHz) for averaged performance: 31.7% at 1920 res, and 31.4% at 2560 res.

Anyway after seeing that there is no Crossfire connection I say no to their new cards as I would want to eventually Crossfire on his board and he won't be able to do so with these new cards.
The rebranded (old architecture) cards still carry Crossfire fingers. It is only the new GPU's that do not. For these the inter-card communication is accomplished over the PCI Express bus in the same way that older low-spec cards have done for a while. For example, the R9-280X (HD 7970 based) features the CFX fingers, while the 290X (new Hawaii GPU) does not
 
Whats CFX fingers? :)
It is the spot on an AMD video card where you plug in a CFX cable to link two or more cards together
Crossfire_Bridge.jpg


PC enthusiasts are still hopelessly trying to extrapolate real world gaming performance from 3DMark scores? How many times must it be shown that 3DMark scores between AMD and NV do not directly translate into gaming performance?

GTX680 > 7970Ghz in 3dMark but in real world 7970Ghz > 680
GTX680 is 67% faster than GTX580 in 3dMark but in the real world it's more like 35-40%
https://static.techspot.com/articles-info/546/bench/3Dmark_02.png

All these synthetic benches like 3dCrap and Unigine heaven are better utilized for testing GPU stability when overclocking. For extrapolating real world gaming performance between AMD/NV, they not very accurate.

You can use them to compare past and present generations, but I do agree with the cross platforming as it does not show true gaming performance. They are using it to show the new Hawaii GPU's performance compared to its other cards this generation.
 
Odd. I ran a 6990 for 2 1/2 years and the only time it was kinda noisy is when I ran benchmark programs. It was just barely audible in gaming in 27-32C degree room temps. That was with the overclock switch enabled as well.

I agree with the blower though, Would it really hurt AMD to put a better cooler on their cards? Especially considering you are paying an arm and a leg for their top tier products.

I have a Gigabyte Radeon 7970ghz now and thanks to the 3 fan cooler it runs below 70 in games and I never hear the fans.
Did you leave it on auto? Also what cooling did you have running through it.

I had mine in a Corsair Obsidian 800D, before I got the liquid cooling components, I had a stock corsair fan blowing on it and it was still loud even with light gaming. I tried putting a nicer Aerocool Shark Fan on that I was saving for when I was going to order the rest of my stuff which helped out a lot, but while playing Battlefield 3, the fan would always spin up pretty far. When I put 2 in CFX, they would hit 100% fan speed under almost all gaming loads even with lots of airflow going through and it was just unbelievably loud. I still have yet to hear a computer that comes close to as loud as just one of those cards at 100% fan speeds was.

I left the fans on auto and it was the stock 1 fan cooler on it. I just realized though that my Phenom 965 was holding the card back, so maybe that's why it wasn't as noisy.

Yeah I can definitely see 2 6990's being loud. They easily ran 85-95 degree's as a single unit.
 
Hmm on interesting one is 290x or whatever they're calling it.(Shoulda just named it 9700 pro) give us some retro
 
PC enthusiasts are still hopelessly trying to extrapolate real world gaming performance from 3DMark scores? How many times must it be shown that 3DMark scores between AMD and NV do not directly translate into gaming performance?
It may not be exact/directly translate but its usually pretty close, 3DMark gives you an accurate idea of how well your GPU will perform compared to others, some specific tests are better then others.
The new 3DMark expecially, although 3DMark11 is still good for results as well.

GTX680 > 7970Ghz in 3dMark but in real world 7970Ghz > 680
There is no difference between a 680 and 7970 in real world performance, they are always within 5-10 frames of one another with each GPU winning at certain games and a small overall advantage to 7970.
 
Hmm on interesting one is 290x or whatever they're calling it.(Shoulda just named it 9700 pro) give us some retro

that was the best video card I've had. solid performer that lasted me for a while, very oveclockable and moddable. miss the good old days.
 
I left the fans on auto and it was the stock 1 fan cooler on it. I just realized though that my Phenom 965 was holding the card back, so maybe that's why it wasn't as noisy.

Yeah I can definitely see 2 6990's being loud. They easily ran 85-95 degree's as a single unit.

Yeah no kidding, but one under load was pretty horrid, both was just atrocious in terms of noise. The heat always stayed around 80C under load with fans at 100% for both cards. I had to liquid cool those puppies, man I love them to death though, now everything runs cool and quiet. Heck under load, I have yet to see anything above 50C under full load unless I overclock beyond the Bios 880mhz setting (Ive been able to get stable 990mhz on the core).

Hmm on interesting one is 290x or whatever they're calling it.(Shoulda just named it 9700 pro) give us some retro
Its an odd name, however, those names have a chance to still change before they launch. They may go with 9970 and below or something like that, though if they stick with the 290X or something like that, I wouldn't complain too much, its just an interesting scheme for names. Though im going to be sad if I cant buy a 9990 :p.
 
if it makes you feel better. Intel's naming scheme is confusing too.

I feel fine. I wasn't the one that was confused. Look at who I was replying to.

and you don't need to be an expert to drop $150 on a video card. anyone who is looking to replace their old dell or busted HP comp with their graphic card went up in smoke because of poor cooling design will be looking at getting a card that's way beyond $150.


Again, you're replying to the wrong person.
 
It may not be exact/directly translate but its usually pretty close, 3DMark gives you an accurate idea of how well your GPU will perform compared to others, some specific tests are better then others.
The new 3DMark expecially, although 3DMark11 is still good for results as well.
Pretty much so.
I think the "this isn't a fair test" comments are purely a reaction to what is seen as a lower level of performance than was expected from some people.
As easy gauge would be to use AMD's own comparison from the slide deck if people are unwilling to believe a level playing field exists:
20130925amd5.jpg


R9-290X scores ~8000, R9-280X (HD 7970GE rebrand) 6800 from this slide and the same chart above. 8000 / 6800 = ~18% more performance for the 290X, which is ballpark with the GTX 780's 20% performance lead over the same HD 7970GE for aggregated benchmarks at 1920 and 2560 resolutions.
 
Too bad there are no AMD chipsets with native PCIe 3.0. So if they do choose to do away with the crossfire bridges then it'll make it slower on their own hardware. :p
 
Ok for all you who disagree with me, watch for real world gaming performance. You state that "Percentage increase between the two generations of AMD top-tier GPU : 14.3% - 17.6%"

R9 290X will be faster than 14.3-17.6% on average at 2560x1600. The entire comparison to 3dmark has always been a poor measurement of real world gaming performance since no game in the world is based on a 3dmark game engine.

Cherry-picking firestrike and conveniently ignoring the inaccurate scores of 3dMark11 of 680 vs. 580 shows you guys don't want to address my points on an overall basis.

@ amstech

"There is no difference between a 680 and 7970 in real world performance, they are always within 5-10 frames of one another with each GPU winning at certain games and a small overall advantage to 7970."

Wrong, 7970GE trails 680 in 3dMark11 but it beats it in the real world. Therefore, that's evidence in itself that 3dMark is inaccurate.

http://techreport.com/review/24996/nvidia-geforce-gtx-760-graphics-card-reviewed/10

If I play Total War games, 3dMark 2013 tells me squat what GPU I should purchase. It's pretty clear I should get an NV card but 3dMark 2013 doesn't portray such an advantage for NV's cards. Similarly there are many games where the reverse is true.
http://www.xbitlabs.com/images/graphics/nvidia-geforce-gtx-770/04_shog.png

The only thing that matters is real world gaming performance unless you love beating the final boss in 3dMark.....
 
Back