AMD reveals the Radeon R9 290X, their next-generation GPU

By Scorpus · 73 replies
Sep 25, 2013
Post New Reply
  1. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    Pre-liminary leaked benchmark scores how it to be besting the titan in many games, but that's all up to how its released and that's a full unlocked Hawaii GPU. Honestly only time will tell.

    hah, yea that was the one of (if not the first, but I cant remember if there was an 8000 series dual GPU off the top of my head) dual GPU cards. But yea his kept working, but he had to do some modifications to it to keep it cool.
  2. The gtx 780 is better than the titan in performance. go to nvidia website and they have a relative graph of all their cars from 200 up to the 700 series. the 780 is the highest, a good chunk over the titan. so if amd new card can hold with the 780 then it will beat the titan. and lets not forget when the 7970 came out it was better than the gtx 580 and was a good up against the the new card could very well smack down the 780.
  3. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Starting from the top down:
    Using AMD's own slide, the 290X is pushing 8000 in 3DMark's Firestrike performance preset
    While the GTX 780 (depending upon clock once the CPU and RAM are normalized), scores 8400-8700

    My guess is that under the Extreme preset, along with any GCN optimized application, that gap will close and there will certainly be instances where the 290X disposes of not just the 780 but the Titan as well. How many apps, and how many extreme corner cases (such 8xSSAA + post process) I couldn't say, but the 512-bit bus of the AMD card is certainly going to help pushing pixels in antialiasing or downsampling mode.
    The rest of the lineup is pretty clear cut since they seem to be largely rebrands:
    R7-250X's Firestrike score of ~2000 is pretty much the same as the HD 7750/7770
    R7-260X's Firestrike score of ~3700 is exactly that of the HD 7790
    R9-270X's Firestrike score of ~5500 is slightly above the HD 7870XT
    R9-280X's Firestrike score of ~6800 is 7970GE (and GTX 770) territory -Tahiti by another name.
    Looks as though Curacao (if that's what the tweaked Tahiti LE/Pitcairn facelift is called) and Bonaire will join the Tahiti rebranded parts along with the unannounced R9-290 (Hawaii Pro), which looks to be the extremely interesting part if priced at $399 and should offer performance between the HD 7970GE and the 290X.

    /My $0.02
  4. dennis777

    dennis777 TS Enthusiast Posts: 285   +33

    AMD Koala would be good. :) its really hard to remember all the numbers of this cards.
  5. Eddo22

    Eddo22 TS Booster Posts: 165   +8

    Odd. I ran a 6990 for 2 1/2 years and the only time it was kinda noisy is when I ran benchmark programs. It was just barely audible in gaming in 27-32C degree room temps. That was with the overclock switch enabled as well.

    I agree with the blower though, Would it really hurt AMD to put a better cooler on their cards? Especially considering you are paying an arm and a leg for their top tier products.

    I have a Gigabyte Radeon 7970ghz now and thanks to the 3 fan cooler it runs below 70 in games and I never hear the fans.
  6. Well if you're buying an expensive graphics cards and you don't even bother to do the necessary research, I'd say it's your own fault...
  7. howzz1854

    howzz1854 TS Evangelist Posts: 611   +94

    That's the thing... to many a $150 video card isn't expensive. it's just another hardware to get their pos hp or Dell running again. think of how many business professionals who own those computers and just easily drops $150 on a video card so they can play Sims 3 on the weekend when the kids are out playing in the yard.
  8. lawfer

    lawfer TechSpot Paladin Posts: 1,270   +91

    I need pricing info! Come on AMD!
  9. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    Did you leave it on auto? Also what cooling did you have running through it.

    I had mine in a Corsair Obsidian 800D, before I got the liquid cooling components, I had a stock corsair fan blowing on it and it was still loud even with light gaming. I tried putting a nicer Aerocool Shark Fan on that I was saving for when I was going to order the rest of my stuff which helped out a lot, but while playing Battlefield 3, the fan would always spin up pretty far. When I put 2 in CFX, they would hit 100% fan speed under almost all gaming loads even with lots of airflow going through and it was just unbelievably loud. I still have yet to hear a computer that comes close to as loud as just one of those cards at 100% fan speeds was.
  10. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Popular industry opinion pegs the 290X at $600 ($599 in etail-speak).

    HD 6990? Yep, we have a thread for that. Has pretty much nothing to do with the R9, it's THREE generations old.
  11. Blue Falcon

    Blue Falcon TS Addict Posts: 161   +51

    PC enthusiasts are still hopelessly trying to extrapolate real world gaming performance from 3DMark scores? How many times must it be shown that 3DMark scores between AMD and NV do not directly translate into gaming performance?

    GTX680 > 7970Ghz in 3dMark but in real world 7970Ghz > 680
    GTX680 is 67% faster than GTX580 in 3dMark but in the real world it's more like 35-40%

    All these synthetic benches like 3dCrap and Unigine heaven are better utilized for testing GPU stability when overclocking. For extrapolating real world gaming performance between AMD/NV, they not very accurate.
  12. DeViLzzz

    DeViLzzz TS Rookie

    So should I ditch my son's 1 GB 7850 for $100 flat if I can get someone to take it and get one of these cards coming out ? Also if I get one of these new cards what do you think the life span will be for one of these ? Btw we only game in 1920x1080 and that will continue for quite awhile (3 yrs most likely)
  13. DeViLzzz

    DeViLzzz TS Rookie

    Eh no ability to edit a comment for a few minutes ?

    Anyway after seeing that there is no Crossfire connection I say no to their new cards as I would want to eventually Crossfire on his board and he won't be able to do so with these new cards.
  14. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    You're doing it wrong :smh:
    AMD R9-290X Firestrike score ~8000 (as per AMD's own slide)
    AMD HD 7970GE Firestrike score ~6800 - 7000
    Percentage increase between the two generations of AMD top-tier GPU : 14.3% - 17.6% . You now have a basis for comparison- and that is the comparison that most people studying performance are looking at since the 7970GE's capabilities are well known.
    Both are AMD designs. Check
    Both are GCN µarch. Check
    Valid comparison. Check

    For Firestrike this actually works as a very good cross-reference. In the chart I posted on the previous page the GTX 780 scored 8684, the HD 7970 (non-GHz) scored 6624 - a 31.1% advantage to the 780. Latest comparison between the GTX 780 and HD 7970 (non-GHz) for averaged performance: 31.7% at 1920 res, and 31.4% at 2560 res.

    The rebranded (old architecture) cards still carry Crossfire fingers. It is only the new GPU's that do not. For these the inter-card communication is accomplished over the PCI Express bus in the same way that older low-spec cards have done for a while. For example, the R9-280X (HD 7970 based) features the CFX fingers, while the 290X (new Hawaii GPU) does not
  15. Obzoleet

    Obzoleet TS Booster Posts: 171   +9

    Whats CFX fingers? :)
  16. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    It is the spot on an AMD video card where you plug in a CFX cable to link two or more cards together

    You can use them to compare past and present generations, but I do agree with the cross platforming as it does not show true gaming performance. They are using it to show the new Hawaii GPU's performance compared to its other cards this generation.
  17. Eddo22

    Eddo22 TS Booster Posts: 165   +8

    I left the fans on auto and it was the stock 1 fan cooler on it. I just realized though that my Phenom 965 was holding the card back, so maybe that's why it wasn't as noisy.

    Yeah I can definitely see 2 6990's being loud. They easily ran 85-95 degree's as a single unit.
  18. Geforcepat

    Geforcepat TS Booster Posts: 140   +16

    Hmm on interesting one is 290x or whatever they're calling it.(Shoulda just named it 9700 pro) give us some retro
  19. amstech

    amstech IT Overlord Posts: 1,936   +1,101

    It may not be exact/directly translate but its usually pretty close, 3DMark gives you an accurate idea of how well your GPU will perform compared to others, some specific tests are better then others.
    The new 3DMark expecially, although 3DMark11 is still good for results as well.

    There is no difference between a 680 and 7970 in real world performance, they are always within 5-10 frames of one another with each GPU winning at certain games and a small overall advantage to 7970.
  20. howzz1854

    howzz1854 TS Evangelist Posts: 611   +94

    that was the best video card I've had. solid performer that lasted me for a while, very oveclockable and moddable. miss the good old days.
  21. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    Yeah no kidding, but one under load was pretty horrid, both was just atrocious in terms of noise. The heat always stayed around 80C under load with fans at 100% for both cards. I had to liquid cool those puppies, man I love them to death though, now everything runs cool and quiet. Heck under load, I have yet to see anything above 50C under full load unless I overclock beyond the Bios 880mhz setting (Ive been able to get stable 990mhz on the core).

    Its an odd name, however, those names have a chance to still change before they launch. They may go with 9970 and below or something like that, though if they stick with the 290X or something like that, I wouldn't complain too much, its just an interesting scheme for names. Though im going to be sad if I cant buy a 9990 :p.
  22. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,041   +678

    I feel fine. I wasn't the one that was confused. Look at who I was replying to.

    Again, you're replying to the wrong person.
  23. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,041   +678

    Bingo! Give this man/woman a prize.
  24. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Pretty much so.
    I think the "this isn't a fair test" comments are purely a reaction to what is seen as a lower level of performance than was expected from some people.
    As easy gauge would be to use AMD's own comparison from the slide deck if people are unwilling to believe a level playing field exists:

    R9-290X scores ~8000, R9-280X (HD 7970GE rebrand) 6800 from this slide and the same chart above. 8000 / 6800 = ~18% more performance for the 290X, which is ballpark with the GTX 780's 20% performance lead over the same HD 7970GE for aggregated benchmarks at 1920 and 2560 resolutions.
  25. Too bad there are no AMD chipsets with native PCIe 3.0. So if they do choose to do away with the crossfire bridges then it'll make it slower on their own hardware. :p
  26. Blue Falcon

    Blue Falcon TS Addict Posts: 161   +51

    Ok for all you who disagree with me, watch for real world gaming performance. You state that "Percentage increase between the two generations of AMD top-tier GPU : 14.3% - 17.6%"

    R9 290X will be faster than 14.3-17.6% on average at 2560x1600. The entire comparison to 3dmark has always been a poor measurement of real world gaming performance since no game in the world is based on a 3dmark game engine.

    Cherry-picking firestrike and conveniently ignoring the inaccurate scores of 3dMark11 of 680 vs. 580 shows you guys don't want to address my points on an overall basis.

    @ amstech

    "There is no difference between a 680 and 7970 in real world performance, they are always within 5-10 frames of one another with each GPU winning at certain games and a small overall advantage to 7970."

    Wrong, 7970GE trails 680 in 3dMark11 but it beats it in the real world. Therefore, that's evidence in itself that 3dMark is inaccurate.

    If I play Total War games, 3dMark 2013 tells me squat what GPU I should purchase. It's pretty clear I should get an NV card but 3dMark 2013 doesn't portray such an advantage for NV's cards. Similarly there are many games where the reverse is true.

    The only thing that matters is real world gaming performance unless you love beating the final boss in 3dMark.....
    GhostRyder and Steve like this.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...