Pre-orders begin for AMD Radeon R9 290X, likely priced at $699

By Scorpus
Oct 4, 2013
Post New Reply
  1. At last week's GPU14 Tech Day event in Hawaii, AMD revealed a new flagship graphics card: the Radeon R9 290X. With over six billion transistors on the GPU die, 4 GB of memory with 300 GB/s of bandwidth, and 5...

    Read more
  2. slamscaper

    slamscaper TechSpot Enthusiast Posts: 105   +10

    That is one sick looking GPU, both in terms of aesthetics and specs. AMD seems to be poised to take over the graphics market. I've been with team green for a really long time now (ever since the G80) because I've felt that they always offered the best performers in the high-end segment. However, AMD really started to impress me whenever their 7xxx series launched (way ahead of Nvidia's Kepler) and laid waste to the green team's best Fermi chips, both in terms of performance and efficiency.

    Now we have AMD silicon in both next-gen consoles and this new R9 290X looks absolutely fantastic. It's not like Nvidia hasn't been on the ball either, as they have impressively squeezed a heck of a lot of performance out of their Kepler architecture. That said, AMD is making Nvidia look bad with their recent innovations. I wouldn't be surprised if Radeon GPU's are powering the majority of gaming PC's in 2014-2015.

    I just wish I wasn't unemployed right now... If I wasn't, I would be seriously considering the 290X for my rig. As it stands, I have to stick with my GTX 670 for the time being.
    Avro Arrow likes this.
  3. Obzoleet

    Obzoleet TechSpot Enthusiast Posts: 164   +7


    In a way I wish I was unemployed, so that I would actually have time to make proper use of a GPU like this :)
    Avro Arrow likes this.
  4. Eddo22

    Eddo22 TechSpot Enthusiast Posts: 146

    I'm happy with my Gigabyte Radeon 7970 1.1ghz. What I really need is a CPU upgrade for my AMD Phenom 965 so I can make proper use of my video card. I'm on the job hunt too. :(
    Avro Arrow likes this.
  5. GhostRyder

    GhostRyder TechSpot Evangelist Posts: 2,133   +493

    Yea I was the same way, before I purchased my HD 6990's, the only 2 AMD/ATI cards I tried in the past was the X1300 and the 9250, the X1300 was just a pain to get running and I returned it fast and bought another NVidia card, the 9250 was fine but I felt it was a little meh in terms of performance.

    I have had (That I can remember off the top of my head) a Nvidia Geforce 5200, 6200, 6700, 8600m, 9800, 460SE, GTX 580's in SLI, and now 2 HD 6990s. All the cards I have owned in the past minus the ATI cards were excellent and I loved them. But when I had so much trouble getting the GTX 590's because of newegg messing my order up and stock, I ended up buying the 6990's on ebay and never looked back. Honestly, I was sure I had made a mistake up until I got catalyst installed and started playing BF3. The 290x at this price point seems to be an excellent deal when comparing it to the 780 and titan, I really want either some Dual GPU variants, or a trio of these cards, however since the 6990s seem to be happily chugging along in BF4 beta, then I don't feel an urge to upgrade as much anymore.
  6. GTX 670 is still a great card.

    I've got a pair of 670s in SLI, and I'll be waiting for 20nm parts to drop in 2014. Whilst the 290X does strike me as a pretty awesome card, it's still only 28nm and not really anything special. imo.

    20nm is where the real deal is for me.
  7. Sniped_Ash

    Sniped_Ash Newcomer, in training Posts: 120   +34

    I would be surprised because that would require AMD to not only stop losing ground to Nvidia, but make up a pretty hefty deficit. Steam shows a roughly 52/15/33% breakdown for Nvidia/Intel/AMD, so that would have to be a colossal turnaround.
  8. VitalyT

    VitalyT TechSpot Guru Posts: 1,569   +480

    Are you on the look out for a good video card or a smaller die size?
  9. BF4 runs fine on max with a 680 ... @ 1920x1200 single card .. infact .. the map they picked for beta ..is 90% gray .. coluored .. looks dull and washed out .. I'm sure they will have better and pretty maps .. feels like BF3 . .but starting all over again .. I love my guns in BF3 .. feel like a noob in BF4 .

    I ran it @ 2560 x 1600 in SLI 680 .. and nothing but flickering and lag ...

    Beta not ready for my setup yet ... * 1200p .. looks ugly on my 30" dell
  10. Boilerhog146

    Boilerhog146 TechSpot Member Posts: 67


    Huh?, I'm running a pair of 4gig 670's in sli and play at ultra settings in the beta,and that's at 2560 x 1600 on my Dell 3007WFP-HC 30" display, looks sweet too me, though , yup, looks a lot like bf3. with better detail. could use some color..
  11. amstech

    amstech TechSpot Enthusiast Posts: 803   +201

    This comment confuses me a little.

    When the 7970 first released, it was slower then a 670 at 1080p and no faster at 1600p. It wasn't until months later and new drivers it finally edged out a 680.
    The 680's 256bit bus makes it more more efficient/power friendly and as far as performance the 680 and 7970 are neck and neck to this day. (The 780 destroys the 7970 so lets leave that comparison out).

    Other then that I do agree with your statement, I love the 512bit bus and recent praises I have heard about AMD GPU's new driver sets, but I don't see AMD taking over anything. They need to win several battles to win the war, in the past 5 years they have lost too many times. Also, being the best GPU is about more then just being the fastest. AMD GPU"s stuttered and skipped all over the place; even when they benchmarked faster they felt slower.
     
  12. slh28

    slh28 TechSpot Paladin Posts: 1,925   +170

    Your comment is even more confusing. The 7970 was released a few months before the 670/680 and the comparison at the time of the 7970 release would have been the 580 (which the OP was referring to when he said Fermi chips...)
  13. LNCPapa

    LNCPapa TS Special Forces Posts: 4,271   +257

    Priced at $699 it's still outside of my range. If priced at $500 or at the very most $550 I'd grab three of them but I'm just not willing to spend $700 on each. Guess I'll drag these 680s around for a while longer.
  14. Geforcepat

    Geforcepat Newcomer, in training Posts: 60

    $599 or bust.
  15. amstech

    amstech TechSpot Enthusiast Posts: 803   +201

    Your right I was referring to when the 670 released, my mistake.
  16. $699 isn't that bad if it comes with BF4 and pwns that ripoff GTX 780 as expected, I was bracing myself for $799+ prices so I'm quite happy about this. Well done AMD.
  17. AMD sucks!!!what a stupid disign of cooler and Card itself!!!I have been using ATI since ATI 9700 Pro but AMD going backwords since!!!Nvidia Titans looks way much better than AMD 290X......AMD sucks because they took Window Vista support from their Driver!!!!Sapphire is great...........
  18. cliffordcooley

    cliffordcooley TechSpot Paladin Posts: 5,749   +1,421

    An AMD rant that finishes with AMD(which is the only flavor Sapphire comes in) is great. Confusion anyone?
     
  19. JC713

    JC713 TechSpot Evangelist Posts: 6,685   +870

    I really wonder what will happen when nVidia cuts prices.
  20. slamscaper

    slamscaper TechSpot Enthusiast Posts: 105   +10

    Well, it remains to be seen just how much Nvidia will cut their prices. If they cut them just enough to remain competitive I seriously think AMD is going to gain some ground in the enthusiast market.
  21. slamscaper

    slamscaper TechSpot Enthusiast Posts: 105   +10

    Also, seeing as AMD silicon is powering both next-gen consoles, we may see upcoming games running better on Radeon GPU's. As I said earlier, AMD really does seem poised to take the lead in the graphics industry. It's exciting to think about, because they are definitely going to keep Nvidia on their toes, which in the end is great for us as it will drive down prices and force them to innovate as well.
    Avro Arrow likes this.
  22. Avro Arrow

    Avro Arrow Newcomer, in training

    I recently replaced my Phenom II X4 965 with an FX-8350. It was an easy drop-in because I already had a 990FX motherboard. The difference is just amazing although for now, you'd be fine just to OC the 965 to 4GHz. That would give you a very nice speed boost.
    Eddo22 likes this.
  23. Avro Arrow

    Avro Arrow Newcomer, in training

    I just picked up the deal of the century. NCIX had the Powercolor PCS Radeon HD 7870 2GB on their site for $100 off PLUS a $30 mail-in-rebate PLUS the ATi "Never Settle Silver" game bundle. I bought two of them because it seemed like an amazing deal. Well as it turns out, they're not EXACTLY Radeon HD 7870s. They're the 7870 XT model with the Tahiti LE chips on them! They're even faster than HD 7870s! I really don't understand the nomenclature of these cards because they really should be called "Radeon HD 7930" since they have the HD 79xx GPU. Well, at the end of the day, I'll have paid just over $300 for both of them with 4 new games. I had a bit of trouble with the first two as at least one of them was unstable in crossfire but I managed to get them seemingly working for about 20 minutes and managed a score of P30045 on 3DMark Vantage. I get two new replacements on Tuesday and I'm all giddy to see what 2 stable cards will do! :D
  24. dividebyzero

    dividebyzero trainee n00b Posts: 4,783   +638

    Only the console ports, and only the console ports that take AMD's money. Game developers aren't noted for their choosiness in who throws money at them or who supplies the SDK (and code where applicable).
    The only likely real game changer is Mantle, and since it is low level (close to metal) API it wont be difficult for Intel (if they can be bothered) or Nvidia (who already have a low-level API in NVAPI) to implement a similar option. There's nothing to say that future games wouldn't have multiple options to run Mantle, NVAPI (or similar), and DirectX/OGL
    AMD (and ATI before it) have been poised to take the lead in the (consumer) graphics industry for over a decade...OpenCL gaming and Get in the Game being two heavily touted strategies that fell flat on their collective faces. Mantle represents an attempt to be pro-active in the industry, rather than AMD's more historically passive stance when it comes to implementing software for their hardware which is a good strategic move. Just a couple of points to note though:
    1. AMD do not command the greater share of PC gaming market share, so don't expect game developers to shun the majority of gamers who use Intel iGP and Nvidia hardware. The coding workload increase to include at least another API (since DirectX and OGL still need inclusion) will translate into higher development and QA costs (guess who's ultimately footing that bill) and a longer lead-in time for games.
    2. Microsoft won't sit idly by if they think that AMD are either undermining D3D, or causing OS issues when implementing Mantle - quite possibly memory/resource allocation conflicts if other API experiences are translatable to Mantle
    Very definitely
    That's not how it works. Increased R&D, and ever more expensive and smaller process nodes means the end-user pays more. The next round of GPUs will be on 20nm (with higher wafer costs), use new (read: expensive) GDDR6, and likely to incorporate a discrete 64-bit RISC CPU -at least at the sharp end of the product stack, to facilitate better parallelization of workloads....and I wouldn't expect game devs (esp EA) or AMD to absorb the increased costs of incorporating Mantle into game engines.
    cliffordcooley likes this.
  25. slamscaper

    slamscaper TechSpot Enthusiast Posts: 105   +10

    Well, it's already been stated that Nvidia is lowering their prices when R9 is launched, so this is what I was referring to. Of course the newest hardware is always going to be expensive. This is a given. But the prices won't remain stagnate for long periods of time if each camp is forced to remain competitive with one another. This is good for us.

    Mantle is another reason I think AMD is poised to make a big difference in the near future. DX has a lot of overhead and it's probably the main reason we suffer from performance issues in the latest console ports. I hope Mantle lives up to it's capabilities. It all depends on how easily the API can be implemented and if it can truly allow developers the low level access they need for "direct to metal-like" performance. From what I remember, Mantle is an open API so Nvidia and Intel can actually make use of it. Of course, I'm sure AMD will do everything they can to ensure that games using Mantle will run best on their hardware.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.