Pre-orders begin for AMD Radeon R9 290X, likely priced at $699

Scorpus

Posts: 2,162   +239
Staff member

At last week's GPU14 Tech Day event in Hawaii, AMD revealed a new flagship graphics card: the Radeon R9 290X. With over six billion transistors on the GPU die, 4 GB of memory with 300 GB/s of bandwidth, and 5 TFLOPS of computing performance, the card is certainly going to be AMD's best performer and a true single-GPU competitior to the Nvidia GeForce GTX Titan.

AMD didn't reveal a release date or a price for the graphics card at the event, but recently pre-orders have started at a range of online retaliers for the R9 290X. Newegg lists the MSI, Sapphire and XFX branded reference cards with a price listed of 'Coming Soon', but some quick digging into the HTML source code reveals the card's price as $729.99 excluding tax. With Newegg taking a small slice, it's likely AMD has set the MSRP at $699.

OverclockersUK are offering a pre-order deposit of £99 to secure a Radeon R9 290X Battlefield 4 Edition, an SKU that AMD announced at the GPU14 event but again failed to price or date. Although the site doesn't list a final price for the card, they estimate they'll deliver their "several hundred units" on October 31st. Centrecom Australia is offering a similar pre-order system, where prospective buyers can put down $200 to be the first in line.

Leaked specifications for the card indicate it features 2816 stream processors, 176 TMUs, 44 ROPs, a base clock of 800 MHz with a turbo clock of 1000 MHz, 4 GB of GDDR5 memory on a 512-bit bus with a clock of 1250 MHz, and an 8+6 pin power port which we already revealed through pictures of the card. Like a number of other R9 and R7 series cards, the R9 290X will support DirectX 11.2, OpenGL 4.3, 'Mantle', and TrueAudio.

Expect AMD to officially reveal the details surrounding this card in the coming weeks, and of course, look our for our reviews on all of the new R9 and R7 series GPUs.

Permalink to story.

 
That is one sick looking GPU, both in terms of aesthetics and specs. AMD seems to be poised to take over the graphics market. I've been with team green for a really long time now (ever since the G80) because I've felt that they always offered the best performers in the high-end segment. However, AMD really started to impress me whenever their 7xxx series launched (way ahead of Nvidia's Kepler) and laid waste to the green team's best Fermi chips, both in terms of performance and efficiency.

Now we have AMD silicon in both next-gen consoles and this new R9 290X looks absolutely fantastic. It's not like Nvidia hasn't been on the ball either, as they have impressively squeezed a heck of a lot of performance out of their Kepler architecture. That said, AMD is making Nvidia look bad with their recent innovations. I wouldn't be surprised if Radeon GPU's are powering the majority of gaming PC's in 2014-2015.

I just wish I wasn't unemployed right now... If I wasn't, I would be seriously considering the 290X for my rig. As it stands, I have to stick with my GTX 670 for the time being.
 
I just wish I wasn't unemployed right now... If I wasn't, I would be seriously considering the 290X for my rig. As it stands, I have to stick with my GTX 670 for the time being.


In a way I wish I was unemployed, so that I would actually have time to make proper use of a GPU like this :)
 
I'm happy with my Gigabyte Radeon 7970 1.1ghz. What I really need is a CPU upgrade for my AMD Phenom 965 so I can make proper use of my video card. I'm on the job hunt too. :(
 
That is one sick looking GPU, both in terms of aesthetics and specs. AMD seems to be poised to take over the graphics market. I've been with team green for a really long time now (ever since the G80) because I've felt that they always offered the best performers in the high-end segment. However, AMD really started to impress me whenever their 7xxx series launched (way ahead of Nvidia's Kepler) and laid waste to the green team's best Fermi chips, both in terms of performance and efficiency.

Now we have AMD silicon in both next-gen consoles and this new R9 290X looks absolutely fantastic. It's not like Nvidia hasn't been on the ball either, as they have impressively squeezed a heck of a lot of performance out of their Kepler architecture. That said, AMD is making Nvidia look bad with their recent innovations. I wouldn't be surprised if Radeon GPU's are powering the majority of gaming PC's in 2014-2015.

I just wish I wasn't unemployed right now... If I wasn't, I would be seriously considering the 290X for my rig. As it stands, I have to stick with my GTX 670 for the time being.
Yea I was the same way, before I purchased my HD 6990's, the only 2 AMD/ATI cards I tried in the past was the X1300 and the 9250, the X1300 was just a pain to get running and I returned it fast and bought another NVidia card, the 9250 was fine but I felt it was a little meh in terms of performance.

I have had (That I can remember off the top of my head) a Nvidia Geforce 5200, 6200, 6700, 8600m, 9800, 460SE, GTX 580's in SLI, and now 2 HD 6990s. All the cards I have owned in the past minus the ATI cards were excellent and I loved them. But when I had so much trouble getting the GTX 590's because of newegg messing my order up and stock, I ended up buying the 6990's on ebay and never looked back. Honestly, I was sure I had made a mistake up until I got catalyst installed and started playing BF3. The 290x at this price point seems to be an excellent deal when comparing it to the 780 and titan, I really want either some Dual GPU variants, or a trio of these cards, however since the 6990s seem to be happily chugging along in BF4 beta, then I don't feel an urge to upgrade as much anymore.
 
GTX 670 is still a great card.

I've got a pair of 670s in SLI, and I'll be waiting for 20nm parts to drop in 2014. Whilst the 290X does strike me as a pretty awesome card, it's still only 28nm and not really anything special. imo.

20nm is where the real deal is for me.
 
I wouldn't be surprised if Radeon GPU's are powering the majority of gaming PC's in 2014-2015.
I would be surprised because that would require AMD to not only stop losing ground to Nvidia, but make up a pretty hefty deficit. Steam shows a roughly 52/15/33% breakdown for Nvidia/Intel/AMD, so that would have to be a colossal turnaround.
 
GTX 670 is still a great card.

I've got a pair of 670s in SLI, and I'll be waiting for 20nm parts to drop in 2014. Whilst the 290X does strike me as a pretty awesome card, it's still only 28nm and not really anything special. imo.

20nm is where the real deal is for me.

Are you on the look out for a good video card or a smaller die size?
 
BF4 runs fine on max with a 680 ... @ 1920x1200 single card .. infact .. the map they picked for beta ..is 90% gray .. coluored .. looks dull and washed out .. I'm sure they will have better and pretty maps .. feels like BF3 . .but starting all over again .. I love my guns in BF3 .. feel like a noob in BF4 .

I ran it @ 2560 x 1600 in SLI 680 .. and nothing but flickering and lag ...

Beta not ready for my setup yet ... * 1200p .. looks ugly on my 30" dell
 
BF4 runs fine on max with a 680 ... @ 1920x1200 single card .. infact .. the map they picked for beta ..is 90% gray .. coluored .. looks dull and washed out .. I'm sure they will have better and pretty maps .. feels like BF3 . .but starting all over again .. I love my guns in BF3 .. feel like a noob in BF4 .

I ran it @ 2560 x 1600 in SLI 680 .. and nothing but flickering and lag ...

Beta not ready for my setup yet ... * 1200p .. looks ugly on my 30" dell


Huh?, I'm running a pair of 4gig 670's in sli and play at ultra settings in the beta,and that's at 2560 x 1600 on my Dell 3007WFP-HC 30" display, looks sweet too me, though , yup, looks a lot like bf3. with better detail. could use some color..
 
However, AMD really started to impress me whenever their 7xxx series launched (way ahead of Nvidia's Kepler) and laid waste to the green team's best Fermi chips, both in terms of performance and efficiency..
This comment confuses me a little.

When the 7970 first released, it was slower then a 670 at 1080p and no faster at 1600p. It wasn't until months later and new drivers it finally edged out a 680.
The 680's 256bit bus makes it more more efficient/power friendly and as far as performance the 680 and 7970 are neck and neck to this day. (The 780 destroys the 7970 so lets leave that comparison out).

Other then that I do agree with your statement, I love the 512bit bus and recent praises I have heard about AMD GPU's new driver sets, but I don't see AMD taking over anything. They need to win several battles to win the war, in the past 5 years they have lost too many times. Also, being the best GPU is about more then just being the fastest. AMD GPU"s stuttered and skipped all over the place; even when they benchmarked faster they felt slower.
 
However, AMD really started to impress me whenever their 7xxx series launched (way ahead of Nvidia's Kepler) and laid waste to the green team's best Fermi chips, both in terms of performance and efficiency..
This comment confuses me a little.

When the 7970 first released, it was slower then a 670 at 1080p and no faster at 1600p. It wasn't until months later and new drivers it finally edged out a 680.
Your comment is even more confusing. The 7970 was released a few months before the 670/680 and the comparison at the time of the 7970 release would have been the 580 (which the OP was referring to when he said Fermi chips...)
 
Priced at $699 it's still outside of my range. If priced at $500 or at the very most $550 I'd grab three of them but I'm just not willing to spend $700 on each. Guess I'll drag these 680s around for a while longer.
 
Your comment is even more confusing. The 7970 was released a few months before the 670/680 and the comparison at the time of the 7970 release would have been the 580 (which the OP was referring to when he said Fermi chips...)
Your right I was referring to when the 670 released, my mistake.
 
$699 isn't that bad if it comes with BF4 and pwns that ripoff GTX 780 as expected, I was bracing myself for $799+ prices so I'm quite happy about this. Well done AMD.
 
AMD sucks!!!what a stupid disign of cooler and Card itself!!!I have been using ATI since ATI 9700 Pro but AMD going backwords since!!!Nvidia Titans looks way much better than AMD 290X......AMD sucks because they took Window Vista support from their Driver!!!!Sapphire is great...........
 
Well, it remains to be seen just how much Nvidia will cut their prices. If they cut them just enough to remain competitive I seriously think AMD is going to gain some ground in the enthusiast market.
 
Also, seeing as AMD silicon is powering both next-gen consoles, we may see upcoming games running better on Radeon GPU's. As I said earlier, AMD really does seem poised to take the lead in the graphics industry. It's exciting to think about, because they are definitely going to keep Nvidia on their toes, which in the end is great for us as it will drive down prices and force them to innovate as well.
 
I'm happy with my Gigabyte Radeon 7970 1.1ghz. What I really need is a CPU upgrade for my AMD Phenom 965 so I can make proper use of my video card. I'm on the job hunt too. :(
I recently replaced my Phenom II X4 965 with an FX-8350. It was an easy drop-in because I already had a 990FX motherboard. The difference is just amazing although for now, you'd be fine just to OC the 965 to 4GHz. That would give you a very nice speed boost.
 
Also, seeing as AMD silicon is powering both next-gen consoles, we may see upcoming games running better on Radeon GPU's. As I said earlier, AMD really does seem poised to take the lead in the graphics industry. It's exciting to think about, because they are definitely going to keep Nvidia on their toes, which in the end is great for us as it will drive down prices and force them to innovate as well.
I just picked up the deal of the century. NCIX had the Powercolor PCS Radeon HD 7870 2GB on their site for $100 off PLUS a $30 mail-in-rebate PLUS the ATi "Never Settle Silver" game bundle. I bought two of them because it seemed like an amazing deal. Well as it turns out, they're not EXACTLY Radeon HD 7870s. They're the 7870 XT model with the Tahiti LE chips on them! They're even faster than HD 7870s! I really don't understand the nomenclature of these cards because they really should be called "Radeon HD 7930" since they have the HD 79xx GPU. Well, at the end of the day, I'll have paid just over $300 for both of them with 4 new games. I had a bit of trouble with the first two as at least one of them was unstable in crossfire but I managed to get them seemingly working for about 20 minutes and managed a score of P30045 on 3DMark Vantage. I get two new replacements on Tuesday and I'm all giddy to see what 2 stable cards will do! :D
 
Also, seeing as AMD silicon is powering both next-gen consoles, we may see upcoming games running better on Radeon GPU's.
Only the console ports, and only the console ports that take AMD's money. Game developers aren't noted for their choosiness in who throws money at them or who supplies the SDK (and code where applicable).
The only likely real game changer is Mantle, and since it is low level (close to metal) API it wont be difficult for Intel (if they can be bothered) or Nvidia (who already have a low-level API in NVAPI) to implement a similar option. There's nothing to say that future games wouldn't have multiple options to run Mantle, NVAPI (or similar), and DirectX/OGL
As I said earlier, AMD really does seem poised to take the lead in the graphics industry.
AMD (and ATI before it) have been poised to take the lead in the (consumer) graphics industry for over a decade...OpenCL gaming and Get in the Game being two heavily touted strategies that fell flat on their collective faces. Mantle represents an attempt to be pro-active in the industry, rather than AMD's more historically passive stance when it comes to implementing software for their hardware which is a good strategic move. Just a couple of points to note though:
1. AMD do not command the greater share of PC gaming market share, so don't expect game developers to shun the majority of gamers who use Intel iGP and Nvidia hardware. The coding workload increase to include at least another API (since DirectX and OGL still need inclusion) will translate into higher development and QA costs (guess who's ultimately footing that bill) and a longer lead-in time for games.
2. Microsoft won't sit idly by if they think that AMD are either undermining D3D, or causing OS issues when implementing Mantle - quite possibly memory/resource allocation conflicts if other API experiences are translatable to Mantle
It's exciting to think about, because they are definitely going to keep Nvidia on their toes,
Very definitely
which in the end is great for us as it will drive down prices and force them to innovate as well.
That's not how it works. Increased R&D, and ever more expensive and smaller process nodes means the end-user pays more. The next round of GPUs will be on 20nm (with higher wafer costs), use new (read: expensive) GDDR6, and likely to incorporate a discrete 64-bit RISC CPU -at least at the sharp end of the product stack, to facilitate better parallelization of workloads....and I wouldn't expect game devs (esp EA) or AMD to absorb the increased costs of incorporating Mantle into game engines.
 
Well, it's already been stated that Nvidia is lowering their prices when R9 is launched, so this is what I was referring to. Of course the newest hardware is always going to be expensive. This is a given. But the prices won't remain stagnate for long periods of time if each camp is forced to remain competitive with one another. This is good for us.

Mantle is another reason I think AMD is poised to make a big difference in the near future. DX has a lot of overhead and it's probably the main reason we suffer from performance issues in the latest console ports. I hope Mantle lives up to it's capabilities. It all depends on how easily the API can be implemented and if it can truly allow developers the low level access they need for "direct to metal-like" performance. From what I remember, Mantle is an open API so Nvidia and Intel can actually make use of it. Of course, I'm sure AMD will do everything they can to ensure that games using Mantle will run best on their hardware.
 
Back