AMD slashes pricing on Radeon R9 290, 290X cards amid limited GeForce supply

Shawn Knight

Posts: 15,294   +192
Staff member

radeon r9 geforce amd nvidia graphics cards price cuts radeon r9 290 radeon r9 290x

Nvidia introduced its GeForce GTX 970 and 980 graphics cards last month and as we’ve seen time and time again, the competition answers the call with a fresh batch of price cuts. Such is the case with AMD as the chip maker has slashed the price of its Radeon R9 290 and 290X to new lows.

The Radeon R9 290 has been cut from $399 down to $299 while its bigger brother, the R9 290X, can be had for just $399, down from its previous $549 list price. Elsewhere, the R9 285 and 280X are now priced around $229 and $269, respectively.

Shortcut! - read our reviews:

radeon r9 geforce amd nvidia graphics cards price cuts radeon r9 290 radeon r9 290x

The price cuts haven’t gone into effect at all vendors just yet although major outlets like Newegg have already adjusted the prices on some of their offerings accordingly. Most of the base models at Newegg have been cut while some of the aftermarket offerings that are overclocked or feature special cooling are a bit more expensive.

As The Tech Report highlights, the price cuts are particularly timely when you consider Nvidia’s new cards appear to be in short supply. For example, Newegg carries multiple GTX 970 cards priced at $329 but all of them are currently out of stock. The same can be said about the GTX 980 which retails for $549 while those cards at Amazon are either out of stock or heavily marked up. Tiger Direct, meanwhile, doesn’t even have the GTX 980 listed.

Permalink to story.

 
I love competition, even though I wouldn't buy a R9 290 or 290X instead of a GTX 970, simply because of the operating temperatures and noise, power consumption also helps -trying to cut down power consumption at home because we're paying $1,500 USD a year for consuming merely 10kW/hr. a day in average [power bill is very high in the zone we live, most of my relatives in the same city pay as much as half we pay for the same average consumption].
 
I love competition, even though I wouldn't buy a R9 290 or 290X instead of a GTX 970, simply because of the operating temperatures and noise, power consumption also helps -trying to cut down power consumption at home because we're paying $1,500 USD a year for consuming merely 10kW/hr. a day in average [power bill is very high in the zone we live, most of my relatives in the same city pay as much as half we pay for the same average consumption].
Ohhh here comes the AMD stormtroopers. Be careful when you speak the truth about AMD, you may face pages of useless banter and blissfully insane brand loyalty.
 
The rumour has, AMD is planning to be first in Q1 2015 to introduce cards that feature HDMI 2.0 and DisplayPort 1.3. It is a shame that nVidia didn't play the ball with the 9xx series, which they could, and yet couldn't be bothered, playing right into the competitor's hand.

A stagnant competition is very unengaging to watch. Monitor manufacturers hold off on 4K and 5K offers, because they need DisplayPort 1.3 badly, as nobody wants that DELL's 5K 2 x DP 1.2a awkwardness.
 
Last edited:
^^ haha. True. Why would people buy an old architecture based AMD 2xx series when the efficient and newer nvidia card is better and faster and COOLER. Truth to be told, many people are selling AMD cards on the buy and sell forums/sites just to have the nvidia maxwell. Also resale value of Nvidia is better than AMD. I just hope AMD can come catch up with intel and nvidia in performance per watt in the cpu and gpu to bring further the price. We love competition.

PS. I am not a fanboi of Nvidia. I just prefer them because I love their software/drivers and efficiency. I am using a gtx770 with titan cooler and recently bought a r7 250 for online games on a old PC.
 
Ok so with Nvidia, you will save about 500$ in electricity bill ?
Are you able to calculate the difference mate?

Here how:
The wattage difference 155w, if the card is used for 365days (4h daily at full power) at 15cents per kWh, result: about 33$
Should be less than 10$ difference per year for intense user (180days per year, 2hours per day)
Remember, the rated wattage is at 100% GPU usage so my calculations are even higher than normal.
 
The rumour has, AMD is planning to be first in Q1 2015 to introduce cards that feature HDMI 2.0 and DisplayPort 1.3. It is a shame that nVidia didn't use their chance with the 9xx series, which they could, and yet couldn't be bothered, playing right into the competitor's hand.
The people who need those features are so few and far between it is ridiculous. The only people who will need those features are professionals and I believe those are available both teams workstation cards.

I can see why some people would want the new display port and hdmi, but these are gaming cards that cannot perform at the resolutions that the new standards allow. By the time 4k is a common household standard these cards will already be obsolete.
 
@guest calculating electric bill

Your calculation is flawed. Not everyone is living in US. most part of the world are paying more in electric bill. You need to get out of the US to see the difference.

With AMD, in some part of the globe you need an aircon to combat the heat produced by graphic cards that are power hungry and not efficient. so add that to the cost.

Efficiency is the future.
 
^^ haha. True. Why would people buy an old architecture based AMD 2xx series when the efficient and newer nvidia card is better and faster and COOLER. Truth to be told, many people are selling AMD cards on the buy and sell forums/sites just to have the nvidia maxwell. Also resale value of Nvidia is better than AMD. I just hope AMD can come catch up with intel and nvidia in performance per watt in the cpu and gpu to bring further the price. We love competition.

PS. I am not a fanboi of Nvidia. I just prefer them because I love their software/drivers and efficiency. I am using a gtx770 with titan cooler and recently bought a r7 250 for online games on a old PC.

AMD's cards are unmatched when it comes to Open CL implementation and generic computing power. That's why their sales soared on the account of virtual currency encoding. nVidia is only better for gamers.
 
They really need to knock down the 290x price more than that. Why would anyone pay more for a card that is at best as good as a 970 while being louder and more power hungry?
 
The rumour has, AMD is planning to be first in Q1 2015 to introduce cards that feature HDMI 2.0 and DisplayPort 1.3. It is a shame that nVidia didn't use their chance with the 9xx series, which they could, and yet couldn't be bothered, playing right into the competitor's hand.
The DisplayPort 1.3 specification was ratified on September 15th, 2014, three days before the GTX 970/980 launch - bearing in mind the difference between board launch and its initial specification for manufacture, Nvidia would have had to seriously jump the gun on the specification - the last time Nvidia did that was when it announced Penryn (45nm) support for their 650/680i/680SLI chipsets just before the CPU spec was finalized. These chipsets never supported Yorkfield quads and the resulting furore caused a considerable amount of bad publicity and hasty specification rewrites for mobo vendors.
Another point may be that VESA was developing the 1.2a spec at the same time. IF 1.3 also supports Adaptive-Vsync, then Nvidia would likely hold back in order for its AIB's to shift inventory of G-Sync monitors (Asus's ROG Swift for example).
AMD's cards are unmatched when it comes to Open CL implementation and generic computing power. That's why their sales soared on the account of virtual currency encoding. nVidia is only better for gamers.
Untrue. Nvidia boards are massively favoured for CG rendering thanks to CUDA being much better supported than OpenCL by rendering engines.
The common misconception, and one you seem to be a party of, is that Nvidia cards lack computational power because of diminished OpenCL ability. That ability is an artificial limit imposed by Nvidia to elevate and maintain CUDA adoption in the main. Tech sites prefer an apples-to-apples comparison, so use OpenCL benchmarks to compare AMD and Nvidia cards - what they don't do in general is measure AMD's OpenCL versus Nvidia's CUDA where both are available.
Hashing is only a subset of compute functionality, so I wouldn't make a blanket statement based on a single facet of performance
Not to be "that guy" but I'm pretty sure the GTX 970 and 980 are HDMI 2.0.
It's only the newer display port that is missing.
The indeed do support HDMI 2.0.
 
Last edited:
@ VitalyT,

Man, I dont know where do you get your facts. Your just a fanboy by your immature comments here in techspot. Ive been here long before I saw your first comment posted. Sometimes, learn to back up your unncessary comments with facts.

Nvidia is the king of computing as of today. You should know that nvidia gaming is just a small fraction of their business. And your comment that why AMD is favored with mining frenzy, that is because they were cheap compared when you buy a titan/780ti or any nvidia cards at that time. the price/performnce was the critical point why an amd video card were preferred by miners like me. I bet you dont have the experience in mining so stop what youre doing.

Now that that maxwell cards are released, just sit back and fancy your amd card while everyone is getting an nvidia card. Reviews by a technical sites not blogs like you are reading everyday favors nvidia 970/980 as the best cards to get today.
 
@ VitalyT,

get your facts first before posting. Nvidia gaming is just a portion of their portfolio. nvidia is making money in the computing.
 
AMD's cards are unmatched when it comes to Open CL implementation and generic computing power. That's why their sales soared on the account of virtual currency encoding. nVidia is only better for gamers.

That was very short lived. Asic's do more than any OpenCL AMD Radeon card could do.
 
That was very short lived. Asic's do more than any OpenCL AMD Radeon card could do.
And the flood of cheap ex-mining cards entering the resell market is probably the largest factor in AMD needing to drop prices. The GTX 970 (and the soon to be announced GTX 960 Ti) just put AMD between a rock and a hard place coming into the spending free-for-all that is the Christmas holiday season. Anyone looking at a new card would likely be swayed by performance and a few new features- which Maxwell brings. Anyone looking at AMD cards specifically has a whole second-hand market to choose from...a resell market whose prices should drop accordingly as new prices go into freefall. Good news for the consumer, but a regular horror show for AMD's bean counters.
 
I love competition, even though I wouldn't buy a R9 290 or 290X instead of a GTX 970, simply because of the operating temperatures and noise, power consumption also helps -trying to cut down power consumption at home because we're paying $1,500 USD a year for consuming merely 10kW/hr. a day in average [power bill is very high in the zone we live, most of my relatives in the same city pay as much as half we pay for the same average consumption].
Wow, that really sucks! It sounds like a residual effect from the Imron scandal.
 
Not to be "that guy" but I'm pretty sure the GTX 970 and 980 are HDMI 2.0.
It's only the newer display port that is missing.
I had thought that Display Port was a royalty free, (or nearly so), interface, while HDMI use, raked in the big bucks for somebody. No?

That being the reason display port was being pushed, as it were.
 
@ VitalyT,

get your facts first before posting. Nvidia gaming is just a portion of their portfolio. nvidia is making money in the computing.

I take it it's the same poster, trying hard to sound convincing. It is not as convincing as the Data Mining Performance Index. Read it and shiver. AMD R9 is miles ahead of anything that nVidia can offer.

Read this if you can't understand the performance index: http://www.extremetech.com/computing/153467-amd-destroys-nvidia-bitcoin-mining/2

Read this, if you can understand the performance index: https://litecoin.info/Mining_Hardware_Comparison
 
Last edited:
They really need to knock down the 290x price more than that. Why would anyone pay more for a card that is at best as good as a 970 while being louder and more power hungry?

I agree it could be another $50 dollars cheaper.

Do you guys really use stock coolers still?

I haven't had an issue with heat and noise in years, why all my Gpu's get aftermarket coolers.

The last card I owned that didn't have one would have been way back a Radeon 8500!!!

Current Radeon 7970 Ghz has a G10 and Corsair H55 on it.

55c full load and super quiet.
 
I take it it's the same poster, trying hard to sound convincing. It is not as convincing as the Data Mining Performance Index. Read it and shiver. AMD R9 is miles ahead of anything that nVidia can offer.

Read this if you can't understand the performance index: http://www.extremetech.com/computing/153467-amd-destroys-nvidia-bitcoin-mining/2
Well, since you're using ExtremeTech's numbers, why wouldn't you use a more recent example?
LiteCoinEfficiency.png


Mining is about performance per watt....and as Guest and I alluded to, compute covers a fairly wide range of applications. From JPR:
With 74% of units shipped bearing its Quadro brand, Nvidia still commands the lion's share of the market, but AMD took another few points of market share in the quarter to rise to 25.0%.
Nvidia commands an even greater share of the HPC sector in relation to AMD.
Nvidia is the dominant player in the High Performance Computing (HPC) GPU Accelerator market where they currently command 85% market share
Supercomputers and render farms aren't generally associated with gaming systems.
And whats the price of Asic again?
Didn't think they were too bad considering up front cost versus power cost. They'll certainly leave any GPU in the dust. Even the earlier (and much cheaper) Blizzard and Gridseed G-Blade hold up well against GPGPU solutions.
I agree it could be another $50 dollars cheaper.
Do you guys really use stock coolers still?
I haven't had an issue with heat and noise in years, why all my Gpu's get aftermarket coolers.
Well, a few thoughts on that...
1. An aftermarket cooler adds to the cost of the graphics solution you choose
2. Depending upon the AIB, removal of the stock cooler could void warranty
3. Considering aftermarket air coolers, becomes somewhat more problematic with multi-GPU setups.
 
Last edited:
6 months ago id recommend anyone on a budget get an AMD card over an nvidia, I have an R9270x myself got it 6 months ago, pretty happy with it, its not that noisy under load, and if it is under load I am doing something intensive like playing a game where I prob won't notice anyway, Ive overclocked mine a little temps never go above 70c running demanding games, I get bills covered in my rent so don't care about power consumption, but today id say to anyone building on a budget get the nvidia 970, good value should last a few years, I want one but can't justify buying a new GFX card seeing as I just got the r9-270x! (plus I wanna upgrade my lackluster amd fx-6300 processor first)
 
Back