TechSpot

AMD slashes pricing on Radeon R9 290, 290X cards amid limited GeForce supply

By Shawn Knight
Oct 6, 2014
Post New Reply
  1. Nvidia introduced its GeForce GTX 970 and 980 graphics cards last month and as we've seen time and time again, the competition answers the call with a fresh batch of price cuts. Such is the case with AMD as the...

    Read more
     
  2. EEatGDL

    EEatGDL TS Maniac Posts: 481   +159

    I love competition, even though I wouldn't buy a R9 290 or 290X instead of a GTX 970, simply because of the operating temperatures and noise, power consumption also helps -trying to cut down power consumption at home because we're paying $1,500 USD a year for consuming merely 10kW/hr. a day in average [power bill is very high in the zone we live, most of my relatives in the same city pay as much as half we pay for the same average consumption].
     
    Burty117 likes this.
  3. amstech

    amstech TechSpot Enthusiast Posts: 1,457   +606

    Ohhh here comes the AMD stormtroopers. Be careful when you speak the truth about AMD, you may face pages of useless banter and blissfully insane brand loyalty.
     
    Burty117 and Blakey like this.
  4. Its time to build a time machine at last.
     
  5. VitalyT

    VitalyT Russ-Puss Posts: 3,156   +1,431

    The rumour has, AMD is planning to be first in Q1 2015 to introduce cards that feature HDMI 2.0 and DisplayPort 1.3. It is a shame that nVidia didn't play the ball with the 9xx series, which they could, and yet couldn't be bothered, playing right into the competitor's hand.

    A stagnant competition is very unengaging to watch. Monitor manufacturers hold off on 4K and 5K offers, because they need DisplayPort 1.3 badly, as nobody wants that DELL's 5K 2 x DP 1.2a awkwardness.
     
    Last edited: Oct 6, 2014
  6. ^^ haha. True. Why would people buy an old architecture based AMD 2xx series when the efficient and newer nvidia card is better and faster and COOLER. Truth to be told, many people are selling AMD cards on the buy and sell forums/sites just to have the nvidia maxwell. Also resale value of Nvidia is better than AMD. I just hope AMD can come catch up with intel and nvidia in performance per watt in the cpu and gpu to bring further the price. We love competition.

    PS. I am not a fanboi of Nvidia. I just prefer them because I love their software/drivers and efficiency. I am using a gtx770 with titan cooler and recently bought a r7 250 for online games on a old PC.
     
  7. Ok so with Nvidia, you will save about 500$ in electricity bill ?
    Are you able to calculate the difference mate?

    Here how:
    The wattage difference 155w, if the card is used for 365days (4h daily at full power) at 15cents per kWh, result: about 33$
    Should be less than 10$ difference per year for intense user (180days per year, 2hours per day)
    Remember, the rated wattage is at 100% GPU usage so my calculations are even higher than normal.
     
  8. yRaz

    yRaz TS Evangelist Posts: 1,906   +954

    The people who need those features are so few and far between it is ridiculous. The only people who will need those features are professionals and I believe those are available both teams workstation cards.

    I can see why some people would want the new display port and hdmi, but these are gaming cards that cannot perform at the resolutions that the new standards allow. By the time 4k is a common household standard these cards will already be obsolete.
     
    TheLastPanda likes this.
  9. @guest calculating electric bill

    Your calculation is flawed. Not everyone is living in US. most part of the world are paying more in electric bill. You need to get out of the US to see the difference.

    With AMD, in some part of the globe you need an aircon to combat the heat produced by graphic cards that are power hungry and not efficient. so add that to the cost.

    Efficiency is the future.
     
  10. VitalyT

    VitalyT Russ-Puss Posts: 3,156   +1,431

    AMD's cards are unmatched when it comes to Open CL implementation and generic computing power. That's why their sales soared on the account of virtual currency encoding. nVidia is only better for gamers.
     
  11. Sniped_Ash

    Sniped_Ash TS Maniac Posts: 253   +108

    They really need to knock down the 290x price more than that. Why would anyone pay more for a card that is at best as good as a 970 while being louder and more power hungry?
     
  12. cmbjive

    cmbjive TS Booster Posts: 777   +137

    We've come a long way from the BitCoin pricing fiasco for these ATI cards.
     
  13. Burty117

    Burty117 TechSpot Chancellor Posts: 2,920   +687

    Not to be "that guy" but I'm pretty sure the GTX 970 and 980 are HDMI 2.0.
    It's only the newer display port that is missing.
     
    TheLastPanda and dividebyzero like this.
  14. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    The DisplayPort 1.3 specification was ratified on September 15th, 2014, three days before the GTX 970/980 launch - bearing in mind the difference between board launch and its initial specification for manufacture, Nvidia would have had to seriously jump the gun on the specification - the last time Nvidia did that was when it announced Penryn (45nm) support for their 650/680i/680SLI chipsets just before the CPU spec was finalized. These chipsets never supported Yorkfield quads and the resulting furore caused a considerable amount of bad publicity and hasty specification rewrites for mobo vendors.
    Another point may be that VESA was developing the 1.2a spec at the same time. IF 1.3 also supports Adaptive-Vsync, then Nvidia would likely hold back in order for its AIB's to shift inventory of G-Sync monitors (Asus's ROG Swift for example).
    Untrue. Nvidia boards are massively favoured for CG rendering thanks to CUDA being much better supported than OpenCL by rendering engines.
    The common misconception, and one you seem to be a party of, is that Nvidia cards lack computational power because of diminished OpenCL ability. That ability is an artificial limit imposed by Nvidia to elevate and maintain CUDA adoption in the main. Tech sites prefer an apples-to-apples comparison, so use OpenCL benchmarks to compare AMD and Nvidia cards - what they don't do in general is measure AMD's OpenCL versus Nvidia's CUDA where both are available.
    Hashing is only a subset of compute functionality, so I wouldn't make a blanket statement based on a single facet of performance
    The indeed do support HDMI 2.0.
     
    Last edited: Oct 6, 2014
  15. @ VitalyT,

    Man, I dont know where do you get your facts. Your just a fanboy by your immature comments here in techspot. Ive been here long before I saw your first comment posted. Sometimes, learn to back up your unncessary comments with facts.

    Nvidia is the king of computing as of today. You should know that nvidia gaming is just a small fraction of their business. And your comment that why AMD is favored with mining frenzy, that is because they were cheap compared when you buy a titan/780ti or any nvidia cards at that time. the price/performnce was the critical point why an amd video card were preferred by miners like me. I bet you dont have the experience in mining so stop what youre doing.

    Now that that maxwell cards are released, just sit back and fancy your amd card while everyone is getting an nvidia card. Reviews by a technical sites not blogs like you are reading everyday favors nvidia 970/980 as the best cards to get today.
     
  16. @ VitalyT,

    get your facts first before posting. Nvidia gaming is just a portion of their portfolio. nvidia is making money in the computing.
     
  17. hahahanoobs

    hahahanoobs TS Evangelist Posts: 1,631   +432

    That was very short lived. Asic's do more than any OpenCL AMD Radeon card could do.
     
  18. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    And the flood of cheap ex-mining cards entering the resell market is probably the largest factor in AMD needing to drop prices. The GTX 970 (and the soon to be announced GTX 960 Ti) just put AMD between a rock and a hard place coming into the spending free-for-all that is the Christmas holiday season. Anyone looking at a new card would likely be swayed by performance and a few new features- which Maxwell brings. Anyone looking at AMD cards specifically has a whole second-hand market to choose from...a resell market whose prices should drop accordingly as new prices go into freefall. Good news for the consumer, but a regular horror show for AMD's bean counters.
     
    cliffordcooley likes this.
  19. captaincranky

    captaincranky TechSpot Addict Posts: 11,706   +1,887

    Wow, that really sucks! It sounds like a residual effect from the Imron scandal.
     
  20. captaincranky

    captaincranky TechSpot Addict Posts: 11,706   +1,887

    I had thought that Display Port was a royalty free, (or nearly so), interface, while HDMI use, raked in the big bucks for somebody. No?

    That being the reason display port was being pushed, as it were.
     
  21. VitalyT

    VitalyT Russ-Puss Posts: 3,156   +1,431

    I take it it's the same poster, trying hard to sound convincing. It is not as convincing as the Data Mining Performance Index. Read it and shiver. AMD R9 is miles ahead of anything that nVidia can offer.

    Read this if you can't understand the performance index: http://www.extremetech.com/computing/153467-amd-destroys-nvidia-bitcoin-mining/2

    Read this, if you can understand the performance index: https://litecoin.info/Mining_Hardware_Comparison
     
    Last edited: Oct 6, 2014
  22. Lionvibez

    Lionvibez TS Evangelist Posts: 1,104   +346

    And whats the price of Asic again?
     
    VitalyT likes this.
  23. Lionvibez

    Lionvibez TS Evangelist Posts: 1,104   +346

    I agree it could be another $50 dollars cheaper.

    Do you guys really use stock coolers still?

    I haven't had an issue with heat and noise in years, why all my Gpu's get aftermarket coolers.

    The last card I owned that didn't have one would have been way back a Radeon 8500!!!

    Current Radeon 7970 Ghz has a G10 and Corsair H55 on it.

    55c full load and super quiet.
     
  24. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Well, since you're using ExtremeTech's numbers, why wouldn't you use a more recent example?
    [​IMG]

    Mining is about performance per watt....and as Guest and I alluded to, compute covers a fairly wide range of applications. From JPR:
    Nvidia commands an even greater share of the HPC sector in relation to AMD.
    Supercomputers and render farms aren't generally associated with gaming systems.
    Didn't think they were too bad considering up front cost versus power cost. They'll certainly leave any GPU in the dust. Even the earlier (and much cheaper) Blizzard and Gridseed G-Blade hold up well against GPGPU solutions.
    Well, a few thoughts on that...
    1. An aftermarket cooler adds to the cost of the graphics solution you choose
    2. Depending upon the AIB, removal of the stock cooler could void warranty
    3. Considering aftermarket air coolers, becomes somewhat more problematic with multi-GPU setups.
     
    Last edited: Oct 6, 2014
    cliffordcooley likes this.
  25. gobbybobby

    gobbybobby TS Guru Posts: 548   +8

    6 months ago id recommend anyone on a budget get an AMD card over an nvidia, I have an R9270x myself got it 6 months ago, pretty happy with it, its not that noisy under load, and if it is under load I am doing something intensive like playing a game where I prob won't notice anyway, Ive overclocked mine a little temps never go above 70c running demanding games, I get bills covered in my rent so don't care about power consumption, but today id say to anyone building on a budget get the nvidia 970, good value should last a few years, I want one but can't justify buying a new GFX card seeing as I just got the r9-270x! (plus I wanna upgrade my lackluster amd fx-6300 processor first)
     

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...