AMD slashes pricing on Radeon R9 290, 290X cards amid limited GeForce supply

And the flood of cheap ex-mining cards entering the resell market is probably the largest factor in AMD needing to drop prices.

Considering the timing, its the GTX 900 series that has them readjusting. Just like any other year when one side is down.

Also, the reviews shout louder than some used cards from months ago.
 
Considering the timing, its the GTX 900 series that has them readjusting. Just like any other year when one side is down.
Also, the reviews shout louder than some used cards from months ago.
Maybe, but the arrival of the GTX 970/980 doesn't account for the two AMD price cuts in quick succession prior to the Maxwell cards launching when the GTX 970 was rumoured to drop at $400+.

AMD's discrete graphics board shipments dropped by 10.7% during Q2, and I'm betting Q3 looks even worse. Q4 ? Well, that's a two front war.
 
Last edited:
New world record for a 3DMark Extreme Fire Strike on a single GPU.

Toms Hardware said:
Elmor's test system consists of an Asus GTX 980 DirectCU II (which Asus now markets as a Strix card), an Asus Rampage V Extreme motherboard, the Intel Core i7-5960X processor and 16 GB of DDR4 memory made by G.Skill, all of which are powered by a V1200 PSU from Cooler Master.

Elmor overclocked the graphics card to a staggering 2208 MHz, with an effective memory speed of 8.4 GHz. Elmor also managed to push the Core i7-5960X CPU to 5.586 GHz, which together delivered a 3DMark Fire Strike Extreme score of 9,568 marks, which is a single-GPU world record.
http://www.tomshardware.com/news/elmor-overclock-gtx980-2.2ghz,27827.html

Edit:
Already beat? Lol.
http://www.overclock.net/t/1517387/...cord-3dmark-extreme-fire-strike#post_22963382
 
Last edited:
Wow, that really sucks! It sounds like a residual effect from the Imron scandal.

For captain and others replying to my post: I live in Mexico; and by all standards (even here) that's a lot of money in electric bill. We disconnect devices, turn off unused lights, and save as much energy as we can and yet we pay an awful lot, the PC is not to blame, since it is mostly used on vacation only -my home is where I go to sleep after a full day of working and studying. Not playing on weekends to keep up with my tasks.
So yeah, bad luck living in the zone we live.
 
New world record for a 3DMark Extreme Fire Strike on a single GPU.
Not bad at all. Shades my stock CPU and GTX 780's quite well. Couldn't see the result causing a mad rush to grab GTX 980's though. Most people in the market for a benchmarking card know that the 980 can easily push through 1500MHz on air. The only real drawback seems to be waiting for custom BIOS's with higher voltage limits to become more widely available.
 
Well, a few thoughts on that...
1. An aftermarket cooler adds to the cost of the graphics solution you choose
2. Depending upon the AIB, removal of the stock cooler could void warranty
3. Considering aftermarket air coolers, becomes somewhat more problematic with multi-GPU setups.

All good points.

1. Cost is worth performance and sounds over stock solutions
2. Out of all the GPU's I've owned I had to RMA one and that was before I started replacing the coolers haven't had an issue since so voiding isn't problem for me. Heck even putting better tim on the gpu and keeping stock cooling will void.
3. I've always been a single GPU guy so for me no issue there.

Of course these are my reasons and for other people it will vary.
 
"Man, I dont know where do you get your facts. Your just a fanboy by your immature comments here in techspot. Ive been here long before I saw your first comment posted. Sometimes, learn to back up your unncessary comments with facts..."

LoL, look who's talking.. how about stop your immature comments that sounds like a fanboy itself, just sit back and fancy whatever your gpu card. As you said before, I can said the same thing to you: learn to back up your unnecessary comments with facts or just stop talking rubbish and just sit back calmly

-another guest-
 
I don't need new cards but I'd love to get an 8GB version of the GTX 980 (because we know it's coming) just to say "My graphics card has 8GB of RAM!"
 
@VitalyT, I bet nVidia has an answer up their sleeve. So as soon as AMD has released their R9 308/390 and are validated by various tech websites they will come up with an answer. ;)
 
^^ haha. True. Why would people buy an old architecture based AMD 2xx series when the efficient and newer nvidia card is better and faster and COOLER. Truth to be told, many people are selling AMD cards on the buy and sell forums/sites just to have the nvidia maxwell. Also resale value of Nvidia is better than AMD. I just hope AMD can come catch up with intel and nvidia in performance per watt in the cpu and gpu to bring further the price. We love competition.

PS. I am not a fanboi of Nvidia. I just prefer them because I love their software/drivers and efficiency. I am using a gtx770 with titan cooler and recently bought a r7 250 for online games on a old PC.

AMD's cards are unmatched when it comes to Open CL implementation and generic computing power. That's why their sales soared on the account of virtual currency encoding. nVidia is only better for gamers.
Yeah but methinks there are a helluva lot more gamers than currency miners and now that little fad is dying down... AMD is probably sitting with an overflowing inventory of R9 cards.
 
Ok so with Nvidia, you will save about 500$ in electricity bill ?
Are you able to calculate the difference mate?

Here how:
The wattage difference 155w, if the card is used for 365days (4h daily at full power) at 15cents per kWh, result: about 33$
Should be less than 10$ difference per year for intense user (180days per year, 2hours per day)
Remember, the rated wattage is at 100% GPU usage so my calculations are even higher than normal.

That's ruffly what you might save a year on electricity. But if you live a hotter climate such as my self, then you also have to consider air conditioning. So I'd imagine the operating cost of a high end AMD card to be much higher.

I've used both companies, over the course of 20 years, and currently I prefer Nvidia over AMD. Mainly for the the power efficiency and better drivers.

With that said, competition is a great thing, it extremely helpful as a consumer. Keeping consumer prices lower and forcing the two companies to innovate. Sadly, we no longer see this to a large degree in the x86 based CPU market.
 
Back