AMD reveals the Radeon R9 290X alongside new R7 and R9 GPU lines

By on September 25, 2013, 2:00 PM
amd, radeon, gpu, graphics card, hawaii, r9 290x, gcn 2.0, tech day, gpu14

For the last few days, AMD has been throwing their GPU14 Tech Day event in Hawaii, gearing tech media up for the reveal of their next-generation graphics card. We've seen a few leaks surrounding this card, known as the Radeon R9 290X, and AMD has been pushing this announcement as the next big thing for PC gamers. But not until today have we officially heard the specifics relating to this graphics card, and what it will mean for the gaming market.

First up we have photos of the actual card, as seen at Tuesday's reception on the USS Missouri, accompanied by EA Dice for Battlefield 4. It's fairly standard in designs as far as we've seen recently for GPUs, with AMD once again opting for an air cooler at the end of the card that pushes air across the GPU core itself. The front panel sees dual-DVI ports, plus a HDMI port and a DisplayPort, and for power there's an 8+6pin PCIe power port.

Perhaps most interesting, though, is the lack of a CrossFire port a long the top of the card. Speculation at the event is that AMD will be using PCI Express 3.0 for connectivity between two GPUs, ditching their proprietary connector that added extra bandwidth. PCIe 3.0 is found already in many modern motherboards, and it supports 120 GT/s (15.8 GB/s) transfers in each direction through a 16x slot, compared to 80 GT/s (8 GB/s) through PCIe 2.0. This extra bandwidth should allow it to ditch the connector at the top, instead pushing the necessary frame information through PCIe 3.0.

There's 4GB of memory aboard this card, 5 TFLOPS of computer power, 300 GB/s of memory bandwidth, and the ability to process 4 billion triangles per second. The GPU itself packs a whopping 6 billion transistors, and is built to support 4K Ultra HD resolutions.

The R9 290X will be available alongside Battlefield 4 in an exclusive bundle, although prices and a launch window have yet to be announced.

AMD also launched new R7 and R9 card lines, right from low-budget cards to enthusiast GPUs at many price points. The R7 250 is a <$89 budget card with 1GB of GDDR5 memory, the R7 260X is a 2GB card for $139, the R9 270X is a 2GB card for $199, and there will also be an R9 280X with 3GB of memory for $299. We also heard about an R9 290 later in the demonstration, but this wasn't detailed.

To support 4K displays, AMD has proposed a new VESA standard that will be embedded in 4K displays, allowing them to work out of the box with various devices. It will be supported in the Catalyst Control Panel, and should resolve issues with screen tearing at high-resolutions.

AMD TrueAudio will be a major part of Graphics Core Next 2.0-based graphics cards, available in the R9 290X, R9 290 and R7 260X. AMD is calling this technology a revolution for audio designers and game developers, similar to programmable shaders. We listened an "AstoundSound" demo at the GPU14 tech day from GenAudio, and it sounded absolutely incredible in 7.1 surround sound.

TrueAudio appears to be dedicated DSP technology on AMD's latest chips, which plugins can access to reduce GPU and CPU load. It was mentioned that certain plugins, such as AstoundSound, will be available for PC, Xbox One and PlayStation developers, enhancing how we hear in-game sounds. TrueAudio will help greatly with reverberation technology - normally a CPU and memory intense task - as audio tasks can be offloaded to the DSP.

Also announced today at AMD's GPU14 Tech Day:
Revolutionary 'Mantle' API unveiled to optimize GPU performance




User Comments: 73

Got something to say? Post a comment
Razer said:

..let the (price) war begin..

JC713 JC713 said:

Cant wait for benchmarks. Leaked benchmarks showed that this GPU was faster than the Titan. Now we have to see the price.

1 person liked this | howzz1854 said:

Now they're just recycling all their old naming scheme. it seems like these two companies only know how to use two naming schemes. it's either three digit letter like 580/680/780. or the 8800/7900/8900. one of these days somebody who knows nothing about computers is going to pick up a old piece of hardware for some ridiculous price off ebay because the naming is so similar. I can just see some poor guy buying a radeon X780 thinking it's gtx780. or buying a 8800gt (almost 10 year old) thinking it's a HD8850.

Staff
Scorpus Scorpus said:

Cant wait for benchmarks. Leaked benchmarks showed that this GPU was faster than the Titan. Now we have to see the price.

Leaked benchmarks are likely faked. Check back in October for the final benchmarks from the GPU we'll be receiving at the Tech Day here in Hawaii

JC713 JC713 said:

Cant wait for benchmarks. Leaked benchmarks showed that this GPU was faster than the Titan. Now we have to see the price.

Leaked benchmarks are likely faked. Check back in October for the final benchmarks from the GPU we'll be receiving at the Tech Day here in Hawaii

True. The benchmarks were in percentages, so that was fishy.

Now they're just recycling all their old naming scheme. it seems like these two companies only know how to use two naming schemes. it's either three digit letter like 580/680/780. or the 8800/7900/8900. one of these days somebody who knows nothing about computers is going to pick up a old piece of hardware for some ridiculous price off ebay because the naming is so similar. I can just see some poor guy buying a radeon X780 thinking it's gtx780. or buying a 8800gt (almost 10 year old) thinking it's a HD8850.

Yeah I was thinking this also. Good point.

ikesmasher said:

Well theres only so many number sequences one can use. Id rather it be called a 290X then a HD 11750.

dividebyzero dividebyzero, trainee n00b, said:

Live stream of the event here:

http://www.pcper.com/live/

So far you haven't missed anything. One hour start delay, a rehash of TressFX, and a repeat of the unified gaming talk given a while ago

Guest said:

The name of the card should be based on their Crysis framerate Max settings at 4k resolution

1 person liked this | howzz1854 said:

There's more to the naming than just three and four digit scheme. how many cars do we have out there, and yet, we identify them all. I am not saying we start naming them Nvidia Panda, or that nature. but there're way more option than sticking to the two and recycling them. you can either add generation number after the model number, or use non numerical names.

JC713 JC713 said:

I really like the look of the shroud. Is the R9 290 just a higher clocked 7970?

GhostRyder GhostRyder said:

Been waiting for this for awhile, now we will see the competition start to stiffen up!

Staff
Scorpus Scorpus said:

I really like the look of the shroud. Is the R9 290 just a higher clocked 7970?

No details on this card yet at the event

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

No details on this card yet at the event

The R9 290 is the salvage part of Hawaii (Hawaii Pro- 2560 shaders seems a popular number based on the internal architecture).

The missing R8 series seem to be a straight up rebrand of the 7970GE with possibly a 20MHz clock bump (970 core, 1070 boost)

1 person liked this | hahahanoobs hahahanoobs said:

Now they're just recycling all their old naming scheme. it seems like these two companies only know how to use two naming schemes. it's either three digit letter like 580/680/780. or the 8800/7900/8900. one of these days somebody who knows nothing about computers is going to pick up a old piece of hardware for some ridiculous price off ebay because the naming is so similar. I can just see some poor guy buying a radeon X780 thinking it's gtx780. or buying a 8800gt (almost 10 year old) thinking it's a HD8850.

Intel CPU's with IGP's are more of a concern to the masses, and no one is writing articles about them being confused about what an i3, i5 and i7 are, so why would GPU's be the straw that breaks the camels' back? Anyone buying a discrete card for upwards of $150, know what they are buying, before they buy it. Period.

amstech amstech, TechSpot Enthusiast, said:

I like the name redirection and proposed specifications, the cards look nice; can't wait to see them perform.

The 4K hookup is sweet.

1 person liked this | GhostRyder GhostRyder said:

I think im in love with that cooler design, the Titan/780 and the R90X GPU coolers are the best we have seen in years. Though me personally, if I grab some of those, im tearing it off for a waterblock :P.

howzz1854 said:

Now they're just recycling all their old naming scheme. it seems like these two companies only know how to use two naming schemes. it's either three digit letter like 580/680/780. or the 8800/7900/8900. one of these days somebody who knows nothing about computers is going to pick up a old piece of hardware for some ridiculous price off ebay because the naming is so similar. I can just see some poor guy buying a radeon X780 thinking it's gtx780. or buying a 8800gt (almost 10 year old) thinking it's a HD8850.

Intel CPU's with IGP's are more of a concern to the masses, and no one is writing articles about them being confused about what an i3, i5 and i7 are, so why would GPU's be the straw that breaks the camels' back? Anyone buying a discrete card for upwards of $150, know what they are buying, before they buy it. Period.

if it makes you feel better. Intel's naming scheme is confusing too. just think Core 2 Duo (dual core, vs Core Duo (dual core) VS Core 2 Solo, which is not dual core but single core , VS Core 2 quad,which is quad core 2nd gen..... whew... hope I got that right.. these guys really gotta learn how to name better.

and you don't need to be an expert to drop $150 on a video card. anyone who is looking to replace their old dell or busted HP comp with their graphic card went up in smoke because of poor cooling design will be looking at getting a card that's way beyond $150. shit I have my own customized water cooling rig with custom paint job, that's overclocked from 2.4ghz to 4ghz, and a HD7970 that's clocked at 1.2ghz even I am confused sometime.

1 person liked this | JC713 JC713 said:

I think im in love with that cooler design, the Titan/780 and the R90X GPU coolers are the best we have seen in years. Though me personally, if I grab some of those, im tearing it off for a waterblock :P.

Too bad it is a blower :P. The ones on the 7000 series were really loud.

1 person liked this | GhostRyder GhostRyder said:

Too bad it is a blower :p. The ones on the 7000 series were really loud.

Talk about a blower, I bet 3 of those cards will still be quieter than one of my HD 6990s. Talk about a Jet taking off lol, 2 of those just were horrible in noise.

howzz1854 said:

The dual card Nvidia 9800GX2 I had was loud as a window AC.

GhostRyder GhostRyder said:

The dual card Nvidia 9800GX2 I had was loud as a window AC.

You had one of those no way, ive only actually ever met one person in real life who had one of those cards and he has it sitting on his desk as we speak lol.

howzz1854 said:

You had one of those no way, ive only actually ever met one person in real life who had one of those cards and he has it sitting on his desk as we speak lol.

I did indeed. the thing was so loud and hot that you could cook a full meal in the case. I remember 120 celsius was normal for that card during gaming. too bad that thing didn't last. but that's not the biggest problem. the biggest problem was SLi wasn't optimized for the games you play until a year after the games are already out. so you would be playing most games at a single 9800gt speed until they come out with proper optimization a year later anyway.

that was a waste of money.

Guest said:

The "OUR most powerful GPU ever" statement says a lot to me... Then, there's the FireStrike benchmark graph itself. The R9 290X is leveled up to a 7k score. Well, a GTX 780 already 8k.

Finally, Plus, if the R9 290 Series would be "killing" the TITAN - or the GTX 780 for that matter -, I don't doubt AMD would have said it quite clearly.

Burty117 Burty117, TechSpot Chancellor, said:

The "OUR most powerful GPU ever" statement says a lot to me... Then, there's the FireStrike benchmark graph itself. The R9 290X is leveled up to a 7k score. Well, a GTX 780 already 8k.

Finally, Plus, if the R9 290 Series would be "killing" the TITAN - or the GTX 780 for that matter -, I don't doubt AMD would have said it quite clearly.

I was thinking the same thing however, they don't state what settings they used, if it's 4k on extreme settings they are awesome scores, if default 1080p though, these are not very impressive?

DivideByZero, what do think about these benchmark scores?

GhostRyder GhostRyder said:

The "OUR most powerful GPU ever" statement says a lot to me... Then, there's the FireStrike benchmark graph itself. The R9 290X is leveled up to a 7k score. Well, a GTX 780 already 8k.

Finally, Plus, if the R9 290 Series would be "killing" the TITAN - or the GTX 780 for that matter -, I don't doubt AMD would have said it quite clearly.

Pre-liminary leaked benchmark scores how it to be besting the titan in many games, but that's all up to how its released and that's a full unlocked Hawaii GPU. Honestly only time will tell.

I did indeed. the thing was so loud and hot that you could cook a full meal in the case. I remember 120 celsius was normal for that card during gaming. too bad that thing didn't last. but that's not the biggest problem. the biggest problem was SLi wasn't optimized for the games you play until a year after the games are already out. so you would be playing most games at a single 9800gt speed until they come out with proper optimization a year later anyway.

that was a waste of money.

hah, yea that was the one of (if not the first, but I cant remember if there was an 8000 series dual GPU off the top of my head) dual GPU cards. But yea his kept working, but he had to do some modifications to it to keep it cool.

Guest said:

The gtx 780 is better than the titan in performance. go to nvidia website and they have a relative graph of all their cars from 200 up to the 700 series. the 780 is the highest, a good chunk over the titan. so if amd new card can hold with the 780 then it will beat the titan. and lets not forget when the 7970 came out it was better than the gtx 580 and was a good up against the 680...so the new card could very well smack down the 780.

3 people like this | dividebyzero dividebyzero, trainee n00b, said:

DivideByZero, what do think about these benchmark scores?

Starting from the top down:

Using AMD's own slide, the 290X is pushing 8000 in 3DMark's Firestrike performance preset

While the GTX 780 (depending upon clock once the CPU and RAM are normalized), scores 8400-8700

My guess is that under the Extreme preset, along with any GCN optimized application, that gap will close and there will certainly be instances where the 290X disposes of not just the 780 but the Titan as well. How many apps, and how many extreme corner cases (such 8xSSAA + post process) I couldn't say, but the 512-bit bus of the AMD card is certainly going to help pushing pixels in antialiasing or downsampling mode.

The rest of the lineup is pretty clear cut since they seem to be largely rebrands:

R7-250X's Firestrike score of ~2000 is pretty much the same as the HD 7750/7770

R7-260X's Firestrike score of ~3700 is exactly that of the HD 7790

R9-270X's Firestrike score of ~5500 is slightly above the HD 7870XT

R9-280X's Firestrike score of ~6800 is 7970GE (and GTX 770) territory -Tahiti by another name.

Looks as though Curacao (if that's what the tweaked Tahiti LE/Pitcairn facelift is called) and Bonaire will join the Tahiti rebranded parts along with the unannounced R9-290 (Hawaii Pro), which looks to be the extremely interesting part if priced at $399 and should offer performance between the HD 7970GE and the 290X.

/My $0.02

dennis777 dennis777 said:

There's more to the naming than just three and four digit scheme. how many cars do we have out there, and yet, we identify them all. I am not saying we start naming them Nvidia Panda, or that nature. but there're way more option than sticking to the two and recycling them. you can either add generation number after the model number, or use non numerical names.

AMD Koala would be good. its really hard to remember all the numbers of this cards.

Eddo22 said:

Too bad it is a blower :p. The ones on the 7000 series were really loud.

Talk about a blower, I bet 3 of those cards will still be quieter than one of my HD 6990s. Talk about a Jet taking off lol, 2 of those just were horrible in noise.

Odd. I ran a 6990 for 2 1/2 years and the only time it was kinda noisy is when I ran benchmark programs. It was just barely audible in gaming in 27-32C degree room temps. That was with the overclock switch enabled as well.

I agree with the blower though, Would it really hurt AMD to put a better cooler on their cards? Especially considering you are paying an arm and a leg for their top tier products.

I have a Gigabyte Radeon 7970ghz now and thanks to the 3 fan cooler it runs below 70 in games and I never hear the fans.

Guest said:

Well if you're buying an expensive graphics cards and you don't even bother to do the necessary research, I'd say it's your own fault...

howzz1854 said:

That's the thing... to many a $150 video card isn't expensive. it's just another hardware to get their pos hp or Dell running again. think of how many business professionals who own those computers and just easily drops $150 on a video card so they can play Sims 3 on the weekend when the kids are out playing in the yard.

lawfer, TechSpot Paladin, said:

I need pricing info! Come on AMD!

GhostRyder GhostRyder said:

Odd. I ran a 6990 for 2 1/2 years and the only time it was kinda noisy is when I ran benchmark programs. It was just barely audible in gaming in 27-32C degree room temps. That was with the overclock switch enabled as well.

I agree with the blower though, Would it really hurt AMD to put a better cooler on their cards? Especially considering you are paying an arm and a leg for their top tier products.

I have a Gigabyte Radeon 7970ghz now and thanks to the 3 fan cooler it runs below 70 in games and I never hear the fans.

Did you leave it on auto? Also what cooling did you have running through it.

I had mine in a Corsair Obsidian 800D, before I got the liquid cooling components, I had a stock corsair fan blowing on it and it was still loud even with light gaming. I tried putting a nicer Aerocool Shark Fan on that I was saving for when I was going to order the rest of my stuff which helped out a lot, but while playing Battlefield 3, the fan would always spin up pretty far. When I put 2 in CFX, they would hit 100% fan speed under almost all gaming loads even with lots of airflow going through and it was just unbelievably loud. I still have yet to hear a computer that comes close to as loud as just one of those cards at 100% fan speeds was.

dividebyzero dividebyzero, trainee n00b, said:

I need pricing info! Come on AMD!

Popular industry opinion pegs the 290X at $600 ($599 in etail-speak).

HD 6990? Yep, we have a thread for that. Has pretty much nothing to do with the R9, it's THREE generations old.

Blue Falcon said:

PC enthusiasts are still hopelessly trying to extrapolate real world gaming performance from 3DMark scores? How many times must it be shown that 3DMark scores between AMD and NV do not directly translate into gaming performance?

GTX680 > 7970Ghz in 3dMark but in real world 7970Ghz > 680

GTX680 is 67% faster than GTX580 in 3dMark but in the real world it's more like 35-40%

[link]

All these synthetic benches like 3dCrap and Unigine heaven are better utilized for testing GPU stability when overclocking. For extrapolating real world gaming performance between AMD/NV, they not very accurate.

DeViLzzz DeViLzzz said:

So should I ditch my son's 1 GB 7850 for $100 flat if I can get someone to take it and get one of these cards coming out ? Also if I get one of these new cards what do you think the life span will be for one of these ? Btw we only game in 1920x1080 and that will continue for quite awhile (3 yrs most likely)

DeViLzzz DeViLzzz said:

Eh no ability to edit a comment for a few minutes ?

Anyway after seeing that there is no Crossfire connection I say no to their new cards as I would want to eventually Crossfire on his board and he won't be able to do so with these new cards.

dividebyzero dividebyzero, trainee n00b, said:

PC enthusiasts are still hopelessly trying to extrapolate real world gaming performance from 3DMark scores? How many times must it be shown that 3DMark scores between AMD and NV do not directly translate into gaming performance?..........etc....etc

You're doing it wrong :smh:

AMD R9-290X Firestrike score ~8000 (as per AMD's own slide)

AMD HD 7970GE Firestrike score ~6800 - 7000

Percentage increase between the two generations of AMD top-tier GPU : 14.3% - 17.6% . You now have a basis for comparison- and that is the comparison that most people studying performance are looking at since the 7970GE's capabilities are well known.

Both are AMD designs. Check

Both are GCN µarch. Check

Valid comparison. Check

For Firestrike this actually works as a very good cross-reference. In the chart I posted on the previous page the GTX 780 scored 8684, the HD 7970 (non-GHz) scored 6624 - a 31.1% advantage to the 780. Latest comparison between the GTX 780 and HD 7970 (non-GHz) for averaged performance: 31.7% at 1920 res, and 31.4% at 2560 res.

Anyway after seeing that there is no Crossfire connection I say no to their new cards as I would want to eventually Crossfire on his board and he won't be able to do so with these new cards.

The rebranded (old architecture) cards still carry Crossfire fingers. It is only the new GPU's that do not. For these the inter-card communication is accomplished over the PCI Express bus in the same way that older low-spec cards have done for a while. For example, the R9-280X (HD 7970 based) features the CFX fingers, while the 290X (new Hawaii GPU) does not

Obzoleet Obzoleet said:

Whats CFX fingers?

GhostRyder GhostRyder said:

Whats CFX fingers?

It is the spot on an AMD video card where you plug in a CFX cable to link two or more cards together

PC enthusiasts are still hopelessly trying to extrapolate real world gaming performance from 3DMark scores? How many times must it be shown that 3DMark scores between AMD and NV do not directly translate into gaming performance?

GTX680 > 7970Ghz in 3dMark but in real world 7970Ghz > 680

GTX680 is 67% faster than GTX580 in 3dMark but in the real world it's more like 35-40%

[link]

All these synthetic benches like 3dCrap and Unigine heaven are better utilized for testing GPU stability when overclocking. For extrapolating real world gaming performance between AMD/NV, they not very accurate.

You can use them to compare past and present generations, but I do agree with the cross platforming as it does not show true gaming performance. They are using it to show the new Hawaii GPU's performance compared to its other cards this generation.

Eddo22 said:

Odd. I ran a 6990 for 2 1/2 years and the only time it was kinda noisy is when I ran benchmark programs. It was just barely audible in gaming in 27-32C degree room temps. That was with the overclock switch enabled as well.

I agree with the blower though, Would it really hurt AMD to put a better cooler on their cards? Especially considering you are paying an arm and a leg for their top tier products.

I have a Gigabyte Radeon 7970ghz now and thanks to the 3 fan cooler it runs below 70 in games and I never hear the fans.

Did you leave it on auto? Also what cooling did you have running through it.

I had mine in a Corsair Obsidian 800D, before I got the liquid cooling components, I had a stock corsair fan blowing on it and it was still loud even with light gaming. I tried putting a nicer Aerocool Shark Fan on that I was saving for when I was going to order the rest of my stuff which helped out a lot, but while playing Battlefield 3, the fan would always spin up pretty far. When I put 2 in CFX, they would hit 100% fan speed under almost all gaming loads even with lots of airflow going through and it was just unbelievably loud. I still have yet to hear a computer that comes close to as loud as just one of those cards at 100% fan speeds was.

I left the fans on auto and it was the stock 1 fan cooler on it. I just realized though that my Phenom 965 was holding the card back, so maybe that's why it wasn't as noisy.

Yeah I can definitely see 2 6990's being loud. They easily ran 85-95 degree's as a single unit.

Geforcepat Geforcepat said:

Hmm on interesting one is 290x or whatever they're calling it.(Shoulda just named it 9700 pro) give us some retro

amstech amstech, TechSpot Enthusiast, said:

PC enthusiasts are still hopelessly trying to extrapolate real world gaming performance from 3DMark scores? How many times must it be shown that 3DMark scores between AMD and NV do not directly translate into gaming performance?

It may not be exact/directly translate but its usually pretty close, 3DMark gives you an accurate idea of how well your GPU will perform compared to others, some specific tests are better then others.

The new 3DMark expecially, although 3DMark11 is still good for results as well.

GTX680 > 7970Ghz in 3dMark but in real world 7970Ghz > 680

There is no difference between a 680 and 7970 in real world performance, they are always within 5-10 frames of one another with each GPU winning at certain games and a small overall advantage to 7970.

howzz1854 said:

Hmm on interesting one is 290x or whatever they're calling it.(Shoulda just named it 9700 pro) give us some retro

that was the best video card I've had. solid performer that lasted me for a while, very oveclockable and moddable. miss the good old days.

GhostRyder GhostRyder said:

I left the fans on auto and it was the stock 1 fan cooler on it. I just realized though that my Phenom 965 was holding the card back, so maybe that's why it wasn't as noisy.

Yeah I can definitely see 2 6990's being loud. They easily ran 85-95 degree's as a single unit.

Yeah no kidding, but one under load was pretty horrid, both was just atrocious in terms of noise. The heat always stayed around 80C under load with fans at 100% for both cards. I had to liquid cool those puppies, man I love them to death though, now everything runs cool and quiet. Heck under load, I have yet to see anything above 50C under full load unless I overclock beyond the Bios 880mhz setting (Ive been able to get stable 990mhz on the core).

Hmm on interesting one is 290x or whatever they're calling it.(Shoulda just named it 9700 pro) give us some retro

Its an odd name, however, those names have a chance to still change before they launch. They may go with 9970 and below or something like that, though if they stick with the 290X or something like that, I wouldn't complain too much, its just an interesting scheme for names. Though im going to be sad if I cant buy a 9990 :p.

hahahanoobs hahahanoobs said:

if it makes you feel better. Intel's naming scheme is confusing too.

I feel fine. I wasn't the one that was confused. Look at who I was replying to.

and you don't need to be an expert to drop $150 on a video card. anyone who is looking to replace their old dell or busted HP comp with their graphic card went up in smoke because of poor cooling design will be looking at getting a card that's way beyond $150.

Again, you're replying to the wrong person.

hahahanoobs hahahanoobs said:

Well if you're buying an expensive graphics cards and you don't even bother to do the necessary research, I'd say it's your own fault...

Bingo! Give this man/woman a prize.

dividebyzero dividebyzero, trainee n00b, said:

It may not be exact/directly translate but its usually pretty close, 3DMark gives you an accurate idea of how well your GPU will perform compared to others, some specific tests are better then others.

The new 3DMark expecially, although 3DMark11 is still good for results as well.

Pretty much so.

I think the "this isn't a fair test" comments are purely a reaction to what is seen as a lower level of performance than was expected from some people.

As easy gauge would be to use AMD's own comparison from the slide deck if people are unwilling to believe a level playing field exists:

R9-290X scores ~8000, R9-280X (HD 7970GE rebrand) 6800 from this slide and the same chart above. 8000 / 6800 = ~18% more performance for the 290X, which is ballpark with the GTX 780's 20% performance lead over the same HD 7970GE for aggregated benchmarks at 1920 and 2560 resolutions.

Guest said:

Too bad there are no AMD chipsets with native PCIe 3.0. So if they do choose to do away with the crossfire bridges then it'll make it slower on their own hardware. :P

2 people like this | Blue Falcon said:

Ok for all you who disagree with me, watch for real world gaming performance. You state that "Percentage increase between the two generations of AMD top-tier GPU : 14.3% - 17.6%"

R9 290X will be faster than 14.3-17.6% on average at 2560x1600. The entire comparison to 3dmark has always been a poor measurement of real world gaming performance since no game in the world is based on a 3dmark game engine.

Cherry-picking firestrike and conveniently ignoring the inaccurate scores of 3dMark11 of 680 vs. 580 shows you guys don't want to address my points on an overall basis.

@ amstech

"There is no difference between a 680 and 7970 in real world performance, they are always within 5-10 frames of one another with each GPU winning at certain games and a small overall advantage to 7970."

Wrong, 7970GE trails 680 in 3dMark11 but it beats it in the real world. Therefore, that's evidence in itself that 3dMark is inaccurate.

[link]

If I play Total War games, 3dMark 2013 tells me squat what GPU I should purchase. It's pretty clear I should get an NV card but 3dMark 2013 doesn't portray such an advantage for NV's cards. Similarly there are many games where the reverse is true.

[link]

The only thing that matters is real world gaming performance unless you love beating the final boss in 3dMark.....

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

Cherry-picking firestrike

How can it be cherry-picked when it is the only performance slide that AMD have released ?

If anyone's cherry-picking Firestrike it is AMD.

Get a grip.

:SMH:

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.