AMD Radeon 300 series graphics card pricing leaks online

Shawn Knight

Posts: 15,285   +192
Staff member

amd radeon series prices leak launch amd radeon gpu graphics card leak video card pricing radeon 300

AMD’s upcoming Radeon 300 series graphics cards may end up being rebadged 200 series cards with minor tweaks but it’s not all bad news for gamers. WCCFtech has it on good authority (they’re confident enough to drop the “rumor” tag completely) that the red team’s 300 series cards will arrive with extremely attractive MSRPs.

Here’s what the publication is claiming at this hour:

  • Radeon R9 390X 8GB (enhanced Hawaii XT): $389
  • Radeon R9 390 8GB (enhanced Hawaii Pro): $329
  • Radeon R9 380X 3GB/6GB (Tonga XT): unconfirmed
  • Radeon R9 380 4GB (Tonga Pro): $235
  • Radeon R9 380 2GB (Tonga Pro): $195
  • Radeon R7 370 4GB (Pitcairn): $175
  • Radeon R7 370 2GB (Pitcairn): $135
  • Radeon R7 360 2GB (Bonaire): $107

The new cards will be split into two separate categories – performance and enthusiast – with only the Radeon R9 390X 8GB and Radeon R9 390 8GB falling into the speedier enthusiast category.

While we’re not prepared to consider these prices set in stone just yet, the publication does have a solid track record. What does seem certain, however, is that the 300 series will indeed be rebrands of the 200 series.

amd radeon series prices leak launch amd radeon gpu graphics card leak video card pricing radeon 300

Graphics card maker XFX accidentally confirmed that the Radeon R9 390X will indeed be a rebranded R9 290X, a card that first debuted near the end of 2013. Demand (and subsequently, pricing) for AMD’s high-end 200 series cards skyrocketed shortly after their debut as they were a popular choice for cryptocurrency miners.

That trend lasted less than six months, however, as dedicated mining hardware quickly make GPU mining inefficient. Pricing returned to normal in May 2014.

amd radeon series prices leak launch amd radeon gpu graphics card leak video card pricing radeon 300

It’s worth pointing out that the Radeon 300 series isn’t the only thing AMD is working on. The company’s true successor, based on its Fiji GPU with vertically-stacked HBM and the rumored nickname "Radeon Fury," will officially be announced at E3 next week.

Permalink to story.

 
Look, it's simple, these incremental rebranding steps from AMD are forced by a goofed up market place and a struggling economy. AMD incrementalism also plagues their driver support process. These baby steps are inefficient BS and everybody knows it.

Gone are the days of bloated organizations that churn out big tech advancements to a huge consumer base. Pound sand on "enhanced", rebranded nonsense and be honest to the readership about how these prices are margin protectionism and nothing more. Spare us on the "extremely attractive MSRPs.", you know this is sponsor-speak - or if not - highly insulting to our collective intelligence ... and perhaps to your own.

I say this as a recent Radeon convert too; I've purged NVidia from all my rigs because of price vs performance vs power consumption ... and I'm very content with a 'more than capable' 290x in my main gaming rig.
 
Last edited:
The 290X is better than the 780 Ti in some benchmarks and waaay cheaper.

The new 980 ti sits at $650 and with this new competitor at a mear $390, AMD is the clear choice.

Now if only I didn't buy an overclocked 780 Ti 6 months ago :(
 
If these re brands are based on GCN 1.2 then this is fine.

Tonga has some improvements that help it to catch or beat GCN 1.0 and 1.1 card that are much stronger.

If the re brands are updated then they will be worth buying if not you are better off just buying a used 290x
 
The 290X is better than the 780 Ti in some benchmarks and waaay cheaper.

The new 980 ti sits at $650 and with this new competitor at a mear $390, AMD is the clear choice.

Now if only I didn't buy an overclocked 780 Ti 6 months ago :(

Dude the 980Ti has a 60-90FPS advantage vs the $390 390X come on dude benchmarks show and they only came out a few days ago what like 13/7/2015, this isn't 9/6/2015 where benchmarks havent even been done yet.
 
The 290X is better than the 780 Ti in some benchmarks and waaay cheaper.

The new 980 ti sits at $650 and with this new competitor at a mear $390, AMD is the clear choice.

Now if only I didn't buy an overclocked 780 Ti 6 months ago :(

Dude the 980Ti has a 60-90FPS advantage vs the $390 390X come on dude benchmarks show and they only came out a few days ago what like 13/7/2015, this isn't 9/6/2015 where benchmarks havent even been done yet.
And let's not forget about DX12 support. The main issue with AMD rebranding the entire 300 series is that none of the cards will support DX12. It's not crucial right now, but when DX12 games start popping up later this year, I know I'd regret buying a 300 series card.
 
I'm not so sure about the "Radeon R9 380X 3GB/6GB (Tonga XT): unconfirmed" entry in that WCCFtech's rumor.
As Anandtech's review has shown (they usually do a deeper analysis of the chips' layouts and architectures), Tonga does not have a 384-bit bus. The fully enabled Tonga chip, as seen in the R9 295M used in the 5K iMac, still has a 256-bit bus only, which has been permanently cut from Tahiti in favor of the color compression feature (which saves power, in theory). That way, you can't have 3 GB of 6 GB of RAM on a Tonga GPU, unless you do some assymetric memory system like Nvidia did with the GTX 550 Ti (2 GB of a 192-bit bus), and that means a segment of your VRAM will be slower than the rest. Given what happened with the GTX 970, I doubt anybody would do something like that again.
So, either that chip is actually Hawaii cut down to 384-bit, or a rebranded Tahiti instead of Tonga, or simply incorrect.
 
Don't support DX12 ? wow the comments section is all FUD here...AMD's high end graphics card gets uncovered next week and you guys are all talking smack...we haven't even seen any numbers yet for Fury and you guys are talking nothing but trash. And as far as rejigging and re-branding prior GPU's as new cards I believe nVidia does exactly the same thing and has for some time now. The 580 GTX I bought a few years back was nothing more than a reworked 480 GTX yet you guys act like AMD is committing some sort of crime when they do the same thing nVidia does. The fanboi-ism is really immature and pointless...
 
The main issue with AMD rebranding the entire 300 series is that none of the cards will support DX12. It's not crucial right now, but when DX12 games start popping up later this year, I know I'd regret buying a 300 series card.

Completely and utterly incorrect. AMD cards all the way back to 7000 series cards will fully support DX12. Please do not make up baloney and try passing it off as fact...
 
The 290X is better than the 780 Ti in some benchmarks and waaay cheaper.

The new 980 ti sits at $650 and with this new competitor at a mear $390, AMD is the clear choice.

Now if only I didn't buy an overclocked 780 Ti 6 months ago :(

Dude the 980Ti has a 60-90FPS advantage vs the $390 390X come on dude benchmarks show and they only came out a few days ago what like 13/7/2015, this isn't 9/6/2015 where benchmarks havent even been done yet.
And let's not forget about DX12 support. The main issue with AMD rebranding the entire 300 series is that none of the cards will support DX12. It's not crucial right now, but when DX12 games start popping up later this year, I know I'd regret buying a 300 series card.

According to another source: AMD has recently confirmed that their current cards can support Feature Level 12_0 at best while NVIDIA’s Maxwell 2.0 architecture has support for Feature Level 12_1. The cards that feature level 12_ 0 support include Radeon HD 7790, Radeon R7 260 (X), Radeon R9 285, Radeon R9 290 (X) and R9 295X2. The older cards such as the ones based on Tahiti, Pitcairn that include all the 7000 series cards (excluding HD 7790), Radeon R9 270 (X) and Radeon R9 280 (X) feature up to Feature Level 11_1 support. Robert Hallock believes that there’s no problem with not feature DirectX Feature Level 12_1 support since features are performance enhancing tools are already available in 11_1 and 12_0 and most games won’t rely on utilization of 12_1.

Read more: http://wccftech.com/amd-confirms-gc...-10-feature-level-120-gcn-1112/#ixzz3cayDvRRu

My biggest decision is going to be how much of an impact of 12_1 will play vs 12_0. Wouldn't know until we see developers utilizing the features in 12_1 to draw that comparison in existing and future games. I would hate to spend a lot of money on a card where those features 12_1 has a significant impact on performance and visual.
 
Dude the 980Ti has a 60-90FPS advantage vs the $390 390X come on dude benchmarks show and they only came out a few days ago what like 13/7/2015, this isn't 9/6/2015 where benchmarks havent even been done yet.

LOL...60 to 90 fps advantage ? lol...the 980ti cant do 90 fps in Batman or BF4 let alone beat a R290X by that much unless your still gaming at 800x600. You fan bois are hilarious...Also lets not forget your claiming to have info on 390x which has no been released yet...lol
 
The 290X is better than the 780 Ti in some benchmarks and waaay cheaper.

The new 980 ti sits at $650 and with this new competitor at a mear $390, AMD is the clear choice.

Now if only I didn't buy an overclocked 780 Ti 6 months ago :(

Dude the 980Ti has a 60-90FPS advantage vs the $390 390X come on dude benchmarks show and they only came out a few days ago what like 13/7/2015, this isn't 9/6/2015 where benchmarks havent even been done yet.
And let's not forget about DX12 support. The main issue with AMD rebranding the entire 300 series is that none of the cards will support DX12. It's not crucial right now, but when DX12 games start popping up later this year, I know I'd regret buying a 300 series card.

According to another source: AMD has recently confirmed that their current cards can support Feature Level 12_0 at best while NVIDIA’s Maxwell 2.0 architecture has support for Feature Level 12_1. The cards that feature level 12_ 0 support include Radeon HD 7790, Radeon R7 260 (X), Radeon R9 285, Radeon R9 290 (X) and R9 295X2. The older cards such as the ones based on Tahiti, Pitcairn that include all the 7000 series cards (excluding HD 7790), Radeon R9 270 (X) and Radeon R9 280 (X) feature up to Feature Level 11_1 support. Robert Hallock believes that there’s no problem with not feature DirectX Feature Level 12_1 support since features are performance enhancing tools are already available in 11_1 and 12_0 and most games won’t rely on utilization of 12_1.

Read more: http://wccftech.com/amd-confirms-gc...-10-feature-level-120-gcn-1112/#ixzz3cayDvRRu

My biggest decision is going to be how much of an impact of 12_1 will play vs 12_0. Wouldn't know until we see developers utilizing the features in 12_1 to draw that comparison in existing and future games. I would hate to spend a lot of money on a card where those features 12_1 has a significant impact on performance and visual.
Good info. Thanks man.
 
And let's not forget about DX12 support. The main issue with AMD rebranding the entire 300 series is that none of the cards will support DX12. It's not crucial right now, but when DX12 games start popping up later this year, I know I'd regret buying a 300 series card.
Incorrect, like other posters have stated. All GCN GPUs support DX12_0 up to Tier 3 features (the highest tier, that is, regarding DX11_2 and DX12_0 features). They only lack DX12_1 support.
Completely and utterly incorrect. AMD cards all the way back to 7000 series cards will fully support DX12. Please do not make up baloney and try passing it off as fact...
Also no 100% correct. It doesn't fully support DirectX 12 due to lack of DX12_1 support. But they do support DX12_0 fully, which Nvidia cards don't.
According to another source: AMD has recently confirmed that their current cards can support Feature Level 12_0 at best while NVIDIA’s Maxwell 2.0 architecture has support for Feature Level 12_1.
Your source is unfairly favoring Nvidia with that claim, most likely on purpose. Yes, Maxwell supports DX12_1, but also lacks Tier 3 support for DX12_0/DX11_2 features, which AMD has. So in reality NEITHER fully support DirectX 12, each of them missing one set of features. GCN has DX12_0 Tier 3 support but misses on DX12_1; while Maxwell has DX12_0 Tier 2 support and DX12_1, but misses on DX12_0 Tier 3. Either of them could suffer from this, depending on which specific features game devs implement. It does not necessarily mean Maxwell has better DirectX 12 support.
 
And let's not forget about DX12 support. The main issue with AMD rebranding the entire 300 series is that none of the cards will support DX12. It's not crucial right now, but when DX12 games start popping up later this year, I know I'd regret buying a 300 series card.
Incorrect, like other posters have stated. All GCN GPUs support DX12_0 up to Tier 3 features (the highest tier, that is, regarding DX11_2 and DX12_0 features). They only lack DX12_1 support.
Completely and utterly incorrect. AMD cards all the way back to 7000 series cards will fully support DX12. Please do not make up baloney and try passing it off as fact...
Also no 100% correct. It doesn't fully support DirectX 12 due to lack of DX12_1 support. But they do support DX12_0 fully, which Nvidia cards don't.
According to another source: AMD has recently confirmed that their current cards can support Feature Level 12_0 at best while NVIDIA’s Maxwell 2.0 architecture has support for Feature Level 12_1.
Your source is unfairly favoring Nvidia with that claim, most likely on purpose. Yes, Maxwell supports DX12_1, but also lacks Tier 3 support for DX12_0/DX11_2 features, which AMD has. So in reality NEITHER fully support DirectX 12, each of them missing one set of features. GCN has DX12_0 Tier 3 support but misses on DX12_1; while Maxwell has DX12_0 Tier 2 support and DX12_1, but misses on DX12_0 Tier 3. Either of them could suffer from this, depending on which specific features game devs implement. It does not necessarily mean Maxwell has better DirectX 12 support.

It's a pain to think about it, because at the end of the day it's at the discretion of the developers. I just want to future-proof my decision for at least the next two to three years; unfortunately, I have a PSU 620W power supply which the maxwell low power consumption could suffice like the 970, if the 300 series are as power hungry as the higher-end AMD cards (which is likely), then I have to buy a new PSU to meet that demand. Which ends up resulting where I really didn't save money (Nvidia 970 + current PSU) vs (AMD 390 + new $ PSU) The advantage I see from AMD purchase is the 390 will have more vram, but I don't game higher than on a 1080p monitor. Someday I will welcome 4k gaming into my life, for now I'm more interested in playing in high/ultra settings with steady fps. The struggle is real because I missed out on the Nvidia Witcher 3 promo to anticipate the announcement from AMD on June 16th.
 
It's a pain to think about it, because at the end of the day it's at the discretion of the developers. I just want to future-proof my decision for at least the next two to three years; unfortunately, I have a PSU 620W power supply which the maxwell low power consumption could suffice like the 970, if the 300 series are as power hungry as the higher-end AMD cards (which is likely), then I have to buy a new PSU to meet that demand. Which ends up resulting where I really didn't save money (Nvidia 970 + current PSU) vs (AMD 390 + new $ PSU) The advantage I see from AMD purchase is the 390 will have more vram, but I don't game higher than on a 1080p monitor. Someday I will welcome 4k gaming into my life, for now I'm more interested in playing in high/ultra settings with steady fps. The struggle is real because I missed out on the Nvidia Witcher 3 promo to anticipate the announcement from AMD on June 16th.
If you really want to make sure you don't miss out on any features, your only option is to wait for Arctic Islands and Pascal next year. Current GPUs will all miss on something, and it's a gamble which will be more important (or if they will matter to begin with).
That being said, thinking a 620W PSU is not enough for AMD GPUs is just absurd. Just look at the power consumption section of TechSpot reviews, like https://www.techspot.com/review/977-nvidia-geforce-gtx-titan-x/page9.html. In the worst case scenario, the entire system with an i7-4770 and a R9 290X consumes 332W under load.
A 620W PSU would be enough even for the system with the dual-GPU R9 295X, which consumes 567W, although then you'd probably have significant noise from the PSU fan.
 
It's a pain to think about it, because at the end of the day it's at the discretion of the developers. I just want to future-proof my decision for at least the next two to three years; unfortunately, I have a PSU 620W power supply which the maxwell low power consumption could suffice like the 970, if the 300 series are as power hungry as the higher-end AMD cards (which is likely), then I have to buy a new PSU to meet that demand. Which ends up resulting where I really didn't save money (Nvidia 970 + current PSU) vs (AMD 390 + new $ PSU) The advantage I see from AMD purchase is the 390 will have more vram, but I don't game higher than on a 1080p monitor. Someday I will welcome 4k gaming into my life, for now I'm more interested in playing in high/ultra settings with steady fps. The struggle is real because I missed out on the Nvidia Witcher 3 promo to anticipate the announcement from AMD on June 16th.
If you really want to make sure you don't miss out on any features, your only option is to wait for Arctic Islands and Pascal next year. Current GPUs will all miss on something, and it's a gamble which will be more important (or if they will matter to begin with).
That being said, thinking a 620W PSU is not enough for AMD GPUs is just absurd. Just look at the power consumption section of TechSpot reviews, like https://www.techspot.com/review/977-nvidia-geforce-gtx-titan-x/page9.html. In the worst case scenario, the entire system with an i7-4770 and a R9 290X consumes 332W under load.
A 620W PSU would be enough even for the system with the dual-GPU R9 295X, which consumes 567W, although then you'd probably have significant noise from the PSU fan.

Thanks for the info, I read somewhere you want some additional watt capacity as an overlap and browsing on newegg on the tech spec I see where they recommend 750W min for most higher-end AMD cards (I know I shouldn't take it literally). I have to do more research, did run the PSU calculator and suggested 553w with my current pc build + 290, must be mindset where I rather be safe than sorry. Unfortunately, I cannot wait til the new Arctic Islands and Pascal GPU release from AMD because my current CPU and GPU are laid to rest and I have to buy something for now, but be able to play my games on high/ultra settings.
 
DX12 has different levels...12_1, 12_0, 11, 11_1, etc...to be absolutely clear understand that 11_1 is not a DX11 feature...it is a base level of DX12 support and any card, including the AMD GCN 1.0 cards, that will fully support that base level of DX12 are "fully" DX12 compliant. The additional spec levels such as 12_1 only add some features not included in the base spec...the important part of this is to remember that ANY DX12 card that supports the base features will have 95% of all the DX12 benefits, including the speed gains, so this is really arguing over who has the best looking graphics card. Even a Tahiti based 280x will get all of the base level benefits of DX12 and will gain significant performance once DX12 and Win10 go live. Anyone buying a 980ti and thinking they are the only card owner with "full" DX12 support is completely wrong...you'll be the only card owner with the "additional" features. How much those "additional" features get implemented by game developers (considering most current generation consoles only support DX12 base features) remains to be seen.
 
Dude the 980Ti has a 60-90FPS advantage vs the $390 390X come on dude benchmarks show and they only came out a few days ago what like 13/7/2015, this isn't 9/6/2015 where benchmarks havent even been done yet.

LOL...60 to 90 fps advantage ? lol...the 980ti cant do 90 fps in Batman or BF4 let alone beat a R290X by that much unless your still gaming at 800x600. You fan bois are hilarious...Also lets not forget your claiming to have info on 390x which has no been released yet...lol
Nevertheless, the 980 Ti is ~ 50% faster in BF4... that's a pretty incredible difference. 1440p at playable framerate vs unplayable.
 
Good info. Thanks man.
Actually, it was incomplete info. See above.
Yeah, I've been reading up on this on various tech sites. I just haven't heard a squeak about AMD's 200 series having any kind of DX12 support before he mentioned it. This one seems to have all the info: http://www.extremetech.com/extreme/...what-amd-intel-and-nvidia-do-and-dont-deliver

From reading that, it appears GCN 1.1 does have feature level 12.0 support. Maxwell 2xx has 12.1 support, but Maxwell 1xx only supports 12.0. Only AMD's GCN 1.0/1.1 supports Resource Binding Level 3, but I'm not even sure that's part of the 12.0 or 12.1 feature level requirements. TechSpot should do an article on this. :p

That said, I still wouldn't buy yesterday's model 390X rebrand. I'm looking forward to seeing what the Radeon Fury has to offer, though.
 
Last edited:
The 290X is better than the 780 Ti in some benchmarks and waaay cheaper.

The new 980 ti sits at $650 and with this new competitor at a mear $390, AMD is the clear choice.

Now if only I didn't buy an overclocked 780 Ti 6 months ago :(

Dude the 980Ti has a 60-90FPS advantage vs the $390 390X come on dude benchmarks show and they only came out a few days ago what like 13/7/2015, this isn't 9/6/2015 where benchmarks havent even been done yet.
And let's not forget about DX12 support. The main issue with AMD rebranding the entire 300 series is that none of the cards will support DX12. It's not crucial right now, but when DX12 games start popping up later this year, I know I'd regret buying a 300 series card.

Hardware does not dictate DirectX Support. There is no reason that a firmware update couldnt fix this.
 
There's nothing incredible about a brand new card being faster than a two year old design...it would be incredible if it wasn't. Whats incredible is the cost of a 980ti...
 
For those who want full dx12 support will have to wait for 2016 and the cards that will be released then by both Nvidia and AMD.
Although, we don't know yet if AMD's Fury cards will fully support dx12 or not.

Right now the only Tier3 resource binding support comes from AMD and 12_1 is only supported by Nvidia. Both have only partial support for DX12.
 
From reading that, it appears GCN 1.1 does have feature level 12.0 support. Maxwell 2xx has 12.1 support, but Maxwell 1xx only supports 12.0. Only AMD's GCN 1.0/1.1 supports Resource Binding Level 3, but I'm not even sure that's part of the 12.0 or 12.1 feature level requirements. TechSpot should do an article on this.
While TechSpot's article on this doesn't come, what I know is this: The part of DirectX 12 that is divided in 3 tiers is DX12_0 (which also includes DX11_1 and DX11_2 level features). That's the part that AMD supports fully, up to Tier 3, while Nvidia doesn't. DX12_1, on the other hand, is a separate level of features, which is not tiered, and you don't need Tier 3 support on DX12_0 to support DX12_1 as well. That's the part Nvidia supports, but AMD doesn't.
Hardware does not dictate DirectX Support. There is no reason that a firmware update couldnt fix this.
Yeah, totally! DirectX support has nothing to do with hardware. That's why Radeon X1K and GeForce 7 cards still support all the latest standards. Also, look at all these DirectX 11 titles that run on my old HD 4850! Who needs programmable stream processors, geometry shaders or tesselation engines to support new APIs, am I right?
Although, we don't know yet if AMD's Fury cards will fully support dx12 or not.
We do know that. Fiji is based on GCN 1.2, just like Tonga. That means that, just like Tonga, it will have DX12_0 Tier 3 support but no DX12_1.
 
Back