Rumor: AMD's new 'Tonga' GPU to debut in August

Scorpus

Posts: 2,162   +239
Staff member

AMD is set to end the trend of releasing re-branded graphics cards when they launch an entirely new GPU in August, assuming the latest rumor from VR-Zone holds true.

Codenamed 'Tonga', the new GPU will replace the aging 'Tahiti Pro' GPU, which is currently found in the Radeon R9 280 but dates way back to the HD 7950 that launched in January 2012. Tonga is expected to be manufactured using a 28nm process (or possibly 20nm, as a few reports claim), and rumors claim it will pack 2,048 stream processors alongside 128 TMUs, 32 ROPs and a 256-bit GDDR5 memory bus.

The first graphics card utilizing Tonga is reported to have 2 GB of memory, and could either fall into AMD's high-end or mid-range line-up, depending on how the chip performs. At this stage it's not known whether a Tonga-based graphics card will form a new series or be slotted into the existing Rx 200 line, or what the card will be named.

With very little to go on at this stage, we'll have to wait until next month for more information, when AMD is expected to launch the new GPU. As it's been under a year since the Radeon Rx 200 series launched, we'd expect the card to slot into the current line-up, likely in the $200-300 range currently occupied by the R9 280 and R9 280X.

Permalink to story.

 
This name is funny to me. I can't wait to see how well this does because it sounds like its going to be around a R9 280-280X.
 
Tonga-donga, that's original.

AMD is set to end the trend of releasing re-branded graphics cards
That's a ballsy statement coming from AMD. So, if we still catch them doing it, will it qualify for a refund?

The first graphics card utilizing Tonga is reported to have 2 GB of memory, and could either fall into AMD's high-end
What is high-end then, if the card can't handle 4K? Test are showing that if you do not want to run out of memory while using 4K gaming mode, you need 4GB of video memory. A 3GB card runs out sometimes under very heavy load, while a 2GB card is pretty much unusable to play in 4K due to memory shortage.
 
Tonga-donga, that's original.


That's a ballsy statement coming from AMD. So, if we still catch them doing it, will it qualify for a refund?


What is high-end then, if the card can't handle 4K? Test are showing that if you do not want to run out of memory while using 4K mode, you need 4GB of video memory. A 3GB card runs out sometimes under very heavy load, while a 2GB card is pretty much unusable in 4K due to memory shortage.
I agree, I am quite shocked that this thing will only have 2gb (Alebit some places have said 4gb so we will just have to see since this is mostly still just rumor). 3gb was not enough to handle 4k properly so the next step is 4gb-6gb on the higher end models. They are probably more targeting middle grounds with this 2gb card than the higher end area.
 
Who the hell thinks these GPU's are made for 4K? Get more sleep at night.
Todays top GPU's suck at running 4K.

And to mention, going with a 4K setup right now might be interesting but its anything but impressive. Several OCN members have averted their 4K setups due to various problems/issues, even with unlimited budgets.
There are many factors for this ranging from head to toe but in short, its just to early.
In a year I'll check out 4K setups again, or when the new GTX's release.
 
In a year I'll check out 4K setups again, or when the new GTX's release.
Maybe in 5 years for me. That would be when a single $300 GPU handles 4K just fine. And even then I wouldn't really need 4K, because I will likely not have the TV/monitor for it. But then by that time maybe pricing will be down for those as well.
 
4k is easily managed right now with a Dual card setup as there are plenty out there with enough VRAM and power in a multi-gpu setup to power a 4k monitor and deliver nice playable FPS (I even have one ordered right now).

The point was the rumor stated this as high end and as vitality pointed out since most high end is pointing its fingers at 4k right now (Both companies seem adamant about the 4k gaming initiative) then why is there only 2gb of VRAM. Some people take things a bit too seriously sometimes...

Either way, the listed specs sound about as real as the so called GTX 880 ones did not that long ago. Most of this is still mere speculation...
 
Maybe in 5 years for me. That would be when a single $300 GPU handles 4K just fine. And even then I wouldn't really need 4K, because I will likely not have the TV/monitor for it. But then by that time maybe pricing will be down for those as well.
And too be clear I am actually a 4K fan. I love the 'old-school' one big monitor and a bag of chips motto but todays GPU's just suck at running 4K, even in double/triple SLi (sorry Ghost but they do). They might run non-demanding games ok but why blow that much money then? The results overall are awful and that's just talking performance numbers and leaving out all the connectivity, support and troubleshooting issues.
When I can get a single GPU for $500 that runs 4K I will make the leap. Will I be waiting awhile? Maybe, but I am happy at 1600p.
 
I can't remember an old joke if it was "death by tonga-tonga" or whatever?

At this stage it's not known whether a Tonga-based graphics card will form a new series or be slotted into the existing Rx 200 line, or what the card will be named.
I predict this new amd gpu will be named Ry 200 series.

The first graphics card utilizing Tonga is reported to have 2 GB of memory, and could either fall into AMD's high-end or mid-range line-up, depending on how the chip performs.
if the built-in gpu of intel Pentium 20th anniversary edition processor (Pentium g3258) performs really well against current entry level discrete gpu from both amd and NVidia, I hope the next generation of entry gpu from either companies will now feature a minimum of 2gb vram. a real leap forward such that next-gen budget cards can play modern games at medium settings at resolutions equal to or less than 1920x1080. (I can dream, can I?)
 
I can't remember an old joke if it was "death by tonga-tonga" or whatever?

Your either referring to the "Death by Bunga Bunga (Guess I spelled that right)" Joke or the Futurama Death by Snou Snou one.

if the built-in gpu of intel Pentium 20th anniversary edition processor (Pentium g3258) performs really well against current entry level discrete gpu from both amd and NVidia, I hope the next generation of entry gpu from either companies will now feature a minimum of 2gb vram. a real leap forward such that next-gen budget cards can play modern games at medium settings at resolutions equal to or less than 1920x1080. (I can dream, can I?)
I am a bit confused by what your saying at first about the Pentium? Maybe I am mis-reading it.

I believe 2gb will now become a base line except on the very bottom end.
 
4k is easily managed right now with a Dual card setup as there are plenty out there with enough VRAM and power in a multi-gpu setup to power a 4k monitor and deliver nice playable FPS (I even have one ordered right now).

The point was the rumor stated this as high end and as vitality pointed out since most high end is pointing its fingers at 4k right now (Both companies seem adamant about the 4k gaming initiative) then why is there only 2gb of VRAM. Some people take things a bit too seriously sometimes...

Either way, the listed specs sound about as real as the so called GTX 880 ones did not that long ago. Most of this is still mere speculation...

4k is not Easily Managed with 2 GPU's if you want to play Watch Dogs at 4k on Ultra Settings with AA.
It isn't ready for Prime Time until 8 gb Video Cards come out. Even then the investment will cost you well over $2,000 to get in. Say $700 for a good 4k monitor, 3 GPUs at $600 each and your talking $2500.00 alone just for the Monitor and Cards. Most people build entire systems in 2 way SLI for that. So No 4k isn't ready yet at least not for the average gamer. Maybe for those willing to spent 4 grand on a Power Rig.

On the other hand 2 ASUS STRIX Cards with 6 gb each would be a perfect balance on the 2k ASUS Rog Swift with Watchdogs at Ultra with Max AA. In other words 2k is probably a more economical and viable solution for the masses at the moment.
 
urrent
4k is not Easily Managed with 2 GPU's if you want to play Watch Dogs at 4k on Ultra Settings with AA.
It isn't ready for Prime Time until 8 gb Video Cards come out. Even then the investment will cost you well over $2,000 to get in. Say $700 for a good 4k monitor, 3 GPUs at $600 each and your talking $2500.00 alone just for the Monitor and Cards. Most people build entire systems in 2 way SLI for that. So No 4k isn't ready yet at least not for the average gamer. Maybe for those willing to spent 4 grand on a Power Rig.

On the other hand 2 ASUS STRIX Cards with 6 gb each would be a perfect balance on the 2k ASUS Rog Swift with Watchdogs at Ultra with Max AA. In other words 2k is probably a more economical and viable solution for the masses at the moment.
WatchDogs on Ultra at 4k is possible with SLI and CFX now. I have read the reviews and a pair of 290X's or 780tis both average around 60FPS. Thats also not including a max overclock from what ive seen so its more than possible. Heck even Crysis 3 at 4k is possible and reading most of Digital Storms reviews at 4k show that unless the game has some issues with multi-GPU configs, 4k is easily achieveable. I personally have 3 R9 290X cards so im not worried and cannot wait for my Asus monitor to arrive. 1k on video cards is about what you would have to pay to get decent framerate (30+) at 4k right now at high-ultra.
 
urrent

WatchDogs on Ultra at 4k is possible with SLI and CFX now. I have read the reviews and a pair of 290X's or 780tis both average around 60FPS. Thats also not including a max overclock from what ive seen so its more than possible. Heck even Crysis 3 at 4k is possible and reading most of Digital Storms reviews at 4k show that unless the game has some issues with multi-GPU configs, 4k is easily achieveable. I personally have 3 R9 290X cards so im not worried and cannot wait for my Asus monitor to arrive. 1k on video cards is about what you would have to pay to get decent framerate (30+) at 4k right now at high-ultra.

You blur the terms 'possible' and 'easily achievable' like their the same thing. Two R9's suck at running 4K and even thats an expensive/formidable setup.
Your performance for what your paying is about as bad as it gets. Thats a big hole in your pocket to be able to say "I play at 4K".
Intelligent spending and setups are impressive traits, not foolish spending and high powered setups to run something thats still not ready, 4K makes todays GPU's look like mid range cards.
Toms Hardware show SLi Titans/780's running 3840 X 2160 Crysis 3 @ 37FPS (max), SMAA @ 2X, High Textures. Thats f***ing pitiful performance for the money. PITIFUL.
 
Last edited:
In 5 years time, I sure hope they are working on other things than 4k displays and GPUs for such.... Fingers crossed for Holograms and fully immersive VR headsets... mayb even cerebral hookups? :0

I agree with amstech, very few people have that sort of money to waste for such pathetic results...

I've been disappointed with SLI / crossfire from the beginning... they have come a ways in optimizing their utilization but cmawn, they still arent properly implementing multi-core for CPUs and the stability for intense gaming with SLI/crossfire is just not consistent enough yet. They keep pushing out more and more cores or 'different' graphics cards which are all just repackages of the same old junk to keep sales going. Yay for SLI/Crossfire benchmarks that give you an idea of their average FPS... but where is the quality.

Conspiracy theory here but maybe they already are leaps and bounds ahead of whats available but are taking their sweet time releasing it to keep revenue coming in... I wish these big companies would spend less time marketing and rebranding and actually strive towards innovation... Its almost like big oil killing eco friendly alternatives to keep their business going.

I read several articles on bio computers, then some more on graphite conducters, but these things arent getting nearly as much face-time as AMDs new R7-R9 junk lineup, or NVIDIAs mystery 880. They really cant be bothered to go 20nm unless forced too? I realize the factories had some delays on releasing the 20nm dies but cmawn! Anyways... my facts are obviously outdated and not fully informed... Anyone plz feel free to discuss or elaborate your fun thoughts on the matter.

GG, Nvidia and AMD.... Silicon Valley... you suck.
 
GG, Nvidia and AMD.... Silicon Valley... you suck.
Please explain!

How can you even compare Intel to nVidia? That is like comparing CPU performance to GPU performance. Intel seems comfortable allowing AMD and nVidia high-end GPU arena. But yet you appear to be holding it against them, as if they have tragically failed. If you are going to include Intel, you cannot disregard how AMD has been with their CPU lineup. Thus removing Intel completely from the "you suck" category.

Next thing you know, you will be comparing LG Optical to WD HDD. Sure they are both storage media, but they play different rolls.
 
You blur the terms 'possible' and 'easily achievable' like their the same thing. Two R9's suck at running 4K and even thats an expensive/formidable setup.
Your performance for what your paying is about as bad as it gets. Thats a big hole in your pocket to be able to say "I play at 4K".
Intelligent spending and setups are impressive traits, not foolish spending and high powered setups to run something thats still not ready, 4K makes todays GPU's look like mid range cards.
Toms Hardware show SLi Titans/780's running 3840 X 2160 Crysis 3 @ 37FPS (max), SMAA @ 2X, High Textures. Thats f***ing pitiful performance for the money. PITIFUL.
Playable performance if you had read first of all.

Second no, AMD cards do a significant job in 4k and in fact are rated as better on many sites at the 4k threshold on many games. That review from Toms is Dated at September 19, 2013 and does not even include all the upper card in the list. So do not pull your "AMD Sucks at 4k" card when it infact shows that 2 290X are able to play almost every major game at 4k very well while costing 150+ bucks less and giving 1 more gb of VRAM.
http://www.digitalstormonline.com/u...-resolution-overclocking-benchmarks-idnum118/

I also have 3 cards so I really do not care because it will play whatever I want and I do not care what you say or count as frivolous. If I have to drop a notch on MSAA so be it but the results are more than promising and I have already tested my rig with a 4k monitor before hand so I already have seen with my own eyes what to expect.

Do not care what you have to say beyond this point.
 
So do not pull your "AMD Sucks at 4k" card when it infact shows that 2 290X are able to play almost every major game at 4k very well while costing 150+ bucks less and giving 1 more gb of VRAM.
The performance numbers are still pitiful on demanding games and even when you crank up the goodies on some dated titles; any setup can run older games, that never means anything.
Needing 3 GPU's to play on a single monitor means your GPU's suck as running that resolution, and the price for performance you get is about as bad as it gets. It takes 2-3 GPUs and its all they can do to muster 25-60 FPS in Crysis 3, Metro, BF4, etc etc.

These charts are from April 14'.



So yeah its playable.
It still doesn't change the fact that todays GPU's suck at running 4K. SUCK.
Your tone is very negating but I guess it can be difficult to accept your beloved R9's get brought to their knees by games made 2 years ago pushing a single monitor. You will need atleast 3 R9's to clear 60FPS in several games. Pretty f***ing pitiful.

You can always turn some goodies down or off to make things run smoother, I am sure everyone who dropped that much $$$ on 2-3 GPU's & a 4K monitor will want to do that.
 
Last edited:
Back