AMD confirms Radeon 6000 reveal and Zen 3 launch for October

midian182

Posts: 6,095   +50
Staff member
Highly anticipated: With most of the PC world having palpitations over the arrival of Nvidia's Ampere cards, some have forgotten about rival AMD—but that could soon change. The company has taken the discussion to Twitter, including CEO Lisa Su who has now confirmed an unveil date for the RDNA 2-based Radeon 6000 series and Zen 3 CPUs.

AMD has confirmed they are ready to roll out their next generation products starting next month. First, the next wave of Ryzen desktop processors using the "Zen 3" architecture, on an event that will take place next October 8 at 12pm ET.

Weeks later, the next horizon of Radeon will be revealed October 28 at 12pm ET. AMD has confirmed we're talked about Big Navi, the RDNA 2-based, Radeon RX 6000 series graphics cards.

We knew that AMD was getting ready to tease more of its upcoming graphics cards after an Easter egg was discovered in Fortnite. Streamer GinaDarling found a secret Radeon room in AMD's Battle Arena that contained a special console. After entering the code "6000," the words "something big is coming to the AMD battle arena" appeared.

A lot of details on Big Navi remain unknown. We've heard there may be a 16GB version of AMD's card that will undercut the $699 RTX 3080 with a price of $549. Elsewhere, another rumor claimed it would offer around 15 percent better performance than the $499 RTX 3070.

Earlier this morning, Frank Azor, Chief Architect of Gaming Solutions & Marketing at AMD, tweeted the lyrics to 'Tomorrow' from the musical Annie. This was later retweeted by AMD's director of marketing, Sasa Marinkovic, so it was made clear something was imminent.

Big Navi will be competing with some exciting-looking Nvidia cards. The RTX 3080 promises over 100 fps in many top titles at 4K with max settings and RTX on. We've also just seen prices for the 3080 and 3090 aftermarket cards.

Moving from graphics cards to CPUs, AMD's Zen 3-based Ryzen 4000 desktop processors are also arriving this year. They're rumored to come with a slew of new features, and a 10-core model could be part of the Ryzen lineup for the first time.

Permalink to story.

 

Adhmuz

Posts: 2,062   +854
300 watt single fan blower cards 🙄🙄

Please just have decent reference models FFS
Oh it'll be a blower, you can count on that, and it doesn't matter how good the reference design is, the AIBs will cheap out and make subpar products just to get the price as low as possible.

Some say AMD should have stricter QA for it's AIBs, other say it's the consumers fault for buying them so cheap. The best thing to do is wait for reviews and find out which ones are good and which ones are not.
 

zamroni111

Posts: 122   +57
It's logical that AMD must announce big navi before rtx 3000 pre-order starts, I.e. September 17.
Except reviewers, people who already buys 3080/3090 won't spend their money to buy another GPU for at least a year or maybe more.
 

Squid Surprise

Posts: 3,481   +2,366
I want it to be at least on pair with 3080 and have 16GB of memory for say $649. I could give up RT and DLSS for extra memory and slightly smaller price tag
That would be nice... but I wouldn't hold your breath... Nvidia also thinks so, as rumours of a 3070Ti being held in reserve suggest...
 
  • Like
Reactions: TomSEA

Puiu

Posts: 4,098   +2,661
I'm curious. What game(s) do you need 16GB of VRAM for?
10GB should still be fine for a few more years, but I'm more worried about the 8GB 3070. Turing already shows signs of bottlenecks at 4K with only 8GB in several titles. Rendering or other types of workstation workloads definitely hit the 8GB limit.

I'm just guessing here, but Nvidia may have had a limit based on the size of the PCB on how much VRAM they could use. Otherwise I don't see why they would stick to 8 and 10GB cards in for the high end cards in 2020.
 
  • Like
Reactions: max0x7ba

grumblguts

Posts: 355   +305
3070ti will have 14 gig I think that is the one worth buying.
this 3080 with its 10 gig may be enough for today's games but what about tomorrows.
with the games coming out this year I dont think 10 gig will be enough.
I see memory swapping with 10 gig of vram. we will have to wait and see.

AMD is constantly playing catch up and it gets kind of boring to be honest not being the winner.
 

jonny888

Posts: 112   +157
Its for the long haul of the card.

We know VRAM usage has been going up and 2-3 years from now you will want your GPU to have 16GB vs 10GB. Not everyone upgrades gpus every year.

You are thinking now he is thinking long term.
I'm not thinking anything particular. I was asking a question about his needs.
 
  • Like
Reactions: cosminmcm

Adi6293

Posts: 584   +692
10GB should still be fine for a few more years, but I'm more worried about the 8GB 3070. Turing already shows signs of bottlenecks at 4K with only 8GB in several titles. Rendering or other types of workstation workloads definitely hit the 8GB limit.

I'm just guessing here, but Nvidia may have had a limit based on the size of the PCB on how much VRAM they could use. Otherwise I don't see why they would stick to 8 and 10GB cards in for the high end cards in 2020.
Its quite simple, they first sell you a card with just enough ram and then a year later they will sell you a card with too much ram : - P They are creating a problem and then a solution to that problem
 

jonny888

Posts: 112   +157
I game at 4K, I don't need 16GB right now but its definitely better to have it than not to have it
This seems to be the common answer so I'll reply to this. All I can say is that you may be right, or you may be wrong. Logic and history suggest that VRAM usage will rise over time, but if nothing requires 10GB right now, then it remains pure speculation if you'll need more than that during the lifetime of the card (which we also don't know).

On the plus side, assuming you aren't jumping to a 3090, you can bet there'll be a 3080Ti with more VRAM in the future.
 

jonny888

Posts: 112   +157
Its quite simple, they first sell you a card with just enough ram and then a year later they will sell you a card with too much ram : - P They are creating a problem and then a solution to that problem
Also plausible. But that only works if game makers make games that actually use that much VRAM (which they may not want to do if they look at the market and see that most people only have X amount of VRAM on average)
 

neeyik

Posts: 1,367   +1,486
Staff member
I'm just guessing here, but Nvidia may have had a limit based on the size of the PCB on how much VRAM they could use. Otherwise I don't see why they would stick to 8 and 10GB cards in for the high end cards in 2020.
The limit is to do with the number of memory controllers in the GPU. The GA102 has 12 in total, and 2 are disabled/unusable for the dies that go into 3080s. Each controller has a data bus of 32 bits in width, so the memory modules that get wired directly to it are also 32 bits wide.

Currently, only Micron offers GDDR6X and they're all 32 bits wide. There's currently only one density available too: 8 Gbits (1 GB). So for the 3080, with it's 10 controllers, the only possible memory configurations are 10 x 1 GB in 32 bit mode or 20 x 1 GB in 16 bit mode. Both give the same bandwidth, but the latter provides a larger memory footprint.

The 3090, with all 12 controllers active, uses a 24 x 1 GB in 16 bit mode configuration, giving a total of 24 GB.
 

Adi6293

Posts: 584   +692
This seems to be the common answer so I'll reply to this. All I can say is that you may be right, or you may be wrong. Logic and history suggest that VRAM usage will rise over time, but if nothing requires 10GB right now, then it remains pure speculation if you'll need more than that during the lifetime of the card (which we also don't know).

On the plus side, assuming you aren't jumping to a 3090, you can bet there'll be a 3080Ti with more VRAM in the future.
I doubt I will be jumping to a 3090 but 20GB RTX3080Ti could work too, a card with more ram will also be easier to sell once I don't want it and games do use more than 10GB. Plus if you look at history every time nvidia brought out a GPU with 2 different memory capacity the one with more ram did stand the test of time better every time. GTX580, GTX770, GTX960
 

Adi6293

Posts: 584   +692
Also plausible. But that only works if game makers make games that actually use that much VRAM (which they may not want to do if they look at the market and see that most people only have X amount of VRAM on average)
There will always be that developer that pushes the boundaries and all it takes is that 1 hot game like this year it will be CP2077, many people will buy RTX cards just to play this game
 
  • Like
Reactions: Charles Olson