Radeon 6000 Easter egg discovered in Fortnite, 16GB card rumored to undercut RTX 3080

midian182

Posts: 9,714   +121
Staff member
Rumor mill: Nvidia is enjoying its time in the spotlight following the reveal of its Ampere cards, but it seems AMD wants to spoil the party. Team Red has dropped an Easter egg into Fortnite teasing the upcoming arrival of the Radeon RX 6000 series, and we’ve heard rumors that a 16GB version of Big Navi will be priced to undercut the $699 RTX 3080.

Streamer GinaDarling discovered the Fortnite Easter egg in AMD’s Battle Arena. Upon entering a phone booth, she was teleported to a secret “AMD Radeon” room that contains a special console. After entering the passcode “6000” into a terminal, the words “something big is coming to the AMD battle arena” appeared.

One of the people to congratulate GinaDarling was AMD’s Scott Herkelman CVP & GM at Radeon. While the Easter egg doesn’t reveal too much, this appears to be AMD’s first step in embedding Big Navi in the public’s consciousness.

Elsewhere, tech analysis YouTube channel coreteks tweeted that AMD wants to release a 16GB version of the RDNA 2-based card at $599, but following the Ampere announcement, that price will likely drop to $549.

This isn’t the first time we’ve heard such a rumor; back in July, it was reported that Big Navi wouldn’t be the Ampere killer first thought, offering around just 15 percent better performance than the RTX 2080 Ti, which would put it slightly above the $499 RTX 3070.

If the 16GB rumor is true, it would mean the Big Navi card features double the memory of the RTX 3070 and 6GB more than the RTX 3080.

We’ve heard previous claims that there will be two Navi 21-based cards with 12GB and 16GB of GDDR6 RAM, rather than the GDDR6X used in the RTX 3080/3090. Coreteks adds that there will not be a consumer version with HBM.

The rumor should be taken with a pinch of salt, of course; coretex’s source is an unnamed “AMD partner,” though parts of it do line up with some previous claims.

AMD boss Lisa Su has repeatedly said that both Big Navi and Zen 3 are arriving this year. The company also confirmed that the former would be here before the November launch of the next-gen consoles, which use RDNA 2 GPUs, meaning an October release—possibly the 7th—seems likely. How they stack up against Ampere remains to be seen, especially as an RTX 3070 Ti is already rumored to be in the works, and the RTX 3080 can allegedly reach 100+ fps in many AAA games in 4K at max settings with RTX on.

Permalink to story.

 
I'm still waiting to see an AMD card that beats the 2080Ti.

The 3070 is going to be a market killer at $499. It's everything the 2060 should have been. Something tells me that the 3070 and 3080 are going to be the cards to beat, but will be unbeatable for some time.
 
Hopefully the unnamed AMD partner is not the same AIB who leaked that Ampere would be on 7nm.
Seems like AMD learned from nVidia to control leaks and keep quiet before the launch.

Either way, all I personally care about is a good 3070 competitor as that is the segment I am looking at. Will decide which of the two to get after the first reviews are out.

From a general market perspective, ideally AMD should compete up to the 3080 level, but as I said, it does not matter to me personally. An uber 3090 or 6900XT won‘t make my 3070 / 6700 any better.
 
Hopefully the unnamed AMD partner is not the same AIB who leaked that Ampere would be on 7nm.
Seems like AMD learned from nVidia to control leaks and keep quiet before the launch.

Either way, all I personally care about is a good 3070 competitor as that is the segment I am looking at. Will decide which of the two to get after the first reviews are out.

From a general market perspective, ideally AMD should compete up to the 3080 level, but as I said, it does not matter to me personally. An uber 3090 or 6900XT won‘t make my 3070 / 6700 any better.
AMD has total control over leaks the same way Nvidia does. 99% of the leaks we get to see are intentional, Nvidia has no intent to chase those leakers because the leaks are on purpose. AMD and Nvidia both use the same tactic, "leaking" like a sieve when they are confident, and suddenly becoming Ft. Knox when they are not. AMD is not confident this round at all. All the hype to the contrary is just hype. If AMD was really confident, we'd see a LOT more leaks already with benchmark numbers and the whole lot, just like Ampere.

When AMD suddenly gets good at security, it's not because they got good at security. It's because they don't think what they have will take top honors and will fall short. Significantly short, 3090 is out of reach and 3080 is a monster to chase, AMD is in more trouble than fanboys want to admit, but price cuts before even launching are NEVER a good sign.
 
AMD has total control over leaks the same way Nvidia does. 99% of the leaks we get to see are intentional, Nvidia has no intent to chase those leakers because the leaks are on purpose. AMD and Nvidia both use the same tactic, "leaking" like a sieve when they are confident, and suddenly becoming Ft. Knox when they are not. AMD is not confident this round at all. All the hype to the contrary is just hype. If AMD was really confident, we'd see a LOT more leaks already with benchmark numbers and the whole lot, just like Ampere.

When AMD suddenly gets good at security, it's not because they got good at security. It's because they don't think what they have will take top honors and will fall short. Significantly short, 3090 is out of reach and 3080 is a monster to chase, AMD is in more trouble than fanboys want to admit, but price cuts before even launching are NEVER a good sign.
I‘d love to comment but I don‘t even know where and how to start...your post is just all over the place.
 
I'm still waiting to see an AMD card that beats the 2080Ti.

The 3070 is going to be a market killer at $499. It's everything the 2060 should have been. Something tells me that the 3070 and 3080 are going to be the cards to beat, but will be unbeatable for some time.

Until a week ago, you'd only heard of one from NV.
 
I'm pretty sure that we'll see a 16GB card that's competitive with the 3080 (at the same or lower price), and a 12GB card that is faster than the 3070 at a similar price. But yes, I don't think they're shooting for 3090.
 
AMD has already beaten Nvidia price-wise. With regarding to those AMD's lagging benchmarks figures, it doesn't matter. It only matters in the bechmmarks.

This generation, AMD will be the same, it won't beat the 3080, but by price, it might be the clear winner. Not going to pay 100-200% increase in price for a 10-30fps lead. All I need is a constant smooth minimum 60fps.

To those who wanted 144fps minimum in their 144hHz screen, they asked for it. - They need to buy the card to keep up with their screen.

Anyway, to each, his own.
 
Ray Tracing performance and DLSS equivalent. What can AMD offer.

True AI reconstruction, not just some image sharpening which DLSS 2.0 clearly beats out in countless close analysis articles.

Nvidia offer a package and feature rich hardware here, which they know they can mark up. If AMD cannot match it or get close to that then it needs to be a fair bit cheaper, otherwise it won't sell all that well again.
 
If this means I can get a 5700 level of performance with Ray tracing for around £250 I'm game. Tbf if I was into 4K gaming I'd be worried about the life expectancy of the 3070 due to that VRAM size.
 
I'm still waiting to see an AMD card that beats the 2080Ti.

The 3070 is going to be a market killer at $499. It's everything the 2060 should have been. Something tells me that the 3070 and 3080 are going to be the cards to beat, but will be unbeatable for some time.
Amd gpu of upcoming ~$600 xbox is 12 tflops just like 2080ti, hence 3070. Minus cost of 1TB ssd, case, power supply and transistors of 8 cpu cores, big navi can do 3070 spec at lower than $499.
 
Ray Tracing performance and DLSS equivalent. What can AMD offer.

True AI reconstruction, not just some image sharpening which DLSS 2.0 clearly beats out in countless close analysis articles.

Nvidia offer a package and feature rich hardware here, which they know they can mark up. If AMD cannot match it or get close to that then it needs to be a fair bit cheaper, otherwise it won't sell all that well again.
The question is, which would be preferable - AI upscaling that only works on select games and requires a super computer for training (I believe nVidia mentioned that) but offers great quality or alternatively universal upscaling that has a good enough but noticeably less good quality vs. DLSS but works on any game and does not require any additional work from the developers?

I am betting that AMD will offer the latter, as at least the XBox will probably use something like this to get better visual on old gen games, plus this would be very useful for an APU like Van Gogh.
 
Amd gpu of upcoming ~$600 xbox is 12 tflops just like 2080ti, hence 3070. Minus cost of 1TB ssd, case, power supply and transistors of 8 cpu cores, big navi can do 3070 spec at lower than $499.
The FP32, texturing, and pixels rate are allis roughly 10% down on a 2080 Ti, so it's very close. Even the memory bandwidth is in the same ballpark. So there's no doubt that AMD can produce a competitor to the 2080 Ti/3070. However, they really shouldn't sell it for less than $450 - they need to drop their image as being the 'cheap alternative.'
 
I’m not holding my breath for anything amazing. A 16GB card is interesting though, we have been on 8GB flagships first years now so it feels like it’s about time. Although I think that really only 4K will benefit from it.

What this new Radeon card really needs is a form of DLSS. With the latest DLSS implementations delivering both better visual quality and frame rate, there is no reason to turn it off. Giving Nvidia cards a huge boost over anything else that can’t do it.

My prediction is that big Navi will edge a 3070 in normal settings but will lack DLSS which could swing the advantage back to Nvidia, potentially quite heavily so.

I don’t think it will matter much if big Navi can or can’t do ray tracing.
 
AMD has total control over leaks the same way Nvidia does. 99% of the leaks we get to see are intentional, Nvidia has no intent to chase those leakers because the leaks are on purpose. AMD and Nvidia both use the same tactic, "leaking" like a sieve when they are confident, and suddenly becoming Ft. Knox when they are not. AMD is not confident this round at all. All the hype to the contrary is just hype. If AMD was really confident, we'd see a LOT more leaks already with benchmark numbers and the whole lot, just like Ampere.

When AMD suddenly gets good at security, it's not because they got good at security. It's because they don't think what they have will take top honors and will fall short. Significantly short, 3090 is out of reach and 3080 is a monster to chase, AMD is in more trouble than fanboys want to admit, but price cuts before even launching are NEVER a good sign.

That's probably true - but AMD is not leaking Zen3 - but that could be to sell Zen2 . Also I'm not sure how much capacity they have with PS5 Xbox consoles coming out - I stated in before if I was AMD I would target 3060/3070 - and use my gain from M/S and Sony's input to max those to the best - ie get better performance from the hardware day 1 and not day 730. Plus I have a Nvidia card - but I trust nothing what they say - see their latest Nvidia shield update - think they are still advertising in with a big speed boost - when shown is basically same chip as older model.
 
The FP32, texturing, and pixels rate are allis roughly 10% down on a 2080 Ti, so it's very close. Even the memory bandwidth is in the same ballpark. So there's no doubt that AMD can produce a competitor to the 2080 Ti/3070. However, they really shouldn't sell it for less than $450 - they need to drop their image as being the 'cheap alternative.'
I think it will still like that for rdna2. Versus ampere 8nm, it doesn't have a lot of room for extra price like "zen2 vs intel 14nm" case.
 
AMD will be able to match the 3070 and 3080 in performance and undercut them a bit on price with up to 16GB variants. They won’t try to beat the 3090 and nor should they waste the time or effort on a card that will be a niche. Spend your R&D dollars on products that will sell in far higher numbers. A 112CU 6900XT will be double the performance at least of a 5700XT so they have 3080 covered and they will have no trouble matching 3070 with say 60CU’s given the likely > 2GHz clocks, 10-15% IPC uplift and more CU’s.
 
AMD has total control over leaks the same way Nvidia does. 99% of the leaks we get to see are intentional, Nvidia has no intent to chase those leakers because the leaks are on purpose. AMD and Nvidia both use the same tactic, "leaking" like a sieve when they are confident, and suddenly becoming Ft. Knox when they are not. AMD is not confident this round at all. All the hype to the contrary is just hype. If AMD was really confident, we'd see a LOT more leaks already with benchmark numbers and the whole lot, just like Ampere.

When AMD suddenly gets good at security, it's not because they got good at security. It's because they don't think what they have will take top honors and will fall short. Significantly short, 3090 is out of reach and 3080 is a monster to chase, AMD is in more trouble than fanboys want to admit, but price cuts before even launching are NEVER a good sign.

Just to be clear, both prices are rumored and not official so it's impossible for this to be called a price cut. You can't cut a price when a price hasn't been set.

As long as it doesn't come at 150% of the TDP....

That's impossible given the extremely high TDP of Nvidia's new cards. Nvidia chose samsung 8nm and all disadvantages that come with it. Just for comparison the R9 290X had a TDP of 250w, the 3090 has a TDP of 350w. You can't add 50% on top of the already insane TDP of Nvidia's new cards, it wouldn't be feasible.

Ray Tracing performance and DLSS equivalent. What can AMD offer.

True AI reconstruction, not just some image sharpening which DLSS 2.0 clearly beats out in countless close analysis articles.

Nvidia offer a package and feature rich hardware here, which they know they can mark up. If AMD cannot match it or get close to that then it needs to be a fair bit cheaper, otherwise it won't sell all that well again.

New consoles have Ray Tracing and AI and so does every RDNA2 card so it's really not a matter of if anymore but when.

FYI there are less than a handful of games that support DLSS 2.0 so "countless" is a gross exaggeration. I have yet to even play a game that supports DLSS, which makes sense give that it represents less than 0.0001% of the market.




I’m not holding my breath for anything amazing. A 16GB card is interesting though, we have been on 8GB flagships first years now so it feels like it’s about time. Although I think that really only 4K will benefit from it.

What this new Radeon card really needs is a form of DLSS. With the latest DLSS implementations delivering both better visual quality and frame rate, there is no reason to turn it off. Giving Nvidia cards a huge boost over anything else that can’t do it.

My prediction is that big Navi will edge a 3070 in normal settings but will lack DLSS which could swing the advantage back to Nvidia, potentially quite heavily so.

I don’t think it will matter much if big Navi can or can’t do ray tracing.

RDNA2 is confirmed to support hardware accelerated ray tracing and AI functions. What's not confirmed is if AMD is using those to create a specific tech to counter Nvidia.

DLSS will start mattering to me when it's in more than just 0.0001% of all games. It's extremely nice to have in games that support it but I have yet to play a single game that does.

I'd like to see AMD release cards with a lower TDP than Nvidia at similar performance. 320w and 350w is just crazy. My 1080 Ti has a TDP of 250w (actual power consumption at 328w) and even the notably hot R9 290X had a TDP of 250w. Actual power consumption of a 350w TDP card is going to easily exceed 400w. Add in 280w for the CPU, 22w for the mobo, 4w for the RAM, and 12w for fans and SSD and you are looking at 718w. Mind you that's assuming you don't OC your CPU. This is why ASUS stated that many people will likely have to upgrade their PSUs to use these new GPUs and they even went as far so as to include indicators on the PCIe power connectors that tell you if your power supply is failing to deliver enough power.
 
Last edited:
Just to be clear, both prices are rumored and not official so it's impossible for this to be called a price cut. You can't cut a price when a price hasn't been set.



That's impossible given the extremely high TDP of Nvidia's new cards. Nvidia chose samsung 8nm and all disadvantages that come with it. Just for comparison the R9 290X had a TDP of 250w, the 3090 has a TDP of 350w. You can't add 50% on top of the already insane TDP of Nvidia's new cards, it wouldn't be feasible.

New consoles have Ray Tracing and AI and so does every RDNA2 card so it's really not a matter of if anymore but when.

FYI there are less than a handful of games that support DLSS 2.0 so "countless" is a gross exaggeration. I have yet to even play a game that supports DLSS, which makes sense give that it represents less than 0.0001% of the market.

RDNA2 is confirmed to support hardware accelerated ray tracing and AI functions. What's not confirmed is if AMD is using those to create a specific tech to counter Nvidia.

DLSS will start mattering to me when it's in more than just 0.0001% of all games. It's extremely nice to have in games that support it but I have yet to play a single game that does.

I'd like to see AMD release cards with a lower TDP than Nvidia at similar performance. 320w and 350w is just crazy. My 1080 Ti has a TDP of 250w (actual power consumption at 328w) and even the notably hot R9 290X had a TDP of 250w. Actual power consumption of a 350w TDP card is going to easily exceed 400w. Add in 280w for the CPU, 22w for the mobo, 4w for the RAM, and 12w for fans and SSD and you are looking at 718w. Mind you that's assuming you don't OC your CPU. This is why ASUS stated that many people will likely have to upgrade their PSUs to use these new GPUs and they even went as far so as to include indicators on the PCIe power connectors that tell you if your power supply is failing to deliver enough power.

nVivida is anouncing TGP (Total graphics power) for the RTX300 series, not TDP (thermal desing power, witch ussualy is close to only the GPU chip cpnsumption) ..

Also, nVvidia as already more eficcient at 12nm(16nm really) than AMD at 7nm. Do you really think that AMD staing at 7nm, just with some arquitecture changes will be more efficient nVidia moving to 8nm ? XD

3rd. From console data, que RT performance of those chips is weaker than that of the RTX 2060.. Not really that big competicion.

DLSS 1.0 had weak adoption, becaused it lacked quality and had to be trained per game.
DLSS 2.0 was launched in April and is gaining a lot of traction. Its gonna be pressent on a lot of the major games this year ( Crysis remaster, Cyberpunk, Watchdogs legion, Call of Duty, Fortnite )
DLSS doesnt need to be in every single game, it just needs to be on the most played and most demanding games to make a HUGE difference.
 
nVivida is anouncing TGP (Total graphics power) for the RTX300 series, not TDP (thermal desing power, witch ussualy is close to only the GPU chip cpnsumption) ..

From Nvidia's website

" TGP, or Total Graphics Power, is a more specific term for the power that a power supply should provide to the graphics subsystem"


Still not a measurement of actual power consumption and just like TDP, does not deleniate under what clocks or utilization it was derived from. Worthless.

And no, TDP is not close to the consumption of just the GPU. GamersNexus did an analysis on TDP, it's mostly arbitrary.

TGP isn't replacing TDP, Nvidia still provided TDP numbers for it's cards.


Also, nVvidia as already more eficcient at 12nm(16nm really) than AMD at 7nm. Do you really think that AMD staing at 7nm, just with some arquitecture changes will be more efficient nVidia moving to 8nm ? XD

1) No Turing 12nm is 12nm, not 16nm. TSMC doesn't lie about it's node sizes.

2) Nvidia had about a 12% efficiency advantage https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/29.html

If you assume that was due to architecture than it's also possible to assume that AMD can overcome that with their own architecture as well. After all AMD has made massive strides in the mobile sector recently with their latest mobile chip's iGPU. Not to mention they will be on a superior processes.

The Nvidia TDP numbers and recommended 750w minimum power supply speak for themselves.

3rd. From console data, que RT performance of those chips is weaker than that of the RTX 2060.. Not really that big competicion.

Where are you getting your performance numbers from? They've already shown off Minecraft performing better on the consoles than a 2080 Ti.

DLSS doesnt need to be in every single game, it just needs to be on the most played and most demanding games to make a HUGE difference.

The thing is no one knows what the most played games will be other than the obvious ones like Cyberpunk. You can implement it in AAA titles that are obviously going to be played but that doesn't mean it's going to find it's way into the most played games. I'd have to disagree on only implementing it in popular titles. No everyone likes to play the same things as everyone else. I'm also surprised Nvidia doesn't have a single VR game with the feature given that VR stands to benefit the most. Maybe the tech doesn't work with VR, who knows.
 
Just to be clear, both prices are rumored and not official so it's impossible for this to be called a price cut. You can't cut a price when a price hasn't been set.



That's impossible given the extremely high TDP of Nvidia's new cards. Nvidia chose samsung 8nm and all disadvantages that come with it. Just for comparison the R9 290X had a TDP of 250w, the 3090 has a TDP of 350w. You can't add 50% on top of the already insane TDP of Nvidia's new cards, it wouldn't be feasible.



New consoles have Ray Tracing and AI and so does every RDNA2 card so it's really not a matter of if anymore but when.

FYI there are less than a handful of games that support DLSS 2.0 so "countless" is a gross exaggeration. I have yet to even play a game that supports DLSS, which makes sense give that it represents less than 0.0001% of the market.






RDNA2 is confirmed to support hardware accelerated ray tracing and AI functions. What's not confirmed is if AMD is using those to create a specific tech to counter Nvidia.

DLSS will start mattering to me when it's in more than just 0.0001% of all games. It's extremely nice to have in games that support it but I have yet to play a single game that does.

I'd like to see AMD release cards with a lower TDP than Nvidia at similar performance. 320w and 350w is just crazy. My 1080 Ti has a TDP of 250w (actual power consumption at 328w) and even the notably hot R9 290X had a TDP of 250w. Actual power consumption of a 350w TDP card is going to easily exceed 400w. Add in 280w for the CPU, 22w for the mobo, 4w for the RAM, and 12w for fans and SSD and you are looking at 718w. Mind you that's assuming you don't OC your CPU. This is why ASUS stated that many people will likely have to upgrade their PSUs to use these new GPUs and they even went as far so as to include indicators on the PCIe power connectors that tell you if your power supply is failing to deliver enough power.
Actually, quite a lot of games support DLSS, this article lists 14 current with 26 games on the way. That’s more than the “small handful” you falsely claimed.


This includes many big name games like Fortnite and PubG. Along with a whole wealth of other AAA games.
 
Last edited by a moderator:
Actually, quite a lot of games support DLSS, this article lists 14 current with 26 games on the way. That’s more than the “small handful” you falsely claimed.


This includes many big name games like Fortnite and PubG. Along with a whole wealth of other AAA games.

No, Jensen Huang, falsely claimed...!

I have had my rtx2080 for 20+ months and rtx/dlss has been essentially non-existent... and we know why. Game Developers are NOT supporting this technology... they have gravitated toward Industry Standards... that nVidia themselves will also have to support.

So no matter what is promised by nVidia, it's their proprietary stuff and NO game dev is going to worry about it unless nVidia pays them. Game Developers will be working with DirectML & DXR... because both nVidia & AMD support it.

There will be 20x more DX12U Series X games, than paid nVidia games.... (meaning nVidia's proprietary dlss is marketing only).

 
Actually, quite a lot of games support DLSS, this article lists 14 current with 26 games on the way. That’s more than the “small handful” you falsely claimed.


This includes many big name games like Fortnite and PubG. Along with a whole wealth of other AAA games.
Wow, 14 games two years after Turing‘s launch. Impressive.
Wonder how that compares to the number of games nVidia promised having DLSS support.

According to PC World, it‘s about a third.


So for someone who got a 2060 or 2070 near launch and now plans on upgrading to Ampere, how much of a benefit was RTX and DLSS support in the end ?
 
Back