Radeon 6000 Easter egg discovered in Fortnite, 16GB card rumored to undercut RTX 3080

Wow, 14 games two years after Turing‘s launch. Impressive.
Wonder how that compares to the number of games nVidia promised having DLSS support.

According to PC World, it‘s about a third.


So for someone who got a 2060 or 2070 near launch and now plans on upgrading to Ampere, how much of a benefit was RTX and DLSS support in the end ?
Well if it takes 2 years to get to this point then that’s very bad news for AMD. And users who bought RTX 20xx series actually enjoyed something when it came to ray tracing and DLSS. I get it, you want to pick two cards released two years ago to showcase the worst case scenario for a user. Of course, if the user went with Nvidias competitor they would have had a Vega 64 which has no DLSS, higher temps and awful driver support and lower performance.

I understand that you are an AMD fan and want to play down the significance of DLSS. But it’s here and it’s going to ruin your year unfortunately mate. It’s getting adopted quite well considering the tech was only announced 2 years ago.
 
Well if it takes 2 years to get to this point then that’s very bad news for AMD. And users who bought RTX 20xx series actually enjoyed something when it came to ray tracing and DLSS. I get it, you want to pick two cards released two years ago to showcase the worst case scenario for a user. Of course, if the user went with Nvidias competitor they would have had a Vega 64 which has no DLSS, higher temps and awful driver support and lower performance.

I understand that you are an AMD fan and want to play down the significance of DLSS. But it’s here and it’s going to ruin your year unfortunately mate. It’s getting adopted quite well considering the tech was only announced 2 years ago.
I am not an AMD fan. While I strongly dislike Intel, I am pretty much neutral when it comes to nVidia. That said, if AMD and nVidia offer me roughly the same for my money (my requirements may not match yours), I would go for AMD in the interest of a competitive market, I.e. support the smaller player.

Now on to RT and DLSS. Both are definitely interesting technologies and I applaud nVidia for bringing them to the market.

What irked me was that it was presented as „must have“ for last gen. Imho, RTX buyers were beta testers and paying a good bit for that privilege .

Of course, if your favorite games happen to be the ones supporting RT and DLSS well and your GPU could run RT at good frame rates (doubt that for standard 2060 and 2070), all is well. Looking at the games my kid is playing, there would have been zero advantage over non RT cards.

RT being here won‘t ruin anything for me as my next GPU will have it either way and I avoided paying the early adopter tax.

And why would RT have ruined anything for me in the first place? Don‘t quite understand your point there.
 
I am not an AMD fan. While I strongly dislike Intel, I am pretty much neutral when it comes to nVidia. That said, if AMD and nVidia offer me roughly the same for my money (my requirements may not match yours), I would go for AMD in the interest of a competitive market, I.e. support the smaller player.

Now on to RT and DLSS. Both are definitely interesting technologies and I applaud nVidia for bringing them to the market.

What irked me was that it was presented as „must have“ for last gen. Imho, RTX buyers were beta testers and paying a good bit for that privilege .

Of course, if your favorite games happen to be the ones supporting RT and DLSS well and your GPU could run RT at good frame rates (doubt that for standard 2060 and 2070), all is well. Looking at the games my kid is playing, there would have been zero advantage over non RT cards.

RT being here won‘t ruin anything for me as my next GPU will have it either way and I avoided paying the early adopter tax.

And why would RT have ruined anything for me in the first place? Don‘t quite understand your point there.
I said a DLSS will ruin your year because I’m pretty certain it’s going to make some brutal comparisons for AMD. We could see a few scenarios where a midrange Nvidia card can run fortnite and pubg (and a wealth of other games) faster than bigNavi when using DLSS and looking better to boot. Currently DLSS is allowing the older and cheaper 2060 to play some games at 20-30fps higher than the more recent, more expensive 5700XT at 4K.

DLSS has evolved very quickly and is far more than something comparable to a sharpening filter now. It isn’t going away either and will be used by more and more games as time goes on.

I don’t think ray tracing matters much yet for sales and value. Although I did recently create a new minecraft survival world with one of the RTX resource packs and it genuinely is the most impressive PC graphics I have ever seen and in a way that I haven’t seen before (ran at 70-100fps at 1440 on a 2080 so FU to the people who claim RT tanks performance). Ray tracing is phenomenal but we are years away from fully ray traced modern 3D games. RT is an enthusiasts dream though, so whilst it won’t matter much to sales, it should matter to enthusiasts. I love it, I want more of it, it’s exactly the sort of thing I want to see. It seems that Nvidia sell the enthusiast cards and AMD sell the budget parts at the moment.
 
I have a gsync monitor so im going nvidia either way. But I do have a question for AMD and those who believe AMD will somehow top this Ampere release.

If your card is going to be faster and cheaper, why keep quiet? Why let your competitor lock in those early sales? If the RDNA2 can beat ampere AMD should know by now from the info nvidia has released. Why not spill the beans and throw a wrench in the preorders? Its not like AMD benefits from a bunch of people feeling like they got cheated and being stuck with a slower Nvidia card because they didn't wait 2 weeks. Sure it might be fun for AMD fanboys in the forums but it doesn't make AMD any money or add anyone to their ecosystem (and as any businessman knows its easier to retain a customer than earn a new one).

AMD has had plenty of surprises the past few years, but I just don't think this will be one of them. I think they already know where their cards will slot against the new ampere lineup (and so does nvidia). I suspect both Nvidia and AMD knew exactly what their competitors products were capable of months ago and its exactly why the 3xxx series cards are being released at the performance and price they're at.

So even if you plan on buying a nvidia card, maybe buy a ryzen cpu. Because AMD has lit enough of a fire under intel and nvidia we will hopefully see more releases like these in the future.
 
I have a gsync monitor so im going nvidia either way. But I do have a question for AMD and those who believe AMD will somehow top this Ampere release.

If your card is going to be faster and cheaper, why keep quiet? Why let your competitor lock in those early sales? If the RDNA2 can beat ampere AMD should know by now from the info nvidia has released. Why not spill the beans and throw a wrench in the preorders? Its not like AMD benefits from a bunch of people feeling like they got cheated and being stuck with a slower Nvidia card because they didn't wait 2 weeks. Sure it might be fun for AMD fanboys in the forums but it doesn't make AMD any money or add anyone to their ecosystem (and as any businessman knows its easier to retain a customer than earn a new one).

AMD has had plenty of surprises the past few years, but I just don't think this will be one of them. I think they already know where their cards will slot against the new ampere lineup (and so does nvidia). I suspect both Nvidia and AMD knew exactly what their competitors products were capable of months ago and its exactly why the 3xxx series cards are being released at the performance and price they're at.

So even if you plan on buying a nvidia card, maybe buy a ryzen cpu. Because AMD has lit enough of a fire under intel and nvidia we will hopefully see more releases like these in the future.

I'm not sure people think AMD is going to beat Nvidia performance wise. Too much unknown at the point to know for sure.

I will say that the longer AMD waits, the less time Nvidia has to react. Rumors are that Nvidia only has 10,000 cards worldwide for launch anyways so it may be wise for AMD to just wait until after launch. It's a lot harder for Nvidia to react after video cards are already in customer hands.
 
I have a gsync monitor so im going nvidia either way. But I do have a question for AMD and those who believe AMD will somehow top this Ampere release.

If your card is going to be faster and cheaper, why keep quiet? Why let your competitor lock in those early sales? If the RDNA2 can beat ampere AMD should know by now from the info nvidia has released. Why not spill the beans and throw a wrench in the preorders? Its not like AMD benefits from a bunch of people feeling like they got cheated and being stuck with a slower Nvidia card because they didn't wait 2 weeks. Sure it might be fun for AMD fanboys in the forums but it doesn't make AMD any money or add anyone to their ecosystem (and as any businessman knows its easier to retain a customer than earn a new one).

AMD has had plenty of surprises the past few years, but I just don't think this will be one of them. I think they already know where their cards will slot against the new ampere lineup (and so does nvidia). I suspect both Nvidia and AMD knew exactly what their competitors products were capable of months ago and its exactly why the 3xxx series cards are being released at the performance and price they're at.

So even if you plan on buying a nvidia card, maybe buy a ryzen cpu. Because AMD has lit enough of a fire under intel and nvidia we will hopefully see more releases like these in the future.
So, realistically, what percentage of those who order a 3080 or 3090 asap would realistically consider a Radeon GPU ?
 
From a general market perspective, ideally AMD should compete up to the 3080 level, but as I said, it does not matter to me personally. An uber 3090 or 6900XT won‘t make my 3070 / 6700 any better.
It would make a "6700" better. Because people think, NVIDIA has the fastest GPU, they must be good and buy an el cheapo one from NVIDIA.
 
It would make a "6700" better. Because people think, NVIDIA has the fastest GPU, they must be good and buy an el cheapo one from NVIDIA.
I know that‘s how it works - car manufacturers have done this for ages.

But the halo effect still does not make the lower end product any better, which why I ignore that. I do find halo products exciting, no question, but I do not include them in my value for money calculation.

This is different for CPU if I know I can upgrade to the current halo product for half price on my platform down the road.
 
nVivida is anouncing TGP (Total graphics power) for the RTX300 series, not TDP (thermal desing power, witch ussualy is close to only the GPU chip cpnsumption) ..

Also, nVvidia as already more eficcient at 12nm(16nm really) than AMD at 7nm. Do you really think that AMD staing at 7nm, just with some arquitecture changes will be more efficient nVidia moving to 8nm ? XD

3rd. From console data, que RT performance of those chips is weaker than that of the RTX 2060.. Not really that big competicion.

DLSS 1.0 had weak adoption, becaused it lacked quality and had to be trained per game.
DLSS 2.0 was launched in April and is gaining a lot of traction. Its gonna be pressent on a lot of the major games this year ( Crysis remaster, Cyberpunk, Watchdogs legion, Call of Duty, Fortnite )
DLSS doesnt need to be in every single game, it just needs to be on the most played and most demanding games to make a HUGE difference.
No one's even see RDNA2 cards yet so how can you say it's only as good as a RTX 2060?
 
Anything, absolutely anything that will force Nvidia to share the market is so f*^%ng needed these days. Let them both be and be good. And lets us, customers, enjoy fairly priced video cards which will last till 16k arrives.
 
I said a DLSS will ruin your year because I’m pretty certain it’s going to make some brutal comparisons for AMD. We could see a few scenarios where a midrange Nvidia card can run fortnite and pubg (and a wealth of other games) faster than bigNavi when using DLSS and looking better to boot. Currently DLSS is allowing the older and cheaper 2060 to play some games at 20-30fps higher than the more recent, more expensive 5700XT at 4K.

DLSS has evolved very quickly and is far more than something comparable to a sharpening filter now. It isn’t going away either and will be used by more and more games as time goes on.

I don’t think ray tracing matters much yet for sales and value. Although I did recently create a new minecraft survival world with one of the RTX resource packs and it genuinely is the most impressive PC graphics I have ever seen and in a way that I haven’t seen before (ran at 70-100fps at 1440 on a 2080 so FU to the people who claim RT tanks performance). Ray tracing is phenomenal but we are years away from fully ray traced modern 3D games. RT is an enthusiasts dream though, so whilst it won’t matter much to sales, it should matter to enthusiasts. I love it, I want more of it, it’s exactly the sort of thing I want to see. It seems that Nvidia sell the enthusiast cards and AMD sell the budget parts at the moment.

Hey joker... how many games on the new Consoles are going to feature DLSS...?

So why would any of those Game Developers be worried about 5% of the market, where someone bought a $1,600 dGPU...? That uses proprietary tech... When that same dlss company will also have to support directML too.

There is zero incentive for Devs to use DLSS... that is why over the past 24 months there are 6 dlss games. nVidia has already proven to CURRENT rtx owners that nVidia's ecosystem has fallen apart.
 
No one's even see RDNA2 cards yet so how can you say it's only as good as a RTX 2060?

They anounced the Raytracing power for the Xbox Series X, and its worse than the 2060 Ray tracing power. Thats what I'm talking about..
 
Hey joker... how many games on the new Consoles are going to feature DLSS...?

So why would any of those Game Developers be worried about 5% of the market, where someone bought a $1,600 dGPU...? That uses proprietary tech... When that same dlss company will also have to support directML too.

There is zero incentive for Devs to use DLSS... that is why over the past 24 months there are 6 dlss games. nVidia has already proven to CURRENT rtx owners that nVidia's ecosystem has fallen apart.

DLSS 1 was weak, DLSS 2 its not.
You had 6 or 7 games with DLSS 1 , but since version 2 ( 6 months ) more than 10 have been lunched/announced, I think alone shows how its gaining traction.

3º the RTX 2060 its not 1600$ , neither the RTX 3050 will be a 1600$ gpu.

RTX Graphics right now account for 10% of steam gamers. When it comes to 200$ GPU's ins 3 months that number its gonna rise, a lot...

You say that BS, but I'm the joker ? LOOL XD
 
DLSS 1 was weak, DLSS 2 its not.
You had 6 or 7 games with DLSS 1 , but since version 2 ( 6 months ) more than 10 have been lunched/announced, I think alone shows how its gaining traction.

3º the RTX 2060 its not 1600$ , neither the RTX 3050 will be a 1600$ gpu.

RTX Graphics right now account for 10% of steam gamers. When it comes to 200$ GPU's ins 3 months that number its gonna rise, a lot...

You say that BS, but I'm the joker ? LOOL XD


I've owned my RTX2080 for like almost 2 years and RTX and DLSS are essentially non-existent.

And at this point, it doesn't matter if it starting to gain traction... the consoles are here (next month) and they won't be using RTX, or dlss, but something else, like directML... or another industry standard (ie: vulkan) and not proprietary tech from only one company...!

nVidia will never sell more video cards than the xbox. Which means AMD rdna's architecture will be the dominant gaming platform. Game Developers who make games for the PC, will be using the Xbox/playstation as their backbone...

Lastly, there is no rtx3060, or rtx 3050 is there..? I
 
I've owned my RTX2080 for like almost 2 years and RTX and DLSS are essentially non-existent.

And at this point, it doesn't matter if it starting to gain traction... the consoles are here (next month) and they won't be using RTX, or dlss, but something else, like directML... or another industry standard (ie: vulkan) and not proprietary tech from only one company...!

nVidia will never sell more video cards than the xbox. Which means AMD rdna's architecture will be the dominant gaming platform. Game Developers who make games for the PC, will be using the Xbox/playstation as their backbone...

Lastly, there is no rtx3060, or rtx 3050 is there..? I

Yes, they were.. they are now only gainingreal traction.
DLSS only became a good feature 6 months ago, not when you bouth it.. It took time.
How many year did DX12 took to beeing implemented in all games ?
It ws launched in 2015 and in 2018 there were many games still only supporting DX11...

Also, What ?
DirectML its just the API for a "software" to comunicate with the "Hardware" . Nothing more; its not some DLSS like software..
DLSS uses directML ( or some version of it).
But you still have to create the entire AI software behind ( DLSS like) to do the upscaling.
The developers can create a game that on consoles run some kind of future AI upscaling, and running on PC use DLSS ( or some other).

DSLL wont stop growing with the new consoles , they arent "replacing" it.
Not to mention consoles wont have dedicated hardware like nVidia GPU's, so a future upscaler will have much less gains on consoles than on nVidia GPU's.

And the same for RTX. A game that supports raytracing hardware on consoles will also support raytracing on nVidia GPU's ( Like Cyberpunk 2077, or watchdogs)

At the end, the new consoles will just help boost the number os DLSS and RTX games, not the other way around..
 
Sorry bruno... but Game Developers are the ones who choose... not You, Me or nVidia.

Which is it... either dlss is uber-super-special... or it just used directML. and if so, why wouldnt nVidia's proprietary version be better..? Again, why know directML can do the same.thing, because Microsoft showed us.

Also, Microsoft (XSX) will have more games than nVidia. RDNA will be the predominant gaming chip... it doesnt need dlss, because it uses directML/dx12u... and so will most the games. Plus, nVidia has to support it too.

So good luck to nVidia for training a A.I. model for every game... while 90% of the gaming industry, will be using the standard.
 
Sorry bruno... but Game Developers are the ones who choose... not You, Me or nVidia.
......
Not true
Direct ML its just an API, it just does the comunication between the software and the hardware, nothing more.
DirectML alone doesnt do anything, no upscale, no calculations of whatever, NOTHING.
DirectML its the "directX" for machine learning aplications ..

Game > DirectX 12 > Drivers > GPU
Machine learning aplication ( Ai upscaler e.g.) > DirectML > Drivers > GPU

Its just a standarzidazion of comunication for machine learning aplications to interact with the hardware. Not only upscalers but any aplication, like caracter AI ou aplications that have nothing to do with gaming

Microsoft migh come with some kind of upscaler implementation, but that doesnt make it that developers have to choose from one or ther other.
That would be the same as saying that developers have to choose between the game running only on nVidia GPUs or AMD GPU's.
That doesnt happer because you have directX, and booth the game and the GPU drivers are coded to interface through it.
As long the upscaler is coded to comunicate through DirectML, you can use one on Xbox and other on PC. The developers dont have to care about that..
 
Not true
Direct ML its just an API, it just does the comunication between the software and the hardware, nothing more.
DirectML alone doesnt do anything, no upscale, no calculations of whatever, NOTHING.
DirectML its the "directX" for machine learning aplications ..

Game > DirectX 12 > Drivers > GPU
Machine learning aplication ( Ai upscaler e.g.) > DirectML > Drivers > GPU

Its just a standarzidazion of comunication for machine learning aplications to interact with the hardware. Not only upscalers but any aplication, like caracter AI ou aplications that have nothing to do with gaming

Microsoft migh come with some kind of upscaler implementation, but that doesnt make it that developers have to choose from one or ther other.
That would be the same as saying that developers have to choose between the game running only on nVidia GPUs or AMD GPU's.
That doesnt happer because you have directX, and booth the game and the GPU drivers are coded to interface through it.
As long the upscaler is coded to comunicate through DirectML, you can use one on Xbox and other on PC. The developers dont have to care about that..

Again, you are assuming the Microsoft can't use directML to do what DLSS is doing. That is your mistake.

Since Microsoft claimed already back in 2018 that they can upscale, or sharpen images using directX12 U (ie: using the API's within DX12, hence also directML). So it seems You are purposely being obtuse, for no reason and think nVidia invented image sharpening...lol

You have to temper your rational with the FACT that Microsoft's Series X will have more games developed for it, than nVidia will have Games Developed it's $500+ dGPU with nVidia's proprietary hardware..!

You simply can't accept this, because you do not have a grasp on reality. You are thinking of the past.. and not what has taken place.


Lastly, you are being stubborn and NOT listening, so I'll repeat it

No, it doesn't mean: (your quote) " Microsoft might come with some kind of upscaler implementation, but that doesnt make it that developers have to choose from one or their other. That would be the same as saying that developers have to choose between the game running only on nVidia GPUs or AMD GPU's. "

It means that Developer's can just make Games using Industry Standards(ie DX12U/Vulkan/ETC)... (that both nVidia and AMD can use)... instead of any Developer spending their time worrying about nVIdia's badly implemented dlss... that is going no where quickly.
 
Back