AMD CEO hints that high-end Navi GPUs and ray tracing may be coming to Radeon cards

Cal Jeffrey

Posts: 4,188   +1,429
Staff member
Rumor mill: As rumors go, nothing beats one coming straight from the horse's mouth. In this case, it's coming directly from AMD's CEO Lisa Su. The rumor: Radeon cards may soon have ray-tracing support, and high-end Radeon GPUs are are on the way. Su did not go into detail, but it seems certain that the company has some plans.

AMD had a few announcements at CES 2020 on Monday, including the Radeon RX 5600 XT with the Navi RDNA architecture and its Ryzen 4000 series for laptops. After the keynote, CEO Lisa Su sat down with the press for a roundtable discussion, which Anandtech transcribed. During the Q&A, Su dropped hints about the future of its Radeon line of GPUs.

The CEO mentioned that the company has some high-end “Big Navi” cards in the pipe. In response to the question of whether AMD needs to have a high-end product in the discrete graphics department, the exec could not help but let slip that such GPUs were in the works without revealing any details.

“I know those on Reddit want a high-end Navi! You should expect that we will have a high-end Navi and that it is important to have it,” Su said. “The discrete graphics market, especially at the high end, is very important to us. So you should expect that we will have a high-end Navi, although I don’t usually comment on unannounced products.”

She also said the company was looking to be more competitive with Nvidia in terms of ray tracing. Current Radeon offerings do not support real-time ray tracing, as the company views the technology as being in the “very early” stages, but that could be changing soon.

“I’ve said in the past that ray tracing is important, and I still believe that, but if you look at where we are today, it is still very early,” she explained. “We are investing heavily in ray tracing and investing heavily in the ecosystem around it – both of our console partners have also said that they are using ray tracing. You should expect that our discrete graphics as we go through 2020 will also have ray tracing.”

Regarding ray tracing: "I do believe though it is still very early, and the ecosystem needs to develop. We need more games and more software and more applications to take advantage of it. At AMD, we feel very good about our position on ray tracing."

If her hint is on the mark, we could potentially see a Radeon GPU with ray-tracing support by the end of the year, and we will definitely be getting it with the PlayStation 5 and Xbox Series X consoles.

A discrete Big Navi card with ray tracing would give Nvidia something to think about, considering it currently enjoys its corner on the RT market. Having AMD jump into the game would most certainly bring more affordable RT cards to the consumer from both manufacturers, even on the high-end.

However, keep in mind that Su’s remarks were off-the-cuff and unplanned. No official announcements have been made, so anything could change at this point. However, coming from as high up on the ladder as you can get makes this news hard to call "rumor." It sounds more like a roadmap, but as they say, "Never count your chickens before they hatch."

That said, if AMD follows through, upgrading your gaming rig may become your priority this holiday season.

Permalink to story.

 
It's actually an annoyance that Ray Tracing is "a thing" now. All Nvidia fans wanted were less expensive cards that could run at a solid 4K 60fps - 120fps. What we got instead was "ray tracing", an answer to a question that no one asked.

Current shadow, reflection and transparency technology is "good enough".

Now they get to go on to the 2nd generation of more efficient, more powerful RTX cards while AMD has to catch up to the 2080ti...since that will inevitably be the "Standard" to which all ray tracing cards from any and all manufacturers will be measured against.
 
AMD dont have to catch up to Nvidia at all because that is a race Nvidia users want so the prices of their cards drop.They are the only people to benefit from such a graphics card release so I say NO dont release one let them pay out the nose for their cards because all they ever do is complain and say its LATE what a fail.
Hope AMD stays with the middle market and avoids the top end and let them *****s suffer the price hogs.
 
I feel sorry for AMD since they have to catch up with ray tracing just cause it's "MUH RAYY TRAACINGG" instead of focusing on performance. Ray tracing is still a meme and even Nvidia doesn't do **** with it.
 
Since next gen consoles have amd hardware that is said to be ray tracing capable, why is it only a maybe pc users will get cards that can ray trace this year?
 
Since next gen consoles have amd hardware that is said to be ray tracing capable, why is it only a maybe pc users will get cards that can ray trace this year?
Cause the consoles won’t be out until the holidays and those cards won’t be PC cards ?
 
I feel sorry for AMD since they have to catch up with ray tracing just cause it's "MUH RAYY TRAACINGG" instead of focusing on performance. Ray tracing is still a meme and even Nvidia doesn't do **** with it.


Not true...they make commercials!!!
 
AMD dont have to catch up to Nvidia at all because that is a race Nvidia users want so the prices of their cards drop.They are the only people to benefit from such a graphics card release so I say NO dont release one let them pay out the nose for their cards because all they ever do is complain and say its LATE what a fail.
Hope AMD stays with the middle market and avoids the top end and let them *****s suffer the price hogs.


The 2080Ti and RTX Titan dominate the top of the GPU performance charts.

Because the 2080Ti is "affordable" and the Titan isn't...the 2080Ti will become the performance icon of the last gen to beat just like the 1080Ti did.

Only now, the "professional reviewers" and social media influencers will have to concentrate their reviews on how well a card performs with Direct X Ray Tracing turned ON.

Just being able to run a game with all settings at maximum - like Crysis - now takes on the inheritance of "RTX ON" being an advertisement point.

If you can't hit the 4K 60fps consistently with "RTX ON" then you will automatically pale in comparison to the 2080Ti.
 

Attachments

  • 2080ti resized (1).jpg
    2080ti resized (1).jpg
    132.1 KB · Views: 1
I genuinely do not understand why anyone cares about raytracing. What a waste of silicon real estate! Gamers want more framerate in a big way right now. I’d love a GPU with hdmi 2.1 that can drive 4k at 120hz 120fps. Nothing can do that and it makes me sad. That 48 inch LG oled with 120hz gsync and BFI makes every computer monitor look like ****.
 
I feel sorry for AMD since they have to catch up with ray tracing just cause it's "MUH RAYY TRAACINGG" instead of focusing on performance. Ray tracing is still a meme and even Nvidia doesn't do **** with it.

It does make Quake II lighting a lot better. So there's that, at least.
 
The 2080Ti and RTX Titan dominate the top of the GPU performance charts.

Because the 2080Ti is "affordable" and the Titan isn't...the 2080Ti will become the performance icon of the last gen to beat just like the 1080Ti did.

Only now, the "professional reviewers" and social media influencers will have to concentrate their reviews on how well a card performs with Direct X Ray Tracing turned ON.

Just being able to run a game with all settings at maximum - like Crysis - now takes on the inheritance of "RTX ON" being an advertisement point.

If you can't hit the 4K 60fps consistently with "RTX ON" then you will automatically pale in comparison to the 2080Ti.

The 2080 Ti isn't affordable in any sense of the word and it will never be on the same level as the 1080 Ti, which provided a far larger boost in performance for much less money.
 
I genuinely do not understand why anyone cares about raytracing. What a waste of silicon real estate! Gamers want more framerate in a big way right now. I’d love a GPU with hdmi 2.1 that can drive 4k at 120hz 120fps. Nothing can do that and it makes me sad. That 48 inch LG oled with 120hz gsync and BFI makes every computer monitor look like ****.

I care. Ray tracing enables photo-realistic gaming, some of those demos are siiiiiiiiiiick!!!! I'd prefer that at 60fps @ 1080 or even 1440 resolution than 120+fps at 4k with none photo-realistic graphics.

Anwyays, Ampere is supposed to bring Ray tracing to the masses for much cheaper, so looking forward to Nvidia's next-gen graphics this year!
 
I care. Ray tracing enables photo-realistic gaming, some of those demos are siiiiiiiiiiick!!!! I'd prefer that at 60fps @ 1080 or even 1440 resolution than 120+fps at 4k with none photo-realistic graphics.

Anwyays, Ampere is supposed to bring Ray tracing to the masses for much cheaper, so looking forward to Nvidia's next-gen graphics this year!
Cool then you can play Doom from the 90s lol.
 
Give me a good story and I wouldn't care about framerate or to an extent graphic quality.
The Outer Worlds is New Vegas' spiritual successor. It doesn't have the greatest graphics or the highest budget but it has a decent story and gameplay. It is considerer one of the best games released in that year without crazy graphics, ray tracing and other bullshit.
 
If they (AMD and nVidia) can manage to include ray tracing with only a minimal performance impact, this is fine with me - I won't say no.

Of course, this will take up extra die space but there is another interesting use besides eye candy. I think it was mentioned in an XBox / PS5 interview where they said that you could also use ray tracing to improve games' AI, I.e. use it to trace when an opponent can hear (audio) or see (visual) you.

I find that use much more interesting than eye candy imho, but that is not to say that nicer graphics if the cost (fps + price) is not higher than the benefit.
 
I care. Ray tracing enables photo-realistic gaming, some of those demos are siiiiiiiiiiick!!!! I'd prefer that at 60fps @ 1080 or even 1440 resolution than 120+fps at 4k with none photo-realistic graphics.

Anwyays, Ampere is supposed to bring Ray tracing to the masses for much cheaper, so looking forward to Nvidia's next-gen graphics this year!
If you prefer low FPS at 1080p/1440p instead of 4K at higher FPS then you have your priorities wrong if graphics is what you care about. The difference in quality is much more noticeable than some fancy reflections which look great without RT anyway.

If you want true photorealism then your only option is to use something like Blender or Maya and render some great scenes. The demos you mentioned may look good, but I assure you that if they used traditional rendering to mimic raytraycing you would not notice if the two were swapped.

What you want is RT at good prices and good performance, don't accept compromises on PC (especially on the performance side when you buy top end hardware).
 
Last edited:
The Outer Worlds is New Vegas' spiritual successor. It doesn't have the greatest graphics or the highest budget but it has a decent story and gameplay. It is considerer one of the best games released in that year without crazy graphics, ray tracing and other bullshit.

Yeah I couldn't bring myself to finish The Outer Worlds (played about 30hours), it feels the same as Fallout 3: quests are almost all fetch quests, NPCs stay in one spot, no day/night cycle and there is a pointer to show where exactly you need to go. Then again The Witcher 3 kinda ruined the RPG genre for me as I always compare any new RPG game to TW3.
On the another hand I finished Metro Exodus and Control just fine, graphics are gorgeous and immersive, stories are okay and not too dragged out (both are 20-30hours long).
 
Last edited:
The technology was most likely paid to be developed by MS and Sony and AMD timed the release of RDNA 2 to coincide with the next gen console release/time schedule.
Can't say anything about Sony, but Microsoft worked with Nvidia, using the Volta architecture, to develop DXR. It's possible that the next Xbox will use a system and protocol interface proprietary to AMD for ray tracing, but given the amount of work already done in DXR, I would expect Microsoft to use that.

Of course, this will take up extra die space but there is another interesting use besides eye candy.
I've not seen any specific figures directly quoted but I suspect it's not as much as one may expect. The TU106 chip is 445 mm² whereas the TU116 is 284 mm² - the former has 50% more SMs and MCs, and 100% more ROPs compared to the latter. This could partly explain the difference in the die sizes (e.g. 1.5 x 284 = 426) but obviously, the TU116's SMs don't contain Tensor and RT units. These were replaced with a stack of FP16 ALUs, so it would seem that, very roughly, the Tensor/RT units are about the same as these in die size.

I think it was mentioned in an XBox / PS5 interview where they said that you could also use ray tracing to improve games' AI, I.e. use it to trace when an opponent can hear (audio) or see (visual) you
This is already done in plenty of games, but since the number of rays required is very low and the BVH is very simple, the performance needed doesn't require specialist hardware. So while it's true that RT-specific units can help improve the performance of such calculations, it's actual application will probably be quite limited.
 
Can't say anything about Sony, but Microsoft worked with Nvidia, using the Volta architecture, to develop DXR. It's possible that the next Xbox will use a system and protocol interface proprietary to AMD for ray tracing, but given the amount of work already done in DXR, I would expect Microsoft to use that.


I've not seen any specific figures directly quoted but I suspect it's not as much as one may expect. The TU106 chip is 445 mm² whereas the TU116 is 284 mm² - the former has 50% more SMs and MCs, and 100% more ROPs compared to the latter. This could partly explain the difference in the die sizes (e.g. 1.5 x 284 = 426) but obviously, the TU116's SMs don't contain Tensor and RT units. These were replaced with a stack of FP16 ALUs, so it would seem that, very roughly, the Tensor/RT units are about the same as these in die size.


This is already done in plenty of games, but since the number of rays required is very low and the BVH is very simple, the performance needed doesn't require specialist hardware. So while it's true that RT-specific units can help improve the performance of such calculations, it's actual application will probably be quite limited.
DXR is separate from the hardware implementation, it's just an API. They used Nvidia Volta because that was the hardware best suited for development of such a feature, but DXR can be used even today with AMD hardware (even with no full driver support from AMD yet).
 
"That said, if AMD follows through, upgrading your gaming rig may become your priority this holiday season."

It's not even halfway through JANUARY.
 
Interpretation: We've already done it, now we want to know how much we can stick you for in order to get it ......
 
Can't say anything about Sony, but Microsoft worked with Nvidia, using the Volta architecture, to develop DXR. It's possible that the next Xbox will use a system and protocol interface proprietary to AMD for ray tracing, but given the amount of work already done in DXR, I would expect Microsoft to use that.

I've not seen any specific figures directly quoted but I suspect it's not as much as one may expect. The TU106 chip is 445 mm² whereas the TU116 is 284 mm² - the former has 50% more SMs and MCs, and 100% more ROPs compared to the latter. This could partly explain the difference in the die sizes (e.g. 1.5 x 284 = 426) but obviously, the TU116's SMs don't contain Tensor and RT units. These were replaced with a stack of FP16 ALUs, so it would seem that, very roughly, the Tensor/RT units are about the same as these in die size.

This is already done in plenty of games, but since the number of rays required is very low and the BVH is very simple, the performance needed doesn't require specialist hardware. So while it's true that RT-specific units can help improve the performance of such calculations, it's actual application will probably be quite limited.

Neeyik, you are stretching the truth....
Microsoft didn't work with nvidia, they used nvidia's hardware...

During the hardware testing phase, to see if the DirectX 12 API they were developing, was viable.



Subsequently, Nvidia's RTX, is not DirectX Ray Tracing (DXR)…
RTX is nvidia's proprietary ray tracing solution, that only works in 6 games. Where as DXR will work in all games, on all cards... and does not have to have a special Nvidia team come into the Developer's studio, to work with the developer to implement Nvidia's proprietary RTX.

Again, that is why only 6 Developer's have released "RTX On" games. Because there is no benefit from doing so, when DX12 is much better.


Mater of fact, nvidia's next GPU will have hardware dedicated to DX12 ray tracing in games, not Enterprise specific hardware, re-worked to slow down games while attempting to ray trace games in real time.
 
Back