Nvidia is adding ray tracing support to some GTX cards

midian182

Posts: 9,734   +121
Staff member
Why it matters: When Nvidia first unveiled its RTX 20-series line last year, one of the main features it pushed was the real-time ray tracing abilities. Now, there’s some good news for owners of GTX cards: ray tracing support is coming to these older products, albeit in a basic form and with some caveats.

Update: The latest WHQL GeForce driver update version 425.31 enables DirectX ray tracing support for GeForce GTX 1660/Ti, GTX 1080/Ti, GTX 1070/Ti, and GTX 1060 6 GB graphics cards. Also Titan GPUs based on previous-gen Pascal and Volta architectures.

Update #2: Yes, we will be benchmarking ray tracing in GTX cards soon.

Nvidia used the Game Developer Conference to reveal that April’s GeForce driver update will add basic ray tracing support to cards ranging from the 6GB GTX 1060 and above. This includes the recent GTX 1660 and 1660 Ti, as well as the Titan X, XP, and V. Nvidia writes that games will work with the GTX cards without updates because ray-traced titles are built on DirectX 12’s DirectX Raytracing API.

Despite the ray tracing effects on non-RTX cards being “basic” and casting far fewer rays, using the feature on these GPUs comes with a hefty performance hit, which varies depending on the game being played.

Using a Core i9-7900K and a GTX 1080 Ti at a 1440p resolution, Metro Exodus, which uses real-time ray-traced global illumination, can’t even reach 20 fps. Things improve slightly with Shadow of the Tomb Raider, which comes in at around 30 fps, and Battlefield V's less demanding ray tracing effects mean 50-60 fps is possible.

Those who want the full-fat ray tracing experience and Nvidia’s Deep Learning Super Sampling (DLSS) will still need an RTX card and its dedicated RT and Tensor cores. But GTX owners willing to sacrifice performance might appreciate the chance to see how much ray tracing adds to a game.

We’ll just have to wait and see if this encourages more game developers to support ray tracing, which is what Nvidia is doubtlessly hoping for.

Permalink to story.

 
Ray tracing will never take off if there's only 2-3 cards that are capable of using it because devs won't bother making something for a handful of people, that's why Nvidia needs to include as many cards as they can.

Using a Core i9-7900K and a GTX 1080 Ti at a 1440p resolution, Metro Exodus, which uses real-time ray-traced global illumination, can’t even reach 20fps.
It's still absolute garbage.
 
I guess, it is, at least in part, a chicken-and-egg problem for NVidia: game developers don't bother spending resources on RT because very few people have required RTX hardware; on the other hand, there is little point in buying overpriced RTX hardware because there are few games for it.
 
Whats the point of this if you cant run the game it was meant to be at? Which is n should always be at least 60 frames.
Why would anyone even want try the tech whem they wont be able to properly try it, 20-30 frames isnt good enough.
 
Whats the point of this if you cant run the game it was meant to be at? Which is n should always be at least 60 frames.
Why would anyone even want try the tech whem they wont be able to properly try it, 20-30 frames isnt good enough.

Maybe it could be used in games that have lower demanding graphics which get ray tracing added as was demonstrated with Quake 2. It wouldn't necessarily have to be for games that ancient, 10 year old ones getting a HD remaster or indie titles could benefit from it perhaps.
 
Ray tracing will never take off if there's only 2-3 cards that are capable of using it because devs won't bother making something for a handful of people
I mean who would ever create a card that accelerates 3D graphics at all, there'll only be a handful of first adopters no point devs supporting it.
 
The Halo collection is coming to PC, lets say Halo 2 remaster was gonna get ray tracing added, a GTX 1080 should be able to handle that quite nicely and the game would look even better.
 
The latest demo from CryTek demonstrates ray-tracing on any AMD or Nvidia video card, without specialized hardware. Obviously CryTek engineers are much better than Nvidia's.

They had no problems adding RT to all current AMD and Nvidia cards, even those which Nvidia itself didn't support. Maybe CryTek should organize a course of programming for Nvidia developers.
 
One would say its to show people what they are missing and get them to want to upgrade to RTX
I say it will only show people how lazy game developers are for not using traditional rendering techniques that are adequate to approximate the effect' that is why people ask is it on or off.
 
The latest demo from CryTek demonstrates ray-tracing on any AMD or Nvidia video card, without specialized hardware. Obviously CryTek engineers are much better than Nvidia's.

They had no problems adding RT to all current AMD and Nvidia cards, even those which Nvidia itself didn't support. Maybe CryTek should organize a course of programming for Nvidia developers.
My bet is that CryTek is using conformal geometric algebra. It is not all that well-known even among mathematicians, and it sounds like it has completely escaped nVidia. With a good implementation of CGA, performance is better than LA - part of the reason is that LA generates useless artifacts that CGA completely avoids.

It is also my bet that CryTek has inadvertently caused nVidia to take this approach and that it is nVidia's attempt to save face. As I see it, nVidia better be careful as they may be digging themselves further into a hole than they already are.
 
Even the RT performance of the 2080 ti isn't enough for the best gaming experience as it struggles to maintain or even reach 60 fps at 4k. I'll wait for the next gen or open source implementation. Looking at what the crytek devs did on the software side gives me more hope than nvidia's proprietary hardware.
 
Ray Tracing is an answer to a question no one asked.

It's like setting a goal of having details in gaming that for the most part don't significantly change the gameplay.

I bought the 2080Ti hoping to see more by now and thus far I haven't.

It makes NO sense to try and force GTX cards to do ray tracing.

Even the Tivan V is terrible at it.
 
I guess, it is, at least in part, a chicken-and-egg problem for NVidia: game developers don't bother spending resources on RT because very few people have required RTX hardware; on the other hand, there is little point in buying overpriced RTX hardware because there are few games for it.

I agree. Though it's always good to see the tech pushed, no matter how small. Once AMD implements their version of this, it will only make these cards cheaper. That's good for us. I have a 1080ti and I don't see anything about the newest gen cards that are going to force me to spend another $700 plus on a video card. My 1080ti keeps me at max settings in every game I've tested it on, so there's little reason to care about RTX cards at the moment for me.
 
It's new technology. Give it time to mature, Sheesh.

So much inexperience and youth in these comments.
I feel like saying 'Hello! Welcome to Earth.'
New tech takes forever to be used anywhere near its potential, going on about 15 years since the Voodoo cards.
 
Ah, the new tech excuse. Its not new to them if they been working on it for years. They also know that performance would be a issue. They just use marketing to overcome the issues.
A new card will come out before we see any real gains with RT and DLSS. Heck several cards may come out before we see any real benefits.
Sure the idea is nice but thats all people are spending money on is the idea. itll be awhile till it doesnt hurt performance.

If you get one of the new 20 series cards, they are great when used without the "new tech". You just have to pay for it whether you use it or want it.

This is why it may be interesting to wait n see if a 1670 or 1680 ti come out over the summer.
 
Last edited:
The latest demo from CryTek demonstrates ray-tracing on any AMD or Nvidia video card, without specialized hardware. Obviously CryTek engineers are much better than Nvidia's.

They had no problems adding RT to all current AMD and Nvidia cards, even those which Nvidia itself didn't support. Maybe CryTek should organize a course of programming for Nvidia developers.
My bet is that CryTek is using conformal geometric algebra. It is not all that well-known even among mathematicians, and it sounds like it has completely escaped nVidia. With a good implementation of CGA, performance is better than LA - part of the reason is that LA generates useless artifacts that CGA completely avoids.

It is also my bet that CryTek has inadvertently caused nVidia to take this approach and that it is nVidia's attempt to save face. As I see it, nVidia better be careful as they may be digging themselves further into a hole than they already are.

I agree with you that Nvidia had to release this update in order to save face.
 
Ray tracing will never take off if there's only 2-3 cards that are capable of using it because devs won't bother making something for a handful of people, that's why Nvidia needs to include as many cards as they can.

Using a Core i9-7900K and a GTX 1080 Ti at a 1440p resolution, Metro Exodus, which uses real-time ray-traced global illumination, can’t even reach 20fps.
It's still absolute garbage.
Disagree. I've got a GTX 1080 and only tend to game single player at 1080p anyway (call me a neanderthal but when I grew up SVGA 640x480 was "ultra high res").

Anyhows looks like I will be able to get ballpark 30FPS for BFV RTX Ultra and Metro Exodus GI High.

That actually sounds quite enticing to me, given the incremental cost for me is $0.00.

Pretty sure I'm not the only one doing this math.
 
Last edited:
The latest demo from CryTek demonstrates ray-tracing on any AMD or Nvidia video card, without specialized hardware. Obviously CryTek engineers are much better than Nvidia's.

They had no problems adding RT to all current AMD and Nvidia cards, even those which Nvidia itself didn't support. Maybe CryTek should organize a course of programming for Nvidia developers.
That's not an apples to apples comparison and you know it.

A single demo slice with zero game logic running behind it is not comparable to building a complete platform with multiple functions and native engine support in Unreal, Unity, Frostbite.

It's like comparing en E3 demo with its contemporaneous game code. Read Kotakus recent Anthem story if you want a pointer.

Yeah those NVIDIA engineers are all SO STUPID!
 
Back