When ray tracing was first a thing, I posted a comment on a Hardware Unboxed YouTube thread predicting that ray-tracing would take off and replace rasterization and DLSS was a dead technology. Wow, that didn't age well, did it?

It's okay, I don't mind being wrong.

What I do mind, however, is the extent to just how wrong I was but that's all on NVIDIA, not me.
The implementation of ray-tracing technology has been absolutely horrendous. NVIDIA refuses to give consumers the necessary amount of RT technology at an affordable price, whether it be an add-on card or a more generous amount of RT cores in GPU cards. So even the highest-end cards can't run the technology at a satisfactory level for consumers. So it's not worth it to enable it. So now the adoption rate sinks. And now RT which could and should be the next revolution in PC graphical technology is nothing more than an also-ran 6 years later.
DLSS is massively improved, and actually took off the way ray-tracing was expected to instead. But I'd argue that even this is not a good thing, because NVIDIA is just using it as a substitute for the proper actions they should be taking, namely giving us more RT cores at an affordable price. And even with DLSS we still can't get acceptable results from games that actually implement proper ray traced visuals well.
This isn't rocket science. Look at how fast rasterized 3D graphics became a standard in our lifetime and look how long it's taking ray-tracing to do so. If it ever does. If you don't make a technology available at an affordable price en masse, it will not be adapted en masse. Until NVIDIA pulls its head out of Jensen's ***, nothing is going to change.