Someone made a rudimentary ray tracing demo on Notepad at 30fps

nanoguy

Posts: 1,355   +27
Staff member
In brief: A graphics and game engine developer made a simple ray tracer work in Notepad on Windows by manipulating the app's memory space to change the displayed text and render a simple 3D scene at 30 frames per second.

Ray tracing is a hot topic in computer graphics right now, especially with all the hype around Nvidia's Ampere architecture which debuted with a powerful enterprise GPU and should soon bring about the GeForce RTX 3000 series consumer GPUs.

The reason why everyone is waiting for the newest graphics hardware is that ray tracing is so taxing that very few of us are able to experience it while keeping a smooth 60fps+ frame rate. And that doesn't even get into the issue that there aren't a lot of games out there that support it even if you have the hardware to handle it.

However, there's one place where ray tracing works that might surprise some, and that is the Notepad app on Windows. Kyle Halladay, who is a graphics and game engine programmer by trade and the author of a book on practical shader development, managed to make a rudimentary ray tracing demo work on Notepad at 30 frames per second.

Halladay's project is not meant to be a serious ray tracer by any means, but instead it is a great showcase of what can be achieved by combining little hacks - in this case it's rendering a scene through techniques like DLL injection and memory scanning. He also made a Snake game that can be played inside Notepad, that for some of you might bring memories of the text mode Quake II and Doom.

For those of you who feel inclined to dive into the nitty gritty technical details of how he achieved this, you can read them here.

Permalink to story.

 
Ray Tracing was the goal post Nvidia created that no one asked for.

I'm fairly certain if Nvidia had taken a poll and asked "which do you want":

A GPU series that can run 4K@144 Hz or "Ray Tracing", the overwhelming demand would have been 95% - 5% respectively.

as a 2080Ti owner, I really don't see the benefit. Shadow technology was already so good that ray tracing was unnecessary and most transparency tech was good enough that you didn't need RT either.

All that said: the bar is there now and now every GPU maker will have to meet it.
 
Ray Tracing was the goal post Nvidia created that no one asked for.

I'm fairly certain if Nvidia had taken a poll and asked "which do you want":

A GPU series that can run 4K@144 Hz or "Ray Tracing", the overwhelming demand would have been 95% - 5% respectively.

as a 2080Ti owner, I really don't see the benefit. Shadow technology was already so good that ray tracing was unnecessary and most transparency tech was good enough that you didn't need RT either.

All that said: the bar is there now and now every GPU maker will have to meet it.

No one asked for T&L, AA gamma correction, AO, etc. yet here we are. Just saying.
 
No one asked for T&L, AA gamma correction, AO, etc. yet here we are. Just saying.

I believe his main point was that Ray Tracing tanks performance. None of the technologies you listed above reduced performance by 80%, Ray Tracing currently does. Mind you, that's including the fact that current real time ray tracing is pretty low quality (requires a de-noise filter) and is only a single effect at low resolution.
 
I believe his main point was that Ray Tracing tanks performance. None of the technologies you listed above reduced performance by 80%, Ray Tracing currently does. Mind you, that's including the fact that current real time ray tracing is pretty low quality (requires a de-noise filter) and is only a single effect at low resolution.

Tessellation did, AO almost did (it was good 40ish %...I get his point fine. It's just not how things work.
Tessellation TANKED performance when it was new. Nowadays it's expected and welcomed and doesn't affect performance that much.

Does it work on AMD graphics cards? :))

No, neither did most of innovations in the beginning (ATi, later AMD). Even today, by default AMD has limited tessellation to some level, unlike nVidia , maybe navi is different, idk. Still people don't disable this optimization when doing benchmarks. Nor any other (shader optimizations etc)

Anyway, it doesn't work on nVidia cards either. Works just on select few.
 
Last edited:
Tessellation did, AO almost did (it was good 40ish %...I get his point fine. It's just not how things work.
Tessellation TANKED performance when it was new. Nowadays it's expected and welcomed and doesn't affect performance that much.

According to places like GamersNexus (and my own experiences from the tech once it launched), Tessellation was a very minor performance loss:


I don't see where you how you got the idea that tessellation reduced performance anywhere near the level of RTX. Maybe if you were comparing apples to oranges, unaccelerated tessellation before it made it's way into games and fixed hardware to accelerate it was added to GPUs but that's just misleading plain and simple.

AO did have around a 40% hit for the first generation of cards it was used on but it's important to note that's taking into consideration that AO was not like RTX or tessellation, which both prompted GPU manufacturers to add dedicated hardware. Even without dedicated hardware, the performance hit is still half, nowhere near what I'd call close.

No, neither did most of innovations in the beginning (ATi, later AMD). Even today, by default AMD has limited tessellation to some level, unlike nVidia , maybe navi is different, idk. Still people don't disable this optimization when doing benchmarks. Nor any other (shader optimizations etc)

Tessellation performance was limited on earlier AMD cards in the sense that they couldn't tessellate far past the point of any visible returns, which is worthless. Meanwhile Nvidia's CUDA based cards could and they used just that "advantage" to over tessellate grass and objects out of the view of the player to hobble AMD performance (google crysis 2 tessellation grass). The visual difference was nothing but it certainly made Nvidia cards look good. I'd recommend reading the articles made on the topic, there is plenty of evidence to support this. I'd also appreciate that people not spread that very old and false misconception. AMD's solution to this "issue" was to limit the amount of tessellation on the driver level. Zero visual difference for a large performance increase. Future iterations of the GCN architecture did improve tessellation performance.
 
1. New stuff no one asked for - is the topic of my posts in reply to earlier.
2. you made "yes we know that" post everything past sentence 2
3. making excuses for companies that want your money is bellow you.

edit: Tessellation and AO on/off in nvidia drivers. No game supported any when they introduced it. Ok, maybe a game or two, can't remember
 
1. New stuff no one asked for - is the topic of my posts in reply to earlier.
2. you made "yes we know that" post everything past sentence 2
3. making excuses for companies that want your money is bellow you.

edit: Tessellation and AO on/off in nvidia drivers. No game supported any when they introduced it. Ok, maybe a game or two, can't remember

1. I've made mention of this before, but you are mostly like misunderstanding quantum's post. This line in particular from his post leads me to believe this:

"A GPU series that can run 4K@144 Hz or "Ray Tracing", the overwhelming demand would have been 95% - 5% respectively. "

In this above sentence you have a trade off of either high resolution high FPS or Ray Tracing. Ultimately that's a trade-off the RTX 20xx series made. Turing and RTX cores take up die space that could have bumped up traditional rasterization performance. In addition to the diminished die space you are also looking at an 80% cut to performance for a single RTX effect. It's totally understandable why he said "Ray Tracing was the goal post Nvidia created that no one asked for ", it comes with the largest trade-off for the least benefits we've seen from a graphics technology. His post isn't against new technologies, it's against large trade-offs for little benefit.

2., 3. I have no idea what you are addressing with these two.
 
I was addressing under 2. that yes, I knew about all that, we just draw different conclusions from the same data.
Like everything else, RT is optional. Like everything else RT is going to be "cheaper" in the future. Or gone.
 
Ray Tracing was the goal post Nvidia created that no one asked for.
As a 2080Ti owner, I really don't see the benefit. Shadow technology was already so good that ray tracing was unnecessary and most transparency tech was good enough that you didn't need RT either.
Because all that 'shadow technology' is just smoke and mirrors which can never really do the right thing especially when something unusual happens. Also, when there are a large amount of light sources, generating shadow maps for all of these becomes incredibly costly too. (basically rendering the scene from every light source). To get around that issue closely located light sources are combined into single maps (giving poor shadow locations) and shadow maps are generated at a very low resolution (giving fuzzy and blocky shadows). Then there is reflected light...

Ray tracing will eventually remove the need for any of this nonsense. Graphics card manufacturers had to pick a point in time when they felt the hardware was sufficiently powerful to start making that switch and they picked now - which I think was about right.
 
Ray Tracing was the goal post Nvidia created that no one asked for.

I'm fairly certain if Nvidia had taken a poll and asked "which do you want":

A GPU series that can run 4K@144 Hz or "Ray Tracing", the overwhelming demand would have been 95% - 5% respectively.

as a 2080Ti owner, I really don't see the benefit. Shadow technology was already so good that ray tracing was unnecessary and most transparency tech was good enough that you didn't need RT either.

All that said: the bar is there now and now every GPU maker will have to meet it.
I tend to agree. Advancing technology though is important. That said, I own a 2080 Ti as well and aside from the Star Wars ray tracing demo, I've not enabled it on any games. I purchased it for the fact that it's a fast card, not for it's ability to ray trace.
 
I enable Direct X ray Tracing on CoD: MW.

I have been playing DCS World with all settings at max. I don't think that game supports Ray Tracing yet, but it will eventually. It looks magnificent as is.

Microsoft Flight Simulator 2020 might support RTX.
 
I tend to agree. Advancing technology though is important. That said, I own a 2080 Ti as well and aside from the Star Wars ray tracing demo, I've not enabled it on any games. I purchased it for the fact that it's a fast card, not for it's ability to ray trace.

But your nick is "Raytrace3D".... you HAVE to enable ray tracing :)
 
Back