Opinion: Ray tracing momentum builds with Nvidia launch

Bob O'Donnell

Posts: 81   +1
Staff member

As a long-time PC industry observer, it’s been fascinating to watch the evolution in quality that computer graphics have gone through over the last several decades. From the early days of character-based graphics, through simple 8-bit color VGA resolution displays, to today’s 4K rendered images, the experience of using a PC has dramatically changed for the better thanks to these advances.

The improvements in computer graphics aren't just limited to PCs, however, as they've directly contributed to enhancements in game consoles, smartphones, TVs, and virtually every display-based device we interact with. The phenomenal success of gaming across all these platforms, for example, wouldn't be anywhere near as impactful and wide-ranging if it weren't for the stunning image quality that today’s game designers can now create.

These striking graphics are primarily due to graphics processing units (GPUs)—chips whose creation and advancement have enabled this revolution in display quality. Over the years, we've seen GPUs used to accelerate the creation of computerized images via a number of different methods including manipulating bitmaps, generating polygons, programmable shaders, and, most recently, calculating how rays of light bounce off of images in a scene to create realistic shadows and reflections—a technique referred to as ray tracing.

Ray tracing isn't a new phenomenon—indeed, some of the earliest personal computers, such as the Amiga, were famous for being able to generate what—at the time—felt like very realistic looking images made entirely on a PC via ray tracing. Back then, however, it could often take hours to complete a single image because of the enormous amount of computing power necessary to create the scene. Today, we’re starting to see the first implementations of real-time ray tracing, where GPUs are able to generate extremely complex images at the fast frame rates necessary for compelling game play.

"[Nvidia] is working to push the momentum [for real-time ray tracing] forward with their second-generation desktop graphics cards, the RTX Super line."

Nvidia kicked off the real-time, PC-based ray tracing movement with the debut of their Turing GPU architecture and the RTX 2000 series graphics cards based on those GPUs last year. Now the company is working to push the momentum forward with their second-generation desktop graphics cards, the RTX Super line, including the RTX Super 2060, RTX Super 2070, and RTX Super 2080.

All three cards offer performance improvements in both ray tracing and traditional graphics acceleration. At the high end ($999), the RTX 2080 Ti remains as the highest performing card in the Nvidia line, while at the low end ($349), the original RTX 2060 remains as the lowest priced option. In between, the original 2070 and 2080 are being replaced by their Super versions (but at the same $499 and $699 prices), while the Super 2060 at $399, ups the onboard graphics memory to 8 GB and nearly matches the performance of the original RTX 2070. As a bonus, all three RTX Super cards come bundled with two games that support real-time ray tracing: Control and Wolfenstein: Youngblood.

Nvidia faced some criticism (and, reportedly, saw somewhat muted sales) after the launch of the first generation RTX cards because of the limited support for real-time ray tracing in many popular PC gaming titles. Since then, the major gaming engines, including Unreal and Unity announced support for ray tracing, as well as Microsoft’s Direct X Ray Tracing (DXR) API, and several AAA gaming titles, including Cyberpunk 2077 and Call of Duty: Modern Warfare. In addition, other games, such as Quake II RTX and Bloodhound have also announced support for accelerated ray tracing hardware.

On top of this, recent announcements from both Microsoft (Project Scarlett) and Sony (PlayStation V) made it clear that the next generation of game consoles (expected in 2020) will incorporate hardware-based support of real-time ray tracing as well.

Interestingly, both of those devices will be powered by AMD-designed GPUs, strongly suggesting that AMD will be bringing real-time ray tracing hardware technology to future generations of their Radeon line of desktop and laptop GPUs.

"As the market has demonstrated, not everybody currently feels the need to purchase GPUs with dedicated ray tracing accelerated hardware. "

As the market has demonstrated, not everybody currently feels the need to purchase GPUs with dedicated ray tracing accelerated hardware. Many gamers focus on purchasing desktop graphics cards (or gaming laptops) that can play the current titles they’re interested in at the fastest possible frame rates and the highest possible screen resolutions at price points they can afford.

For those gamers who are thinking ahead, however, it’s clear that there’s a great deal of momentum starting to build around real-time ray tracing. In addition to the previous examples, both Nvidia and AMD have announced software-based support of ray tracing in the latest drivers for their existing GPUs, which will likely encourage more game developers to add support for the technology in their next generation games. While the software-based solutions won’t run as fast, nor provide the same level of image quality for ray traced effects as hardware accelerated solutions, they will at least make people more aware of the kind of graphics enhancements that ray tracing can provide.

The evolution of computer graphics is still clearly moving ahead and, as a long-time industry -watcher, it’s great to see the once far-off concept of real-time ray tracing finally come to life.

Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter . This article was originally published on Tech.pinions.

Permalink to story.

 
My RTX 2080Ti can run 100% of the games on the market in full detail.

However: I'm totally not impressed by Ray Tracing thus far.

You mean to tell me, I spend over $1000 on a GPU designed to take advantage of new tech and the best you can give me is Quake 2?

Which BTW doesn't run as well with the 2060 and 2070 despite being 22 years old?
 
I bought an RTX card mostly because I wanted a glimpse of the future. Granted we currently only get to see one piece of the puzzle per game (ray traced global lighting in Metro Exodus and Quake II RTX, ray traced shadows in Shadow of the Tomb Raider, and ray traced reflections in Battlefield V), but I'm excited about the day when all of these pieces are put together!
 
My RTX 2080Ti can run 100% of the games on the market in full detail.

At 2K @ 165HZ/165FPS or 4K @ 60Hz/60FPS, the 2080Ti will run into trouble, and can struggle to hit over 50FPS in many titles at said settings.
https://www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/
Also, its min FPS dips well below 60 in many games.
The performance is still very good, not saying the GPU doesn't perform very well, but its still very overpriced for its performance IMO.
As far as Ray Tracing, your card will be old news and obsolete before that's a feature worth having.
 
Last edited:
At 2K @ 165HZ/165FPS or 4K @ 60Hz/60FPS, the 2080Ti will run into trouble, and can struggle to hit over 50FPS in many titles at said settings.
https://www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/
Also, its min FPS dips well below 60 in many games.
The performance is still very good, not saying the GPU doesn't perform very well, but its still very overpriced for its performance IMO.
As far as Ray Tracing, your card will be old news and obsolete before that's a feature worth having.
You can get EVGA 2080Ti from Ebay for $1000 now. NEW.
 
I bought an RTX card mostly because I wanted a glimpse of the future. Granted we currently only get to see one piece of the puzzle per game (ray traced global lighting in Metro Exodus and Quake II RTX, ray traced shadows in Shadow of the Tomb Raider, and ray traced reflections in Battlefield V), but I'm excited about the day when all of these pieces are put together!


I'd like to see a SPLINTER CELL where shadows play a huge role in stealth.

Thus far, they promise better transparency/gass effects/ shadows and lighting from explosive/flammable sources. Surely a stealth game is the best showcase.
 
Again @QuantumPhyics, I am not trying to downplay your GPU, its very nice and I am sure your system is very nice as well. But the money that GPU costs, I feel it should crush everything no matter what, with RT on.
 
I really like the design of the FE Super, I'm currently running a Radeon VII and I picked it over RTX2080 mostly because of its 16GB buffer that comes in handy in 4K but if AMD doesn't come up with a card at least as fast as 2080Ti next year I will be switching to nVidia because my Radeon overclocked to 1975Mhz struggles to hit 60 fps in every games at 4K Ultra settings
 
RTX is not a thing yet, it's a novelty.
These super cards are touting RTX 60fps at 1080p. I have a 1440P monitor, which Id rather utilise.

It's like buying a 4k TV to watch DVDs.

2017 cards are still a better buy, unless you game at 1080P?

I dunno, sounds backwards to me.
 
At 2K @ 165HZ/165FPS or 4K @ 60Hz/60FPS, the 2080Ti will run into trouble, and can struggle to hit over 50FPS in many titles at said settings.
https://www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/
Also, its min FPS dips well below 60 in many games.
The performance is still very good, not saying the GPU doesn't perform very well, but its still very overpriced for its performance IMO.
As far as Ray Tracing, your card will be old news and obsolete before that's a feature worth having.
You can get EVGA 2080Ti from Ebay for $1000 now. NEW.
2k = 1920x1080 not 2.5k 2560x1440
 
I love real time ray tracing and love that it’s trickling down towards mainstream. Can’t wait to see more and more games support it. To me it’s the single biggest advance I’ve seen in image quality for years.
 
In the article it says that sales of rtx cards dropped due to lack of adoption and I feel that that was the lesser of the two reasons holding back sales, as the other was the value proposition being soooooo poor for this generation, especially with the inital shock of the extreme hit to performance with RT on. Early adopters who got the card specifically for that reason or anyone who was on the fence and was swayed by that specific tech knew what they were getting into for the most part and understood that adoption would be slow, IMO. And while I do praise Nvidia for pushing this issue technologically It feels like those high prices were inflated for the sake of shareholders because AMD aint been much competition on the xx70-range and up front. Eirher that or they really got screwed by tsmc over 12nm, obviously in yields because this refresh, but in cost as well. Either way I'm calling shenanigans somewhere in all that.
 
the other was the value proposition being soooooo poor for this generation

With no "tech demonstrators" like Crysis to bring rigs to their knees, there's just no reason to spend this much money beyond the fact: #1 You have it to spend and #2 You want some level of future proofing.

Supposedly Crysis is getting updated with RTX. That's gonna be a horribly optimized buggy mess - as Crysis doesn't run perfectly on my 1060, my 1080, my 2080Ti or my previous Titan Xp.
 
Last edited:
I have to admit, the 2070 super is compelling now. It is the 1080ti at $500. That's a beast. It's no better at RTX than the 2070 it is replacing, but then no one used the 2070 for RTX in the first place.
 
I have to admit, the 2070 super is compelling now. It is the 1080ti at $500. That's a beast. It's no better at RTX than the 2070 it is replacing, but then no one used the 2070 for RTX in the first place.

I'd have been happier if the lowest RTX card was more powerful than the 1080Ti.
 
I wonder what it costs per card? The reason only a few games support Nvidia RT is because it has a 50%+ decline on FPS with a difference you have to be told to look for in most cases. You could probably go from 1080p to 2k or 2k to 4k with the same impact on FPS. It's irritating to see Nvidia ignore reality.

I guess showing a graph on the performance impact of Nvidia RT would make to much sense. Maybe even show us how much it has improved in performance since release? None idk?

How about a demonstration on how much better gaming looks when increasing resolution and other settings vs NVIDIA RT at the same FPS loss.
 
I hear if you go into the bathroom, turn off the lights and say Nvidia Ray Tracing 3 times into the mirror Lisa Su will appear and smack the **** out of you.
 
2k = 1920x1080 not 2.5k 2560x1440
I get the average street talk portion of it, this message is of no use, anyways I used to call 1080p 1K.
It's [1440p] about 3.5Mill Pixels, 4K is over 8.2Mill.
2K,4K,8K isnt about pixel count giving them the name.
each one is 4 x bigger than the one before.
there are 16 x 1920x1080 screens in a 8k display but only 4 x 4k screens and each of them 4k screens have 4 1080p screens. marketing bull. I wish screens were sold by pixel count make it much more simple considering all these different aspect ratios
 
IMO, I would not be surprised if nVidia knew about the small increase in performance of the RTX series cards well before they launched. As such, they had no choice but to spin RTX as a game changer. The hype just has not panned out unfortunately for them.

Perhaps it will teach some out there to hone their BS detector. A long time ago in a galaxy far, far away, I spent $2K on a graphics card. For me, it turned out to be a lesson in hype. I'll never spend that much again. Most of the crop of high-cost nVidia cards from any generation simply do not perform well enough, IMO, to justify their cost.
 
IMO, I would not be surprised if nVidia knew about the small increase in performance of the RTX series cards well before they launched. As such, they had no choice but to spin RTX as a game changer. The hype just has not panned out unfortunately for them.

Perhaps it will teach some out there to hone their BS detector. A long time ago in a galaxy far, far away, I spent $2K on a graphics card. For me, it turned out to be a lesson in hype. I'll never spend that much again. Most of the crop of high-cost nVidia cards from any generation simply do not perform well enough, IMO, to justify their cost.
Almost every year we get "refreshes" of existing hardware. This year they just named them Super.
 
If you're thinking ahead then you are waiting for next gen hardware. With so few games with raytracing enabled and with first gen hardware taking a huge hit who cares for current gen stuff. By the time Ampere and Navi+ ship the hardware will be far more capable and we might have a critical mass of games with raytracing support. Why waste money now for almost no benefit. And apart from raytracing thew DLSS is a waste of time too. Nvidia should have released GTX cards not RTX super. A $399 GTX 1670 and a $499 GTX 1680 would have been much more useful to me.
 
Back