Nvidia launches RTX Video Super Resolution to boost streaming video quality

Shawn Knight

Posts: 15,289   +192
Staff member
What just happened? Nvidia has launched a new AI upscaling tool designed to boost the effective resolution and overall quality of online videos. Dubbed RTX Video Super Resolution (VSR), the new tool taps into the power of GeForce RTX 40 and 30 Series GPUs to upscale lower-resolution video content up to 4K. Nvidia said the AI is able to remove compression artifacts including blockiness, banding on flat areas, ringing artifacts around edges and washing out of high-frequency details while improving overall sharpness and clarity, all in a single pass.

According to the GPU maker, nearly 80 percent of Internet bandwidth today is comprised of streaming video and 90 percent of it is streamed at 1080p quality or lower. This is fine for lower-resolution devices or older TVs that don't support 4K but when viewed on a computer with a high-res display, the resulting image is often soft or blurry.

The company likened it to putting on a pair of prescription glasses to "snap the world into focus."

Nvidia's neural network model was trained on "countless" images at different resolutions and compression levels so it would know how to properly handle all types of content. An early version of this same tech debuted on the Shield TV in 2019 but that mostly targeted living room use. PC users sit much closer to their displays so the tech had to be enhanced to produce a higher level of processing and refinement.

RTX VSR is available now as part of the latest GeForce Game Ready Driver and requires an RTX 40 or 30 Series GPU. It works with most content streamed in Microsoft Edge and Google Chrome, we're told. Make sure you have the latest versions of your respective browser as support for the tech was only recently added.

With everything in place, simply open the Nvidia Control Panel, head to "adjust video image settings," look for the "RTX video enhancement" section and tick the super resolution box. The feature currently offers four quality levels to choose from, ranging from the lowest performance impact to the highest level of improvement.

Permalink to story.

 
I guess would be a nice feature to have. Youtube compression is BAD and it'd be nice to lower the resolutions for people with data caps. Not so much with wired connections but for people who use their cell phones with as a home internet service. When I travel for work I've used my cell phones hotspot as internet and my 50 gigs runs out fairly quickly if I'm watching anything than 480P
 
Very important detail to note: in testing performed today by a few other sites it HALVES performance in games if you have a video running while gaming. So, this is a great tool if you don't do both at the same time, but will destroy performance otherwise. Plus, it's not too easy to disable it--they decided not to build in a hotkey.
 
Re: the before/after photo, I'm looking at the building with the green line down the center of it, and I see no real difference between the left and right sides. Am I missing something or is the issue just that Techspot shrank the provided image down so much that you can no longer see the benefits?
 
Very important detail to note: in testing performed today by a few other sites it HALVES performance in games if you have a video running while gaming. So, this is a great tool if you don't do both at the same time, but will destroy performance otherwise. Plus, it's not too easy to disable it--they decided not to build in a hotkey.
Wow, that's really too bad, and makes the feature completely useless for me. Does the product manager not realize that plenty of PC gamers frequently do both things simultaneously? It should have been so easy to just auto-disable the feature when GPU load is over X% or when the GPU knows it is playing a game (which given it has profiles for hundreds or thousands of them shouldn't be too hard.)
 
Very important detail to note: in testing performed today by a few other sites it HALVES performance in games if you have a video running while gaming. So, this is a great tool if you don't do both at the same time, but will destroy performance otherwise. Plus, it's not too easy to disable it--they decided not to build in a hotkey.
another niche failure by nvidia imo
 
Wow, that's really too bad, and makes the feature completely useless for me. Does the product manager not realize that plenty of PC gamers frequently do both things simultaneously? It should have been so easy to just auto-disable the feature when GPU load is over X% or when the GPU knows it is playing a game (which given it has profiles for hundreds or thousands of them shouldn't be too hard.)
You can't expect it to be perfect out of the gate. At least you get better quality when you're not gaming...unless it isn't then it's truly useless.
 
Great.... now Jensen Huang thinks everyone needs to buy a $1k+ RTX card with every TV purchase...


The marketing is out of control.... every single modern tv upscales to 4k.
 
I have seen a few comparisons online, but the diference is minimal, almost non-existent.
But lot's of people noticed a big increase in GPU usage. Power usage and temperatures going way up with this feature.
Better to just keep this thing off.
 
Great.... now Jensen Huang thinks everyone needs to buy a $1k+ RTX card with every TV purchase...


The marketing is out of control.... every single modern tv upscales to 4k.

Does he? And do you only watch videos on your TV?

Really, just like the previously added microphone denoising, it's nothing more than a nice bonus.
Membership has its benefits.
 
It looks very plastic... but some people might like it. I guess when youtube turns to 1080p and everything looks terrible it might be handy.
 
Does he? And do you only watch videos on your TV?

Really, just like the previously added microphone denoising, it's nothing more than a nice bonus.
Membership has its benefits.
Where else are you going to watch a signal, that is using your RTX card..?

Computer room, or TV room... are your two choices...!
 
It's a free feature that works, I don't really see how anyone could complain about that. If Nvidia came out and tried to justify the price of the RTX 4080 with this feature and locked it to the 40 series, then I would criticize it, but, at the moment, it seems to me like something that makes your GPU better (or at least adds functionality) should you already own an RTX 30/40, but not a selling point by any means.
 
It's a free feature that works, I don't really see how anyone could complain about that. If Nvidia came out and tried to justify the price of the RTX 4080 with this feature and locked it to the 40 series, then I would criticize it, but, at the moment, it seems to me like something that makes your GPU better (or at least adds functionality) should you already own an RTX 30/40, but not a selling point by any means.

It's useless because what 4k monitor or TV doesn't have 4k up scaling already..
 
It's useless because what 4k monitor or TV doesn't have 4k up scaling already..
I mean, neither my 4K monitor nor my 4k television have AI upscaling. I know there are some that do, but I would think that most do not. Do they have something like 1:4 upscaling, yes, but AI upscaling calculating and adding pixels rather than just replicating them, few televisions do that. From what I have seen, it does a pretty good job, especially with 720p to 4k content, making it nearly look native 4K if the original video is decent quality. 1080p to 4k already looks good, but it definitely makes it look native 4k.
 
Last edited:
Another BS gimmick to peddle the expensive products of the greedy company.
I'll take that over half assed, rushed to market expensive products and g̶I̶m̶m̶I̶c̶k̶s̶ features of the other just as greedy, if not more greedy company.

Another nice to have feature just like Broadcast, it won't sell GPU's by itself, but it's another decent little addition some will love, and most won't use.
 
I mean, neither my 4K monitor nor my 4k television have AI upscaling. I know there are some that do, but I would think that most do not. Do they have something like 1:4 upscaling, yes, but AI upscaling calculating and adding pixels rather than just replicating them, few televisions do that. From what I have seen, it does a pretty good job, especially with 720p to 4k content, making it nearly look native 4K if the original video is decent quality. 1080p to 4k already looks good, but it definitely makes it look native 4k.

Not sure if you know... but ALL 4k TV do this...!
Samsung is on it's 3rd generation of AI 4k & 8K upscaling. LG, SONY, etc...
 
Not sure if you know... but ALL 4k TV do this...!
Samsung is on it's 3rd generation of AI 4k & 8K upscaling. LG, SONY, etc...
All 4K televisions do this, yet, I have a 4K television that does not. I have an LG 85nano with a 120hz refresh rate etc, its a great gaming television, but its does not use AI upscaling. I have a Gigabtye M32u gaming monitor, its a great gaming monitor, but it does not have AI upscaling. So, I'm really not sure where you are coming from on this.
 
The point is... your older 4K tv still has upscaling even if it's outdated and doesn't have Ai Upscaling. Have you done a side-by-side comparison?

Hooking a RTX card to an old TV to getting (perhaps) a better picture is an extremely small case scenario. You could buy a whole new TV with Ai upscaling for the cost of an RTX card... so it's use case is small.
 
The point is... your older 4K tv still has upscaling even if it's outdated and doesn't have Ai Upscaling. Have you done a side-by-side comparison?

Hooking a RTX card to an old TV to getting (perhaps) a better picture is an extremely small case scenario. You could buy a whole new TV with Ai upscaling for the cost of an RTX card... so it's use case is small.
I have, I already explained, I have an RTX 3080
 
Nearly a moot point now.... Windows itself will also be using Ai hardware upscaling soon. (Using NVidia & AMD gpus.)

Most likely why nVidia attempted to announced it first....
 
Nearly a moot point now.... Windows itself will also be using Ai hardware upscaling soon. (Using NVidia & AMD gpus.)

Most likely why nVidia attempted to announced it first....
AI-based video upscaling using certain Intel GPUs has been in Chrome since last April but hasn't made any noticeable effort to finalize the feature and support it officially, due to power concerns and other matters. If AMD is planning to do likewise, they've made no indication yet.
 
Back