Nvidia claims RTX 40 series AV1 encoder is better than AMD and Intel alternatives

nanoguy

Posts: 1,355   +27
Staff member
In context: It's no secret Nvidia's NVENC media engine in GeForce GPUs has been the preferred option for streamers for years, but AMD and Intel have recently done a decent job of catching up. That said, Team Green claims its RTX 40 series GPUs are still better than AMD and Intel equivalents when it comes to AV1 hardware encoding.

Recently, AMD has been extolling the VRAM capacity of its Radeon RX 6000 and RX 7000 series graphics cards, while criticizing Nvidia offerings like the RTX 4070 and RTX 4070 Ti which offer less than comparable Radeon models even though they have a higher price tag.

This happened just as Nvidia was getting ready to launch one of its Ada offerings, so it was only a matter of time before Team Green would fire back at AMD. In a new blog detailing the launch of OBS Studio 29.1, the company explains how GeForce RTX 40 series GPUs are more capable than both AMD and Intel alternatives when it comes to AV1 encoding.

The new version of OBS comes hot on the heels of YouTube adding support for AV1 live streaming over Enhanced RTMP, which is meant to improve video quality while also reducing bandwidth requirements. AV1 is much more computationally intensive than the more commonly used H.264 (AVC) codec, but now there are several integrated and dedicated GPUs out there that come with partial or full support for decoding it in hardware, so it's no surprise companies are using every occasion to promote it over the older format.

Nvidia says the AV1 encoder in its RTX 40 series GPUs is capable of real-time AV1 hardware encoding, allowing you to stream in 4K at 60 frames per second. This is anywhere between 30 to 50 percent more efficient than live streaming using H.264, meaning you'll only need around 10 Mbps of upload bandwidth for a 4K60 stream.

Team Green claims you can also get much better video quality using the NVENC unit inside RTX 40 series as opposed to competitors' hardware, but we're only getting a screenshot for comparison where Intel and AMD encodes do look a little underwhelming. The company says it tested a 4K60 AV1 stream using a GeForce RTX 4080, a Radeon RX 7900 XT, and an Intel Arc 770 with the OBS Studio default settings at 12 Mbps.

Of course, independent reviewers don't always arrive at the same conclusions in their testing of these technologies. YouTuber EposVox recently looked at the new OBS Studio feature while it was in beta and found that media engines in the newest Nvidia, AMD, and Intel GPUs are quite similar when it comes to AV1 stream quality. The only issue is that RX 7000 series GPUs struggle with 4K60 streaming, but that will no doubt be corrected through software updates.

It's also worth noting Nvidia's GeForce software stack has areas where it is sorely lacking in comparison to AMDs. The Nvidia Control Panel is arguably a relic at this point, GeForce Experience cannot be used without an Nvidia account, and the Nvidia ShadowPlay feature has seen much less love than AMD's ReLive in recent years.

If anything, we'd love to see AMD work on something similar to Nvidia's RTX Remix for the game modding community. It could also bring Windows support for ROCm (AMD's alternative to Nvidia CUDA) on consumer Radeon GPUs.

The latter will actually happen sometime in the coming months, but the former will likely remain a pipe dream. Meanwhile, Intel's Arc Control software has generally been well-received by gamers and offers a pleasant user experience, save for some small quirks.

Permalink to story.

 
I've been a member of the "roast nVidia" bandwagon for awhile and it's getting pretty exhausting. My problem with this is that if you're streaming at 4k60, bandwidth probably isn't an issue. Look, I just don't see this as a big deal. We aren't watching streamers to look around the game and say "pretty graphics." we're watching because we're interested in what they're doing. I don't think this is going to be something that's going to sway gamers. Maybe it's they're doing it to get they're cards in the hands of streamers because of all the bad PR they've been getting recently. But I see streaming rigs with either 2 graphics cards or a dedicated encoding rig for people who are serious. For people who are serious about frames at 4k60, they're still going to want to offload as much workload from their dedicated hardware. Maybe someone just getting into streaming would upgrade to this card but they could also just have their old graphics card do all the encoding for them.

It's a good feature to have and one that's love overdue, I just don't see a big market for it. nVidia could make more money by discounting their cards by $20 across the boards than they would by adding AVI encoding as a feature. If there is a single person reading this article who is concerned about AV1 encoding for live streaming over H.264, please comment and tell me. I'm not arguing, I seriously want to know if there is anyone who reads techspot(and the comments) that actually cares about this
 
Frankly we should not even discuss 'screenshots' shared by a company without own screenshots and comparison. Wouldn't take more than an couple of hours to do that and then discuss claim vs reality. Sorry to say but this is just lazy and misleading.
 
The article said:
The only issue is that RX 7000 series GPUs struggle with 4K60 streaming, but that will no doubt be corrected through software updates.

Uh huh. Let's not harp on the fact that AMD's software stack fails once again.
It'll probably work at some point in the future. And then we'll crow about fine wine 🙄
 
Uh huh. Let's not harp on the fact that AMD's software stack fails once again.
It'll probably work at some point in the future. And then we'll crow about fine wine 🙄
This is also on the streaming software, not just AMD. Nvidia is usually the one to receive the first batch of support and optimisations.
 
To me, this sounds like Nvidia is getting desperate for some reason, and sounds like the old "Mine is bigger than yours" argument. To me, this makes it all just more marketing blather and Nvidia tooting their horn trying to get attention. I typically ignore such claims. Maybe that's on me and maybe Nvidia is right, but frankly, I'm getting sick of Nvidia trying to convince everyone that the prices on their products are justified.
 
WTF! they are comparing AV1 in the nvidia gpus with H.264 in AMd and Intel!! They don't even dare to do it with H.265! also, all have H264, H265 and AV1.

misleading propaganda

on the other hand, I don't give a damn what nvidia says, and for that matter, amd and intel and etc with their hardware encoders, all my video and audio encoding is on CPU, giving the extra quality that none of them can give.
 
WTF! they are comparing AV1 in the nvidia gpus with H.264 in AMd and Intel!! They don't even dare to do it with H.265! also, all have H264, H265 and AV1.

misleading propaganda

on the other hand, I don't give a damn what nvidia says, and for that matter, amd and intel and etc with their hardware encoders, all my video and audio encoding is on CPU, giving the extra quality that none of them can give.
If you want quality - take your time and use the CPU :) If you want speed then by all means use NVENC or AV1 etc. but don't claim a quality crown. From what I've seen most streams are 1080p anyway....
 
If you want quality - take your time and use the CPU :) If you want speed then by all means use NVENC or AV1 etc. but don't claim a quality crown. From what I've seen most streams are 1080p anyway....

I use NVENC a lot - a lot of movies don't deserve overnight treatment on a my CPU.
I mean you just use MakeMkV and leave it as a remux - with all unnecessary languages , audio streams stripped if it's important to have a great backup - Or just play your actual disk - Quality DVD- to UltraHD players harder to find I suppose.

I eyeball the movie - if noisey grain - especially temporal noise - will use hardware denoiser KNLMeansCL - at light or rarely medium settings ( sometimes custom ) - light rarely degrades and in my opinion often improves as I like grain - but not low light sparkles.
I use SSIM to check - I know it isn't completely accurate - normally get high 98 or low 99 (modern movies easier as cleaner ) - yes KNLMeansCL will boost that a bit - but as I said save temporal noise for special affects in cartoons/anime/flashbacks. Occasionally you will get a movie that's SSIM will be 95 - when eyeballing - most of that is really hard to encode stuff that is not that important and throwing bitrates will be a waste as NVENC must struggle with it.
I do throw extra bitrate in - so my base is 7000 variable for 1080p - when I compare with say a top encoder like Tigole out of curiosity ( Not YIFY the famous Kiwi encoder )- they appear pretty equal - they would use say 2000mbps less

I remember when I used to get a DVD onto a CD in say DivX/mp4 - I was brutal - I stripped out all the credits to get the 700mb - was happy enough - but kids movies like IceAge should have the graphics updated anyway - to higher color palette, better lighting and details - 5 Million budget could get it a new release
 
WTF! they are comparing AV1 in the nvidia gpus with H.264 in AMd and Intel!! They don't even dare to do it with H.265! also, all have H264, H265 and AV1.

The article said:
The company says it tested a 4K60 AV1 stream using a GeForce RTX 4080, a Radeon RX 7900 XT, and an Intel Arc 770 with the OBS Studio default settings at 12 Mbps.

What are you on about?
 
Last edited:
Back