Intel Arc AV1 encoder beats AMD and Nvidia's H.264 encoders in early tests

Daniel Sims

Posts: 1,368   +43
Staff
In brief: Intel has had a rough time launching its freshman series of dedicated graphics cards, struggling to get them to consumers on time and possibly with unimpressive results in gaming benchmarks. However, they represent the debut of AV1 hardware encoding, and early tests show big efficiency gains against Nvidia and AMD's H.264 encoders.

The AV1 hardware encoder in Intel's new Arc A380 destroyed Nvidia and AMD's H.264 encoders in initial real-world tests this week. The results are sorely needed good news for Intel's entry into the GPU space and bode well for AV1's future in content creation.

Late last month, computers featuring Intel's entry-level Arc Alchemist graphics cards hit the international market after weeks of delays. YouTuber EposVox bought one of the first available A380s and measured its AV1 encoding against multiple H.264 encoders.

Also see: TechSpot's Intel Arc A380 Review

Support for the new AV1 video codec has expanded rapidly over the last several months. It promises more efficient compression than competitors like VP9 or H.264 and is royalty-free, unlike H.265.

However, decoding AV1 requires relatively recent hardware like Nvidia's RTX 30 series graphics cards, AMD's Radeon 6000 GPUs, or 11th generation or later Intel CPUs.

Apple devices don't support it yet, but a Qualcomm Snapdragon chip will next year. Streaming services like YouTube, Twitch, and Netflix at least partially support AV1 at the moment and will likely expand their support in the future. Firefox added AV1 hardware decoding in May after relying solely on software decoding.

While hardware decoding to view AV1 content is gaining broad support, encoding to produce AV1 videos was only possible through software until Intel Arc. EposVox's results indicate content creators may want to switch to the codec as its use widens.

When streaming games like Halo Infinite and Battlefield 2042, Intel's AV1 encoder produced significantly cleaner video than H.264 encoders like Nvidia's NVENC, AMD's AMF, and Intel's QSV, even at lower bitrates. The Intel AV1 encoder at 3.5Mbps appears to beat AMD and Nvidia's at 6Mbps. AV1's lead shrinks at higher bitrates, but at 6Mbps it still outperforms the other GPU vendors' 8Mbps results.

These tests suggest that as more devices and services add AV1 compatibility, they will deliver better-looking streams using less data. AV1 hardware encoding could be featured in next-gen GPUs from both Nvidia and AMD, set to arrive later this year.

Permalink to story.

 
If I'm off base here, please let me know, but AV1 decoding is for streaming, among other things. And one of the biggest gripes against the RX 6400 was that it lacked that feature.

So my question is, how many people stream their gameplay using a 64-bit GPU?
 
Wait, so it‘s a quality comparison of AV1 vs H264 ? It‘s hardly unsurprising that AV1 does better.

Arc has this and nVidia and AMD don‘t, but still it‘s one codec beating another.

What would have been interesting is a decoder comparison in addition, I.e. how Arc us comparing to nVidia and AMD using the same codecs.
 
Wait, so it‘s a quality comparison of AV1 vs H264 ? It‘s hardly unsurprising that AV1 does better.

Arc has this and nVidia and AMD don‘t, but still it‘s one codec beating another.

What would have been interesting is a decoder comparison in addition, I.e. how Arc us comparing to nVidia and AMD using the same codecs.

Intel have always had pretty good hardware encoders eg qsv - AV1 is like 2 steps up from H264 .
H264 is a known thing - so should be well optimised . Anyway most streamers using NVENC will use h265 .
GPU encoders are not as good as cpu ones - so serious home movie encoders use cpu - maybe a few use GPU to apply some filter to speed things up .
H265 is slow even on fast CPUs - AV1 is brutal - obviously the big guys like google could spit them out quick .
Funny Apple hasn't embrace AV1 - next gen TVs will have it , new mid range plus android SOCs will have it - it is so more efficient bandwidth - but really needs a hardware decoder .
Same evolution for h265 - most smart TVs play it off a USB stick now .

Google is going to embrace it - even though they have their own next gen encoder .

Most home movie encoders will stick with H265 -for a few more years - as they understand how to tweak it - and their devices play it .
But eventually they will move to AV1 as it become universal and well understood
 
Intel have always had pretty good hardware encoders eg qsv - AV1 is like 2 steps up from H264 .
H264 is a known thing - so should be well optimised . Anyway most streamers using NVENC will use h265 .
GPU encoders are not as good as cpu ones - so serious home movie encoders use cpu - maybe a few use GPU to apply some filter to speed things up .
H265 is slow even on fast CPUs - AV1 is brutal - obviously the big guys like google could spit them out quick .
Funny Apple hasn't embrace AV1 - next gen TVs will have it , new mid range plus android SOCs will have it - it is so more efficient bandwidth - but really needs a hardware decoder .
Same evolution for h265 - most smart TVs play it off a USB stick now .

Google is going to embrace it - even though they have their own next gen encoder .

Most home movie encoders will stick with H265 -for a few more years - as they understand how to tweak it - and their devices play it .
But eventually they will move to AV1 as it become universal and well understood
They are also known to completely botch their decoders as a few of their iGPUs released with features touted then they never released working support. I don't remember NV or AMD doing that.
 
If I'm off base here, please let me know, but AV1 decoding is for streaming, among other things. And one of the biggest gripes against the RX 6400 was that it lacked that feature.

So my question is, how many people stream their gameplay using a 64-bit GPU?
Rx 6500 and 6400 xt were likely intended to be gpu for low end gaming pc to compete against gtx 1650 during chip shortage.
Advance video codecs were expected to be done by integrated gpu.
 
They are also known to completely botch their decoders as a few of their iGPUs released with features touted then they never released working support. I don't remember NV or AMD doing that.
Arc dead. Shocked I tell ya.
 
Back