'Fake HDMI 2.1' doesn't really bother HDMI Licensing Administrator

Daniel Sims

Posts: 1,372   +43
Staff
Facepalm: This week the administrator that handles HDMI licensing confirmed to TFT Central that the HDMI 2.1 label no longer strictly requires features like 4K120Hz, dynamic HDR, or variable refresh rate (VRR). Customers might want to look more carefully from now on.

Monitor-focused tech site TFT Central noticed a Xiaomi monitor with “HDMI 2.1*” in its specs despite only being 1080p (at 240Hz) and explaining in fine print it only has bandwidth matching the HDMI 2.0 specification.

When the HDMI Forum announced the release of 2.1 in 2017, it highlighted improvements from 2.0, including 8K60Hz, 4K120Hz, VRR, Quick Media Switching, and Quick Frame Transport. Previously, the only real significant change from the HDMI 1.4 standard to 2.0 was the addition of static HDR.

The HDMI Forum's stance on the Xiaomi monitor's labeling is confusing. It said HDMI 2.1 labels need not necessarily include any of those added features. According to the Forum, HDMI 2.0 is now just a subset of 2.1. So any device with functionality associated with 2.0 or 2.1 can have the HDMI 2.1 label.

This arbitrary labeling of the standard may become confusing for customers shopping for new displays, cables, and devices with features like 4K, 120Hz, dynamic HDR, eARC, or VRR. Just seeing “HDMI 2.1” on the box might not guarantee those features anymore. Consumers will now have to carefully read specification labels to ensure they purchase a display with the advanced HDMI 2.1 functionality they want.

Permalink to story.

 
Sales person wouldn't even use the words 2.1. Why, cause most are 20 yr olds who don't care or even understand most tech things. They just know to sell you a hdmi cable. The rest is on the customer. Cause the sale person is going to tell you they don't know in most cases.
 
This speaks fairly badly about the certification board: The freaking number should be *ALL I NEED* to know if the cable works for what I want it to. So not only do I have to contest with nonsensical version naming convention (2.0? 2.1? What's next 2.1 version 2? 2.1 version 2 revision 2? USE MORE DAMNED NUMBERS 1, 2, 3, 4,5 WHY IS IT SO COMPLICATED TO JUST DO THAT!?) but now I pretty much have to find independent reviews or at the very least other consumer reviews to see if it actually does what it's supposed to do.

At that point your certification just failed: Time to revoke EVERYONE and force all companies to start from scratch and either re-certify or remove the logos.
 
Someone point out the flaw in my logic, but as I understand it specifying the HDMI version for a device tells you what cable you'd need to support every feature the device has, and specifying the version for a cable tells you that it supports every feature in that version. It doesn't mean that every feature in the specification of the HDMI version will be used even if everything plugged into the cable is the same version. When on a device, the HDMI version number is there only to provide maximum compatibility, it tells you nothing about the features supported (ie. display resolution/refresh rate).

In this case since this is a monitor, all it means is technically buying an HDMI 2.0 cable won't negatively impact the experience at all. It implies that one of the 7 features added in HDMI 2.1 is supported, but what the HDMI Forum is saying is it doesn't mean one of them is. HDMI version numbers on cables are a completely different case though because it should mean that every feature is supported. If an HDMI 2.1 cable did not support a feature in HDMI 2.1 and both devices supported it with another cable, then that would be news and false advertisement.
 
I imagine that "Licensing Administrator" probably caved since people, myself included, are not happy that 2.0 was incapable of 4K120Hz UHD. Its like each time that they upgrade the spec, you have to get new hardware. Personally, I'm not buying a new HTR each time they change the spec. So the adoption of 2.1 is probably lagging and manufacturers wanted a ruse to get people to buy their crap.

I can see nothing but grief from customers because of faux 2.1 HDMI compatibility.
 
The article at wccftech explains this situation better. I left this one with too many questions I had to dig deeper. As a result, it isn't too bad. As of now 2.0 is dead and 2.1 took it's place whether it has 2.1 features or not. If consumers supposedly knew what came with 2.1, they can now educate themselves on this change since 2.1 is nowhere near mainstream yet, and I doubt any device was expected to support every 2.1 feature anyway.
 
Last edited:
This speaks fairly badly about the certification board: The freaking number should be *ALL I NEED* to know if the cable works for what I want it to. So not only do I have to contest with nonsensical version naming convention (2.0? 2.1? What's next 2.1 version 2? 2.1 version 2 revision 2? USE MORE DAMNED NUMBERS 1, 2, 3, 4,5 WHY IS IT SO COMPLICATED TO JUST DO THAT!?) but now I pretty much have to find independent reviews or at the very least other consumer reviews to see if it actually does what it's supposed to do.

At that point your certification just failed: Time to revoke EVERYONE and force all companies to start from scratch and either re-certify or remove the logos.
I agee 1000000000000%
 
My 4k LG monitor was adamant, I used the cable that came with it . When I do buy that humongous OLED tv - will make sure I get verified 2.1 hdmi cables- with most of the features - at least 1 with full features.
The thing is most users won't know they are being shortchanged- as they will drop to lower settings probably .
Will wait 2 more years for receivers to iron out kinks- for gaming use earc anyway.
Linus tips has a testing setup for cables - not cheap
 
HDMI is an inferior, proprietary protocol, contrary to DisplayPort. Just let it die pls.
Depends on your use case, HDMI if fine for consumer electronics like TVs, soundbars and receivers. But if you are using computers/monitors Display Port is better/preferred.
 
Let's make an analogy. Suppose that "HDR20" is introduced, with better minimal brightness, even wider color space, etc., in a package of new features, which HDR10 will inevitably be a sub-set of, because it came before and was improved upon. And now the glorious announcement: "all HDR10 devices are HDR20 from today." Or all DisplayPort 1.4 devices are DisplayPort 2.0 from today, because 1.4. is a part of 2.0 No! Nonsense! Bonkers!

Tech community needs to organize and find a law firm to prepare a case. There are plenty of firms that have successfully challenged Amazon, Apple and other companies for various dubious practices. This is one of those cases. It stinks from a mile. They cannot simply "choose" not to certify 2.0 devices from one day to another. It does not make sense.

Most of HDMI ports and chips for media content will remain 2.0 for years, apart from graphics cards, consoles, some monitors, some TVs, some projectors, AV receivers, some high-end future PC motherboards. There was a clear dividing line between the two specs and now they are desperate to remove it. Apart from mostly gaming quality features and eARC that can be delivered without overall increase in video bandwidth, the defining feature of HDMI 2.1 is FRL data protocol that allows speeds above 18 Gbps and repurposes fourth "TMDS clock" lane on chips and in cables to transmit more data. Just because there is backward compatibility, it does not give them the right to abolish 2.0 and turn entire consumer electronics market up side down without proper marketing safeguards in place. People are being taken for *****s.
 
I imagine that "Licensing Administrator" probably caved since people, myself included, are not happy that 2.0 was incapable of 4K120Hz UHD.
HDMI 2.0 is capable of 4K/120 "UHD". It's about bit depth and colour space that takes a lot of bandwidth.
2.0 can do up to 18 Gbps - 4K/120 8-bit 4-2-0
2.1 can do up to 48 Gbps - 4K/120 10-bit 4-2-0 (20 Gbps) to 4K/120 12-bit RGB
 
I bet the HDMI certification board charge manufacturers more money to slap a HDMI 2.1 label on devices than HDMI 2.0. Who cares about consumer confusion when you can make more money off the certification process selling the more valuable HDMI 2.1 stamp than the previous HDMI 2.0 stamp for the same hardware? Win win for everyone but consumers, HDMI consortium gets more money, manufacturers get to advertise compliance to the latest and greatest standard (even when they don't really comply with the main features of that standard), and consumers...just get confused buying hardware that might not do what they thought it could do.
 
I really don't care because my TV is 4K60Hz and it's what I use for gaming as well. All of the marketing BS about motion blur on 60Hz TV's was nothing more than that, utter BS. I've had my TV for about 5 years now and there's NEVER been any motion blur on anything that I've watched or any game that I've played.

It was just a scam to get people to pay more for the 120Hz panel even though it made no difference.
 
I really don't care because my TV is 4K60Hz and it's what I use for gaming as well. All of the marketing BS about motion blur on 60Hz TV's was nothing more than that, utter BS. I've had my TV for about 5 years now and there's NEVER been any motion blur on anything that I've watched or any game that I've played.

It was just a scam to get people to pay more for the 120Hz panel even though it made no difference.
Think Steve from Hardware Unboxed - wants to talk to you - without doing any research - motion blur is dependant on screen type and up a screen to 540hz is just one of a myriad of factors - pixel change latency, input latency, power of GPU .
Then you have the human factor - most of us don't care as our brains are very good at compensating . Like those who scream the mp3s are trash - and they can tell instantly a mp3 vbr 0 from a flac file - no matter their system - or converters in the chain .
Generally my take is if you go to a better system for say 2 months and go back to old one - and you are still happy with it - you know your limit .
Plus I find it interesting that those mostly with discretionary cash are upgrading - when their acuity and hearing are lessening with especially music -law of diminishing returns is amplified with age
 
I really don't care because my TV is 4K60Hz and it's what I use for gaming as well. All of the marketing BS about motion blur on 60Hz TV's was nothing more than that, utter BS. I've had my TV for about 5 years now and there's NEVER been any motion blur on anything that I've watched or any game that I've played.

It was just a scam to get people to pay more for the 120Hz panel even though it made no difference.
Statements like this only show that consumer ignorance, even among those who think they know something, will help the latest move by HDMI Forum to confuse people even more.

If you look into bandwidth requirements, even 4K/60 image can use both older and newer spec, as follows:
HDMI 2.0 (TMDS signal) - 4K/60 8-bit RGB or 10-bit 4-2-2 need 18 Gbps
HDMI 2.1 (FRL signal) - 4K/60 10-bit RGB needs 20 Gbps
Therefore, 4K/60 gamers who want RGB colours with 10-bit panels need to pay close attention to monitor's or TV's capability in regards to HDMI port bandwidth. Until yesterday, the label 2.1 meant that 4K/60 10-bit display can be driven with full RGB colours, without compromising on chroma subsampling. Vendors are not required, sadly, to advertise port speed, so it's going to be more difficult to figure out what monitor can actually deliver.

As a rule, we have not had "HDMI 2.1 4K/60" monitors or TVs, for a simple reason. The slowest chip with FRL signal above 18 Gbps would be FRL3. Such port would give 24 Gbps of bandwidth. Some new monitors have it. Companies install such HDMI 2.1 chip, again, for a reason, to drive 4K/120 monitor. With this bandwidth, you can get 4K/120 10-bit 4-2-0, and also all options from 4K/60. If you do not like 10-bit chroma 4-2-0 image and prefer RGB, you switch 4K/120 10-bit 4-2-0 to 4K/60 10-bit RGB mode. But you still need HDMI 2.1 chip and bandwidth to be able to do this.

From now on, companies will be able to advertise 4K/60 monitors as HDMI 2.1 monitor, which is ridiculous, because it does not tell you if you are getting 18 Gbps or 24 Gbps port. Are you getting confused already? If you are, that's good, because you are getting to learn now that something has gone really wrong with what they did.

Also, note that gamers are not the only people on the planet who benefit from higher refresh rates. Scrolling through text is a better experience at 100/120 Hz. I do it everyday and my eyes are thankful.

So, yes, it would be great if you start to care just a little bit, for the beginning, because there are other people on the planet who also enjoy 4K/60 setting, just like you. If you do start to care, the world becomes a better place for you and me, for all of us.
 
Plus I find it interesting that those mostly with discretionary cash are upgrading - when their acuity and hearing are lessening with especially music -law of diminishing returns is amplified with age
But yet us old timers, who have some experience and training, can still hear what actually needs to be heard. Can't hear up to 20K anymore? So what? There's so little musical information there, it doesn't matter.

The average mutt with a 1000 watt sub woofer in his car, can't tell that the amp is clipping, and bass that low doesn't articulate until some 16+ feet from the car anyway. What you get, is some a**hole riding down the street with every trim panel in the the car vibrating, and no (supposed), "high fidelity sound", just deviant audio spectrum, and some nitwit who thinks himself as "the gorilla king".

At least, once upon a time, when people walked into an audio store, they took the time to ask what they thought to be intelligent questions. The biggie was, "what's the total harmonic distortion" Oh, and they always seemed to have a copy of Consumer Reports under their arms.

I tried to tell them, "everything you like about the sound of the electric guitar is nothing but harmonic distortion.(Yes, and that'\s even with those "clean Fender amps" and Telecasters It's the inter modulation distortion that sounds like crap. But would they listen, noooooo. So the next round of promotions concocted and introduced "slew rate distortion", as the true specification of an amplifier's performance.

So, to stay somewhat on topic, if today's marketers have the ballz to label some $9.95 computer speaker system as "high fidelity", and some 4" near low frequency trash driver as a "sub-woofer", then I suppose they can get away with tagging whatever version of HDMI, with whatever version number they choose.
 
The article at wccftech explains this situation better. I left this one with too many questions I had to dig deeper. As a result, it isn't too bad. As of now 2.0 is dead and 2.1 took it's place whether it has 2.1 features or not. If consumers supposedly knew what came with 2.1, they can now educate themselves on this change since 2.1 is nowhere near mainstream yet, and I doubt any device was expected to support every 2.1 feature anyway.

I have a question for you.

A friend of mine wanted to buy a 1440p 120hz HDMI 2.0 monitor for his PS5 since his 80" 1080p HDTV is pretty old and doesn't have a good response time for gaming. He asked my advice and I did a bit of research to find that the PS5 only supports 1080p 60/120hz or 4K 60hz over HDMI 2.0 and the higher resolutions/refresh rates need 2.1, 1440p might be enabled at a later date, but no promises or info on what HDMI gen would be needed.

On further digging I also found that to use a 1440p monitor you'd have to have one that up and/or downscales. More importantly it wouldn't change the output from the PS5, so he could have 1080p 120hz upscaled which would look like crap or he could have 4K 60hz downscaled which would look good but wouldn't give him the refresh rate and fps he wants for a better response rate.

I told him his best bet was to just get a 1080p 120hz monitor. But am I to understand that 2.0 is being fazed out and future 1440p monitors should come with 2.1 instead? More importantly does this mean that he could have a PS5 4K 120hz input into one of these future 2.1 1440p monitors giving him the downscaled 4K to 1440p picture with his desired 120hz refresh?
 
Someone point out the flaw in my logic, but as I understand it specifying the HDMI version for a device tells you what cable you'd need to support every feature the device has, and specifying the version for a cable tells you that it supports every feature in that version. It doesn't mean that every feature in the specification of the HDMI version will be used even if everything plugged into the cable is the same version. When on a device, the HDMI version number is there only to provide maximum compatibility, it tells you nothing about the features supported (ie. display resolution/refresh rate).

In this case since this is a monitor, all it means is technically buying an HDMI 2.0 cable won't negatively impact the experience at all. It implies that one of the 7 features added in HDMI 2.1 is supported, but what the HDMI Forum is saying is it doesn't mean one of them is. HDMI version numbers on cables are a completely different case though because it should mean that every feature is supported. If an HDMI 2.1 cable did not support a feature in HDMI 2.1 and both devices supported it with another cable, then that would be news and false advertisement.
The "flaw in your logic" is that it does not work "as you understand it". See USB specifications for another example of this.
 
Wouldn't the easiest ( could still be scammed ) - but only buy cables purporting to do 48Gbps: apparently that covers 4k 120HZ at maximum colour depth/sound etc etc
One of the reasons I'm waiting to next year to buy a huge OLED (77 to 83")- will probably use it 10 to 15 years - so want to handle PS6 etc - lot of pressure on likes of Sony, Panasonic to get it right - All the big receiver manufacturers will want it as std next year - plus you don't want a downgrade from a streaming service in the DRM chain . HDMI was pretty flakey when it first came out - you had to sometimes switch off and on to get handshaking/drm correct .
I think even LG TVs has limitations in their chips - not sure about Samsung TVs
For PC my LG 4K monitor came with a display port cable.
As I will Earc most inputs back to receiver for sound - I may wait a few for years depending what's in the pipeline .
 
Back