'Fake HDMI 2.1' doesn't really bother HDMI Licensing Administrator

I have a question for you.

A friend of mine wanted to buy a 1440p 120hz HDMI 2.0 monitor for his PS5 since his 80" 1080p HDTV is pretty old and doesn't have a good response time for gaming. He asked my advice and I did a bit of research to find that the PS5 only supports 1080p 60/120hz or 4K 60hz over HDMI 2.0 and the higher resolutions/refresh rates need 2.1, 1440p might be enabled at a later date, but no promises or info on what HDMI gen would be needed.

On further digging I also found that to use a 1440p monitor you'd have to have one that up and/or downscales. More importantly it wouldn't change the output from the PS5, so he could have 1080p 120hz upscaled which would look like crap or he could have 4K 60hz downscaled which would look good but wouldn't give him the refresh rate and fps he wants for a better response rate.

I told him his best bet was to just get a 1080p 120hz monitor. But am I to understand that 2.0 is being fazed out and future 1440p monitors should come with 2.1 instead? More importantly does this mean that he could have a PS5 4K 120hz input into one of these future 2.1 1440p monitors giving him the downscaled 4K to 1440p picture with his desired 120hz refresh?
All future HDMI 2.0 products will fall under 2.1 no matter what for now even without any 2.1 features.

If downscaling from 4K to 1440p is too much loss of performance, I would look towards a monitor or tv that supports VRR.

My current tv is a $150 RCA w/Roku, so I'm no expert in current gen consoles or what exactly to look for in a tv to guarantee a good experience. I've personally bought a monitor recently that was pure trash. My mistake was going off one good review and not much else, so it wasn't a great experience. I could only say read a lot of professional TV reviews from sources such as rtings dot com, and as many customer reviews from as many sources as possible and compare them to find consistencies.

Wish I had more for you.
 
HDMI 2.0 is capable of 4K/120 "UHD". It's about bit depth and colour space that takes a lot of bandwidth.
2.0 can do up to 18 Gbps - 4K/120 8-bit 4-2-0
2.1 can do up to 48 Gbps - 4K/120 10-bit 4-2-0 (20 Gbps) to 4K/120 12-bit RGB
Thanks for the elaboration. So basically, what you are saying is not really any different than what I said. If you have a component in the system that is capable of something beyond what HDMI 2.0 supports, and you only have an HDMI 2.0 device acting as a relay for that signal or a supposed HDMI 2.1 device that is not capable of supporting all that the HDMI 2.1 spec supports, as in @captaincranky s analogy to the nut riding in his rattling rover with his subwoofer shaking his car apart, you're still screwed, and your system cannot meet its full potential. :rolleyes:
 
Last edited:
Statements like this only show that consumer ignorance, even among those who think they know something, will help the latest move by HDMI Forum to confuse people even more.

If you look into bandwidth requirements, even 4K/60 image can use both older and newer spec, as follows:
HDMI 2.0 (TMDS signal) - 4K/60 8-bit RGB or 10-bit 4-2-2 need 18 Gbps
HDMI 2.1 (FRL signal) - 4K/60 10-bit RGB needs 20 Gbps
Therefore, 4K/60 gamers who want RGB colours with 10-bit panels need to pay close attention to monitor's or TV's capability in regards to HDMI port bandwidth. Until yesterday, the label 2.1 meant that 4K/60 10-bit display can be driven with full RGB colours, without compromising on chroma subsampling. Vendors are not required, sadly, to advertise port speed, so it's going to be more difficult to figure out what monitor can actually deliver.

As a rule, we have not had "HDMI 2.1 4K/60" monitors or TVs, for a simple reason. The slowest chip with FRL signal above 18 Gbps would be FRL3. Such port would give 24 Gbps of bandwidth. Some new monitors have it. Companies install such HDMI 2.1 chip, again, for a reason, to drive 4K/120 monitor. With this bandwidth, you can get 4K/120 10-bit 4-2-0, and also all options from 4K/60. If you do not like 10-bit chroma 4-2-0 image and prefer RGB, you switch 4K/120 10-bit 4-2-0 to 4K/60 10-bit RGB mode. But you still need HDMI 2.1 chip and bandwidth to be able to do this.

From now on, companies will be able to advertise 4K/60 monitors as HDMI 2.1 monitor, which is ridiculous, because it does not tell you if you are getting 18 Gbps or 24 Gbps port. Are you getting confused already? If you are, that's good, because you are getting to learn now that something has gone really wrong with what they did.

Also, note that gamers are not the only people on the planet who benefit from higher refresh rates. Scrolling through text is a better experience at 100/120 Hz. I do it everyday and my eyes are thankful.

So, yes, it would be great if you start to care just a little bit, for the beginning, because there are other people on the planet who also enjoy 4K/60 setting, just like you. If you do start to care, the world becomes a better place for you and me, for all of us.
You completely misunderstood my post. I never once said that it didn't matter (of course it does). I said that it wouldn't make a difference in my case because my TV is only 60Hz and it doesn't. For anyone who uses a TV for gaming, the difference would of course be readily apparent and for that reason, the HDMI administration shouldn't be so nonchalant about it.

I just pointed out that, for people who just use their TV as a TV, or people who game with a 60Hz TV (like me) it really wouldn't make a difference because NTSC is only around 30fps and PAL is only 25fps. Having a TV that is capable of 120fps is irrelevant in those cases because those standards (which are still used by the overwhelming majority of broadcasters) don't even come close to using the panel to its full potential.

I'm not saying that it's ok, because I NEVER consider false advertising to be ok. What I am saying is "Yes it sucks but if you're only using your TV as a TV, you should be ok." because it will be. Only people who use a 2160p computer monitor rated at >60Hz should be concerned and keep a sharp eye out for things like this.

I wasn't being ignorant, I was being reassuring. Anyone who is familiar with my activity here (I'm guessing that you're not) knows very well that I'm the antithesis of ignorant.
 
My current tv is a $150 RCA w/Roku, so I'm no expert in current gen consoles or what exactly to look for in a tv to guarantee a good experience. I've personally bought a monitor recently that was pure trash. My mistake was going off one good review and not much else, so it wasn't a great experience.
FWIW, (and IMHO), anything 30" and under, doesn't need 4K,. 1440p will do just fine, and has plenty of resolution for anything you might want to do. With today's video card market being what it is, all 4K does is force you to believe that you need it, and spend money many of us simply don't have for a card to drive it.

As for bad buys in monitors, I've made a few myself. I use vertical orientation for the web, as it acts like a legal pad when participating in forums., so a rotating stand is a necessity. My biggest blunder cane with a Samsung TN (!!) POS, which I never could get the green tint out of it. I never used it, and it's been boxed up in the back bedroom for pushing 10 years.

Anyway, for 60 Hz monitors, Acers seem to be outstanding, and not at all expensive. I have two up and running now, a 27" 1080p, and a 31" 1440p.

Keep in mind, any of my advice is coming from the viewpoint of someone who is heavily involved with imaging, and not at all with gaming. The 1080p is ideal for the web, as I can now read print without resorting to bifocals, as I once had to do with a monumental piece of garbage 24" Dell (CCFL)-, which I endured for years.

Also, the 10 bit color depth "1 billion color" monitors, can't be utilized with less than a 750 ti, and then only with a special 5xx "studio driver" from Nvidia. You still have to use the latest "gaming driver" for gaming, (4xx. series), and only get 8 bit color depth, (16.7 million color), out of the panel.

Of course, your results, and needs, may vary.
 
Last edited:
This is a freaking outrage, so what's the point?... Bad enough I can't even get a d@mn RX 6800XT at MSRP a year after launch, smh.
 
Thanks for the elaboration. So basically, what you are saying is not really any different than what I said.
Whatever the configuration of devices, I just corrected the statement: "HDMI 2.0 is not capable of 4K/120 UHD". It was not correct statement and I provided the numbers for bandwidth requirements. Let's keep it simple, with clarity.
 
HDMI is an inferior, proprietary protocol, contrary to DisplayPort. Just let it die pls.
This is not helpful. HDMI is in more than billion devices on the planet. It's not going to "die" any time soon. Any other propositions?
 
FWIW, (and IMHO), anything 30" and under, doesn't need 4K,. 1440p will do just fine, and has plenty of resolution for anything you might want to do. With today's video card market being what it is, all 4K does is force you to believe that you need it, and spend money many of us simply don't have for a card to drive it.

As for bad buys in monitors, I've made a few myself. I use vertical orientation for the web, as it acts like a legal pad when participating in forums., so a rotating stand is a necessity. My biggest blunder cane with a Samsung TN (!!) POS, which I never could get the green tint out of it. I never used it, and it's been boxed up in the back bedroom for pushing 10 years.

Anyway, for 60 Hz monitors, Acers seem to be outstanding, and not at all expensive. I have two up and running now, a 27" 1080p, and a 31" 1440p.

Keep in mind, any of my advice is coming from the viewpoint of someone who is heavily involved with imaging, and not at all with gaming. The 1080p is ideal for the web, as I can now read print without resorting to bifocals, as I once had to do with a monumental piece of garbage 24" Dell (CCFL)-, which I endured for years.

Also, the 10 bit color depth "1 billion color" monitors, can't be utilized with less than a 750 ti, and then only with a special 5xx "studio driver" from Nvidia. You still have to use the latest "gaming driver" for gaming, (4xx. series), and only get 8 bit color depth, (16.7 million color), out of the panel.

Of course, your results, and needs, may vary.
I completely agree with you. The only reasons that we have these uber-high resolutions is that big panels can get pixelated at lower resolutions. A co-worker (who's retired and rich) who I always got along with really well actually GAVE me a Sharp Aquos 65" 1080p 3D TV and it's a REAL Sharp (from before Hisense bought Sharp). I still haven't used it because I can't imagine it's going to look better than the TV I have.

My panel is a 55" 2160p60Hz and I only bought it because five years ago, it was available at Costco for less than $500CAD (delivered). It looked more or less the same as the other TVs on display, not that I cared because I knew that with no other TV next to it, it would look great (and it does). It was only a 60Hz but I ignored the "experts" warning of motion blur because I know that NTSC is 30fps and PAL is 25. These are not exactly what I would call "heavy lifting" for a panel that can do double that. The other nice thing is that it's not a "smart" TV so it doesn't spy on me and since I have my PC hooked up to it anyway, a "smart" TV would've been superfluous. I believe that the model # is 55E5500U. It's pretty bare-bones but I wouldn't use more than it has. Hell, it has 3 HDMI inputs, 1 HD component input and 1 composite input. I only use two HDMI ports, one for my PC and one for my PS4

It was over $300 less than a Samsung or LG and works flawlessly so I'm quite happy with the purchase I made. :laughing:
 
Back