Consumer Technology Association sets official standard for 8K UHD TVs

Shawn Knight

Posts: 15,296   +192
Staff member
Bottom line: As per the association, officially certified 8K sets must feature at least 33 million active pixels with a minimum resolution of 7,680 horizontally and 4,320 vertically in a 16:9 window. 8K displays four times the detail of 4K UHD and is 16 times sharper than standard HD.

The Consumer Technology Association recently announced the official industry display definition and logo for 8K Ultra HD (UHD) televisions.

Televisions officially designated with the 8K badge must also have one or more HDMI inputs supporting 7,680 x 4,320 pixels, support for a bit depth of 10 bits and support for frame rates of 24, 30 and 60 frames per second.

What’s more, the CTA mandates that 8K sets be able to upscale SD, HD and 4K video to 8K resolution. This should ensure that video won’t look like total crap in the absence of native 8K content.

Sets must also adhere to “HDR transfer functions and colorimetry as specified by ITU-R BT.2100; and HDCP v2.2 or equivalent content protection.”

It’s worth noting that inclusion in the 8K certification program is completely voluntary although manufacturers will almost certainly want to participate as they’ll get to plaster the new 8K logo on their sets starting January 1, 2020.

The CTA in its latest Sales & Forecasts report estimates a total of 175,000 8K UHD TV sets will be sold in the US in 2019, raking in around $734 million in revenue. Naturally, growth is expected over the coming years as technology matures and prices come down.

Masthead credit: 8K by Ron Dale

Permalink to story.

 
I'm assuming Apple will soon have a 9K monitor available for $20,000 with a $4K stand. So you can say, "Oh yeah? You got a 4K monitor, well I got a $4K *stand*!"
 
Consumer Technology Association, good morning!

A useful spec would state that it is criminal for an 8K panel not to have HDMI 2.1 or HDR-1000 at the least. Otherwise, it's a big yawn!

New QLED-s are coming with support for the new HDR-1500 standard, they are the ones to have.

Patiently waiting for the new video cards with HDMI 2.1, as I always have my TV connected to the PC. Only then I can consider buying an 8K TV.
 
Last edited:
It feels to me like 4K television's prices plummeted faster than 1080p did.

4K televisions basically sat on shelves and didn't move when you could get 1080p TV in 50", 60" and 65" for less than $800, but the really big 1080p TV in 70" or higher were over $1500 - $2500

NOW: a 4K 70" Smart TV will run less than $900 on average - less than $800 during holidays.

So my question is, how long will it take neglecting 8K TV models before their prices fall down as well?

1 year?

2 years?

This planned obsolescence is ridiculous.

And then they'll start making 8K smartphone cameras which will need no less than 512GB storage, but by that time 1TB and 2TB will be the norm (2 years from now).
 
@QuantumPhysics. I'm betting we are about to move away from resolution standards, and into the future where video content will be all with dynamic resolution and refresh rates. The new upcoming panel-based TV-s are the first indication toward that. You already can assemble an 8K, 16k or 32k TV panel. I feel that 8K is gonna be the last resolution to be considered a standard. After that it'll be just all dynamic, I.e. both resolution and refresh rate will be custom or even dynamically customizable.

In fact, with HDMI 2.1, the refresh rate is already there, it is customizable. The same awaits the resolution, IMO.
 
Umm I have a problem when are we going to get 4K terrestrial transmissions when most terrestrial channels haven't even got 1080p yet and even cable or satellite companies haven't got all their channels in 4K yet and you have to pay extra over the cost of cable or satellite subscriptions and that's not even the main question which is bandwidth as most western countries struggle with downloading 1080p let alone 4K but they don't realise that as I think only 2 countries in the world has internet fast enough for 8K and they make the TVs so they benefit making these TVs even though they said that 4K was the best you can view almost life like they said so when does it STOP ,,sigh
 
Another wank fest from manufacturers to get people to update yet again to technology that will not benefit them one bit. How about making 4K OTA content widespread. Hell in Australia our TV stations only moved to FHD and 1080i at that, a few years ago and it's it's still garbage super-compressed quality. God knows when we'll see non-streaming services move to 4K. 8K would be great if you want 80"+ TV's and have a crap load of money to waste.

While I can see a big improvement in FHD to 4K in a 55"+ TV at 3m, the difference for 4K to 8K would be much smaller. 8K will also use a a lot more power, and how bad will normal FHD content look even with upscaling?
 
8K is excessive for normal consumers. They won't likely be able to tell the difference 4K vs 8K from ordinary use.

Most people who say this have no clue. I have a good 4K monitor here + 4K TV, both with HDR support. And when I'm playing native 8K content on them, which gets down-scaled to 4K, it looks hugely better than any native 4K content out there. So this way I know that the day when I switch over to 8K screens, it will look a lot better.

If you haven't tried, then you haven't seen it, and you do not know. 8K is a huge improvement. Your eyes may often fail to detect difference in individual pixels, but the mind perceives the whole as another level in realism.
 
8K is excessive for normal consumers. They won't likely be able to tell the difference 4K vs 8K from ordinary use.

Most people who say this have no clue. I have a good 4K monitor here + 4K TV, both with HDR support. And when I'm playing native 8K content on them, which gets down-scaled to 4K, it looks hugely better than any native 4K content out there. So this way I know that the day when I switch over to 8K screens, it will look a lot better.

If you haven't tried, then you haven't seen it, and you do not know. 8K is a huge improvement. Your eyes may often fail to detect difference in individual pixels, but the mind perceives the whole as another level in realism.

Here is another problem of 8K and why streaming 8K is VERY far off. Bandwidth and dataspeeds. Most people don't have over 50Mbp/s lines and those who do, might have datacaps. The prices of services like Netflix will increase too, bandwidth consumption is going to be a real ***** unless we revolutionise The Internet.
 
How are you "dynamically" going to increase pixel density in a display? Can't create nothing out of something.
Pixel-to-vector mixed compression (semi-vector compression), combined with AI-based approximation. Plus there are already AI-based up-scaling solutions that are way better than regular pixel up-scaling with anti-aliasing.
 
Last edited:
Pixel-to-vector mixed compression (semi-vector compression), combined with AI-based approximation. Plus there are already AI-based up-scaling solutions that are way better that regular pixel up-scaling and anti-aliasing.
or to put it in other terms or expand the concept... yes you can create something out of nothing, as long as it fits with what the viewer expects to see.

think for a moment if the task was given to a human and it was one frame/ a still image. if one had painting talent like many art school grads/consumers of green do lets pretend they are given a 640x480 face printed on a 30in medium that accepts their paint type of preference.

is it so hard to concive that they couldnt add lines, colors/tones and shading to greatly improve the detail in this image? have you seen any museum art galleries? photo realism from scratch is something a talented painter can pull off.

with a bit of artistic skill and preferably a drawing tablet one could take a 640x480 image of a face and paint in the lines and shadows thrown out by lack of dots to represent them.

the face isnt important the rez isnt important, whats important to make it fly is that the additions are context aware. aka if it fits what people expect in a face, bridge, building etc.
if it does the majority of people will never see it.

30 of these images per second is a double edged sword. on the one hand you see it for a lot less time. on the other, you add the challenge of anything added better follow the spot it was added to, aka if the face gets closer to the cam everything better scale. same goes for if actor turns 90deg or if camera is moving on any living or inanimate subject.

if you look into it there are libraries for coding...freely available that can track and identify subjects of images or features in an image. see open cv. geting to having ai paint with that data is no small feat but certainly with in the rhelm of possible now and getting easier every day.

if you limit it to source content that can accept delay (aka games are out) the task is way easier. if you add say a 2 second delay on channel change... you make it so the ai isnt racing the clock to produce and validate each frame. the major risk with smaller windows is when you go from an olson twin to the shot of the bridge.; the ai needs to realize the context of what its suposed to paint is different and its previously tracked points are invalid. if the ai gets caught (er probably shouldnt use with its pants down lol) you end up with the actors face/the upscale painted additions as part of the bridge or some weird jumbled mess in between the two frames or what would be a new twist on the term compression artifact. the more frames the ai can churn through ahead of what is being blasted to your eyes, the less likely this is to happen.

if the source content was compressed in a format that added vector info id bet at least the task of keeping things where they should be gets exponentially smaller

im an ai noob but have a programing background and some friends who are anything but ai noobs. id think this is the sort of task that would be accomplished with something like a Generative adversarial network but I could be off base.
 
8K is excessive for normal consumers. They won't likely be able to tell the difference 4K vs 8K from ordinary use.

Most people who say this have no clue. I have a good 4K monitor here + 4K TV, both with HDR support. And when I'm playing native 8K content on them, which gets down-scaled to 4K, it looks hugely better than any native 4K content out there. So this way I know that the day when I switch over to 8K screens, it will look a lot better.

If you haven't tried, then you haven't seen it, and you do not know. 8K is a huge improvement. Your eyes may often fail to detect difference in individual pixels, but the mind perceives the whole as another level in realism.
did you just say, the mind perceives what the eyes cannot see?
 
Back