The world's first 8K TV channel has kicked off with masterpiece '2001: A Space Odyssey'

mongeese

Posts: 643   +123
Staff
Through the looking glass: National broadcasting network NHK launched the world’s first 8K TV channel yesterday in Japan. Not unironically, NHK provided the 8K televisions and dedicated satellite dishes to guests at the launch ceremony who celebrated the beginning of 8K television with the “masterpiece of film history,” 2001: A Space Odyssey.

Adoption of 8K television is still a pipe dream for most of the world, but in Japan, the government has pushed networks to be at the forefront. NHK has happily complied, and their research and development over the last few years have culminated in a dedicated 8K TV channel.

For the first ever 8K television broadcast, NHK wanted to show a classic, something everyone can respect. Unfortunately, only the most recent films have been shot in 8K, and they didn’t suit the occasion. To work around this, NHK requested that Warner Brothers scan the original negatives of the 1968 Sci-Fi 2001: A Space Odyssey, which happened to have been recorded on 70mm film, the best at the time.

According to NHK, "the many famous scenes become even more vivid, with the attention to detail of director Stanley Kubrick expressed in the exquisite images, creating the feeling of really being on a trip in space, allowing the film to be enjoyed for the first time at home.”

In addition to upgrading the film to 8K, they also improved the audio to support the 22.2 channel system that they’ll be broadcasting with from now on. Because the audio has such high fidelity, it requires a cable of its own, in addition to the four HDMI cables the image needs. Plugging into one end of the cables is Sharp’s $2,200 satellite dish, and at the other end is Sharp’s $14,000 8K Aquos TV.

The new 8K channel will be broadcasting exclusively in Japan for around 12 hours a day. Rather than upscaling 4K video like Samsung has been hoping to do, they’ll be filming new content and events and playing recent high-budget 8K movies. Starting soon, they’ll also be sending a film crew to Italy to present “popular tourist attractions from Rome, as well as food, culture and history.” In March, they’ll be showing 1956 hit My Fair Lady starring Audrey Hepburn, another film that was shot on 70mm.

All this is of course in preparation for the 2020 Tokyo Olympics, which NHK will be broadcasting live in 8K. A major part of the Japanese government’s plan, the Tokyo Olympics may finally make 8K something worth considering. Communications Minister Masatoshi Ishida says that he hopes Japan can play a leading role in 8K broadcasting.

Will 8K ever become mainstream? There is an obvious answer to that question. But it's possible it will be adopted first on VR displays and on close up screens like tablets. On televisions it's harder to see a difference from other high resolutions like 4K or 6K at a distance. For reference, Sharp’s 70” 8K TV has 126 pixels per inch, which is nearly 40% higher than a 24” 1080p monitor – something you’d sit at least 6 feet closer to.

The irony of choosing a sci-fi classic to demonstrate what is nearly sci-fi technology hasn’t gone unnoticed. But by making 8K content available, NHK has finally created a use case that may bring enough consumers to the table to merit the development of cheaper 8K televisions.

Permalink to story.

 
Yet more proof the entire mindless pixel rat-race is entirely marketing driven... The last time I checked, digital scans of older 35mm films essentially maxed out at or just below 4K due to limitations of the silver-halide based medium itself. In other words, you could build a 16,777,216K TV, saturate it in gushing over-hype, and yet you still wouldn't be able to extract or display any more than 4K perceivable pixels out of anything originally shot on decades old 35mm without the grain that's ingrained in the film medium itself washing the supposed extra details out. It's like scanning in an old photo - 150 to 300 to 600ppi may show improvements but there comes a point where 1200, 2400, 4800, 9600ppi is just a waste of space. This effect is already blatently obvious on existing 1080p Blu-Ray's of older movies, eg, Terminator 1.
 
Yet more proof the entire mindless pixel rat-race is entirely marketing driven... The last time I checked, digital scans of older 35mm films essentially maxed out at or just below 4K due to limitations of the silver-halide based medium itself. In other words, you could build a 16,777,216K TV, saturate it in gushing over-hype, and yet you still wouldn't be able to extract or display any more than 4K perceivable pixels out of anything originally shot on decades old 35mm without the grain that's ingrained in the film medium itself washing the supposed extra details out. It's like scanning in an old photo - 150 to 300 to 600ppi may show improvements but there comes a point where 1200, 2400, 4800, 9600ppi is just a waste of space. This effect is already blatently obvious on existing 1080p Blu-Ray's of older movies, eg, Terminator 1.
Very true.... but 2001 was shot on 70mm film, not 35mm....

The thing is, if you’re watching on a TV, the human eye cannot perceive the difference between 1080p and 4K (let alone 8k) unless the screen is like 80” or more - and even then it’s a very small difference!

The main reason to buy a 4K tv is simply that if you want HDR, and all the new features, you have no choice but to get 4K. Alas, that will probably be the case years from now once 8k is mainstream as well. Any new tv sold 15 years from now will be 8k and we won’t have a choice.
 
Yet more proof the entire mindless pixel rat-race is entirely marketing driven... The last time I checked, digital scans of older 35mm films essentially maxed out at or just below 4K due to limitations of the silver-halide based medium itself. In other words, you could build a 16,777,216K TV, saturate it in gushing over-hype, and yet you still wouldn't be able to extract or display any more than 4K perceivable pixels out of anything originally shot on decades old 35mm without the grain that's ingrained in the film medium itself washing the supposed extra details out. It's like scanning in an old photo - 150 to 300 to 600ppi may show improvements but there comes a point where 1200, 2400, 4800, 9600ppi is just a waste of space. This effect is already blatently obvious on existing 1080p Blu-Ray's of older movies, eg, Terminator 1.
Very true.... but 2001 was shot on 70mm film, not 35mm....

The thing is, if you’re watching on a TV, the human eye cannot perceive the difference between 1080p and 4K (let alone 8k) unless the screen is like 80” or more - and even then it’s a very small difference!

You guys really need to get your eyes checked. Everyone under 50 can see the difference between SD, HD, UHD and 8k. At any distance from the screen. Everytime there is an improvement in technology the Luddites come out of the woodwork complaining about ... What exactly? Go back to your standard definition crt and rabbit ears. The world sees the difference that you don't. And we are happy with it.
 
Yet more proof the entire mindless pixel rat-race is entirely marketing driven... The last time I checked, digital scans of older 35mm films essentially maxed out at or just below 4K due to limitations of the silver-halide based medium itself. In other words, you could build a 16,777,216K TV, saturate it in gushing over-hype, and yet you still wouldn't be able to extract or display any more than 4K perceivable pixels out of anything originally shot on decades old 35mm without the grain that's ingrained in the film medium itself washing the supposed extra details out. It's like scanning in an old photo - 150 to 300 to 600ppi may show improvements but there comes a point where 1200, 2400, 4800, 9600ppi is just a waste of space. This effect is already blatently obvious on existing 1080p Blu-Ray's of older movies, eg, Terminator 1.
Very true.... but 2001 was shot on 70mm film, not 35mm....

The thing is, if you’re watching on a TV, the human eye cannot perceive the difference between 1080p and 4K (let alone 8k) unless the screen is like 80” or more - and even then it’s a very small difference!

You guys really need to get your eyes checked. Everyone under 50 can see the difference between SD, HD, UHD and 8k. At any distance from the screen. Everytime there is an improvement in technology the Luddites come out of the woodwork complaining about ... What exactly? Go back to your standard definition crt and rabbit ears. The world sees the difference that you don't. And we are happy with it.

I feel like a mutant freak when people say stuff like that, because I can absolutely see the difference on any screen over 24-25 inches or so.

Only on phones and tablets would it not be useful.
 
The thing is only a few people think they need this and even those don't. What do you want to see nipple hair?
 
You guys really need to get your eyes checked. Everyone under 50 can see the difference between SD, HD, UHD and 8k. At any distance from the screen. Everytime there is an improvement in technology the Luddites come out of the woodwork complaining about ... What exactly? Go back to your standard definition crt and rabbit ears. The world sees the difference that you don't. And we are happy with it.
I'm 26 years old and recently got lucky and managed to nab Sony's highest end (non-OLED) 55inch TV with a free UHD Blu-Ray Player (also came bundled with the latest Spiderman movie on UHD).

I have 4 UHD Blu-rays at the moment, Solo: a Star Wars Story came with the 4k and 1080p Versions of the movie. Honestly, with the exception of the HDR effects, it's genuinely hard to tell the difference. If I get up and close to the screen, sure, I can see the difference, but if I sit back on my sofa like normal and swap disks to see the difference, image quality wise it's extremely difficult to see the difference.

Not just me either, my Girlfriend and a bunch of my mates have been round to take a look and have done the same thing, pretty much all of them notice the difference when HDR is enabled, but actual image quality, pretty much all of them thought it was the same disk just with HDR disabled.
 
Several years ago is when the Japanese said that they were going to broadcast the Tokyo Olympics in 8K, and by golly it looks like they're following through despite the fact that the technology has not yet begun to roll. In such matters the Japanese are far-sighted and enthusiastic, and they'll make it happen.

My Fair Lady is a 1964 movie.. The stage play was 1956.
 
You guys really need to get your eyes checked. Everyone under 50 can see the difference between SD, HD, UHD and 8k. At any distance from the screen. Everytime there is an improvement in technology the Luddites come out of the woodwork complaining about ... What exactly? Go back to your standard definition crt and rabbit ears. The world sees the difference that you don't. And we are happy with it.
I'm 26 years old and recently got lucky and managed to nab Sony's highest end (non-OLED) 55inch TV with a free UHD Blu-Ray Player (also came bundled with the latest Spiderman movie on UHD).

I have 4 UHD Blu-rays at the moment, Solo: a Star Wars Story came with the 4k and 1080p Versions of the movie. Honestly, with the exception of the HDR effects, it's genuinely hard to tell the difference. If I get up and close to the screen, sure, I can see the difference, but if I sit back on my sofa like normal and swap disks to see the difference, image quality wise it's extremely difficult to see the difference.

Not just me either, my Girlfriend and a bunch of my mates have been round to take a look and have done the same thing, pretty much all of them notice the difference when HDR is enabled, but actual image quality, pretty much all of them thought it was the same disk just with HDR disabled.

That's odd. I'm not visually acute but I definitely notice a difference even between 2K and 4K. Are you sure it's a genuine 4K bluray player and not one that advertises it but really just upscales? It could also be the movies.
 
I love this development. It opens the opportunity to buy TV-s larger than 70", be able to see the difference, and no pixelation. Even at 85", 8K already makes sense, as the difference is there already.

And when it comes to a desktop PC, 8K does make a difference when sitting right in front of it. Check out this video... image of the steak does it justice :)

only the most recent films have been shot in 8K, and they didn’t suit the occasion
Meaning, the movies weren't worth the equipment they used. That's how Hollywood does it these days, getting crappier every year.
 
Last edited:
If you want to see a difference you obviously need to be closer. when setting up a living room tv the size of the tv and the distance you sit away should complement each other. Otherwise you are definitely overpaying on the uhd premium.
 
Yet more proof the entire mindless pixel rat-race is entirely marketing driven... The last time I checked, digital scans of older 35mm films essentially maxed out at or just below 4K due to limitations of the silver-halide based medium itself. In other words, you could build a 16,777,216K TV, saturate it in gushing over-hype, and yet you still wouldn't be able to extract or display any more than 4K perceivable pixels out of anything originally shot on decades old 35mm without the grain that's ingrained in the film medium itself washing the supposed extra details out. It's like scanning in an old photo - 150 to 300 to 600ppi may show improvements but there comes a point where 1200, 2400, 4800, 9600ppi is just a waste of space. This effect is already blatently obvious on existing 1080p Blu-Ray's of older movies, eg, Terminator 1.

Yeah but let those suckersssssss buy those 8K tv's so it drives competition and means cheaper tv's for us peasant's with 1080P & 2K. When technology hits a consumer limit or line so to speak it ends up making technology stagnant. Those consumers are the carrot and the company is the rabbit. Without the carrot the rabbits have no motivation or direction or reason.
 
Last edited:
The more empty the mind, the bigger the screen I guess.
I love this development. It opens the opportunity to buy TV-s larger than 70", be able to see the difference, and no pixelation. Even at 85", 8K already makes sense, as the difference is there already.

And when it comes to a desktop PC, 8K does make a difference when sitting right in front of it. Check out this video... image of the steak does it justice :)

only the most recent films have been shot in 8K, and they didn’t suit the occasion
Meaning, the movies weren't worth the equipment they used. That's how Hollywood does it these days, getting crappier every year.

The more empty the mind, the bigger the screen I guess.Who can afford 70"+ screen in living room of apropriate size nowadays? And why? Given the fact that movie production is dumber and dumber it is not even worthy.
 
Yet more proof the entire mindless pixel rat-race is entirely marketing driven... The last time I checked, digital scans of older 35mm films essentially maxed out at or just below 4K due to limitations of the silver-halide based medium itself. In other words, you could build a 16,777,216K TV, saturate it in gushing over-hype, and yet you still wouldn't be able to extract or display any more than 4K perceivable pixels out of anything originally shot on decades old 35mm without the grain that's ingrained in the film medium itself washing the supposed extra details out. It's like scanning in an old photo - 150 to 300 to 600ppi may show improvements but there comes a point where 1200, 2400, 4800, 9600ppi is just a waste of space. This effect is already blatently obvious on existing 1080p Blu-Ray's of older movies, eg, Terminator 1.
Great point. It looks like movies made entirely with IMAX film are the only ones that will scan properly in 8K. The rest are gimmics.
 
You guys really need to get your eyes checked. Everyone under 50 can see the difference between SD, HD, UHD and 8k. At any distance from the screen. Everytime there is an improvement in technology the Luddites come out of the woodwork complaining about ... What exactly? Go back to your standard definition crt and rabbit ears. The world sees the difference that you don't. And we are happy with it.
I'm 26 years old and recently got lucky and managed to nab Sony's highest end (non-OLED) 55inch TV with a free UHD Blu-Ray Player (also came bundled with the latest Spiderman movie on UHD).

I have 4 UHD Blu-rays at the moment, Solo: a Star Wars Story came with the 4k and 1080p Versions of the movie. Honestly, with the exception of the HDR effects, it's genuinely hard to tell the difference. If I get up and close to the screen, sure, I can see the difference, but if I sit back on my sofa like normal and swap disks to see the difference, image quality wise it's extremely difficult to see the difference.

Not just me either, my Girlfriend and a bunch of my mates have been round to take a look and have done the same thing, pretty much all of them notice the difference when HDR is enabled, but actual image quality, pretty much all of them thought it was the same disk just with HDR disabled.

I noticed the difference from 1080p to 4k after getting my eyes tested (I have low astigmatism), once the contacts or glasses are on, boom the difference is real!
 
So if you ever went to a cinema, for that huge screen, you must have a completely empty mind then, according to your own logic. What a dumb comment!
Of course, the cinema’s screen is far larger than anyone’s tv... yet it “only” has 4K resolution - at the most.
 
Isn't it amazing the number of people who always post comments on articles about higher TV display resolutions claiming they can't see the need for it? They must live in a world of straight horizontal and vertical lines.
There also appears to be a group of people who like seeing individual pixels, I'm surprised if they use anything above 768x576.
 
Isn't it amazing the number of people who always post comments on articles about higher TV display resolutions claiming they can't see the need for it? They must live in a world of straight horizontal and vertical lines.
There also appears to be a group of people who like seeing individual pixels, I'm surprised if they use anything above 768x576.
I suggest you stand about 8 feet from a 60” 1080p tv and then 8 feet from a 60” 4K tv and be honest with yourself... yes, you can see the difference when you are a foot or 2 from the screen... but there have been plenty of studies that show that the vast majority of people cannot tell the difference at the “optimal viewing distance” unless the screen is really big. HDR is the reason to go 4K - because it’s virtually impossible to get a 1080p tv that supports HDR 10... 8k, unless your screen is crazy big, will be even less useful.

Not to mention that the amount of 4K media is still pretty rare - when upscaling content, the difference becomes even less.
 
I 'm not enthralled with the move to 4K via sheer pixel count, but there is a visible benefit to fully supporting all the features in Ultra HD Premium certification via HDR10 and or Dolby Vision
 
I suggest you stand about 8 feet from a 60” 1080p tv and then 8 feet from a 60” 4K tv and be honest with yourself... yes, you can see the difference ...
Should leave it at that unless you need your eyes testing. If you can't make out the pixels that are almost 1mm across on a 60" FHD TV from 8 feet away your eyesight definitely needs correction. Or maybe you are one of those oddities that likes to look at pixels rather than pictures.
 
Back