Game of Thrones cinematographer says it's your fault The Long Night was too dark

BS! I'm 20y broadcast engeneer and around 10 MCR (Master Control) engeneer, know everything about video audio standards in all formats including QC. I saw the 40min docu how the episode was shot, and ther is light enough on the set , for obvious CINEMATIC REASON they decided to tone it down and it is too dark, the black is crushed beyond resnoble explanation. I belive the raw material is completly OK and all this darkness was edited in post production. I understand what they wanted to express but that was really way too much, and belive me I know how setup corectly 2 of my LG screens one led +HDR the other OLED Dolby Vision, it's too freaking dark if you play with brightness or contrast you just make it worse, either way grayish and the colors get washout or as it should be but then it is 75% of the time so dark with some extreme bright spots (fire sources) wich is very annoying, no balance at all, I call it slappy ****ery job and HBO/GOT team should take in account it is ment for TV /streaming and the Cinematic part doesn't really fit in. In other words they ****ed up massivly from post production and QC.
Well, if you'll take notice, (but who really reads those damned credits anyway?), there will always be a credit for someone known as the "color timer".

Herewith my Google search results to the question, "what does a movie color timer do".

https://www.google.com/search?client=opera&q=What+does+a+movie+"color+timer"+do%3F&sourceid=opera&ie=UTF-8&oe=UTF-8

As you already know, movie stock is color negative, and the process of balancing the color in the final positive transparency, is very similar to color negative printing on paper, save for a special machine used to flip the negative positive, without a test printing need being done.

My other point is this. In reference to "available contrast", black is always black, but the maximum contrast available is always dependent on the maximum white level, or "absolute brightness".

The lower the maximum incident light, the closer it is to black in F-stops or exposure value. Consequently the less contrast which can be realized in the scene.

Photoshop measures this on a 0 to 255 scale, representing absolute black to absolute brightness, or 8 photographic F-stops. Should the maximum brightness be less that 255, Photoshop can stretch it out to full brightness, at a severe sacrifice in intermediate values.

In simpler terms , if you only have 4 F-stops of light level above black, you can either tolerate the low contrast available, or you can add contrast artificially, at the expense of lost shadow detail, and washed out highlights.

The techniques used in architectural photography, such as using very log exposures to capture existing lighting in the scene, and then lighting up the area with massive doses of flash with a secondary short exposure are well beyond merely impractical for motion picture work. Trying to light a movie set by firelight alone, would obviously present the same types of difficulties

But you probably know all that.
 
Last edited:
Maybe someone has brought this up in the comments already (I haven't read them all yet), but I can't believe the article didn't mention the use of low budget TVs which will not show proper blacker-than-black scale no matter how well the TV is adjusted.
I just went through this with with a colleague two weeks ago. He knows that I am the local audio/video 'know-it-all' so he asked for my suggestion for the latest top 80" TVs. I gave him a list that ranged in price from $4000-7500 which he guffawed at. He wanted a selection of top 80" TVs that were priced well below $1000. I told him to just go to Walmart and pick whatever he wanted off the shelf, at that price it didn't matter, they just suck. Then he said that he planned to get it from Sam's because they would have a better $800 80" TV then Walmart would have. "OK, you do that, then go hang it above your fireplace at 7' above the floor."
 
Maybe someone has brought this up in the comments already (I haven't read them all yet), (statement of th obvious) edit but I can't believe the article didn't mention the use of low budget TVs which will not show proper blacker-than-black scale no matter how well the TV is adjusted.
Well first of all Mr S, there is no such thing than "blacker than black" The term "black", is an absolute, meaning the absence of ALL light, and hence all color".
I just went through this with with a colleague two weeks ago.
At this point in you post, one has to wonder if the opposite were true, and this poor man had to, "go through it with you", suffering greatly in the process.

He knows that I am the local audio/video 'know-it-all' (*), so he asked for my suggestion for the latest top 80" TVs. I gave him a list that ranged in price from $4000-7500 which he guffawed at.
(I got quite a kick out of your price range as well). You apparently read something in a AV or PC oriented magazine about "the most expensive TVs you can find", and found it obligatory to pass it on so as to present yourself as more knowledgeable than you, in reality are. In fact, people like you, are a buyer's worst nightmare, the "salesman" or "friend", who simply doesn't give a crap how much of someone else''s money they spend. Does that give you some sort of perverse and vicarious pleasure?

He wanted a selection of top 80" TVs that were priced well below $1000. I told him to just go to Walmart and pick whatever he wanted off the shelf, at that price it didn't matter, they just suck. Then he said that he planned to get it from Sam's because they would have a better $800 80" TV then Walmart would have
(There was absolutely no need to add the "would have" at the end of the sentence. The price comparison had already been made, the rest obviously poor syntax).It is a fair assumption that Sam's Club, (which is owned by Walmart. "Sam Walton", is the origin of "Walmart" and "Sam's Club" get it?), might have a slightly better price on the same item, since Sam's Club is really just an, "Amazon Prime", for walk-in traffic.

The truth of the matter is, most of the panels for TVs, are grown by a limited number of manufacturers.

I have read about people whining bitterly over their just purchased high end TVs breaking. You know, sometimes you're not paying for quality, just the name. Whereas, I have a low end "Dynex", (Former Best Buy house brand, supposedly lower quality that their "Insignia" issues). It came out of the box working perfectly, and has done so, with a great picture for the past 15 years. (Hook it up to a computer and you'll be told ir was made by "Funai", a huge Chinese OEM).

Just to clarify, I also have 3 other (budget )flat panel TVs of varying.sizes. I've never had a stitch of trouble with them either.

Then the old adage kicks in, "ask a stupid question, get a stupid answer". You're fairly haughty in selecting a price point for someone else that you imagine is acceptable. Do you own an $8,000.00 TV? Because if you do, congratulations, you just spent $7,000.00 too much to watch one stinking episode of a TV show.

"OK, you do that, then go hang it above your fireplace at 7' above the floor."
Clever quip, at once completely condescending, and a warning that you're someone to avoid at all cost in the future. (y) (Y)

BRW,"welcome to Techspot". (Do you understand the significance of the shock quotes I've place around that salutation)?


(*) You are of course aware that the meme, "know it all", is rarely used in anything but a derogatory context, um don't you?
 
Last edited:
Well first of all Mr S, there is no such thing than "blacker than black" The term "black", is an absolute, meaning the absence of ALL light, and hence all color".
In the bottom right hand corner of the SMPTE bars are three vertical gray stripes called PLUGE bars. The PLUGE bar on the left represents 3.5 IRE level known as 'blacker than black,' the middle one is 7.5 IRE level known as 'black,' the right bar is 11.5 IRE level known as 'lighter than black.'

(I got quite a kick out of your price range as well). You apparently read something in a AV or PC oriented magazine about "the most expensive TVs you can find"...
My colleague asked for the top 80" TVs of 2018/2019, my go to source is the Value Electronics annual TV shootout.

Do you own an $8,000.00 TV? Because if you do, congratulations, you just spent $7,000.00 too much to watch one stinking episode of a TV show.
I have the Value Electronics annual TV shootout winner from 2013.

(*) You are of course aware that the meme, "know it all", is rarely used in anything but a derogatory context, um don't you?
Um, so that's what you are known as at this site?
 
Whereas, I have a low end "Dynex", (Former Best Buy house brand, supposedly lower quality that their "Insignia" issues). It came out of the box working perfectly, and has done so, with a great picture for the past 15 years. (Hook it up to a computer and you'll be told ir was made by "Funai", a huge Chinese OEM).
Just to clarify, I also have 3 other (budget )flat panel TVs of varying.sizes. I've never had a stitch of trouble with them either.
My point about the budget TVs and 'blacker than black' is that many budget TVs will not show black levels down to 3.5 IRE, so in the three PLUGE bars they will only see two gray bars. They are missing any video content below 7.5 IRE, which may be a problem for this GoT episode.
 
In the bottom right hand corner of the SMPTE bars are three vertical gray stripes called PLUGE bars. The PLUGE bar on the left represents 3.5 IRE level known as 'blacker than black,' the middle one is 7.5 IRE level known as 'black,' the right bar is 11.5 IRE level known as 'lighter than black.'
In the real world, there isn't anything, "blacker than black, period. And that's not in some techie's perennial need to constantly coin clever clichés, it's a point of physical fact. "Known as", doesn't really count. "Our TVs go "backer than black" actually sounds like Madison Avenue drivel

The reality is, "our TVs are blacker than that which we were passing off as black before".

My colleague asked for the top 80" TVs of 2018/2019, my go to source is the Value Electronics annual TV shootout.
So all this supposed "wisdom" you're brandishing about is coming directly from a >retail sales outlet< ?

You should realize the the more expensive TV they sell you, the more money the take from you. The average 14 year old knows that.

I have the Value Electronics annual TV shootout winner from 2013.
Well, this is 2019, so you're 6 years behind the times. Remember, the guy with the 2019 "Value Electronics Annual TV Shootout Winner", has the luxury of explaining to you how much TVs have advanced in the past 6 years, can in a leisurely and matter of fact way, declare your TV is a "turd" by today's standards. :poop:

See that's the problem, if you want to talk about how much how you, "know", you should at least not be plagiarizing some self serving electronics box store's material, to do it.

(*) You are of course aware that the meme, "know it all", is rarely used in anything but a derogatory context, um don't you?
Um, so that's what you are known as at this site?
Oh dear, "me too, I know you are", isn't really a valid comeback, well unless you're 13 years old that is.
 
Last edited:
My point about the budget TVs and 'blacker than black' is that many budget TVs will not show black levels down to 3.5 IRE, so in the three PLUGE bars they will only see two gray bars. They are missing any video content below 7.5 IRE, which may be a problem for this GoT episode.
No, the point is, that video content created for streaming should have taken into consideration that not everyone has an $8,000,00 TV.

There's plenty of blame to spread around here, but a blanket response from HBO to the effect of "you're holding your TV wrong" is pure crap.

Quite frankly, if that episode would have been on broadcast TV, then a Walmart purchased TV would have worked out just fine..

In all sincerity, I think they screwed it up on purpose, so they can sell you the Blu-ray set later, claiming the episode has been "remastered".

In turn, the consumers will have to double down on the money they've already spent on their subscription to HBO, and not have to buy "better TVs"...
 
Last edited:
@wiyosaya Good sir, please refer to post #72, for my take on this issue.
I think I mentioned that somewhere in my posts, too, but maybe not. ;)

To me, it is pretty obvious from the comments that even having a calibrated set did not guarantee that the episode would be seen as intended. Which leaves this -

Other outlets are reporting
The episode's Cinematographer says that the scene was dark intentionally, to make it extra intense, claustrophobic and disorienting. However he also blames the compression, and the display settings and viewing environment of most users.
https://www.oled-info.com/latest-game-throne-episodes-gives-boost-interest-oled-tvs
Emphasis mine, of course, and it appears that articles like this, on TS, that sound like they are blaming the viewers are rather incomplete.

Compression seems prime among those and could easily account for those with calibrated displays not seeing the Prime GoT picture - and that means that one GoT to have those UHD or BR disks to see this as intended. And HBO has GoT everyone on this - as you mentioned.

BTW - there is also mention of a Starbucks cup in an episode. :laughing:
 
Last edited:
Here is an interesting discussion of calibration to blacker than black. Essentially, as long as the display is properly calibrated to either level, the picture should look no different.

https://www.hometheaterforum.com/community/threads/black-level-7-5-ire-0-ire.114994/

EDIT: The author does, at the end of the post, get into how the "standard" is variably applied. Which is the problem and is another instance of Joe Kane's Never Twice the Same Color. :laughing:
 
Last edited:
@wiyosaya From your link:

"Viewing the episode on an OLED TVs however makes for a good viewing experience with its high contrast and HDR settings. According to reports from the US, this has increased the interest in OLED TVs. Popular Mechanics, for example, ran an article titled "Games of Thrones Proves Why You Need an OLED TV" and Consumer Reports and CNET both recommended an OLED TV over an LCD for the specific episode" <(Right, we should all rush out and buy new TVs, to watch this one episode, spare me).

From M$, Intel, & AMD:

"Future processors will only be fully compatible with Windows 10".

If anyone doesn't think these corporate moguls aren't having a circle jerk in a bathroom or around a campfire planning, (read "conspiring toward a inter corporate mega-monopoly"), ways to screw you out of your last dime, raise your hand. (At the risk of public shaming).

From me:

"sometime in the not too distant future, OLED TVs will be available at Walmart for about five hundred bucks, and that's when I'll make my move".

Although I do have a Blu-ray player, I prefer to play with DVDs. I suppose these old eyes don't have the necessary resolution to be fully impacted by 4K, nor does my wallet have the necessary stamina in my old age either.

Funny thing though, straight DVD players seem incapable of providing a "properly scaled" signal to my 4K TV. Coming from my Blu-ray player they're perfect. I wonder why.

It kind of makes me wonder if I shouldn't have stuck with my 46" full HD TV

After all, I don't even own a cell phone, and I'm coming to you "live", on a Win 7 32 bit machine..

My good man, man, if you have a few moments to spare, skin through posts #77 to #82. (Pretty please).
 
Last edited:
Here is an interesting discussion of calibration to blacker than black. Essentially, as long as the display is properly calibrated to either level, the picture should look no different.

https://www.hometheaterforum.com/community/threads/black-level-7-5-ire-0-ire.114994/

EDIT: The author does, at the end of the post, get into how the "standard" is variably applied. Which is the problem and is another instance of Joe Kane's Never Twice the Same Color. :laughing:
Interestingly enough, there is a difference between how color film and digital sensors "see" color.

I have an old Nikon AF-D 80-200 mm F2.8. While functioning as a 120-300 mm on a digital body, (which is great), but it does some goofy things with the color and exposure. So, on digital specific lenses, they change the coating's color transmission, to match up with the sensor's spectral.wavelength distribution.

And BTW, there still is no such thing as "blacker than black". It's a clever sales promotional way of getting around the fact the prior panels weren't as good as they should have been (?), and sell you another TV, while appearing to be, "doing you a favor', and coming out smelling like a rose.

In all honesty, "our blacks are now 'blacker" than they were before. But the truth is a heavy burden to bear, especially when ad time is so expensive.

I guess, "our blacks are now approaching actual black, but we still have a ways to go", wouldn't fly either.

While I'm at it, I should add some rustic wisdom, "a bit of manure in the soil will make the roses smell better".

After thought: Adobe RGB is, and has been, the industry standard for measuring the gamma of monitors. So, panels either meet, fail to meet, or exceed that standard.

But I suppose it could be said of newer, brighter monitors, that they give you whites which are, "whiter than white". But do you see how stupid that sounds applied to the opposite end of the contrast spectrum? After all, everyone knows that only "Tide", can give you, "whiter whites".
 
Last edited:
@captaincranky
Interestingly enough, there is a difference between how color film and digital sensors "see" color.

I have an old Nikon AF-D 80-200 mm F2.8. While functioning as a 120-300 mm on a digital body, (which is great), but it does some goofy things with the color and exposure. So, on digital specific lenses, they change the coating's color transmission, to match up with the sensor's spectral.wavelength distribution.

And BTW, there still is no such thing as "blacker than black". It's a clever sales promotional way of getting around the fact the prior panels weren't as good as they should have been (?), and sell you another TV, while appearing to be, "doing you a favor', and coming out smelling like a rose.
You read the first link, I guess the second one was TL;DR.

I get black is black, but from a calibration (read that as speciously applied standards) standpoint, US black is, at least at the time of the article, 7.5 IRE while if you were in Japan, black is 0 IRE.

As that TL;DR article states, the point is to calibrate to one or the other and then there is no difference in dynamic range between the two.

The problems come when material from the other standard gets to your display some how at different level than what your display is calibrated for. If you are calibrated to 7.5 as black and the material has 0 as black, then the display will not reproduce the full dynamic range of the source material.

EDIT: The other problem comes if the display can only go to 7.5 IRE as black and the material has 0 IRE as black - the result is the same as above; however, the display this time, is limited itself rather than a calibration issue.

So, perhaps I should have said, Never Twice the Same Black.

And Yes, black is black. I want my baby back. :laughing:

But, whatever....

Different coatings on the digital lenses does not surprise me.
 
Last edited:
@wiyosaya captaincranky can't come to the computer right now, he's out shopping for a new TV.

BTW, he got to 1:14 of that video before he finally had to throw up his hands in disgust shouting, "no mas". :eek: That is the worst lip sync and most piercing treble mix he could remember.

Instead of "Los Bravos", they should have named themselves, "Weebles Wobble But They Don't Fall Down".https://en.wikipedia.org/wiki/Weeble

No hard feelings though. Perhaps you and he can do a couple of pages about how he doesn't have his computer's sound calibrated correctly, and/or needs to buy a new computer to listen to that video properly..

And just because you're special to cranky and there's hard feelings for dragging him so harshly back to the 60's. Here's a lovely metal anthem with a lovely Celtic lilt.(y) (Y)


(If fact, captaincranky told me he might buy you Dragonforce's greatest hits for Christmas, but doesn't want to spoil the surprise).
 
@captaincranky (y) (Y)
Always good for a laugh. My apologies, I just could not resist that musical, er, uh, somewhat musical?, reference.

BTW - I am right there with you on OLED prices except, for now while I have the wife's approval, my price point will be under $1K.

Do you promise not to tell my wife that your price point for an OLED is $500? :laughing:

With all the variety of "standards" it almost seems not worth it to take the time to calibrate a display these days. But if I did not, my eyes would pop and go boom with vivid mode. :laughing: Then again vivid mode was why I bought my last TV after having seen it in the store. I was hypnotized and just could not resist that TV telling me buy me, buy me, buy me. :scream:

I kind of like the idea in that second link of having calibration frames in THX referenced material, but that would certainly make it more expensive. Apparently, the industry balked at that, too. Us poor consumers are left suffering with instances like this episode of GoT, but wait, I did not watch it, so I am not suffering. Fizz Fizz, what a relief it is.

Maybe Dragonforce should be the GoT theme!!

Edit: But alas, our problems are not as great as the paper coffee cup in GoT https://www.cbsnews.com/news/game-of-thrones-starbucks-cup-the-last-of-the-starks-coffee/
 
Last edited:
Well, I went back and rewatched the episode with the brightness all the way up (from 0 to +30) and the contrast doubled (from 0 to +10). I know the numbers are meaningless but they were almost maxed out. And the experience was still poor. Marginally better at times but still nowhere near clear. Then again, I am streaming as I don't have Sky or HBO. I guess I'll just have to buy a blu-ray player and the box set in blu-ray and maybe a 4K TV, to see this one episode the way it was intended.
 
@captaincranky (y) (Y)
Always good for a laugh. My apologies, I just could not resist that musical, er, uh, somewhat musical?, reference.
Well at least it wasn't rap. I get quite enough of that here in da hood

BTW - I am right there with you on OLED prices except, for now while I have the wife's approval, my price point will be under $1K.

Do you promise not to tell my wife that your price point for an OLED is $500? :laughing:
Oh, I'm going to rat you out alright, but I promise to hold the line @ $750.00

With all the variety of "standards" it almost seems not worth it to take the time to calibrate a display these days. But if I did not, my eyes would pop and go boom with vivid mode. :laughing: Then again vivid mode was why I bought my last TV after having seen it in the store. I was hypnotized and just could not resist that TV telling me buy me, buy me, buy me. :scream:

I kind of like the idea in that second link of having calibration frames in THX referenced material, but that would certainly make it more expensive. Apparently, the industry balked at that, too. Us poor consumers are left suffering with instances like this episode of GoT, but wait, I did not watch it, so I am not suffering. Fizz Fizz, what a relief it is.
Try to follow. Whatever alphanumeric value the lab rats chose to call black, simply doesn't matter. If we agree on a practical definition of black as, the point at which whatever panel, can no longer distinguish or differentiate between tones. Or, the point at which all shadow detail disappears.

The problem with the GoT episode isn't the problem of not enough black, but "not enough white".

The maximum white level available is coincident with the maximum contrast available. The less light on the scene, the closer "white is to "black". Higher or "more available contrast" is an equivalent to "HDR" as it is being used to describe, "High Dynamic Range", in AV context.

Which is why, in a nutshell, a panel with 4000 nits max brightness, will always have more contrast than a panel of 400 nits. Notice the 10:1 illumination ratio. But no matter which panel you view an under lit scene, it still won't have decent contrast. That's because the available potential contrast hasn't changed, only the set's ability to reproduce if it were there

Maybe Dragonforce should be the GoT theme!!
Well that's kinda their thing. Pretty much all of their material is epic "sturm und drang", video game anthems, and battle scores. As pompous, pretentious, and overblown as their music may be, they're a huge guilty pleasure for me.

I'm going to proffer the term, "martial heavy metal", as a description of the genre. If you remember Davis Laine, he had a band called "Sabaton" as his guilty pleasure. One day, he would be posting about, "I almost got all the way through, "Race with the Devil on Spanish Highway" (Al Di Meola), next day saying he was listening to "Sabaton". To which I said, "huh"? Sabaton is "martial heavy metal", but in mostly the real world, as a lot of their music deals with actual battles fought.

Dragonforce has been called "chip metal", since their guitar styling sounds like the old "8 bit chip music and sounds" made by early video games. Hermann Li makes his guitar sound like a point being scored, by lifting it by only the whammy bar, while wiggling it.

As I'm assuming the entire Wiki page on them won't interest you, scroll down to the paragraph, "musical style". https://en.wikipedia.org/wiki/DragonForce

But alas, our problems are not as great as the paper coffee cup in GoT https://www.cbsnews.com/news/game-of-thrones-starbucks-cup-the-last-of-the-starks-coffee/
Here again, I'm going to point out the extreme lack of contrast brought on by the inordinately meager lighting levels.

Whoever captured those photos, had to push the brill just to see the cup, which caused the scene to wash out even more.

BTW, A "nit" is less than a tenth of a "foot candle". So, nits obviously yields much bigger numbers.

Here's a cute comparison chart of different light measurements:

Now you know what to use in which situation. Here are a few real world scenarios:

  • Computer monitors can have a luminance level of 80 to 300 nits.
  • HDTV monitors can have a luminance level of 500 to 1000 nits.
  • The SMPTE requires cinema screens to have a luminance level of 55 nits (16 fL).
  • The sun has a luminance of about 1.6 billion nits. This is about 10,000 fc or about 100,000-130,000 lux.
  • Typical studio lighting is about 1,000 lux.
  • Golden hour is about 400 lux.
  • Office lighting is about 300-500 lux.
  • A living room is about 50-100 lux.
  • A full moon on a clear night is about 0.27 lux.
  • A moonless clear night sky is 0.002 lux.
BTW, I think the "vivid" picture mode, would likely work out best on animation. I would jack up the brightness and perhaps drop the color saturation a bit, to yield a compromise setting. (Not really standard, but not quite full vivid either. Obviously only if the vivid mode allows user tampering.).
 
Last edited:
Oh, I'm going to rat you out alright, but I promise to hold the line @ $750.00
:mad:

Try to follow. Whatever alphanumeric value the lab rats chose to call black, simply doesn't matter. If we agree on a practical definition of black as, the point at which whatever panel, can no longer distinguish or differentiate between tones. Or, the point at which all shadow detail disappears.

The problem with the GoT episode isn't the problem of not enough black, but "not enough white".
Exacerbated by compression, of course.

The maximum white level available is coincident with the maximum contrast available.
Exactly why the "contrast" control is called "white level" on Joe Kane's calibration disks. As I see it, "white level" would have been a term more understandable to the layman, however, marketing...:facepalm:

The less light on the scene, the closer "white is to "black". Higher or "more available contrast" is an equivalent to "HDR" as it is being used to describe, "High Dynamic Range", in AV context.

Which is why, in a nutshell, a panel with 4000 nits max brightness, will always have more contrast than a panel of 400 nits. Notice the 10:1 illumination ratio. But no matter which panel you view an under lit scene, it still won't have decent contrast. That's because the available potential contrast hasn't changed, only the set's ability to reproduce if it were there
:dizzy:That math is just too simple. It needs to have a few summations, derivatives, and integrals, not to mention a few extra dimensions, in it for me to understand it.:laughing:

All joking aside - Good explanations, all, Sir! (y) (Y)

Well that's kinda their thing. Pretty much all of their material is epic "sturm und drang", video game anthems, and battle scores. As pompous, pretentious, and overblown as their music may be, they're a huge guilty pleasure for me.

I'm going to proffer the term, "martial heavy metal", as a description of the genre. If you remember Davis Laine, he had a band called "Sabaton" as his guilty pleasure. One day, he would be posting about, "I almost got all the way through, "Race with the Devil on Spanish Highway" (Al Di Meola), next day saying he was listening to "Sabaton". To which I said, "huh"? Sabaton is "martial heavy metal", but in mostly the real world, as a lot of their music deals with actual battles fought.

Dragonforce has been called "chip metal", since their guitar styling sounds like the old "8 bit chip music and sounds" made by early video games. Hermann Li makes his guitar sound like a point being scored, by lifting it by only the whammy bar, while wiggling it.

As I'm assuming the entire Wiki page on them won't interest you, scroll down to the paragraph, "musical style". https://en.wikipedia.org/wiki/DragonForce
I thought the guitar solos were well-done, but the vocals, well I understand why you call it a guilty pleasure. :laughing:

Here again, I'm going to point out the extreme lack of contrast brought on by the inordinately meager lighting levels.

Whoever captured those photos, had to push the brill just to see the cup, which caused the scene to wash out even more.

BTW, A "nit" is less than a tenth of a "foot candle". So, nits obviously yields much bigger numbers.

Here's a cute comparison chart of different light measurements:

Now you know what to use in which situation. Here are a few real world scenarios:

  • Computer monitors can have a luminance level of 80 to 300 nits.
  • HDTV monitors can have a luminance level of 500 to 1000 nits.
  • The SMPTE requires cinema screens to have a luminance level of 55 nits (16 fL).
  • The sun has a luminance of about 1.6 billion nits. This is about 10,000 fc or about 100,000-130,000 lux.
  • Typical studio lighting is about 1,000 lux.
  • Golden hour is about 400 lux.
  • Office lighting is about 300-500 lux.
  • A living room is about 50-100 lux.
  • A full moon on a clear night is about 0.27 lux.
  • A moonless clear night sky is 0.002 lux.
BTW, I think the "vivid" picture mode, would likely work out best on animation. I would jack up the brightness and perhaps drop the color saturation a bit, to yield a compromise setting. (Not really standard, but not quite full vivid either. Obviously only if the vivid mode allows user tampering.).
You should write a wikipedia page!

Just joking about the Vivid mode. Yes, animation would be good material for that, but my display never leaves its calibration settings - mostly. I've got an older LG plasma display that has some "image persistence problems" and I am pining for the day when OLEDs are down to my price range. (Unless by some happenstance of fate, an affordable version of Crystal LED shows up on the market - doubtful, though.)

For those who don't have the time or access to calibration materials "Movie mode" in the TV's settings is recommended as well as a few others where the names vary based on the display manufacturer but essentially amount to the same thing.

Some people do not realize that TVs out-of-the-box are set way out of line with calibration standards because the manufacturer wants the TV to catch your eye in the store, and hypnotize you so that you hear "take me home, take me home, take me home," coming from their speakers while you are viewing them in the store. ;) As you previously said, marketing...:facepalm:
 
@wiyosaya OK, here's a page from CBS news, (via Twitter, I think), about the coffee cups in thGoT scene: https://www.cbsnews.com/news/game-of-thrones-starbucks-cup-the-last-of-the-starks-coffee/.

Before we start, "contrast", is not really "white level". While this explanation maybe the "because" of explanations, if White level were "contrast", why would they bother to put a "contrast control" on a TV's picture menu?

Never mind that for now.

I'm going to explain the different picture adjustments on the uploads of Daenerys and "her Starbucks chalice".

Image 1, by "Hollie Baggins-KenobI'

Lots of push in the contrast. Danny's face and the cup lid are very bright. However, her coat has no shadow detail whatsoever, no is there any in the entire right side of the image. So, source failure.

Image 2, by Svede Kurt

The image was manipulated almost by brightness alone. There's almost no contrast, but plenty of shadow detail. as you can see the piping around the sleeves on Danny's jacket.

The pale yellow color balance smacks of cat urine.

Image 3, by Ira Madison

Not quite as bright as #2 still flat in contrast, but with excess red color saturation, and very little in the way of highlights in Danny's face.

My conclusion, source failure resisted all attempts at post transmission corrections.

BTW, the CBS page wouldn't allow me to extract the individual images, so I had to link the whole page.

Some people do not realize that TVs out-of-the-box are set way out of line with calibration standards because the manufacturer wants the TV to catch your eye in the store, and hypnotize you so that you hear "take me home, take me home, take me home," coming from their speakers while you are viewing them in the store. ;) As you previously said, marketing...:facepalm:

Neither you nor I would watch TV in a room anywhere near as bright as a Walmart. The light levels in those stores is set so that it gets people awake and moving, and the psychology behind it it so that people , "will buy and get out". Gone are the natural wood and dimly lit record stores of the 70's. Give us all your money, and get out. See you next week, same time, same ripoff.:rolleyes:

It is reasonable to suggest, that manufacturers are aware of what type of environment their sets will likely first be viewed, and at least in part, adjust for that eventuality.

Which is not to say you're wrong about the "come hither" mentality of the picture settings, but just one more factor likely being considered.

Don't you have a copy of Photoshop so you can d!ck around with the levels controls? I think experiencing that type of image manipulation strategy, would open new worlds of understanding to you.

Here's a short, but very good video about working with levels, and you'll see that it also addresses some of the issues I've pointed out in the GoT uploaded stills.

 
Last edited:
@wiyosaya OK, here's a page from CBS news, (via Twitter, I think), about the coffee cups in thGoT scene: https://www.cbsnews.com/news/game-of-thrones-starbucks-cup-the-last-of-the-starks-coffee/.

Before we start, "contrast", is not really "white level". While this explanation maybe the "because" of explanations, if White level were "contrast", why would they bother to put a "contrast control" on a TV's picture menu?

Never mind that for now.
@captaincranky Well, in the context of calibration of TVs, you will have to argue that with Joe Kane - http://www.videoessentials.com/
There are others out there that think he is FOS when it comes to his reasoning on HDR https://www.avsforum.com/forum/138-avs-foruma-podcasts/2440346-joe-kane-hdr.html
So why not? ;) Sounds like it might be fun for you. :laughing:

Don't you have a copy of Photoshop so you can d!ck around with the levels controls? I think experiencing that type of image manipulation strategy, would open new worlds of understanding to you.

Here's a short, but very good video about working with levels, and you'll see that it also addresses some of the issues I've pointed out in the GoT uploaded stills.

Honestly, right now I have an ancient PS version 2.0, and I have not opened it in years. I have no compelling reason to and much less time. Maybe when I break out my telescope in some future year and get serious about astrophotography, I will have a reason to get and learn an image manipulation program, but not right now.

Still, IMO, the only job of a marketer is to dupe you into buying their products. The only place that I know that will calibrate at least some displays is the home town Photo/Audio/Video dealer that I frequent when I want to buy. I trust them because they literally do not give me a load of :poop: when I want to buy something.

In fact, I once recommended that a friend go there. That friend did and came back to me saying something that was very close to "They did not tell me what I wanted, so I did not buy anything." :facepalm:

If they don't calibrate, they probably pick the default setting that is closest to what a calibrated display will look like. From my previous trips there, there is very little variation between their displays when showing the same reference material.

Aside from that, I've bought the bill of marketing goods in the past, so I have no reason to trust marketers, and the job of most display makers is to sell displays. I've walked through the rows at Walmart and BestBuy, and its plain to tell that their displays are not calibrated. The variation from set to set showing the same demo material is just far too great which to me, indicates they really do not care if their customers are getting an accurate representation of what any particular display can do as long as they make the sale.

Aside from that, I highly doubt that most people working at Walmart, especially, would have any clue where to even start a calibration procedure on any display they sell. And the case for BestBuy is likely only marginally better.

So I think we are in agreement on that.

Thanks for the video, I am sure it will prove interesting and edifying. :yum
 
Last edited:
@captaincranky Just wondering, have you ever calibrated a TV? Feel free to use any expletives you choose in your reply to me as you would like. ;)

From viewing that video, I think it is different than adjusting an image in Photoshop.

Consider this a return of the favor https://www.howtogeek.com/299838/how-to-get-the-best-picture-quality-from-your-hdtv/

It is consistent with Joe Kane's calibration material. You may, or may not, find the information on adjusting the contrast interesting. Try to pay attention. ;) You might understand why I suggested that the contrast control be renamed.

Here is another link just in case the contrast adjustment in the above link was too far down the page. ;) http://spearsandmunsil.com/portfolio-item/setting-the-contrast-control/
 
@captaincranky Just wondering, have you ever calibrated a TV? Feel free to use any expletives you choose in your reply to me as you would like. ;)
I set every TV I have ever owned up by eye, using manual picture mode. I first step I take, is to dial the color temperature back to 5500K. That's the same approach I've taken with every computer monitor I've ever owned.

Guess what, that takes most of the excess blue out of the screen, so you don't need some a**hole in marketing to explain about how you should buy a new monitor because, "this years model has a special 'blue light reduction' feature".

From viewing that video, I think it is different than adjusting an image in Photoshop.
Don't be too sure. The levels panel on the video I posted, is more than likely in keeping with current professional methods of preparing a video clip for distribution, since digital cameras are most likely employed for the initial capture as opposed to color negative film. The "color timer's" job description has likely been redefined to include evaluating digital using "levels like histograms", and balancing that information with what his eyes are telling him to do with the image.

Now, the primary reason I posted the levels video, was to show you what parameters affected what outcome in a print, (or movie clip). All of those parameters are, (or should have been) adjusted BEFORE the material is transmitted.

The alternative would be like handing a DVD of a person's wedding photos to the happy bride and groom, and telling them to, "adjust the color to your liking, before you have them printed. Which is exactly what HBO did, by handing that turd of an episode and telling them, "the pictures you are about to see are true, it's you and your TV that are in, 'the Twilight Zone'"!

Now, since you have gone wildly off topic before with your calibration fanaticism with the side trip into creating "printer profiles", which do absolutely depend on a reference standard calibrated monitor, I can't possibly be that far afield, by explaining the parameters for adjusting a movie for distribution.

While the methodologies for reaching the same qualities in an image might differ, the outcome, both objectively and subjectively, are to be judged using the same criteria

Consider this a return of the favor https://www.howtogeek.com/299838/how-to-get-the-best-picture-quality-from-your-hdtv/-

It is consistent with Joe Kane's calibration material. You may, or may not, find the information on adjusting the contrast interesting. Try to pay attention. ;) You might understand why I suggested that the contrast control be renamed.

Here is another link just in case the contrast adjustment in the above link was too far down the page. ;) http://spearsandmunsil.com/portfolio-item/setting-the-contrast-control/

Let me say fist and foremost, "fu*k Joe Kane", he should shove his DVD where the 'white level' doesn't shine". Next "contrast doesn't need to be renamed, it's fine the way it is.

Now, I've had 2 years of college, taking still photos and movies, and printing them for juried exhibition. (I got mostly all A's, and had a few prints selected for the school's collection). So, you won't come over to watch TV at my house, and see anything but a decent picture. There'll be no green faces, no blown out highlights, no missing shadow detail, no washed out color, no over saturated color, and I did it all by eye, the way I was trained As someone who is literally afraid to touch his picture controls, do you have the hubris to say the same?

And BTW, my daddy taught me what the 'tint', 'color', and 'brightness' controls did on a color TV, some 60 odd years ago.

With that in mind, professional photographers wouldn't touch an LCD panel with a ten foot pole when they were first introduced. They stuck with CRT, for a good number of years, until flat panel technology caught up.

Anyway; make of this what you will:


Heavy Horses
Jethro Tull
Iron-clad feather-feet pounding the dust
An October's day, towards evening
Sweat-embossed veins standing proud to the plough
Salt on a deep chest seasoning
Last of the line at an honest day's toil
Turning the deep sod under
Flint at the fetlock, chasing the bone
Flies at the nostrils plunder
The Suffolk, the Clydesdale, the Percheron vie
With the Shire on his feathers floating
Hauling soft timber into the dusk
To bed on a warm straw coating

Heavy horses, move the land under me
Behind the plough gliding, slipping and sliding free
And now you're down to the few and there's no work to do
The tractor is on its way

Let me find you a filly for your proud stallion seed
To keep the old line going
And we'll stand you abreast at the back of the wood
Behind the young trees growing
To hide you from eyes that mock at your girth
Your eighteen hands at the shoulder
And one day when the oil barons have all dripped dry
And the nights are seen to draw colder
They'll beg for your strength, your gentle power
Your noble grace and your bearing
And you'll strain once again to the sound of the gulls
I
Heavy horses, move the land under me
Behind the plough gliding, slipping and sliding free
And now you're down to the few and there's no work to do
The tractor is on its way

Standing like tanks on the brow of the hill
Up into the cold wind facing
In stiff battle harness, chained to the world
Against the low sun racing
Bring me a wheel of oaken wood
A rein of polished leather
A heavy horse and a tumbling sky
Brewing heavy weather

Bring a song for the evening
Clean brass to flash the dawn
Across these acres glistening
Like dew on a carpet lawn
In these dark towns folk lie sleeping
As the heavy horses thunder by
To wake the dying city
With the living horseman's cry
At once the old hands quicken
Bring pick and wisp and curry comb
Thrill to the sound of all the heavy horses coming home

Iron-clad feather-feet pounding the dust
An October's day, towards evening
Sweat-embossed veins standing proud to the plough
Salt on a deep chest seasoning
Bring me a wheel of oaken wood
A rein of polished leather
A heavy horse and a tumbling sky
Brewing heavy weather

Heavy horses, move the land under me
Behind the plough gliding, slipping and sliding free
And now you're down to the few and there's no work to do
The tractor is on its way
Oh, heavy horses, move the land under me
Behind the plough gliding, slipping and sliding free
And now you're down to the few and there's no work to do
The tractor is on its way
Oh, heavy horses, move the land under me
Behind the plough gliding, slipping and sliding free
And now you're down to the few and there's no work to do
The tractor is on its way
Oh, heavy horses, move the land under me
Behind the plough gliding, slipping and sliding free
Songwriters: Ian Anderson
Heavy Horses lyrics © BMG Rights Management
 
Last edited:
I set every TV I have ever owned up by eye, using manual picture mode. I first step I take, is to dial the color temperature back to 5500K...
Guess what, that takes most of the excess blue out of the screen,..
Since you set temp to 5500K instead of the standard of 6500K you like your screens with a reddish tinge instead of a blue tinge, you don't want proper white?
 
Since you set temp to 5500K instead of the standard of 6500K...
Oh, and another disadvantage of cheaper budget TVs is that you may set the color temp to 5500K (or 6500K) at a certain brightness level, but then as the brightness level changes, so does your color temp. Higher priced TVs have better processing which allow them to keep the same color temp throughout brightness changes. But yes I know, that $2000 TV can't be any better than that entry level $800 TV, it's just marketing gobbledygook (sp?).
 
Since you set temp to 5500K instead of the standard of 6500K...
Oh, and another disadvantage of cheaper budget TVs is that you may set the color temp to 5500K (or 6500K) at a certain brightness level, but then as the brightness level changes, so does your color temp. Higher priced TVs have better processing which allow them to keep the same color temp throughout brightness changes. But yes I know, that $2000 TV can't be any better than that entry level $800 TV, it's just marketing gobbledygook (sp?)(or maybe you would throw in a curse word that is misspelled so it doesn't get edited out, shows maturity).
 
Back