Game of Thrones cinematographer says it's your fault The Long Night was too dark

Plasma tv handled the blacks great. Not sure what everyone is complaining about. It was dark but clear!!!
 
IAs I understand the ideal "home theater room," one should be able to darken the room rather than lighten the set. Barring that, yes a brighter set would be good as long as it it calibrated.
As I mentioned earlier in my reply to Julio's post. I'm seriously thinking about blacking out the wall behind my home theater.rig. This since I got rid of my nasty silver grilled JBL "Stadiums", and bought a pair of equivalent (sort of) Klipsch (jet black 2 x 8" + horn), towers to replace them.

Since I consider daytime TV a waste of time, and never draw my blinds anyway, some of the necessity for maximum brightness from my TV is of no concern.

That said, HDR would probably help, too, but most sets these days do not have the 4,000 nits peak brightness levels to take advantage of what HDR can really do.
Maybe but, light value measured photographically is a logarithm of 2. So 7 f-stops difference would be either 128 times the light, or the reciprocal 1/128th. Noon daylight equals an exposure of 1/100 sec @ F-16. You can sort of interpolate that the lower the maximum light value you start with, the less contrast is available before you hit black.

Accordingly, an under-lit set would be very "flat" or low contrast. In the picture, the high reflectance value of Daenary's white coat, (white can reflect up to 90+ % of light which hits it), works out fine in low light. But for the ground and those combatants in dark clothing, the reflectance value drops radically, quickly disappearing into no differentiation between them and maximum black..

Basically, while you could go out at noon and underexpose by 7 stops and still get some detail in the highlights, that would in no way hold true doing the same thing in moonlight. A full moon in and of itself is fairly bright, something like F-8 @ 125 sec ISO 100, but since it only reflects light and doesn't generate it, the inverse square law kicks in and we get very little illumination from it here on earth. By comparison the sun is a burn the eyeballs right out of your head, light source.

Contrast is the difference between the maximum exposure value, "highlights", and the darkest part of the scene or "shadows". in other words, black is always black, but arguably "white" is the variable. and the available contrast ratio diminishes as the value of white decreases

Point being, if this fool at HBO wants to try photographing by moonlight, then starts blowing smoke about how, "nobody was holding their TV correctly", he's full of sh!t.

It's also quite disingenuous and dare I say snobbish to photograph the scene so that only those with the most up to date and best equipment can view the program properly, as those individuals likely make up a fairly small percentage of the 18,000,000 and some odd million who watched it. Methinks it might be better to tell those with HDR OLED sets, that if they want true moonlight action, that they should turn down their brightness.

Once upon a time, music studio engineers would make their final mix on some kind of garbage table radio, since the majority of the music would be played back on the same kind of device, or a trash single speaker car radio. It made no sense to mix for the high end audio of the period, when the teens listening to it didn't have access to such luxuries. I think the same principles attach to this issue.

I hope I succeeded in somehow making some of that make sense.
 
Last edited:
Well, I am now in the 'vocal minority'. I watched got s8e3 on my old monitor. I will re-watch it on my old entry level Samsung 4k tv and see if there is an improvement.

edit: I played with my old acer monitor's settings >> default user to movie >> the 'dark scenes' are now 'more visible'. dafuq!. hbo is right.

we who are supposed to be knowledgeable should set our monitors and tv screens according to contents (movie, still picture slideshow, normal desktop, gaming, etc) and NOT one setting for all contents.
 
Last edited:
Watched this on a Sony xe9000 (uk) on Sky Q (Sky Atlantic) in HD. Had the curtains drawn and it was fine.

Only one problem and one that Julio mentions - the colour banding in the darkest images was pretty awful. Huge black space taking up 1/2 of the screen at times which was made of 3 colours maximum and those were hugely pixelated. I look forward to seeing it in HDR at some point, if they ever released a 4K UHD version. Or just a bluray for less compression.
 
I think the darkness did very well for the dramatic tension - it's the final battle with the death impersonated in the long winter's night, right? Usually it's a cheap horror-style trick but here I liked it.

It doesn't work well with h264 high profile (or similar codec) streaming, because the dark parts of the image are where the bits are saved - the viewer shouldn't be able to see that, but when the brightness is cranked up to it comes out.

I watch movies in a darkened room with a small light behind the screen. The factory calibration of the cinema profile of my Panasonic TV seems pretty good with just a little adjustments to the brightness and backlight.

Professional calibration should be the way to go, but if it's not an option try to avoid any "dynamic" or "intelligent" display profiles since they alter the image and have nothing in common with how professional mastering monitors display the image.

I also think that very good sound made this episode work so well, but it's another story...
 
Probably they are right. We should have already gone out and buy an 100'' OLED TV, set brightness at 200% and also put gamma at the highest possible value. That way we can enjoy the new "The long day" episode.

You don't need to do that either. You can calibrate any TV set to view the show properly.
 
Point being, if this fool at HBO wants to try photographing by moonlight, then starts blowing smoke about how, "nobody was holding their TV correctly", he's full of sh!t.
I've seen some folks say that they made it dark so they wouldn't have to spend so much money on CGI. I get that darkness can add to the tension but there were many times I had no idea what was happening. Who just got pulled to their death by the wights? Whose dragon is that in the sky? What the heck is Arya doing running around inside a dark castle?
Maybe they wanted me to be all confused but I'm not buying that explanation. They could have lit the battle a lot better or had random explosions to briefly show the scale of the enemy but it would have cost more to show that - better to show them as pixelated blurred shapes rushing past everyone at Mach 2.
And since when can wights move that fast anyway?
SPOILERS AHEAD!!!
Do not get me started on the defenders' battle tactics. There isn't a decent general among them. Cavalry wiped out in a frontal charge against an invisible but vastly superior enemy. Main elite force wiped out by being unable to retreat. No defence on their flanks, where the cavalry should have been. No way to light up the battlefield with their mighty trebuchets. No archers to weaken the enemy hitting the main force. And no way to repel the enemy scaling the walls, with burning pitch or rocks. You'd swear the producers wanted all of the defenders to be wiped out.
 
I've seen some folks say that they made it dark so they wouldn't have to spend so much money on CGI. I get that darkness can add to the tension but there were many times I had no idea what was happening. Who just got pulled to their death by the wights? Whose dragon is that in the sky? What the heck is Arya doing running around inside a dark castle?
Maybe they wanted me to be all confused but I'm not buying that explanation. They could have lit the battle a lot better or had random explosions to briefly show the scale of the enemy but it would have cost more to show that - better to show them as pixelated blurred shapes rushing past everyone at Mach 2.
And since when can wights move that fast anyway?
SPOILERS AHEAD!!!
Do not get me started on the defenders' battle tactics. There isn't a decent general among them. Cavalry wiped out in a frontal charge against an invisible but vastly superior enemy. Main elite force wiped out by being unable to retreat. No defence on their flanks, where the cavalry should have been. No way to light up the battlefield with their mighty trebuchets. No archers to weaken the enemy hitting the main force. And no way to repel the enemy scaling the walls, with burning pitch or rocks. You'd swear the producers wanted all of the defenders to be wiped out.

All you say seems reasonable, I'd just like to point out that it's a TV show where dragons fight animated corpses :)
 
@captaincranky My statement about 4,000 nits apparently comes from the fact that that is the peak brightness of the monitors currently used to master Dolby Vision. Here is some reading material on the subject from one of the big names in calibration of displays SpectraCal http://files.spectracal.com/Documents/White Papers/HDR_Demystified.pdf
It is rather technical, but a decent read.

And I agree with you that the director is a nit - especially given that he knows it was on HBO and that various cable/streaming services apply varying amounts of compression and thus degrade his pristine version before it hits the viewer's display.
we who are supposed to be knowledgeable should set our monitors and tv screens according to contents (movie, still picture slideshow, normal desktop, gaming, etc) and NOT one setting for all contents.
Speaking from my experience in the industry -

With the current technology in the industry in general, that simply cannot be done because each broadcaster, read that as Cox, Spectrum, Verizion, HBO Go, Netflix, etc., applies their own level of compression and, to use their marketing term, "Optimization" to the signal before it is sent out to you, the consumer.

What would be required is some sort of added reference frame that you could display on your display and then measure with a spectrophotometer to develop a profile that then can be fed to the monitor to calibrate the display.

That is simply not there at this point, so this is impossible to do.

You don't need to do that either. You can calibrate any TV set to view the show properly.
What can be done is to calibrate to one source, say using a calibration disk. While that gets you in the ballpark, so to speak, because of the above, I.e., every broadcaster applies their own massaging of the material before they rebroadcast it, as soon as you go to another source, heck, that means even another disk in the exact same player, the display is no longer truly calibrated to that source material.

I agree - it is possible to calibrate the display, but that calibration will only be in the ballpark. For it to match each source, or for that matter, each movie, TV show, etc, one would need to have the calibration reference that was used to master what ever it is you are watching.

Right now, we, as consumers, simply do not have access to that material, and thus calibrating so that each and every movie, TV show, whatever, is seen on our displays as it was intended by the director is impossible.

So here is another example that perhaps will make this clear.

To truly calibrate a scanner to display properly on a monitor, you need to scan a reference in the scanner, display that on the monitor, then take precise measurements with a photometer attached to your screen of that reference while it is displayed on your screen. Those measurements are then translated into an ICC profile that your monitor can use so that it accurately matches your scanner. Such a target is pictured at this link. https://www.filmscanner.info/en/Scannerkalibrierung.html

Doing anything less than this means that the monitor is not truly calibrated.

So, now you have calibrated your monitor to that scanner. Will your monitor's calibration now hold for other sources, say, your DSLR?

Of course, that is a loaded question. The answer is no, it will not. To calibrate your monitor to your DSLR, you need to photograph that target with your DSLR and repeat the process of measurements to obtain an ICC profile that can then be fed to your monitor so that your photos from your DSLR will display accurately on your monitor - with the caveat that as soon as you change something in your DSLR like white balance, your calibration is no longer good.

So, one more time, since we do not have access to similar calibration targets from directors, HBO, Netflix, you name your source, the absolute best that we as consumers can do it calibrate from a disk. That gives ballpark calibration at best - which is better than nothing, but not ideal.

Edit: Good calibration materials, like Video Essentials, explain exactly this. It is included in the reference material in Video Essentials (an NO, I am not an employee, just a satisfied user of Video Essentials).
 
Last edited:
I am with you watching on directv. There was so much of it screwed up by their compression technologies. All the clouds and smoke really turned out horrible. I watch in a black room with the TV being my only light source. If it had fast motion it can't keep up with the pictures either with a lot of pixels. It happens on most action motion shows.
 
What can be done is to calibrate to one source, say using a calibration disk. While that gets you in the ballpark, so to speak, because of the above, I.e., every broadcaster applies their own massaging of the material before they rebroadcast it, as soon as you go to another source, heck, that means even another disk in the exact same player, the display is no longer truly calibrated to that source material.

I agree - it is possible to calibrate the display, but that calibration will only be in the ballpark. For it to match each source, or for that matter, each movie, TV show, etc, one would need to have the calibration reference that was used to master what ever it is you are watching.

Right now, we, as consumers, simply do not have access to that material, and thus calibrating so that each and every movie, TV show, whatever, is seen on our displays as it was intended by the director is impossible.

So here is another example that perhaps will make this clear.

To truly calibrate a scanner to display properly on a monitor, you need to scan a reference in the scanner, display that on the monitor, then take precise measurements with a photometer attached to your screen of that reference while it is displayed on your screen. Those measurements are then translated into an ICC profile that your monitor can use so that it accurately matches your scanner. Such a target is pictured at this link. https://www.filmscanner.info/en/Scannerkalibrierung.html

Doing anything less than this means that the monitor is not truly calibrated.

So, now you have calibrated your monitor to that scanner. Will your monitor's calibration now hold for other sources, say, your DSLR?

Of course, that is a loaded question. The answer is no, it will not. To calibrate your monitor to your DSLR, you need to photograph that target with your DSLR and repeat the process of measurements to obtain an ICC profile that can then be fed to your monitor so that your photos from your DSLR will display accurately on your monitor - with the caveat that as soon as you change something in your DSLR like white balance, your calibration is no longer good.

So, one more time, since we do not have access to similar calibration targets from directors, HBO, Netflix, you name your source, the absolute best that we as consumers can do it calibrate from a disk. That gives ballpark calibration at best - which is better than nothing, but not ideal.

Edit: Good calibration materials, like Video Essentials, explain exactly this. It is included in the reference material in Video Essentials (an NO, I am not an employee, just a satisfied user of Video Essentials).

I don't know why you bothered going through professional calibration examples. This isn't a professional printing studio trying to get their screen to exactly match their commercial printer's output. Some variance on consumer equipment is perfectly acceptable and fine.

My point was that a screen with basic reference calibration is far better then an uncalibrated one. I never said it was perfect for professional use.

You should consider the context before posting, very few people care that the maroon red drapes in the red keep are Delta E 1, which is a difference that is barely perceivable by a graphics professional who is starting directly at the drapes vs the reference.
 
I don't know why you bothered going through professional calibration examples. This isn't a professional printing studio trying to get their screen to exactly match their commercial printer's output. Some variance on consumer equipment is perfectly acceptable and fine.

My point was that a screen with basic reference calibration is far better then an uncalibrated one. I never said it was perfect for professional use.

You should consider the context before posting, very few people care that the maroon red drapes in the red keep are Delta E 1, which is a difference that is barely perceivable by a graphics professional who is starting directly at the drapes vs the reference.
granted that both of you have made some excellent points.

However, the "loose nut between the steering wheel and the seat", is generally the studio engineer. I had two years of college, (actually 13th & 14th grades), of photography, which was fairly rigorous.in determining correct color balance by eye, and photo printing on photographic color paper, getting to the point where the instructor agrees with your assessment. The factors which affect the end result are the same through all visual disciplines.

OK, myself, and any one of a different color engineers a various broadcast studios are likely to differ in taste. So, no two simultaneous newscasts on different channels look the same, a situation which is aggravated by the fact the their basic color themes are different in and of themselves.

As for me sitting home and flying into the picture controls each time I change channels, is patently and completely out of the question. I make an overall adjustment in brightness, contrast, color saturation, and tonal balance, and flip through various source material until I feel I have it nailed down to a "one picture adjustment fits all", (for the most part), kind of way.

Broadcast TV, is in many ways better than cable or streaming, since it doesn't suffer from all the over compression nonsense, until of course, you get into the small town and secondary channels.. FWIW, it's also either 720p , or 1080i, on the primary channels.

But standards must be met at some level. I am even more unwilling to to d!ck around with my TV's picture controls, when I go from one episode of the same show title, to the next..

In perhaps simpler terms, if I watch, "Law and Order: Special Victim's Unit" one week, I expect Ms. Hargitay's skin tone and color, to be damned near identical the next time I turn the show on.

The same should attach to GoT without exception.
 
Last edited:
granted that both of you have made some excellent points.

However, the "loose nut between the steering wheel and the seat", is generally the studio engineer. I had two years of college, (actually 13th & 14th grades), of photography, which was fairly rigorous.in determining correct color balance by eye, and photo printing on photographic color paper, getting to the point where the instructor agrees with your assessment. The factors which affect the end result are the same through all visual disciplines.

OK,myself, and any one of a different color engineers a various broadcast studios are likely to differ in taste. So, no two simultaneous newscasts on different channels look the same, a situation which is aggravated by the fact the their basic color themes are different in and of themselves.

As for me sitting home and flying into the picture controls each time I change channels, is patently and completely out of the question. I make an overall adjustment in brightness, contrast, color saturation, and tonal balance, and flip through various source material until I feel I have it nailed down to a "one picture adjustment fits all", (for the most part), kind of way.

Broadcast TV, is in many ways better than cable or streaming, since it doesn't suffer from all the over compression nonsense, until of course, you get into the small town and secondary channels.. FWIW, it's also either 720p , or 1080 I.on the primary channels.

But standards must be met at some level. I am even more unwilling to to d!ck around with my TV's picture controls, when I go from one episode of the same show title, to the next..

In perhaps simpler terms, if I watch, "Law and Order: Special Victim's Unit" one week, I expect Ms. Hargitay's skin tone and color, to be damned near identical the next time I turn the show on.

The same should attach to GoT without exception.

I got you. Thanks for the backstory and excellent explanation. (y) (Y)
 
Any TV that have backlight bleed is not very good at watching dark scenes. My AH-IPS 34" 3412UM is one of them.

I would wait and buy the whole GoT series on 4k Bluray and get a Panasonic Professional OLED TV to watch it.

Something like the GZ2000 Oled, cause Korean TVs like Samsung often exaggerate the colors to give a punchy, vibrant colors to WOW you. But inaccurate when it comes to cinematic accuracy, shadow details etc.

2018 TV Shootout Results: Panasonic, LG & Sony OLED vs Samsung QLED
Look at 7:16
 
Last edited:
BS! I'm 20y broadcast engeneer and around 10 MCR (Master Control) engeneer, know everything about video audio standards in all formats including QC. I saw the 40min docu how the episode was shot, and ther is light enough on the set , for obvious CINEMATIC REASON they decided to tone it down and it is too dark, the black is crushed beyond resnoble explanation. I belive the raw material is completly OK and all this darkness was edited in post production. I understand what they wanted to express but that was really way too much, and belive me I know how setup corectly 2 of my LG screens one led +HDR the other OLED Dolby Vision, it's too freaking dark if you play with brightness or contrast you just make it worse, either way grayish and the colors get washout or as it should be but then it is 75% of the time so dark with some extreme bright spots (fire sources) wich is very annoying, no balance at all, I call it slappy ****ery job and HBO/GOT team should take in account it is ment for TV /streaming and the Cinematic part doesn't really fit in. In other words they ****ed up massivly from post production and QC.
 
BS! I'm 20y broadcast engeneer and around 10 MCR (Master Control) engeneer, know everything about video audio standards in all formats including QC. I saw the 40min docu how the episode was shot, and ther is light enough on the set , for obvious CINEMATIC REASON they decided to tone it down and it is too dark, the black is crushed beyond resnoble explanation. I belive the raw material is completly OK and all this darkness was edited in post production. I understand what they wanted to express but that was really way too much, and belive me I know how setup corectly 2 of my LG screens one led +HDR the other OLED Dolby Vision, it's too freaking dark if you play with brightness or contrast you just make it worse, either way grayish and the colors get washout or as it should be but then it is 75% of the time so dark with some extreme bright spots (fire sources) wich is very annoying, no balance at all, I call it slappy ****ery job and HBO/GOT team should take in account it is ment for TV /streaming and the Cinematic part doesn't really fit in. In other words they ****ed up massivly from post production and QC.
Thank you!!!!!!!!!
 
BS! I'm 20y broadcast engeneer and around 10 MCR (Master Control) engeneer, know everything about video audio standards in all formats including QC. I saw the 40min docu how the episode was shot, and ther is light enough on the set , for obvious CINEMATIC REASON they decided to tone it down and it is too dark, the black is crushed beyond resnoble explanation. I belive the raw material is completly OK and all this darkness was edited in post production. I understand what they wanted to express but that was really way too much, and belive me I know how setup corectly 2 of my LG screens one led +HDR the other OLED Dolby Vision, it's too freaking dark if you play with brightness or contrast you just make it worse, either way grayish and the colors get washout or as it should be but then it is 75% of the time so dark with some extreme bright spots (fire sources) wich is very annoying, no balance at all, I call it slappy ****ery job and HBO/GOT team should take in account it is ment for TV /streaming and the Cinematic part doesn't really fit in. In other words they ****ed up massivly from post production and QC.

I agree. It's nice when people know what they are talking about.

It reminds me of one episode of "Air Crash Investigations" when a miniscule movement of the control stick switched off the autopilot. The AP played a very silent sound notification to notify the crew. Also, its little light indicator went off. But the pilots didn't see it, because there's like 500 indicators lit in the cockpit.

The plane crashed, because the pilots thought auto-pilot was still on, and when they figured out it wasn't they were too low.

Now... some pilots argued that it's a pilot error, because they should have piloted the airplane. Which sounds logical. On the other hand, as a Windows GUI developer I'm pretty sure it's the airplane design fault. Autopilot is something like a 3rd pilot. You assign it a task and you expect it to carry out that task.

If it can't, then it should clearly and loudly inform you that it's not in control anymore, and that you should take control. Sure... maybe the designers wanted to make the "3rd crew member" a bit more subtle, but often it's better to be clear and loud, than to count on subtle artistic effects.

I think in this HBO episode, they just went too artistic and unrealistic. They were making a video for their superb-quality internal theaters, to impress the management and test audience. Instead of taking into account limitations of display devices of their average viewer. Decoupled from reality, a frequent problem with artists.
 
Last edited:
....[ ]...I think in this HBO episode, they just went too artistic and unrealistic. They were making a video for their superb-quality internal theaters, to impress the management and test audience. Instead of taking into account limitations of display devices of their average viewer. Decoupled from reality, a frequent problem with artists.
Or we could forgo your altogether too abstract air crash analogy, which is in and of itself, "detached from reality", and concentrate on an altogether more plausible "conspiracy theory", which is:

HBO is more than likely aware of issues of streaming compression, and how many times their show was, and will be, pirated.

Accordingly, "if you want to see the show as the artist intended", you'll have to double down, and pay for the full hi-def, (minimum), Blu-ray set, with the possibility it might also be made available in 4K as well.

So suckas, they "GoT" you to pay for the streaming and subscription fees, and now you've "GoT" to pay twice, to see a what amounts to a "rerun", as the way it should have been shown in the first place....:rolleyes:-
 
Last edited:
OK, myself, and any one of a different color engineers a various broadcast studios are likely to differ in taste. So, no two simultaneous newscasts on different channels look the same, a situation which is aggravated by the fact the their basic color themes are different in and of themselves.
Ah, Joe Kane's quip in so many words. Never Twice the Same Color. :eek:

From the responses in the thread, its obvious that there were differing experiences for this episode from "it looked great to me" to "it looked like :poop:".

To me, it sounds like even a calibrated TV was not going to guarantee that this episode showed all the director had intended.

I don't know why you bothered going through professional calibration examples. This isn't a professional printing studio trying to get their screen to exactly match their commercial printer's output. Some variance on consumer equipment is perfectly acceptable and fine.

My point was that a screen with basic reference calibration is far better then an uncalibrated one. I never said it was perfect for professional use.

You should consider the context before posting, very few people care that the maroon red drapes in the red keep are Delta E 1, which is a difference that is barely perceivable by a graphics professional who is starting directly at the drapes vs the reference.
I am sorry you feel that way.

We are on the same page about basic calibration being better than nothing.

I said what I said to emphasize that every source is different - even at a professional level; broadcasters are certainly at, or at least should be, IMO, at, that level.

Unfortunately, with this episode, it sounds like even on a calibrated display, there was no guarantee that the episode was going to look good, and telling people who may be less knowledgeable about the topic that calibrating their display is all they need to get a great picture is, to me, misleading.

In a way, it is unfortunate that stations no longer broadcast the color bar pattern during "off hours" as that could be used to calibrate a display to each channel - if one so desired, though personally, I cannot imagine that one would go to all that trouble.
 
I've seen some folks say that they made it dark so they wouldn't have to spend so much money on CGI. I get that darkness can add to the tension but there were many times I had no idea what was happening. Who just got pulled to their death by the wights? Whose dragon is that in the sky? What the heck is Arya doing running around inside a dark castle?
Maybe they wanted me to be all confused but I'm not buying that explanation. They could have lit the battle a lot better or had random explosions to briefly show the scale of the enemy but it would have cost more to show that - better to show them as pixelated blurred shapes rushing past everyone at Mach 2.
And since when can wights move that fast anyway?
My tv is tuned and actually professionally calibrated. I’m not saying the whole episode was bad. But there were areas of too much motion blur and darkness like they were trying to hide bad cgi
I see you both came to very similar conclusions about CGI, and its expense, in conjunction with motion blur and its utility.

IMHO, the classic example of using motion blur to eliminate elaborate FX strategies, and/or impossibilities, is ALL of the movies in the "Transformers" series. While the much simpler children's toys do actually turn from car, to being, a lot of size, shape, and fit liberty is being taken on the silver screen. It's very, very noticeable if you look close at the "transformations".

IDo not get me started on the defenders' battle tactics. There isn't a decent general among them. Cavalry wiped out in a frontal charge against an invisible but vastly superior enemy. Main elite force wiped out by being unable to retreat. No defence on their flanks, where the cavalry should have been. No way to light up the battlefield with their mighty trebuchets. No archers to weaken the enemy hitting the main force. And no way to repel the enemy scaling the walls, with burning pitch or rocks. You'd swear the producers wanted all of the defenders to be wiped out.
Since the motto of the series is, "all men must die", and the show is very, very, dark, I have to wonder why this comes s a surprise to you..

One possible finale is that the king of the White Walkers sits on the Iron Throne. Hey, could happen.
 
Last edited:
Back