As I mentioned earlier in my reply to Julio's post. I'm seriously thinking about blacking out the wall behind my home theater.rig. This since I got rid of my nasty silver grilled JBL "Stadiums", and bought a pair of equivalent (sort of) Klipsch (jet black 2 x 8" + horn), towers to replace them.IAs I understand the ideal "home theater room," one should be able to darken the room rather than lighten the set. Barring that, yes a brighter set would be good as long as it it calibrated.
Maybe but, light value measured photographically is a logarithm of 2. So 7 f-stops difference would be either 128 times the light, or the reciprocal 1/128th. Noon daylight equals an exposure of 1/100 sec @ F-16. You can sort of interpolate that the lower the maximum light value you start with, the less contrast is available before you hit black.That said, HDR would probably help, too, but most sets these days do not have the 4,000 nits peak brightness levels to take advantage of what HDR can really do.
Probably they are right. We should have already gone out and buy an 100'' OLED TV, set brightness at 200% and also put gamma at the highest possible value. That way we can enjoy the new "The long day" episode.
I've seen some folks say that they made it dark so they wouldn't have to spend so much money on CGI. I get that darkness can add to the tension but there were many times I had no idea what was happening. Who just got pulled to their death by the wights? Whose dragon is that in the sky? What the heck is Arya doing running around inside a dark castle?Point being, if this fool at HBO wants to try photographing by moonlight, then starts blowing smoke about how, "nobody was holding their TV correctly", he's full of sh!t.
I've seen some folks say that they made it dark so they wouldn't have to spend so much money on CGI. I get that darkness can add to the tension but there were many times I had no idea what was happening. Who just got pulled to their death by the wights? Whose dragon is that in the sky? What the heck is Arya doing running around inside a dark castle?
Maybe they wanted me to be all confused but I'm not buying that explanation. They could have lit the battle a lot better or had random explosions to briefly show the scale of the enemy but it would have cost more to show that - better to show them as pixelated blurred shapes rushing past everyone at Mach 2.
And since when can wights move that fast anyway?
SPOILERS AHEAD!!!
Do not get me started on the defenders' battle tactics. There isn't a decent general among them. Cavalry wiped out in a frontal charge against an invisible but vastly superior enemy. Main elite force wiped out by being unable to retreat. No defence on their flanks, where the cavalry should have been. No way to light up the battlefield with their mighty trebuchets. No archers to weaken the enemy hitting the main force. And no way to repel the enemy scaling the walls, with burning pitch or rocks. You'd swear the producers wanted all of the defenders to be wiped out.
Speaking from my experience in the industry -we who are supposed to be knowledgeable should set our monitors and tv screens according to contents (movie, still picture slideshow, normal desktop, gaming, etc) and NOT one setting for all contents.
What can be done is to calibrate to one source, say using a calibration disk. While that gets you in the ballpark, so to speak, because of the above, I.e., every broadcaster applies their own massaging of the material before they rebroadcast it, as soon as you go to another source, heck, that means even another disk in the exact same player, the display is no longer truly calibrated to that source material.You don't need to do that either. You can calibrate any TV set to view the show properly.
Same here!I haven't yet seen a single episode of GoT... ooops...
What can be done is to calibrate to one source, say using a calibration disk. While that gets you in the ballpark, so to speak, because of the above, I.e., every broadcaster applies their own massaging of the material before they rebroadcast it, as soon as you go to another source, heck, that means even another disk in the exact same player, the display is no longer truly calibrated to that source material.
I agree - it is possible to calibrate the display, but that calibration will only be in the ballpark. For it to match each source, or for that matter, each movie, TV show, etc, one would need to have the calibration reference that was used to master what ever it is you are watching.
Right now, we, as consumers, simply do not have access to that material, and thus calibrating so that each and every movie, TV show, whatever, is seen on our displays as it was intended by the director is impossible.
So here is another example that perhaps will make this clear.
To truly calibrate a scanner to display properly on a monitor, you need to scan a reference in the scanner, display that on the monitor, then take precise measurements with a photometer attached to your screen of that reference while it is displayed on your screen. Those measurements are then translated into an ICC profile that your monitor can use so that it accurately matches your scanner. Such a target is pictured at this link. https://www.filmscanner.info/en/Scannerkalibrierung.html
Doing anything less than this means that the monitor is not truly calibrated.
So, now you have calibrated your monitor to that scanner. Will your monitor's calibration now hold for other sources, say, your DSLR?
Of course, that is a loaded question. The answer is no, it will not. To calibrate your monitor to your DSLR, you need to photograph that target with your DSLR and repeat the process of measurements to obtain an ICC profile that can then be fed to your monitor so that your photos from your DSLR will display accurately on your monitor - with the caveat that as soon as you change something in your DSLR like white balance, your calibration is no longer good.
So, one more time, since we do not have access to similar calibration targets from directors, HBO, Netflix, you name your source, the absolute best that we as consumers can do it calibrate from a disk. That gives ballpark calibration at best - which is better than nothing, but not ideal.
Edit: Good calibration materials, like Video Essentials, explain exactly this. It is included in the reference material in Video Essentials (an NO, I am not an employee, just a satisfied user of Video Essentials).
granted that both of you have made some excellent points.I don't know why you bothered going through professional calibration examples. This isn't a professional printing studio trying to get their screen to exactly match their commercial printer's output. Some variance on consumer equipment is perfectly acceptable and fine.
My point was that a screen with basic reference calibration is far better then an uncalibrated one. I never said it was perfect for professional use.
You should consider the context before posting, very few people care that the maroon red drapes in the red keep are Delta E 1, which is a difference that is barely perceivable by a graphics professional who is starting directly at the drapes vs the reference.
granted that both of you have made some excellent points.
However, the "loose nut between the steering wheel and the seat", is generally the studio engineer. I had two years of college, (actually 13th & 14th grades), of photography, which was fairly rigorous.in determining correct color balance by eye, and photo printing on photographic color paper, getting to the point where the instructor agrees with your assessment. The factors which affect the end result are the same through all visual disciplines.
OK,myself, and any one of a different color engineers a various broadcast studios are likely to differ in taste. So, no two simultaneous newscasts on different channels look the same, a situation which is aggravated by the fact the their basic color themes are different in and of themselves.
As for me sitting home and flying into the picture controls each time I change channels, is patently and completely out of the question. I make an overall adjustment in brightness, contrast, color saturation, and tonal balance, and flip through various source material until I feel I have it nailed down to a "one picture adjustment fits all", (for the most part), kind of way.
Broadcast TV, is in many ways better than cable or streaming, since it doesn't suffer from all the over compression nonsense, until of course, you get into the small town and secondary channels.. FWIW, it's also either 720p , or 1080 I.on the primary channels.
But standards must be met at some level. I am even more unwilling to to d!ck around with my TV's picture controls, when I go from one episode of the same show title, to the next..
In perhaps simpler terms, if I watch, "Law and Order: Special Victim's Unit" one week, I expect Ms. Hargitay's skin tone and color, to be damned near identical the next time I turn the show on.
The same should attach to GoT without exception.
My tv is tuned and actually professionally calibrated. I’m not saying the whole episode was bad. But there were areas of too much motion blur and darkness like they were trying to hide bad cgiI didnt have an issue on my Television or my iPad. Just tune your TVs guys,
Thank you!!!!!!!!!BS! I'm 20y broadcast engeneer and around 10 MCR (Master Control) engeneer, know everything about video audio standards in all formats including QC. I saw the 40min docu how the episode was shot, and ther is light enough on the set , for obvious CINEMATIC REASON they decided to tone it down and it is too dark, the black is crushed beyond resnoble explanation. I belive the raw material is completly OK and all this darkness was edited in post production. I understand what they wanted to express but that was really way too much, and belive me I know how setup corectly 2 of my LG screens one led +HDR the other OLED Dolby Vision, it's too freaking dark if you play with brightness or contrast you just make it worse, either way grayish and the colors get washout or as it should be but then it is 75% of the time so dark with some extreme bright spots (fire sources) wich is very annoying, no balance at all, I call it slappy ****ery job and HBO/GOT team should take in account it is ment for TV /streaming and the Cinematic part doesn't really fit in. In other words they ****ed up massivly from post production and QC.
BS! I'm 20y broadcast engeneer and around 10 MCR (Master Control) engeneer, know everything about video audio standards in all formats including QC. I saw the 40min docu how the episode was shot, and ther is light enough on the set , for obvious CINEMATIC REASON they decided to tone it down and it is too dark, the black is crushed beyond resnoble explanation. I belive the raw material is completly OK and all this darkness was edited in post production. I understand what they wanted to express but that was really way too much, and belive me I know how setup corectly 2 of my LG screens one led +HDR the other OLED Dolby Vision, it's too freaking dark if you play with brightness or contrast you just make it worse, either way grayish and the colors get washout or as it should be but then it is 75% of the time so dark with some extreme bright spots (fire sources) wich is very annoying, no balance at all, I call it slappy ****ery job and HBO/GOT team should take in account it is ment for TV /streaming and the Cinematic part doesn't really fit in. In other words they ****ed up massivly from post production and QC.
Or we could forgo your altogether too abstract air crash analogy, which is in and of itself, "detached from reality", and concentrate on an altogether more plausible "conspiracy theory", which is:....[ ]...I think in this HBO episode, they just went too artistic and unrealistic. They were making a video for their superb-quality internal theaters, to impress the management and test audience. Instead of taking into account limitations of display devices of their average viewer. Decoupled from reality, a frequent problem with artists.
Ah, Joe Kane's quip in so many words. Never Twice the Same Color.OK, myself, and any one of a different color engineers a various broadcast studios are likely to differ in taste. So, no two simultaneous newscasts on different channels look the same, a situation which is aggravated by the fact the their basic color themes are different in and of themselves.
I am sorry you feel that way.I don't know why you bothered going through professional calibration examples. This isn't a professional printing studio trying to get their screen to exactly match their commercial printer's output. Some variance on consumer equipment is perfectly acceptable and fine.
My point was that a screen with basic reference calibration is far better then an uncalibrated one. I never said it was perfect for professional use.
You should consider the context before posting, very few people care that the maroon red drapes in the red keep are Delta E 1, which is a difference that is barely perceivable by a graphics professional who is starting directly at the drapes vs the reference.
I've seen some folks say that they made it dark so they wouldn't have to spend so much money on CGI. I get that darkness can add to the tension but there were many times I had no idea what was happening. Who just got pulled to their death by the wights? Whose dragon is that in the sky? What the heck is Arya doing running around inside a dark castle?
Maybe they wanted me to be all confused but I'm not buying that explanation. They could have lit the battle a lot better or had random explosions to briefly show the scale of the enemy but it would have cost more to show that - better to show them as pixelated blurred shapes rushing past everyone at Mach 2.
And since when can wights move that fast anyway?
I see you both came to very similar conclusions about CGI, and its expense, in conjunction with motion blur and its utility.My tv is tuned and actually professionally calibrated. I’m not saying the whole episode was bad. But there were areas of too much motion blur and darkness like they were trying to hide bad cgi
Since the motto of the series is, "all men must die", and the show is very, very, dark, I have to wonder why this comes s a surprise to you..IDo not get me started on the defenders' battle tactics. There isn't a decent general among them. Cavalry wiped out in a frontal charge against an invisible but vastly superior enemy. Main elite force wiped out by being unable to retreat. No defence on their flanks, where the cavalry should have been. No way to light up the battlefield with their mighty trebuchets. No archers to weaken the enemy hitting the main force. And no way to repel the enemy scaling the walls, with burning pitch or rocks. You'd swear the producers wanted all of the defenders to be wiped out.
@wiyosaya Good sir, please refer to post #72, for my take on this issue....[ ]...To me, it sounds like even a calibrated TV was not going to guarantee that this episode showed all the director had intended....[ ]....