captaincranky
Posts: 19,663 +8,799
Apparent brightness diminishes with distance from the source, with respect to point source illumination, illumination diminishes by half, when distance is increases by a.factor of 1.4, (the approximate square root of 2). And if anybody wants to watch a movie in a fully lit room, I wouldn't be able to help then with set adjustment.Higher brightness != higher contrast. It does lead to brighter colors as you were explaining. The contrast range is actually decreased once you go over 220 cd/m2 as the display is no longer able to show darker shades. TVs are expected to be viewed in rooms with light, so most sets have to make a compromise of better viewing conditions by increasing the brightness but they loose contrast as a result.
As for loss of contrast with higher brightness levels, you have to increase set contrast, along with color saturation, then use the brightness control, to compensate for apparent brightness loss. In other words, bright crimson, blocks more of the back light, than pale pink. A color's density affects either transmitted or reflected light values, (expressed as a percentage).
Now I'm not such a dolt as to increase brightness to the point where I can't get black on screen. .Accordingly, if the available gamma, shrinks, I'll gladly suffer the loss of shadow detail, as opposed to looking at the washed out screen you're describing.
As far as maximum nits available with any screen, I'll gladly turn a too bright screen down, as opposed to pegging a brand new monitor on all elevens, then watch it go to sh!t from there. So, if a monitor isn't going to come out of the box at 300 to 350 nits, Newegg can keep it.
Which is also true of my cheap a** Walmart, (Vizio). The trouble with energy saving features" is, the picture looks dull, flat, and washed out.Just saying, "Energy Saving", is something to write on the box, but nothing anybody in their right mind would want to watch.The energy saving features of every monitor I've ever had could be turned off in the OSD but I admittedly only purchase higher end displays.
Why would anybody need a "video player" to watch broadcast TV on an ATSC tuner equipped TV?? the tuner merely repeats the color space it receives without tweaking it.Could you elaborate on that last bit? I'm not sure if you are trying to say if this is a problem for just monitors or something else. TV shows are broadcast in a variety of formats and may use a variety of color spaces but that issue has been solved for some time now. Any video player on the market can already correctly translate them all to your PC monitor accurately and compensate for things like interlacing. The only thing that is going to alter the colors is going to be your monitor, assuming it's un-calibrated and/or low quality.
If a tuner did modify the color space of different content, networks would be getting their pants sued off, for modifying the "mood", and "substance" of the content. A beer ad is sparkly. A lobster is saturated red. A child coming home from scholl with a bad report card, from not having attended whatever preschool, is shades of gray.
Now, take a few moments to watch this piece of sh!t propaganda ad for "Rexulti". Notice first the woman at the beginning looks like a basset hound, (albeit a muted blue & gray basset hound), and further note as, "the drug kicks in", the color balance is shifted from muted grays and blues, to sunshine yellow: They even drop the lighting contrast to alleviate the simian lines on her face, thus making her look "happy".
Now, every broadcast technician has his or her own opinion, of how every thing should look, on THEIR screen,which isn't necessarily calibrated with YOURS. You go through half dozen commercials, you get a half dozen interpretations of color balance, saturation, brightness, and contrast. Whether or not they're mixing for the same color space, matters not, the end results vary wildly.
Last edited: