Consumer Reports: Your smart TV is tracking you and so are hackers

Higher brightness != higher contrast. It does lead to brighter colors as you were explaining. The contrast range is actually decreased once you go over 220 cd/m2 as the display is no longer able to show darker shades. TVs are expected to be viewed in rooms with light, so most sets have to make a compromise of better viewing conditions by increasing the brightness but they loose contrast as a result.
Apparent brightness diminishes with distance from the source, with respect to point source illumination, illumination diminishes by half, when distance is increases by a.factor of 1.4, (the approximate square root of 2). And if anybody wants to watch a movie in a fully lit room, I wouldn't be able to help then with set adjustment.

As for loss of contrast with higher brightness levels, you have to increase set contrast, along with color saturation, then use the brightness control, to compensate for apparent brightness loss. In other words, bright crimson, blocks more of the back light, than pale pink. A color's density affects either transmitted or reflected light values, (expressed as a percentage).

Now I'm not such a dolt as to increase brightness to the point where I can't get black on screen. .Accordingly, if the available gamma, shrinks, I'll gladly suffer the loss of shadow detail, as opposed to looking at the washed out screen you're describing.

As far as maximum nits available with any screen, I'll gladly turn a too bright screen down, as opposed to pegging a brand new monitor on all elevens, then watch it go to sh!t from there. So, if a monitor isn't going to come out of the box at 300 to 350 nits, Newegg can keep it.

The energy saving features of every monitor I've ever had could be turned off in the OSD but I admittedly only purchase higher end displays.
Which is also true of my cheap a** Walmart, (Vizio). The trouble with energy saving features" is, the picture looks dull, flat, and washed out.Just saying, "Energy Saving", is something to write on the box, but nothing anybody in their right mind would want to watch.

Could you elaborate on that last bit? I'm not sure if you are trying to say if this is a problem for just monitors or something else. TV shows are broadcast in a variety of formats and may use a variety of color spaces but that issue has been solved for some time now. Any video player on the market can already correctly translate them all to your PC monitor accurately and compensate for things like interlacing. The only thing that is going to alter the colors is going to be your monitor, assuming it's un-calibrated and/or low quality.
Why would anybody need a "video player" to watch broadcast TV on an ATSC tuner equipped TV?? the tuner merely repeats the color space it receives without tweaking it.

If a tuner did modify the color space of different content, networks would be getting their pants sued off, for modifying the "mood", and "substance" of the content. A beer ad is sparkly. A lobster is saturated red. A child coming home from scholl with a bad report card, from not having attended whatever preschool, is shades of gray.

Now, take a few moments to watch this piece of sh!t propaganda ad for "Rexulti". Notice first the woman at the beginning looks like a basset hound, (albeit a muted blue & gray basset hound), and further note as, "the drug kicks in", the color balance is shifted from muted grays and blues, to sunshine yellow: They even drop the lighting contrast to alleviate the simian lines on her face, thus making her look "happy".


Now, every broadcast technician has his or her own opinion, of how every thing should look, on THEIR screen,which isn't necessarily calibrated with YOURS. You go through half dozen commercials, you get a half dozen interpretations of color balance, saturation, brightness, and contrast. Whether or not they're mixing for the same color space, matters not, the end results vary wildly.
 
Last edited:
help me out - - what systems have TCP/UPD ports in the BIOS??
oops; misread. NetBIOS ports are not the same as TCP/UDP. Basically they are 137,138,139 - - the foundations to file sharing.
 
great idea

help me out - - what systems have TCP/UPD ports in the BIOS??
Here's a good description of it. https://en.wikipedia.org/wiki/NetBIOS
It is not in the BIOS. It is an API that is behind things like Windows Shares. IF you have Windows PCs in your network and the ports it operates over are not blocked from the internet, those PCs can send out tons of crap basically broadcasting to the world that there's a computer there. The port numbers for it are often listed as SMB ports. There are both UDP/TCP and "secure" UDP/TCP ports for the API.

If NetBIOS ports were blocked, that also ensured that any PCs behind a firewall that was blocking them were immune to WannaCry as WannaCry, before the patch for it, spread via those ports.
 
Well, TVs in general have a higher light output. More nits, equals more opportunity for high color saturation and plenty of contrast.
High nits are also important for HDR, but from what I have heard, there are no TVs yet that can reach the nit level of the HDR standard.

So in general, it's easier to "candy up" the screen output.
Joe Kane highly advises against the candy settings. ;)
 
Now, every broadcast technician has his or her own opinion, of how every thing should look, on THEIR screen,which isn't necessarily calibrated with YOURS. You go through half dozen commercials, you get a half dozen interpretations of color balance, saturation, brightness, and contrast. Whether or not they're mixing for the same color space, matters not, the end results vary wildly.
Joe Kane also says that NTSC stands for Never Twice the Same Color!
 
Joe Kane also says that NTSC stands for Never Twice the Same Color!
Given that some people can't even tell green from red, why would you expect that it would be? The color temperature of the source of illumination, is the downfall of our films and digital cameras.

Our brains reinterpret color temperature, while our machines don't. So, should you walk into a candlelit scene, the flame will appear to you as pale yellow to white, and the reflected light faces on faces, will also "cool off, or we will perceive more blue in the light, than is actually present. However, take a picture with a daylight balanced film or camera, you get results which are a fairly hideous reddish orange.

While people might visit an audiologist in their lifetime, they'll never visit a "color visionologist". An eye doctor will only check your color response on a perfunctory level. You might be asked which is red, and which is green. But, you'll never get a trace graph indicating the actual "frequency response", of your eyeballs. So, my idea of what emerald green looks like, may not jibe with yours, but it is something, which is both unknowable, and well as indescribable.

I have had a fair amount of training in color photography, and one overarching prerequisite was being able to balance for skin tones and neutrals by eye, in a printing lab situation. Obviously, scientific accuracy requires specialized equipment such as colorimeters, to interpret a scene's parameters identically, onto film, monitor, or paper.

Kodak's color printing guides used to come with a test pattern, (like that you might see on TV), and a set of graduated viewing filters in the 3 primary, and 3 secondary colors, for evaluating color balance in a print. With a reasonable amount of experience, you should no longer need to use them. In a more, "last 20 years" context, you should be able to know which lever to diddle first in the Adobe "levels" window

Wow, now we're, oops I meant "I'm" really off topic..
 
Last edited:
Personally, I avoid all of this nonsense by simply not connecting my TV to the Internet in the first place. I used to but after a bad firmware update that never got fixed, I quit connecting and haven’t looked back.
I am not sure its a good idea to employ a Luddite at a technology news site (kidding....a little).
ROFL How does not connecting my television to the internet make me a Luddite? Haha
 
ROFL How does not connecting my television to the internet make me a Luddite? Haha
You're only a Luddite, if you're simply making excuses because your CRT analog TV, doesn't have a CAT port.

Now, if you have a "smart TV", and refuse to connect it, it makes you, "a master of your own destiny"! (y):D
 
Back