1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Opinion: The Beauty of 4K

By Julio Franco · 14 replies
Aug 7, 2018
Post New Reply
  1. Being able to see the cinematographic nuance of a well-lit scene, the fine-grained details of a high-resolution photo, or just a razor-sharp image of whatever you happen to be viewing, there are few things as satisfying as taking in the glories of a beautiful 4K display—whether it’s on a TV or a PC. That’s particularly true if you can enjoy the enormous color range on an HDR (High Dynamic Range)-enabled screen, offering the billions of colors possible with 10-bit color instead of the traditional 16.7 million colors of 8-bit color.

    In fact, US consumers seem to agree. As discussed on a recent Techpinions podcast (US Consumer Electronics Trends: PCs, TVs, Headphones, Smart Home and Wearables), the very mature TV industry is still one of the largest categories in all of consumer electronics, with sales expected to increase this year thanks in large part to the move to 4K TVs. Not only that, but the often-overlooked PC monitor market is also very robust. Admittedly, not all PC monitors offer 4K resolution, but many of the large monitors driving sales in the category do. Plus, others offer higher resolutions than we’ve been accustomed to, because consumers are hungry for larger (and finer) screen real estate.

    In the world of TVs, the market activity is all about 4K and, for many, 4K HDR. There’s been an explosion of low-cost, large-screen (50”+) options offering this resolution from many different vendors, as well as continued refinement and enhancements for higher-end models.

    Some of the latest offerings in the higher-end TV market come from Sony, which introduced its new Master Series line last week in New York. The A9F OLED-based model (available in both 55” and 65”) and the Z9F LCD-based device (available in both 65” and 75”) are top-of-the-line 4K HDR TVs that feature the company’s new X1 Ultimate processor—a Sony-designed piece of silicon that optimizes the image quality for the unique characteristics of the display panels integrated into these sets. Though not always recognized for its silicon expertise, Sony has actually been designing key semiconductor chips for integration into its devices for more than four decades. (Sony is also a leader in image sensors for smartphones, digital cameras, robots, IoT devices, and increasingly, autonomous cars.)

    "Though not always recognized for its silicon expertise, Sony has actually been designing key semiconductor chips for integration into its devices for more than four decades."

    The X1 Ultimate, in particular, delivers on the full color range potential of HDR. Unlike fixed pixel counts, such as the 3,840 x 2,160 dimensions of any 4K TV, there are several ways to implement HDR. As a result, not all implementations of HDR are equivalent—even on TVs that use the same raw display panels. With the X1 Ultimate, Sony offers resolution and color enhancements dynamically on a per object (not just per scene) basis. The X1 Ultimate is also capable of leveraging the LED backlights used on the LCD-based model to deliver a higher contrast image and, on OLED panels, of manipulating what they called Pixel Booster technology for a broader range of colors.

    In addition to their imaging enhancements, the new Sony Master series TVs also have a number of refinements related to calibration, both for location and content. Calibration is the process of ensuring that the detailed color and brightness settings are optimized for the physical environment in which the TV is located—accounting for local lighting, etc.—as well as the content being played. One particularly unique capability on the new Master Series is a Netflix Calibrated Mode—a Sony exclusive feature that optimizes the display for each piece of Netflix-originated content with appropriate metadata embedded in the signal. (Like most recent Sony TVs, the Master Series are smart TVs based on Google’s Android TV platform and feature a built-in Netflix app—as well as apps from many other over-the-top (OTT) content providers.) Basically, this mode ensures that you automatically view any Netflix-generated material exactly as its creators intended—a bit of geeky TV tech, but definitely cool if you want accurate color and brightness renditions.

    Of course, as mentioned earlier, the benefits of 4K go well beyond TVs. I’ve recently been enjoying Dell’s new XPS 15 2-in-1 notebook with a 4K resolution, 10-bit color display that’s powered by the unique Intel/AMD collaboration chip uncreatively titled the Intel 8th Generation Core Processor With AMD Radeon RX Vega M Graphics. In an industry first, the chip combines an Intel CPU with a discrete AMD Radeon GPU integrated into a single module and connected by a new high-speed bus called Embedded Multi-Die Interconnect Bridge, or EMIB. The net result is a powerful (though also power-hungry) notebook with extremely responsive graphics suited for the most demanding applications and games. HP also offers a version of their popular Spectre notebook line, the x360 15T, with this new Intel/AMD combo chip and a 4K display.

    For standalone monitors attached to desktops, or functioning as additional displays for notebooks, there is a wide range of 4K HDR monitors from Dell, Samsung, LG, Asus, HP, Benq and others. Beware that not all graphics cards or notebooks offer the ability to drive a 4K HDR display, so you have to do your homework. However, if you have a PC that does support it, the visual results of a 4K HDR monitor are well worth it.

    Looking ahead to the future of both entertainment and computing, there are certainly going to be other means of consuming content, interacting with our data, and manipulating applications than large, high-resolution displays. You’ll be hard-pressed, however, to find something quite as beautiful and compelling as a big 4K screen.

    Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter . This article was originally published on Tech.pinions.

    Image credit: Nate Grant via Unsplash

    Permalink to story.

  2. TomSEA

    TomSEA TechSpot Chancellor Posts: 3,124   +1,617

    There's no doubt 4k looks gorgeous. But when it comes to gaming - is the cost worth the end result? Between the expensive monitor and high-end graphics card required to run it at decent FPS, you're looking at around $1,500 for bargain prices. For a high-end gaming 4k, the monitor alone can hit $2,000.

    I use a Dell 27" LED QHD GSync, 2560 x 1440, 144Hz and can push it plenty with an overclocked GTX 980ti (soon to be an 1180 when they come out). Total cost for both was about $900. And the visual difference between that set-up - especially with the GSync - and 4k is almost indiscernible.

    For strictly gaming, I think 4k is still a couple of years out before you can really get value for your money.
  3. Vulcanproject

    Vulcanproject TS Evangelist Posts: 737   +1,065

    For sure. The common HD display resolution resolution prior was 1920 x 1080, and it very quickly moved to 3840 x 2160 with no mid point transition. 1080p HDTVs for example virtually disappeared from all model ranges in the space of just 2 years, except for the very lowest end sets. Cheap 4K displays just exploded.

    Going from 2 megapixel resolution to 8 megapixels is a vast leap, the hardware hasn't really caught up with it yet. Such a sharp transition for display technology.

    The evolution from 640 x 480 to 1280 x 720 was only 3 times the resolution bump. The hardware already existed to push those gaming resolutions particularly on PC, where by the mid 2000s 1280 x 1024 was very common. So no problem there. A mid range 6600GT in 2004 was expected to deal with that resolution, and it did very well.

    Moving to 1920 x 1080 was another relatively easy step, a little over twice the resolution of 1280 x 720. Again, by the time mid range priced 1080p HDTVs were available in the latter 2000s graphics hardware was starting to be able to push it comfortably. Prime example: 8800GT was a mainstream card at the end of 2007, very capable.

    4K has been different. It's a step of 4 times the resolution over the previous most common, it's bigger than ones prior! This happening at a time where graphics hardware is at it's most expensive, where Moore's law is dead, gains are smaller each generation and slower to arrive.

    Affordable GPU hardware for 4K games will catch up fairly soon, but it is behind the curve.
    Last edited: Aug 7, 2018
  4. VitalyT

    VitalyT Russ-Puss Posts: 4,478   +3,036

    My two-cent....

    HDR is very overrated. I have an HDR 4K TV, plus a lot of HDR content now, and I will tell you that a prefer non-HDR version of movies most of the time.

    Movies are shot with the lighting in a such a way that made them look best on a regular TV. Granted, the sun is shining brighter, faces may show more red, and nature may seem even greener than it is. But that's what sets a happy mood in a lot of movies.

    When I switch over to HDR, the all-familiar bright movies are suddenly color-less, as all artificial colors are gone from them, and the all-natural look is just sad.

    I have been comparing a lot of movies side by side this way, and it is just frustrating. I have stopped getting HDR version of the movies.
    Edito and BSim500 like this.
  5. Evernessince

    Evernessince TS Evangelist Posts: 3,995   +3,481

    Typically most TVs have a movie present, which is what you are looking for. If your's doesn't have that you can always adjust the color settings manually.

    Whether or not an HDR video appears "fake" or not is really an artistic choice of the companies making the video. If they want to make an over saturated HDR movie, they easily can.
    fps4ever and VitalyT like this.
  6. pan1c

    pan1c TS Member Posts: 19   +6

    Well I disagree.

    HDR content looks amazing on my screen. I'm using a 55 inch LG OLED 2017 model, with support for HDR and Dolby Vision.

    The combination of deep blacks; and really bright pixels, amazes me every time.
    lumbeeman likes this.
  7. VitalyT

    VitalyT Russ-Puss Posts: 4,478   +3,036

    It doesn't contradict what I said. HDR does look higher quality, but also sad, because the artificial colors that make movies look happier are all gone. The reality is a lot more gray than what's shown in movies, and I prefer the movies version of it.
  8. AleXopf

    AleXopf TS Rookie

    It's the problem is than you have a panel not capable of showing true HDR
    lumbeeman likes this.
  9. H3llion

    H3llion TechSpot Paladin Posts: 1,694   +438

    21:9, 4k, hdr, 120hz qdot ips or oled gimmiiiie
  10. lumbeeman

    lumbeeman TS Enthusiast Posts: 25   +24

    This 1000x. Most of the time that is the case, people have equipment that can't take full advantage of it or have one weak point in the system that negates the HDR experience.
  11. Badelhas

    Badelhas TS Booster Posts: 97   +46

    I own a Hisense h55n6800 4K HDR 55 TV inch but the only content I watch is from Netlfix. I dont know if it´s because it has a low bitrate and the image quality is reduced or something but I cant see any difference between a 1080p and a 4K HDR movie.
    I am a bit dissapointed, to be honest.
  12. Edito

    Edito TS Addict Posts: 101   +33

    I used to think that HDR is overrated but that is because when you get used to it it seems like a normal image but once you go back you realize that it's something really amazing in the details and it changes the experience but I do agree that its still hard to convince someone that 1080p or 1440p is not better than 4K just by the look of it... Most people don't care and I understand them.
  13. Sausagemeat

    Sausagemeat TS Maniac Posts: 409   +205

    I currently own a 4K monitor and I absolutely love it. However with my crossfire 280X there are very few games that I can run at 4K. But those that do look absolutely incredible. However I can’t seem to be able to get 4K Netflix running on it. I haven’t put a huge amount of effort into trying to get it to work, does anyone here know if it’s even possible? And if so, how I might go about getting it? My TV runs 4K Netflix and it looks superb, I’d love to be able to get that on my desktop system.
  14. Vulcanproject

    Vulcanproject TS Evangelist Posts: 737   +1,065

    You will need a Polaris or Vega GPU, AMD's latest drivers, HDCP 2.2 compliant monitor, all the Windows 10 updates, and you have to use Microsoft Edge to view. So a 280X is too old.
  15. lumbeeman

    lumbeeman TS Enthusiast Posts: 25   +24

    I think it is because your particular model only has a peak brightness of 450 nits. 1000 nits is what you really need. My first 4k was a Samsung 6300 model and needless to say it was no where near the 1000 nits and did not have wide color gamut. My second was an 8000 model and wow what a difference much better brightness and wide color gamut made.
    All these "4k" TVs being put out and so many lack the specs for a true experience, more pixels alone doesn't mean a better picture.

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...