Op-ed: Are we ready for 4K?

Jos

Posts: 3,073   +97
Staff

ready

The rest of the industry is finally catching up to what smartphone and tablet owners already knew: people want high resolution, high pixel density displays. It’s ridiculous that even in 2014, the majority of mainstream laptops being sold feature a display resolution that’s not only lower than mid-to-high-end smartphones with screens a fraction of the size, but they employ vastly inferior panel technology to boot. Thus, the great white hope is 4K.

Kind of a misnomer, “4K” is essentially rounding up the horizontal resolution; actual display resolution is typically 3840x2160. It maintains the 16:9 aspect ratio that a lot of us still chafe under, but essentially quadruples the resolution of a 1080p display, doubling it in each dimension.

For now we can ignore manufacturing costs for these new, higher resolution panels since those will eventually come down along with the overall price to the end user as economy of scale kicks in. Yet there are still a few mitigating factors that surface when we talk about making the move to a 4K display, especially if you’re a gamer, and these mitigating factors are the kinds of things that keep me using conventional 1080p/1200p displays at home.

Editor’s Note:
This guest post by Dustin Sklavos was originally published on the Corsair blog. Dustin is a Technical Marketing Specialist at Corsair and has been writing in the industry since 2005.

New technology often materializes before the infrastructure is fully in place. Intel launched their Ultrabook initiative with Sandy Bridge processors that weren’t really designed to hit those low voltages. Each transition to a new memory technology was very gradual. 4K displays have a similar chicken and egg problem, and there are three key points where we’re just not quite ready.

Performance

Our mid to high end graphics cards have been able to handle games at 1080p for a couple of generations now. As game technology somewhat stagnated, being held back by the PS3/Xbox 360 era, console ports just weren’t pushing our graphics cards. AMD even brought Eyefinity to market with the Radeon HD 5870 to essentially give the hardware room to stretch its legs since the card itself was hilariously overpowered for 1080p at the time of launch.

Today we have the AMD Radeon R9 290X, architected specifically for 4K. The problem is that the GPU is monstrously large, pulls close to 300W under load, and still can’t push maximum or often even high settings at 60fps at 4K.

Adding a second card mitigates the issue, but doesn’t solve it, and leaves you at the mercy of AMD’s driver team. Meanwhile, Nvidia’s beastly GeForce GTX 780 Ti eats the 290X alive at conventional resolutions but cedes ground at 4K, and the 3GB framebuffer actually becomes problematic at 4K resolution. SLI again mitigates, but doesn’t solve the problem.

To replicate the kind of experience we have at 1080p with a single graphics card at 4K actually can require as many as four of those GPUs, and that’s again assuming the drivers show proper scaling. Suddenly your 4K gaming experience requires a 1500W power supply; that's an unrealistic bar to clear.

This problem will likely solve itself in one to two generations... of manufacturing process nodes. But 20nm GPUs aren’t due until 2015, and I wouldn’t expect the next step to be available until 2017 at the earliest.

Interface

The majority of our existing display interfaces were designed to top out at about 1920x1200 at 60Hz. Dual-link DVI will go up to 2560x1600, but that’s still not enough to drive 3840x2160 at 60Hz. 30Hz can be done over HDMI, but 30fps looks horrendous on a PC. HDMI 2.0 can do 3840x2160 at 60Hz and was ratified last September, but it’s very rare in the wild.

4K monitors use one of three methods to produce a 4K, 60Hz image. They’ll either use two cables and connect to two outputs on your card, essentially bridging the image in software; they’ll use a single DisplayPort connection that will reveal itself to Windows as two 1920x2160 panels and again have you stitch together the image in software; or – and this is the most ideal method – they’ll use a single DisplayPort 1.4 connection and have it reveal as a single 4K monitor. Hilariously, this last most efficient method is also the rarest one.

DisplayPort is fantastic for this, but the problem is that DisplayPort has also often been a second class citizen on graphics cards and monitors alike. Current generation Nvidia cards only include a single DisplayPort connection, so multi-monitor 4K setups are right out. AMD cards often include more, but again, you’ll be dealing with AMD’s persisting driver and frame-pacing issues.

This technology exists, but is not widespread yet and may need a generation or two. It does effectively sound the death knell for DVI, which just plain can’t handle 4K at 60Hz and has no plan on the books to change.

Software

The final issue is software. Since we basically need two graphics cards just to have a decent 4K gaming experience, we’re left at the mercy of the GPU vendors to straighten out their multi-GPU solutions. AMD and Nvidia both have issues in this department, but this isn’t even the biggest barrier.

Where 4K and high-ppi displays in general have real issues is in Windows itself. Microsoft has tried to make strides to improve scaling on high-ppi displays, but third party software vendors have failed miserably to pick up the slack. Scaling doesn’t even work properly in Google Chrome.

So we now have our incredibly high resolution monitors that we asked for, but the software ecosystem is so hosed that they’re actually lousy for productivity. Adobe is particularly guilty of this, with none of their software scaling up on high-ppi displays, resulting in extremely tiny images. Gamers are really out of luck, too, because Steam doesn’t scale either.

Your other issue is that if you’ve been avoiding using Windows 8.1 at all, you’re extra screwed, because Windows 7’s scaling is even worse. So if you want to use a high-ppi display, Windows 8.1 is your best bet.

This is the direction the industry is moving, though, so at a certain point pressure will set in and developers will have to either get on the bus or get left behind. In the meantime, you’re still stuck with a display that’s beautiful to behold, but troublesome for gaming or productivity. What’s really left?

Permalink to story.

 
4K probably won't be a viable choice until 2020 for the gaming industry. I'm honestly satisfied with 1080p for a standard TV, however I do understand why PC enthusiasts want that extra pixel boost for being so close to their monitors.
 
It's at least a few years off being a viable option yet in my opinion which is why we haven't included 4K results in our GPU reviews. That said we might start to include them next year.
 
The only difference between 1080P and 4k is purely aesthetics. Look at the detail of the leaves in the wind... to die for... apparently. The upgrade from SD to HD was night and day. Suddenly we had enough pixels to make more useful gaming interfaces without taking up half the screen. The upgrade to 4k is such a shallow upgrade. We don't need 4k. Its not that pretty. The money spent creating artistic detail is money that could be spent on game designers and writers that don't suck.
 
That is soo true. It's useless and I feel just happening because Intel wanted it to happen not because anyone else wanted it. That's why the hardware is there and nothing else is there to support it.
 
4K is a solution in search of problem. The human eye can't even distinguish added resolution above 2400 or so DPI. Its a pointless waste of energy and computing resources meant solely to push more hardware down the channel. Its going to be a toy of the rich for years to come. Oh, and there's another major bit of infrastructure missing form the formula: bandwidth. 4K steaming is a non-starter for at least 5 years and probably longer, esp. in US with its greedy ISPs and their second-rate service. You also have to consider that games which actually use 4K-worthy textures will double in size. Who wants to spend the money, time or hard drive space on a standard that you won't even notice on anything smaller than a 46" TV?
 
The human eye can't even distinguish added resolution above 2400 or so DPI.
Just FYI, 4K on a desktop monitor isn't even remotely close to 2400 DPI. Even on a 5-inch smartphone screen it's nowhere near that mark.
Unless you meant 240 DPI, in which case that is just wrong.
 
DPI and PPI are different. PPI is the standard unit of measurement for pixel density these days. Humans can make out jaggies on a 24inch 2560x1440 monitor.

It's called progress. The 1% will buy this, allowing it to eventually become mainstream. If you're against progress, what are you doing reading a tech blog?
 
The only difference between 720P and 1080p is purely aesthetics. Look at the detail of the leaves in the wind... to die for... apparently. The upgrade from SD to HD was night and day. Suddenly we had enough pixels to make more useful gaming interfaces without taking up half the screen. The upgrade to 1080p is such a shallow upgrade. We don't need Full HD. Its not that pretty. The money spent creating artistic detail is money that could be spent on game designers and writers that don't suck.
 
I'm still getting plenty out of my 1080P display. I've seen 4k, and it does look great, but the graphics quality in video games currently gives us no motivation to move to 4k. I'd rather have a 1080p AMOLED display than a 4K IPS panel.
 
Perhaps adaptive-sync would go a long way towards diminishing the nasty effects of low fps on UHD monitors... maybe as low as 25fps. I'm just waiting for someone to write a review on such a setup.

BTW, I didn't know there was a DisplayPort 1.4. I thought the latest was 1.2a.
 
DPI and PPI are different. PPI is the standard unit of measurement for pixel density these days. Humans can make out jaggies on a 24inch 2560x1440 monitor.

It's called progress. The 1% will buy this, allowing it to eventually become mainstream. If you're against progress, what are you doing reading a tech blog?

Not just that but with 4k you have more workspace to work with and everything is crystal clear with no jagged edges. Remember when switch from 1080P to 800x600 it was a massive difference! Hence why 4k is here to stay.

Sure technology might not be here right now but its still playable at 4k if your willing to shell out a few bucks, paying a premium for a chance to use something 99.9% of the world wont be able to try for years to come is worth it tbh.
 
Well my friend with far to much cash on hand (just kidding) bought one of Samsung's first 4k tv's to be released in 65". He is replacing his 1 year old (sad I know) 1080p Samsung. Both are 65" and honestly I couldn't tell to much difference, and he was sad because neither could he. We were even pushing 4k video to the 4k tv and comparing it with the same video in 1080p. As far as the market goes we are at least 5 years away from 4k being the mainstream. A good 55" 1080p tv from a good brand can be had for like $1400, a 4k tv in the same size would cost 2-3times that. Our system are just now getting to point where 1080p can flourish. Cable/satellite can reliably deliver it. Streaming services can as well. and our ISP's can deliver the bandwidth needed to stream 1080p at more affordable prices. Our home networks can now stream 1080p to multiple devices thanks to advances in wireless networking. 720p had a good run there as the main "HD" for the last 7 years and now its 1080ps time to truly shine, consoles can game in 1080p, blueray is still growing. I see 1080p enjoying a good 5 years as king with 4k finally becoming more mainstream in 2017-18 and having 8k be like 1080p was 7 years ago. Then by 2025 8k being the standard, but just thinking what computing power will we have in 2025 and how small of a package it will be in just begins to blow my mind. With computer screens in mind I see 108p remaining the standard and I could see the lower end laptops finally moving to 1080p. the high end will be a mix of 1600p and 4k monitors for those that need it, but I would say scaling won't be truly there for another 2-3 years. For PC gaming in 4k I would say another 2-3 generations will have it down perfectly, good FPS's and good performance/watt. That puts us 3 years away from it not being a enthusiast budget only affair. It puts consoles gamers prob around 6 years out. I don't see Sony/Microsoft keeping these consoles going as long as the last gen, prob one refresh then a new design, unless there's a big economic tank again. As for me I am gonna wait a good 3 years before I get a 4k monitor (I don't use tv's). Just finally replaced my 24" acer 1080p monitor that died 2 yers ago (been using a cheapo HP lcd monitor) with a HP 23xi IPS display.
 
I'll be sticking with 2560x1440 resolution for quite a while. At least until a large amount of media becomes available. Also web pages are going to have to be written with a bit more thought to spanning these massive resolutions, at present even on 1920x1080 displays most pages have big borders down the sides. A full screen web page on a 4K monitor is going to look ridiculous. They need to have text and picture content auto formatting that works well.
 
What people don't realise is this: TVs are 1080p, monitors are 1080p. On a TV images look like real life/indistinguishable. And this is at 1920x1080. So WHY do we need 4K?? The real difference between computer-generated games not looking like TV is that things like ambient occlusion in a computer have to rendered in real time whereas on TV they are stored frames. To generate the levels of ambient occlusion present in a TV image your GPU in real time your GPU would need to be many, many times more powerful. That is why games developers stick with a compromise which ends up leaving the game looking like... a game and not reality, despite being the same resolution as TV images.
 
So, if a tiny single SnapDragon 805 can provide a playable FPS for 1440P screens, how come that we need two vacuum cleaners to crunch a single 4K desktop?

Does this mean that desktop solutions got behind the curve due to stale competition?
 
Very well written article, covers essentially every problem you may face if you try to jump to 4K today.

I'm not sure about jumping to even 1440p right now because of gaming performance. I'm happy with 1080p for now; I get great gaming performance and it's 1:1 for movies/Youtube/TV Shows/etc.
 
Even though part of the editorial is dedicated to PC gaming and some commenters have extended this to compare HD vs. 4K TVs (which is far from the same), the #1 reason I would move my computing to 4K ASAP is productivity.

For the past two years I've been working interchangeably on a Macbook Retina and the difference between using this and any other lower-res laptop is immense (and painful). On the desktop it's not as critical because the monitor sits farther away, but I would love to have the same sharp image on my desktop monitor regardless.

Besides, you don't need the same hardware muscle to drive a 4K monitor for desktop use than for gaming, but unfortunately, Windows support is subpar (particularly on third party apps as mentioned on the article above) while Apple has moved very slowly to support 4K on the desktop side, not to mention the cost involved.
 
So, if a tiny single SnapDragon 805 can provide a playable FPS for 1440P screens, how come that we need two vacuum cleaners to crunch a single 4K desktop?

Does this mean that desktop solutions got behind the curve due to stale competition?

No. Cellphone games don't use as much shaders and effects as PC, Console games. Not to mention that traditional games push Artificial Inteligence much further. Try to compare Combat Arms with Battlefield or Batman Arkham Origins with Blackgate. See the difference?! That is why.
 
Frame pacing? Welcome to 2014!! The author seems to forgot that the 295X and 290 series solved the pacing issues almost entirely on hardware, even Toms Hardware acknowledged that. Scaling issues are rare but still regardless if its SLI or CF plus AMD tend to do better on 4K cause it has more ROP power which is very important as pixel count increases. Look at the 295X review on every site and see for yourself.
 
The only reason for 4K is that tv/monitor manufacturers need to sell so, if people are satisfied with full hd that means that for years they won't upgrade their monitor. So with the intention to make them look obsolete they launch 4k. Personally I would like to see Freesync supported in next or current gen monitors.
there is no urgency for 4k at this moment.
 
So what's next after 4K? 6K or maybe 8K? When will it end? At what point do we reach the limits of what the human eye can distinguish before it becomes just another selling point? The fact is 1080p is fine for most people (myself included). 4K, at this point, is more for marketing than anything else.
 
DPI and PPI are different. PPI is the standard unit of measurement for pixel density these days. Humans can make out jaggies on a 24inch 2560x1440 monitor.

It's called progress. The 1% will buy this, allowing it to eventually become mainstream. If you're against progress, what are you doing reading a tech blog?
The only reason for 4K is that tv/monitor manufacturers need to sell so, if people are satisfied with full hd that means that for years they won't upgrade their monitor. So with the intention to make them look obsolete they launch 4k. Personally I would like to see Freesync supported in next or current gen monitors.
there is no urgency for 4k at this moment.


I am not against progress. But we just got gaming graphics awesome. Now we have to start over and have $6000.00 machines just game. dual titans and 2000w power supplies
 
We don't need this - just like all that 3D stuff in home tv screens. The thing is, companies doesn't have anything else to offer us so they trying to convince people that 4k is what we all want and need - it's just a marketing strategy.
 
Back