The rest of the industry is finally catching up to what smartphone and tablet owners already knew: people want high resolution, high pixel density displays. It's ridiculous that even in 2014, the majority of mainstream laptops being sold feature a display resolution that's not only lower than mid-to-high-end smartphones with screens a fraction of the size, but they employ vastly inferior panel technology to boot. Thus, the great white hope is 4K.

Kind of a misnomer, "4K" is essentially rounding up the horizontal resolution; actual display resolution is typically 3840x2160. It maintains the 16:9 aspect ratio that a lot of us still chafe under, but essentially quadruples the resolution of a 1080p display, doubling it in each dimension.

For now we can ignore manufacturing costs for these new, higher resolution panels since those will eventually come down along with the overall price to the end user as economy of scale kicks in. Yet there are still a few mitigating factors that surface when we talk about making the move to a 4K display, especially if you're a gamer, and these mitigating factors are the kinds of things that keep me using conventional 1080p/1200p displays at home.

Editor's Note:
This guest post by Dustin Sklavos was originally published on the Corsair blog. Dustin is a Technical Marketing Specialist at Corsair and has been writing in the industry since 2005.

New technology often materializes before the infrastructure is fully in place. Intel launched their Ultrabook initiative with Sandy Bridge processors that weren't really designed to hit those low voltages. Each transition to a new memory technology was very gradual. 4K displays have a similar chicken and egg problem, and there are three key points where we're just not quite ready.

Performance

Our mid to high end graphics cards have been able to handle games at 1080p for a couple of generations now. As game technology somewhat stagnated, being held back by the PS3/Xbox 360 era, console ports just weren't pushing our graphics cards. AMD even brought Eyefinity to market with the Radeon HD 5870 to essentially give the hardware room to stretch its legs since the card itself was hilariously overpowered for 1080p at the time of launch.

Today we have the AMD Radeon R9 290X, architected specifically for 4K. The problem is that the GPU is monstrously large, pulls close to 300W under load, and still can't push maximum or often even high settings at 60fps at 4K.

Adding a second card mitigates the issue, but doesn't solve it, and leaves you at the mercy of AMD's driver team. Meanwhile, Nvidia's beastly GeForce GTX 780 Ti eats the 290X alive at conventional resolutions but cedes ground at 4K, and the 3GB framebuffer actually becomes problematic at 4K resolution. SLI again mitigates, but doesn't solve the problem.

To replicate the kind of experience we have at 1080p with a single graphics card at 4K actually can require as many as four of those GPUs, and that's again assuming the drivers show proper scaling. Suddenly your 4K gaming experience requires a 1500W power supply; that's an unrealistic bar to clear.

This problem will likely solve itself in one to two generations... of manufacturing process nodes. But 20nm GPUs aren't due until 2015, and I wouldn't expect the next step to be available until 2017 at the earliest.

Interface

The majority of our existing display interfaces were designed to top out at about 1920x1200 at 60Hz. Dual-link DVI will go up to 2560x1600, but that's still not enough to drive 3840x2160 at 60Hz. 30Hz can be done over HDMI, but 30fps looks horrendous on a PC. HDMI 2.0 can do 3840x2160 at 60Hz and was ratified last September, but it's very rare in the wild.

4K monitors use one of three methods to produce a 4K, 60Hz image. They'll either use two cables and connect to two outputs on your card, essentially bridging the image in software; they'll use a single DisplayPort connection that will reveal itself to Windows as two 1920x2160 panels and again have you stitch together the image in software; or - and this is the most ideal method - they'll use a single DisplayPort 1.4 connection and have it reveal as a single 4K monitor. Hilariously, this last most efficient method is also the rarest one.

DisplayPort is fantastic for this, but the problem is that DisplayPort has also often been a second class citizen on graphics cards and monitors alike. Current generation Nvidia cards only include a single DisplayPort connection, so multi-monitor 4K setups are right out. AMD cards often include more, but again, you'll be dealing with AMD's persisting driver and frame-pacing issues.

This technology exists, but is not widespread yet and may need a generation or two. It does effectively sound the death knell for DVI, which just plain can't handle 4K at 60Hz and has no plan on the books to change.

Software

The final issue is software. Since we basically need two graphics cards just to have a decent 4K gaming experience, we're left at the mercy of the GPU vendors to straighten out their multi-GPU solutions. AMD and Nvidia both have issues in this department, but this isn't even the biggest barrier.

Where 4K and high-ppi displays in general have real issues is in Windows itself. Microsoft has tried to make strides to improve scaling on high-ppi displays, but third party software vendors have failed miserably to pick up the slack. Scaling doesn't even work properly in Google Chrome.

So we now have our incredibly high resolution monitors that we asked for, but the software ecosystem is so hosed that they're actually lousy for productivity. Adobe is particularly guilty of this, with none of their software scaling up on high-ppi displays, resulting in extremely tiny images. Gamers are really out of luck, too, because Steam doesn't scale either.

Your other issue is that if you've been avoiding using Windows 8.1 at all, you're extra screwed, because Windows 7's scaling is even worse. So if you want to use a high-ppi display, Windows 8.1 is your best bet.

This is the direction the industry is moving, though, so at a certain point pressure will set in and developers will have to either get on the bus or get left behind. In the meantime, you're still stuck with a display that's beautiful to behold, but troublesome for gaming or productivity. What's really left?