DisplayPort 2.0 officially revealed, supports 16K resolutions at 60Hz

midian182

Posts: 9,763   +121
Staff member
What just happened? As multiple 4K monitor setups and 6K/8K displays become a reality, the DisplayPort spec was definitely in need of an upgrade. Now, three years after the last update, DisplayPort 2.0 has been introduced, offering almost triple the bandwidth of its predecessor.

It was back in 2016 when the Video Electronics Standards Association (VESA) introduced the last spec update—DisplayPort 1.4a—which offers 25.9 Gbps bandwidth. With the 2.0 standard, that jumps to 77.4 Gbps.

That fat pipe is good news for users of high-resolution displays. DisplayPort 2.0 is the first standard that can support 8K at a 60Hz refresh rate with full color 4:4:4 resolution without compression, along with 30 bits per pixel (bpp) for HDR10 support. With Display Stream Compression (DSC), it can power two 8K@120Hz displays.

While there’ll be few people able to take advantage of it, DisplayPort 2.0 can also handle a 16K display (15,360 x 8,460) at 60Hz with DSC, while those using multi-monitor setups will appreciate support for three 4K displays at 90Hz.

Here’s a full list of setups:

Single display resolutions

  • One 16K (15360×8460) display @60Hz and 30 bpp 4:4:4 HDR (with DSC)
  • One 10K (10240×4320) display @60Hz and 24 bpp 4:4:4 (no compression)

Dual display resolutions

  • Two 8K (7680×4320) displays @120Hz and 30 bpp 4:4:4 HDR (with DSC)
  • Two 4K (3840×2160) displays @144Hz and 24 bpp 4:4:4 (no compression)

Triple display resolutions

  • Three 10K (10240×4320) displays @60Hz and 30 bpp 4:4:4 HDR (with DSC)
  • Three 4K (3840×2160) displays @90Hz and 30 bpp 4:4:4 HDR (no compression)

DisplayPort 2.0 is backward compatible with the previous versions of the standard and includes its main features, such as Forward Error Correction (FEC), and HDR metadata transport, along with support for USB Type-C and Thunderbolt 3 for carrying data.

When using USB-C via DP Alt Mode, which allows simultaneous SuperSpeed USB data and video, the following configurations are possible:

  • Three 4K (3840×2160) displays @144Hz and 30 bpp 4:4:4 HDR (with DSC)
  • Two 4Kx4K (4096×4096) displays (for AR/VR headsets) @120Hz and 30 bpp 4:4:4 HDR (with DSC)
  • Three QHD (2560×1440) @120Hz and 24 bpp 4:4:4 (no compression)
  • One 8K (7680×4320) display @30Hz and 30 bpp 4:4:4 HDR (no compression)

VESA says the first products to incorporate DisplayPort 2.0 will arrive on the market by late 2020.

Image credit: Mehaniq via Shutterstock

Permalink to story.

 
Quite the leap, hope its true.
They could have called it DP 3.0 and no one would have complained.
 
I can smell the Thunderbolt 4 to be announced soon, with 80Gbit/s and DP 2.0 compatibility.
 
If only more products and technologies would make improvements of similar proportions across generations rather than serial marginal increments when the potential exits to take a huge leap all at once...
 
If only more products and technologies would make improvements of similar proportions across generations rather than serial marginal increments when the potential exits to take a huge leap all at once...
It's like keeping processors on 4 cores for a decade...
 
It looks like 2.1 can do 4K120.

Wish SLI/CF was still a thing.

Hell, HDMI 2.1 supports 8k/120. Bandwidth is *not* going to be a problem for a very long time. And with the new HDMI spec forcing support of VRR, HDMI gets the win this go-around.

As for SLI/CF, as I predicted, it died to low level APIs. It's simply too much work for developers to implement on their end. SLI/CF was *always* going to be limited to using similar GPUs and managed by drivers; once those two assumptions were no longer the case, it became impossible to support.
 
DisplayPort is much better than HDMI and always has been. But for some reason, it seems most software vendors don't want to support it, even though it does support the latest DRM schemes and such.

Biggest example is Netflix. I have two monitors connected to my 1080Ti via displayport. One is an LG UD69P-W 4K monitor and the other is an older (but still very great) Dell Ultrasharp U2410 with a 1200p resolution.

Both of these monitors are professional grade monitors with advanced IPS displays (well, the Dell was advanced at the time with its H-IPS panel). The LG uses a really nice AS-IPS panel that's very capable of being accurate and even fast enough for gaming. It even has Freesync support, and Gsync actually works pretty damn good on it now that Nvidia has opened up the standard to every monitor. Everything just works better when I use DP connections, but there is one nagging problem and that's Netflix.

Netflix will only stream in 4K to my LG display over HDMI, PLUS I have to make sure no other monitor is connected or it won't work in 4K. Even though hardware wise, it's completely possible to stream at 4K over DP, for whatever reason Netflix blocks it. This may be due to the Playready DRM that it also uses, along with HDCP, I'm not sure. All I know is I have to switch inputs from DP to HDMI when I want to enjoy Netflix in 4K on my PC. It's annoying to have to do this because I also have to completely disable my Dell. Simply turning the Dell off isn't enough. I can pull the connection but that's even more invasive. The easiest way is to just disable it completely from the NV control panel. So... I have to do this ritual every time I want to watch Netflix in 4K. I wish they'd just support DP already. I wish the industry would settle on one standard, even if that means one standard for PC and one for HDTV's, since all video cards support both anyway. But if you read through the specs and see all the features, DP has HDMI beat hands down, which is why it hasn't needed updated in years.
 
Dual 4k screens at 144Hz will be the most common use for this. Perfect without going overboard.


Just an fyi, this is *per port*. Many people get confused and think these DP specs are per *system* since most nvidia cards come with 3+ DP ports. But one of those nvidia cards would support 3 times what these specs are showing since the specs are per port / per cable (as in daisy chained configurations).
 
DisplayPort is much better than HDMI and always has been.

Your own experiences counter your argument.

DP has typically offered more bandwidth, but has offered significantly less feature wise compared to HDMI. Case in point: VRR is being made standard in HDMI 2.1, but remains optional in DP 2.0.
 
Your own experiences counter your argument.

DP has typically offered more bandwidth, but has offered significantly less feature wise compared to HDMI. Case in point: VRR is being made standard in HDMI 2.1, but remains optional in DP 2.0.

Vesa Adaptive Sync has been available in DP for years. I'm not sure your completely understand how they are using the word "standard". VRR support is "standard" on the "source" (the video card) but not on the "sink" (the monitor). Early this year Nvidia started supporting VAS on DP so its not really a big feature anymore on the PC side (it will be for TVs and consoles though).

DP also supports carrying HDMI over the DP connector. I believe that's why most modern video cards have 3-4 DP ports and 1 HDMI port.
 
Vesa Adaptive Sync has been available in DP for years. I'm not sure your completely understand how they are using the word "standard". VRR support is "standard" on the "source" (the video card) but not on the "sink" (the monitor). Early this year Nvidia started supporting VAS on DP so its not really a big feature anymore on the PC side (it will be for TVs and consoles though).

Not correct.

The fact is, VESA Adaptive Sync is not required to be supported; it remains an optional part of the DP 2.0 specification. Display manufactures do not have to support it. This contrasts directly with HDMI 2.1, which is requiring VRR support for HDMI 2.1 capable displays.

DP also supports carrying HDMI over the DP connector. I believe that's why most modern video cards have 3-4 DP ports and 1 HDMI port.

You forget the connector has to be active; you can't passively convert DP 1.4 into a full bandwidth HDMI signal. And the only reason GPUs carry more DP ports is because that is what the PC display industry has settled on, even as everyone else uses HDMI. And speaking as someone who uses an LG OLED TV as his primary PC display, that decision sucks.
 
Your own experiences counter your argument.

DP has typically offered more bandwidth, but has offered significantly less feature wise compared to HDMI. Case in point: VRR is being made standard in HDMI 2.1, but remains optional in DP 2.0.

Optional still means supported. DP is simply the better interface in every way EXCEPT wide-spread adoption, since HDMI is used on all home theater equipment. But when it comes to PC's, DP should be treated as king seeing as it's simply superior and much better for a platform that needs this type of interface.
 
Optional still means supported. DP is simply the better interface in every way EXCEPT wide-spread adoption, since HDMI is used on all home theater equipment. But when it comes to PC's, DP should be treated as king seeing as it's simply superior and much better for a platform that needs this type of interface.

Except it really isn't. Optional support means that there's no requirement to support VRR, which means that majority of displays won't.

Likewise, DP has typically been late supporting things like HDCP and CEC commands. HDMI also allows for much longer cables, which is one major reason why DP is DOA outside of use for computer displays.
 
Not correct.

The fact is, VESA Adaptive Sync is not required to be supported; it remains an optional part of the DP 2.0 specification. Display manufactures do not have to support it. This contrasts directly with HDMI 2.1, which is requiring VRR support for HDMI 2.1 capable displays.



You forget the connector has to be active; you can't passively convert DP 1.4 into a full bandwidth HDMI signal. And the only reason GPUs carry more DP ports is because that is what the PC display industry has settled on, even as everyone else uses HDMI. And speaking as someone who uses an LG OLED TV as his primary PC display, that decision sucks.

I don't know where your getting the idea that hdmi is *requiring* display manufacturers to support vrr, because according to their own media and this article from cnet it is optional not a required standard feature:

www.cnet.com/google-amp/news/hdmi-2-1-what-you-need-to-know/

(Quote from article)
However, not all TVs that claim HDMI 2.1-compatibility are actually capable of everything we've discussed. HDMI Licensing, the organization in charge of the HDMI specification, is allowing companies to claim 2.1 compatibility even if they don't support every aspect. So a TV that can't accept 8K/60, but has eARC and Variable Refresh Rate, still can claim it's 2.1... as long as the company specifies what aspects of 2.1 it can support.
(End quote)

You so mention cable length, but you misunderstand there too. Increasing speed and bandwidth is almost never free. DP demanded shorter cables due to its higher bandwidth. HDMI is being forced down the same path. Passive cables for 2.1 are currently 2 to 3 meters max (and like usual I suspect only the very highest quality cable makers will be able to meet the 3 meter length at maximum bandwidth). And longer will require active cables (or optical cables with media converters).

As for not many monitor makers supporting VRR over DP, thats not really true. They do, they just call it freesync. Freesync is essentially VRR as outlined for DP 1.2a. Since the only two actual players in the game were nvidia and AMD (intel didnt support vrr in any way previously) there wasnt any incentive to call it anything else. Now that hdmi will support its own vrr will that change? Maybe, or since I believe MS has already labeled their xbox's as "freesync" compatible and freesync is license free we might simply see them all adopting the AMD naming system just like PC monitor makers have.
 
Back