DisplayPort vs HDMI: What's Best for High Refresh Rate Gaming?

That's a good point and the only reason why I skipped multiple monitoring configurations is that it's not something that's usually associated with high refresh rate gaming. Of course, with new monitors typically supporting high refresh rates and motorsport/flight sim rigs often containing two or three monitors, it's valid that it should be in the article. I'll add it as soon as I can.


What graphics card and monitor do you have? As noted in the article, some monitors only support their advertised refresh rate using DP.


GeForce RTX 30 series has DP 2.0 and HDMI 2.1, so there's no reason for the 40 series to not have DP 2.0 - it's monitors that don't have it.
Nope, the RTX 30 series uses DP 1.4

I just confirmed that on my own 3080 12gb and the newest 3090 Ti from both Asus and EVGA.
 
HDMI doesn't let me run 170Hz on my monitor at 1440p, just 144Hz max. Is that some HDMI limitation or do I just need a better cable? The one I use is what I got in the package with the monitor. DisplayPort lets me run 170Hz no problem and that's what I'm using.
It's a limitation if you have anything below HDMI 2.0
 
Nope, the RTX 30 series uses DP 1.4

I just confirmed that on my own 3080 12gb and the newest 3090 Ti from both Asus and EVGA.
You’re quite right and apologies to @VitalyT too. I’ll amend the article today to reflect this and make it clear that only AMD’s Radeon 600M GPUs offer a DP 2.0 (not that it’s much use, at the moment). Funny thing is, I could have sworn I had an old press release from Nvidia claiming DP 2.0 in Ampere - I must be losing it… 🫣
 
I'm seeing a couple of technical omissions here:

1: The maximum supported resolutions/refresh combo's for HDMI 2.1 is flat out wrong, and fails to mention HDMI *also* has a version of DSC to support higher resolutions, much like Displayport.

The maximum officially supported combos for HDMI 2.1:

Standard: 4k/240 (DSC) or 8k/120 (DSC)
HDR: 4k/144* or 8k/120 (DSC)

*While not listed as supported, 4k/240 HDR should be possible via DSC, as it is less bandwidth then 8k/120 HDR.

The limits given in the chart in the article fails to account for DSC in HDMI, even as it gives DSC supported resolutions for Displayport.

2: The VRR range limitation is not an issue with HDMI; top-end displays are fully capable of 20-120 VRR ranges (or more) without issue; yell at panel manufactures who don't bother to spend the resources to actually support HDMI properly.

Also note HDMI has it's own implementation of VRR which (more often then not) falls under NVIDIA's "Gsync Compatible" umbrella. Also note Freesync over HDMI is totally a thing, so doing even a Gsync-Freesync comparison is hardly fair given HDMI supports both (which makes sense; it's just timed frame data).
 
1: The maximum supported resolutions/refresh combo's for HDMI 2.1 is flat out wrong, and fails to mention HDMI *also* has a version of DSC to support higher resolutions, much like Displayport.

The maximum officially supported combos for HDMI 2.1:

Standard: 4k/240 (DSC) or 8k/120 (DSC)
HDR: 4k/144* or 8k/120 (DSC)

*While not listed as supported, 4k/240 HDR should be possible via DSC, as it is less bandwidth then 8k/120 HDR.

The limits given in the chart in the article fails to account for DSC in HDMI, even as it gives DSC supported resolutions for Displayport.
The table wasn't claimed to be a comprehension calculation of all possible resolution and refresh rate combinations. From the article:

"An entry of 'Yes' in the table means that the display system supports that resolution and refresh rate, without resorting to the use of image compression. It's worth noting that a 'No' entry doesn't necessarily mean that it won't work -- it's just that there's insufficient transmission bandwidth for that resolution and refresh rate, using industry standard display timings."

The timings in question, used to produce the table, are CVT-RBv2 - vendor custom timings will often permit specific res+refresh combos that aren't possible with CVT-RBv2.

Note that the table has an entry for 4K 240 Hz - with 4:2:2 compression (or DSC) it's supported, ergo other combinations of a similar bandwidth requirement are also supported. The paragraph above the table literally says that HDMI 2.1 has DSC; however, I did miss it off the table by accident, so I will add that in shortly.

2: The VRR range limitation is not an issue with HDMI; top-end displays are fully capable of 20-120 VRR ranges (or more) without issue; yell at panel manufactures who don't bother to spend the resources to actually support HDMI properly.

Also note HDMI has it's own implementation of VRR which (more often then not) falls under NVIDIA's "Gsync Compatible" umbrella. Also note Freesync over HDMI is totally a thing, so doing even a Gsync-Freesync comparison is hardly fair given HDMI supports both (which makes sense; it's just timed frame data).
It doesn't matter whether the fault lies with HDMI or the monitor vendors. At the end of the day, one is getting better VRR support using DP. HDMI does indeed have its own VRR (it's the Adaptive Sync entry in the second table) and the table also makes it clear that FreeSync over HDMI is perfectly possible. The second paragraph underneath the table also points out the VRR often works outside the ranges advertised.

The same is true for the HDR refresh rates in the second table - these are not indicative of what the interface is capable, simply what the manufacturer limits it to.
 
No mentioned of HDCP support on HDMI ?? I think this is why the reason TV / Video Player /console only include HDMI rather than choose DP that clearly free of license. That HDCP playback support included as DRM in media like BluRay is why manufacturer love more inferior HDMI compared to DP
 
1) HDMI is by far the most versatile / compatible standard, when video+audio is the most important thing. Most video services and devices use hdcp handshake from HDMI.

2) DP is still the best standard for high bandwidth image necessities, like high framerate games

3) HDMI 2.x is catching up BUT as the standard is too loose, most vendors do not meet the high standards and don't support most of the advanced functions. So despite HDMI 2.1 is very versatile and powerful, the real implementation leaves much to be desired.

4) I think in a near future a common single standard like for example: USB-C 5.0 with 80 gbps bandwidth mixing everything from USB + HDMI + DP2.x. That way all USB-C VIDEO cables would be somewhat more expensive but definitely reaching 80 gbps. USB-C 4.0 already support DP 2.0 so a new version ending different connectors would be great.
 
This article is wrong in more ways than one. First off there are no gpus in the market that have Displayport 2.0. Latest AMD 6000 series and Nvidia 3000 series support DP 1.4a and HDMI 2.1. Hdmi 2.1 monitors started showing up in the market from last year so no DP 2.0 for at least 2-3 years.Hdmi 2.1(48Gbs)> Dp1.4a since it can do 4k 144hz HDR Full RGB without DSC. Hope this makes it clear everyone.
 
You’re quite right and apologies to @VitalyT too. I’ll amend the article today to reflect this and make it clear that only AMD’s Radeon 600M GPUs offer a DP 2.0 (not that it’s much use, at the moment). Funny thing is, I could have sworn I had an old press release from Nvidia claiming DP 2.0 in Ampere - I must be losing it… 🫣
You're not losing it, and no need to apologize. No one here is perfect ;)
 
Great article, thank you. It'd be really nice to see one on the importance of VRR (and thus DisplayPort) for a smooth and genuine emulation and retro-gaming experience.

In general, I'd love retro-gaming coverage on TechSpot. Doesn't need to be massive.
 
This is a great article. However, there is one important point that is missed - the cost of using HDMI to the manufacturer. There is an annual cost to use HDMI plus a fee per unit and/or HDMI key. There is no cost to use DisplayPort...
 
DP2.0 is the only interface that supports 8k monitors today without loss of quality up to 36 bit color depth at 60Hz. HDMI 2.1, even in its full version with 42Gbps (only useful data, excluding service traffic), does not even handle 8k@60Hz 24 bits. Its destiny is 4k monitors and household appliances.

To date, despite the fact that DP 2.0 in its full version was adopted almost 3 years ago as a standard, all iron manufacturers have deceived consumer expectations, because. none of the Intel/AMD/Nvidia trio provided DP 2.0 80Gb/s (77Gb/s payload).
Zen4 (and discreete gpu) only supports the poor UHBR 10Gbit/s mode for DP2.0 (I.e. 40Gb/s including service traffic and even less without it, I.e. weaker than full HDMI 2.1).

Alas, today there is no hardware that actually supports lossless 8k data transmission at 60Hz (not to mention a higher frequency) with a color depth of 24 to 36 bits with full UHBR20 (20Gbits to lane) mode. The future of 8k never happened. Although formally discrete video chips of even an average level 12 years ago were able to pump the necessary level of data in memory, sufficient to support 8k monitors in 2D mode. Exactly what you need for high-quality output of text, vector graphics and photos on 27-32" screens, where 4k is clearly not enough in terms of pixel density (ppi) at a distance of 50-60cm from the screen.

In terms of cable length, both HDMI 2.1 and DP2.0 (which they don’t put in projectors or TVs) have long been uncomfortable without switching to optics cable. Projector owners are well aware of where the typical distances are 8-15m.

The data signal should long ago be transferred to an optical cable, where all restrictions are removed at once (for example, a noisy gaming unit can be sent to the back room of the house, lead a 15-30m optical cable to the monitor and work and play in complete silence). But the industry keeps pretending it doesn't need it. But consumers need it.
 
Back