Nvidia's GTX 1070 and 1080 DisplayPorts incompatible with HTC Vive

Scorpus

Posts: 2,159   +239
Staff member

This is a strange one. According to the good folks over at Tom's Hardware, the DisplayPorts on both the Nvidia GeForce GTX 1080 and GeForce GTX 1070 are incompatible with the HTC Vive headset.

The HTC Vive has two options for connecting your graphics card to the headset through the HMD's link box: HDMI or mini-DisplayPort. Most people will end up using the HDMI connection as it's more common and a HDMI cable is included in the box, but DisplayPort is there for those who have already used the HDMI port on their graphics card for another display.

When hooking up a GTX 1070 or GTX 1080 to the Vive through DisplayPort, Tom's Hardware reports that SteamVR does not detect the headset. The headset works fine through HDMI, but not through DisplayPort on both the latest drivers and the drivers that launched with either card.

This is an unusual issue for Nvidia and something that has clearly slipped through the cracks during testing. It's not clear whether the problem can be addressed through a software update, and the only word that's come from Nvidia since the issue first surfaced is that they're "still investigating the issue".

And it's not like this issue is found in all Nvidia graphics cards: you can connect a GeForce GTX 980 Ti to a HTC Vive through DisplayPort and it works just fine. It's just 10-series cards that are experiencing the issue, so hopefully Nvidia can identify and resolve this problem in the near future.

Permalink to story.

 
I am very mad, I just bought a Inno3D iChill X3 1070 (the cheapest I could find, 500 euros, a much higher price than Nvidia promised and made us believe it would cost, which is itself already pretty shamefull) to work with the HTC Vive, which I was going to set up tonight. I also had to spend 18 euros in a mini-displayport to displayport since Nvidia decided to put 3 displayports and ONLY ONE hdmi port, a decison which I dont agree with.
I need the only HDMI port for my projector which is in my living room (where the HTC Vive will be used) and the dual-DVI port for my monitor (it´s the only connection type it has), which is where the PC is. Now they are saying it wont even work?! This means it´s 18 euros donw the drain... I am truly furious!

Cheers
 
I am very mad, I just bought a Inno3D iChill X3 1070 (the cheapest I could find, 500 euros, a much higher price than Nvidia promised and made us believe it would cost, which is itself already pretty shamefull) to work with the HTC Vive, which I was going to set up tonight. I also had to spend 18 euros in a mini-displayport to displayport since Nvidia decided to put 3 displayports and ONLY ONE hdmi port, a decison which I dont agree with.
I need the only HDMI port for my projector which is in my living room (where the HTC Vive will be used) and the dual-DVI port for my monitor (it´s the only connection type it has), which is where the PC is. Now they are saying it wont even work?! This means it´s 18 euros donw the drain... I am truly furious!

Cheers

DisplayPort easily adapts to HDMI and DVI with adapters - but the inverse is not true. Hence why they elected to use DisplayPort.

And if your monitor only has DVI, good chance that you're missing out on what your GPU has to offer. Top-tier GPU with a old monitor is like a luxury sports car, with old worn-out tires.
 
Let me add a little sarcasm to your sarcasm. TROLL! Everyone run for your life.
And if your monitor only has DVI, good chance that you're missing out on what your GPU has to offer. Top-tier GPU with a old monitor is like a luxury sports car, with old worn-out tires.
Now if only we didn't have speed limits and stop signs we could get the full potential out of our cars. Much like gaming we can't get the full potential out of our games for crappy graphics and game play. The monitor is of little concern to me.
 
Last edited:
I am very mad, I just bought a Inno3D iChill X3 1070 (the cheapest I could find, 500 euros, a much higher price than Nvidia promised and made us believe it would cost, which is itself already pretty shamefull) to work with the HTC Vive, which I was going to set up tonight. I also had to spend 18 euros in a mini-displayport to displayport since Nvidia decided to put 3 displayports and ONLY ONE hdmi port, a decison which I dont agree with.
I need the only HDMI port for my projector which is in my living room (where the HTC Vive will be used) and the dual-DVI port for my monitor (it´s the only connection type it has), which is where the PC is. Now they are saying it wont even work?! This means it´s 18 euros donw the drain... I am truly furious!

Cheers

DisplayPort easily adapts to HDMI and DVI with adapters - but the inverse is not true. Hence why they elected to use DisplayPort.

And if your monitor only has DVI, good chance that you're missing out on what your GPU has to offer. Top-tier GPU with a old monitor is like a luxury sports car, with old worn-out tires.
My monitor is a 27 inch IPS screen 1440p overclocked to 120hz so it's exactly the opposite from what you said. And dual Dvi has plenty of bandwidth, for your information. I suggest you read about it.
 
Good I cancelled my vive order then. I'll buy 1080ti and hope this gets fixed by then. I use hdmi for tv and two display ports for monitors so only free ports are display port and dvi. Good to know also that it doesn't ship with cable.
 
And if your monitor only has DVI, good chance that you're missing out on what your GPU has to offer. Top-tier GPU with a old monitor is like a luxury sports car, with old worn-out tires.
Pretty asinine comment.. If you're only taking gaming monitors into account, then you might have a point, but dual link DVI still has more than enough bandwidth and is still digital, and while most monitors are released with DVI-D and HDMI, Displayport is actually still the least used interface (even compared to VGA).
 
Asus is the only card manufacturer I have seen that puts 2 HDMI and 2 DP on the 1080's. All of the others appear to be 1 HDMI and 3 DP. Just fyi in case someone was looking for different port options.
 
Chuckle all you will, this is a very valid point.. Compare performance vs price, availability and pricing issues, and bugs like this that weren't discovered until weeks after an 'in-depth' review -- and this is definitely not a 100/100 product. Just a lazy reviewer jaded by 4 generations of gfx chips with diminishing returns, surprised when they get a new one that actually improves performance.
 
Pretty asinine comment.. If you're only taking gaming monitors into account, then you might have a point, but dual link DVI still has more than enough bandwidth and is still digital, and while most monitors are released with DVI-D and HDMI, Displayport is actually still the least used interface (even compared to VGA).

There you have it. Only way to promote Displayport vs legacy connectors is to restrict legacy connectors on video cards. There are always people who prefer legacy over modern and only way to get rid of legacy is stop supporting it. There are still many "modern" video projectors with only VGA connector because it's 50 cents (or something) cheaper than DVI.

Unlike HDMI, Displayport is royalty free and has been around since 2007. There is really no good reason why DisplayPort is still so less used, excluding fact adding DisplayPort connector to device cost around 50 cents or something. To make DisplayPort more widely adopted, only way seems to be stripping legacy connectors from video cards.

Customers had chance to buy more DisplayPort devices. As usual, customers were too stupid so now DisplayPort is coming not so much option but necessity.
 
Chuckle all you will, this is a very valid point.. Compare performance vs price, availability and pricing issues, and bugs like this that weren't discovered until weeks after an 'in-depth' review -- and this is definitely not a 100/100 product. Just a lazy reviewer jaded by 4 generations of gfx chips with diminishing returns, surprised when they get a new one that actually improves performance.

Steve did open himself up for flames when he gave the 1080/1070 perfect scores.
I would have gave them both low 90's, but not perfect.
The temps were too high (almost AMD bad) and the Founders Edition cards were a dumb idea by Nvidia.
But the performance jump from a 980 to a 1080 is inarguable, we haven't seen such a massive leap in 5 years. I don't care about rebrands or the similar Maxwell/Pascal tech used. Those are weightless points, the entire IT industry is about shrinking die size.

I see others on here bitching about pricing but that is decided by the market/re-seller and the quality of the product, not the manufacturer. When a user mentions this point (and some others about features or new tech) I automatically can almost guess their noob GPU enthusiast experience and age. This has happened since the Voodoo days kiddies. I had to pay out of my @ss for my GTX 280 due to limited availability (ohh 2009, how I loathe thee). 2 Months later, it was available for $550. So pissed!

And lastly, nothing is perfect.
While Nvidia has the best reputation for quality software and hardware and has for the past 10 years, you will never see me or any true fan sugarcoat issues. They have their fair share of problems. We are all victims of diminishing returns in this business, its the nature of the beast. There are lots of people on here that come in and act like they know a lot about GPU hardware and software because they've been doing for a whole 5 years. A few of techspots staff and a very few select posters are in the same knowledge level as me, Steve or dividebyzero when it comes to GPU topics.
I'll never forget trying to explain 333 Megatexels/sec (Voodoo3 3000 reference FTW) and pixel shaders to my professor at RIT.
 
I see others on here bitching about pricing but that is decided by the market/re-seller and the quality of the product, not the manufacturer. When a user mentions this point (and some others about features or new tech) I automatically can almost guess their noob GPU enthusiast experience and age. This has happened since the Voodoo days kiddies. I had to pay out of my @ss for my GTX 280 due to limited availability (ohh 2009, how I loathe thee). 2 Months later, it was available for $550. So pissed!

And lastly, nothing is perfect.
Bang for buck should always be a consideration and this price point is always poor bang for buck which should always preclude a perfect score. They always price gouge at the top end of town particularly when AMD has been a bit lacklustre with the competition there.
 
There are lots of people on here that come in and act like they know a lot about GPU hardware and software because they've been doing for a whole 5 years. A few of techspots staff and a very few select posters are in the same knowledge level as me, Steve or dividebyzero when it comes to GPU topics.
I'll never forget trying to explain 333 Megatexels/sec (Voodoo3 3000 reference FTW) and pixel shaders to my professor at RIT.
I still have my first gfx card -- an ancient trident modded up to a whopping 8mb. And then my voodoo1 add-in for glide support -- first 'real' 3d! Those were the days when you still had to tweak your dos high level memory values constantly to even play games.. Then there was my Riva TNT, then my ATI 64 DDR, then the first radeon (now, that was a huge step forward for the times), even my dual port matrix vga. THOSE were all products worthy of a 100/100 rating, if any products ever were. They were simply revolutionary steps forward, that pulled the rest of the industry with them.

Nobody is arguing the performance of the 10-series -- those gains are much more than you'd expect from a simple die shrink (unlike the RX 480), showing much work done on the underlying architecture. Is it revolutionary, though..? Not by any means..

If it had the fastest tech available (namely HBM2) and supported DX12 fully at a hardware level -- then maybe that 100/100 would feel better earned.
 
Nobody is arguing the performance of the 10-series -- those gains are much more than you'd expect from a simple die shrink (unlike the RX 480), showing much work done on the underlying architecture.

Quite opposite in fact. 10 series is very much just die shrink while RX 480 brings many architecture changes.

Compare theoretical performance difference of GTX 1070 and GTX 970 and you easily see that theoretical performance difference is very close to real world performance difference. That indicates Pascal is essentially just a die shrink from Maxwell.

TSMC's process just seem to work much better than GF's.
 
And if your monitor only has DVI, good chance that you're missing out on what your GPU has to offer. Top-tier GPU with a old monitor is like a luxury sports car, with old worn-out tires.
Pretty asinine comment.. If you're only taking gaming monitors into account, then you might have a point, but dual link DVI still has more than enough bandwidth and is still digital, and while most monitors are released with DVI-D and HDMI, Displayport is actually still the least used interface (even compared to VGA).
I'm very happy with HDMI /DVID; my experience of Displayport, mainly on laptops, has been less good - it's a horrible I/f and seems mechanically unsound.
 
Back