Nvidia RTX 5000 graphics cards to offer DisplayPort 2.1 and PCIe 5.0 support, built on...

midian182

Posts: 9,745   +121
Staff member
Rumor mill: We might still be more than a year away from Nvidia's next-gen RTX 5000-series graphics cards hitting the market, but that doesn't mean rumors aren't already arriving. The latest of these claims the cards will be built on TSMC's 3nm process node and come with features such as DisplayPort 2.1, which AMD already offers, and PCI Express 5.0.

Prolific and usually reliable leaker Kopite7kimi made the RTX 5000 claims on X. They might not be the most earth-shattering rumors, but that means they're likely accurate.

Moving from TSMC's 5nm process node used for the Lovelace generation to 3nm for its successor is an obvious step from Nvidia. Apple's recent M3 SoCs utilize the 3m node, which TSMC says brings a 15% performance improvement compared to 5nm, along with better efficiency (~30%) while shrinking the die size by around 42%. No word if the RTX 5000 will be a custom node or one of TSMC's announced N3 family: N3E, N3P, and N3A.

The RTX 5000 Blackwell graphics card are also predicted to offer DisplayPort 2.1. Some are have been disappointed with the RTX 4000 series sticking with DisplayPort 1.4a connections while AMD's RX 7000 line uses DP 2.1. AMD's standard with UHBR 13.5 supports a link bandwidth of up to 54 Gbps – not the full 80 Gbps (UHBR20) that DisplayPort 2 can provide but a huge bandwidth upgrade over DP 1.4. This allows for future display types such as 8K at 165Hz and 4K at a blistering 480Hz. It's not clear which DisplayPort 2.1 standard Nvidia will choose.

PCIe 5.0 compatibility is another rumored feature in the RTX 5000 series. The cards are also expected to keep the 16-pin power connector, albeit using the revised 12V-2x6 connector. The specter of the melting RTX 4090s remains a problem for buyers and Team Green, but there have been no reported issues since the company updated its flagship to 12V-2x6 adapters earlier this year. Tests have shown that the revised power cables are safe even when not fully connected.

There are plenty of other rumors about Blackwell, including its use of GDDR7 memory, but it's best to wait until more compelling evidence of these features arrives.

The big question everyone wants to know is when will the RTX 5000 series land. Many believe it won't be here until 2025, but the next-gen cards could launch late next year after the Blackwell HPC GPUs start shipping.

Something that suggests the RTX 5000 cards won't launch until 2025 is what appears to be the RTX 4000 Super variants' imminent arrival. There have been a slew of rumors that the RTX 4080 Super, 4070 Super, and 4070 Ti Super will be unveiled at CES in January. If they prove accurate, Nvidia might not want to launch a new card generation in the same year. Let's hope the company prices the RTX 5000 line a bit more sensibly, too.

Permalink to story.

 
DP 2.1 built on TSMC 3nm, with GDDR7 and PciE 5. I expect a price bump of at least 50%.
Now how about that frame buffer? Still 8GB for 1080p cards?

Today the mighty 4090 can't drive latest AAA RT titles at 4K 60FPS whitout FG and DLSS. Going to take at least
a decade for 8K 165Hz. Didn't Jensen told you that More's law is dead? Don't expect more than 3-5% more perf from a 5090 vs 4090.

 
If display port carries audio, too, maybe. HDMI is the way to go when you need audio, however.
DisplayPort can be used to transmit audio and video simultaneously, although each can be transmitted without the other. The video signal path can range from six to sixteen bits per color channel, and the audio path can have up to eight channels of 24-bit, 192 kHz uncompressed PCM audio.
Wikipedia
 
Curious to see how long the 3090 lasts at 3840x1600 for 60fps games. I wonder if I can make it last until RTX9090 and have a whole system overhaul then.
 
I have no use for 8K displays or anything 480Hz but I look forward to more of the great innovations that Nvidia brings to each new GPU generation. I'm sure nothing will touch Blackwell, just like Ada, Ampere and so many before them.
 
If display port carries audio, too, maybe. HDMI is the way to go when you need audio, however.
From what I've read is that the reason they don't include displayport on TVs is that there isn't any DRM on displayport which would make watching pirated content in HDR easier on a TVS. Streaming services don't allow you to watch 4k or high bitrate content on a PC. Not saying this exclusively, but I know that I have a home theater system connected to my TV and haven't used my TVs audio in years. Most people with high end TVs at least have a sound bar.

I'm just annoyed because I want my next TV to be 8k120 because I use it as a desktop monitor. I don't expect to play games at 8k120, atlwast not anytime soon. Just saying, 8k120 would be nice for smoothness for desktop applications. I remember it took forever to get 4k120 in a TV due to HDMI limitations. It took forever to get 4k120 in a TV and I think it's still stuck at 4:4:2 instead of 4:4:4 for some arbitrary reason
 
Last edited:
It is absolutely a sense of entitlement to complain about the price of luxury items. Nobody needs to play video games of any kind, let alone technologically advanced and high fidelity games that require uber powerful hardware. Nvidia can charge whatever the market will bear and time has proven that people will pay to play. I'm far from wealthy and dislike the high prices as much as anyone, but do not subscribe to the notion that prices 'should' be lower.
 
Are we ever going to get displayport on TVs?
No because it's incomaptible with the HDMI Consortium Requirements that if you license HDMI, you can't provide any other digital connection. VGA is fine as that's Analog.

It's this reason I wont buy any new TV today and will stick with a Computer Display with DP as it's a better standard then HDMI since it is fully digital. Yes HDMI is still Analog, which is why it's crap
 
No because it's incomaptible with the HDMI Consortium Requirements that if you license HDMI, you can't provide any other digital connection. VGA is fine as that's Analog.
This is wrong. Please explain the multitude of monitors with HDMI that supports DRMed HDR content and ALSO have dispalyport as an available option.

It's this reason I wont buy any new TV today and will stick with a Computer Display with DP as it's a better standard then HDMI since it is fully digital. Yes HDMI is still Analog, which is why it's crap
.....what? How on EARTH do you figure HDMI is analog? HDMI doesnt support analog at ALL, which is why HDMI to VGA converters have boxes on them to actively convert the signal. Now DISPLAYPORT can output in either analog or digital, which is why it doesnt need an active converter to do VGA out.

I think you may be drunk.
 
I have no use for 8K displays or anything 480Hz but I look forward to more of the great innovations that Nvidia brings to each new GPU generation. I'm sure nothing will touch Blackwell, just like Ada, Ampere and so many before them.


Did you get that Rah Rah Rah approved by Nvidia first ?

Nvidia's ego of course means - if AMD bring out an expensive monster then Nvidia just has to bring out RTX5090 Ultra Titan

GPUs scale - so just have to throw more real estate

Lets be real - people will buy the RTX 5090 even if $3000 and 1000w power draw - I assume it will be $2000 and say 600W though

The Toyota Super car is nice , well healed will buy the Lexus - but most of us just want the Camry or Corolla

So the main competition is in 5080 and 4060 range - where most people buy
Plus the new ARM GPUs . APUs that will come as a cheap but pretty good solution to play casual games

Lots of smart gamers saved big on AMD or a 4080 - seeing just as good graphics as some users of 4090 with smart settings - on lots of games - the differences aren't relevant 200fps vs 280fps on non FPS game

That's $500 saved they can have fun elsewhere in their lives, plus power bill , extra build cost , noise reduction or louder noise

Even more savvy people buy last years stock at big discount and uy AAA games of last year deeply discounted

End of day whats a game developer to do sales from Xbox/PS5 vs a few percent on RTX 5090 - when 4060 like will be average on Steam next year

TD/DR - power expensive Flagship GPUs are interesting - but most of us don't care and will never buy one even with deep enough pockets
 
No because it's incomaptible with the HDMI Consortium Requirements that if you license HDMI, you can't provide any other digital connection. VGA is fine as that's Analog.

It's this reason I wont buy any new TV today and will stick with a Computer Display with DP as it's a better standard then HDMI since it is fully digital. Yes HDMI is still Analog, which is why it's crap
Absolutely everything in both these sentences are 100% false. This must be a troll post.
 
The Toyota Super car is nice , well healed will buy the Lexus - but most of us just want the Camry or Corolla
I think that’s very true. Makes me wonder if the attitudes are similar in other industries.

Do people buy a Camry and then spend their time pontificating on car forums about how evil Lexus is and circle jerking about how much smarter they are than Lexus customers? I honestly have no idea as cars are not something I spend a lot of time thinking about.
 
"15% performance improvement compared to 5nm, along with better efficiency (~30%)"

In this context performance IS efficiency, how can these numbers be different?!

At best I can intepret it to be, "At the same performance level it is 30% more efficient then 5nm, but if you increase it to run at the same power as previously quoted 5nm comparison, then it is only 15% faster, due to diminishing power efficiency at higher levels.". Or is 30% more efficient at an optimal power envelope, and but it reach 15% higher as an upper limit?
 
Back