Cyberpunk 2077 met with second delay, release set for November

Shawn Knight

Posts: 15,294   +192
Staff member
Rumor mill: Given the revised launch date of mid-November, some are speculating that CD Projekt Red’s decision could have more to do with timing than polish. That is, a mid-November launch is right around the time that next-gen consoles from Microsoft and Sony are expected to go on sale. Having a mega game like Cyberpunk 2077 as a launch title would certainly give gamers yet another reason to hop aboard the early adopter bandwagon.

CD Projekt Red has made good on its promise to share more information about the pending release of Cyberpunk 2077. Unfortunately, it probably isn’t the sort of news that most want to hear.

In a press release issued on Thursday, CD Projekt Red said it has decided to push the launch of Cyberpunk 2077 back to November 19. The dev said that at the time of writing, the game is finished both content and gameplay-wise. From the quests and cutscenes to skills and items, everything is in place. But with such a complex game, the team felt it needed a little extra time to balance gameplay mechanics and iron out bugs.

“Ready when it’s done” is not just a phrase we say because it sounds right, it’s something we live by even when we know we’ll take the heat for it. At the same time, we are fully aware that making such a decision costs us your trust and trading trust for additional time is one of the hardest decisions a game developer can make. And despite we think it’s the right decision for the game, we’d still like to apologize for making you wait longer.

Even still, a two-month delay isn't fun, especially considering this is the second time the game has been bumped. In January, it was revealed that the original April 16 launch had been postponed until September 17. As recently as April, it appeared as though the game was still on track to launch before the official start of fall.

On the bright side, CD Projekt Red noted that journalists are starting to play the game this week and should have previews ready to go right after its June 25 “Night City Wire” presentation.

Permalink to story.

 
It would definitely be better if Cyberpunk was demanding enough to require the power of PS5 or Scarlett.

Pretty sure there will be plenty of rendering features in this game that will only manifest themselves properly on a high end PC or a next gen console. It always looked that way from the moment we saw game play. Extensive ray tracing for a start.
 
Wonder if the push is to line up more closely to the console launches this fall. Makes it easier to hold out for the next cards from NVIDIA/AMD too. Special edition Cyberpunk hardware perhaps?
 
Well I consider Them to be national treasure of Poland, but if They did that to correspond with release of new consoles, They just shot Their knee (Yeah, I used to be an adventurer like You).
It's gonna be Black Friday week. People will have lots of stuff to buy, lots of games among other things. If Microsoft made Them do This, It better compansate Them for the possibility of lost revenue.
 
Well I consider Them to be national treasure of Poland, but if They did that to correspond with release of new consoles, They just shot Their knee (Yeah, I used to be an adventurer like You).
It's gonna be Black Friday week. People will have lots of stuff to buy, lots of games among other things. If Microsoft made Them do This, It better compansate Them for the possibility of lost revenue.
This has been the most anticipated game for a few years now, they aren't going to lose sales. Especially considering how frustrated gamers are as a whole at the stereotypical AAA gaming industry right now.

Last of us 2 is trying to get the "game of this generation" title, but I still think that's been going to go to cyber punk.
 
Pretty sure there will be plenty of rendering features in this game that will only manifest themselves properly on a high end PC or a next gen console. It always looked that way from the moment we saw game play. Extensive ray tracing for a start.

Extensive ray tracing isn't happening on modern games anytime soon. You can barely run minecraft with RTX on at 1080p with a $1,200 graphics card. A modern title like Cyberpunk is completely out of the question. Not even Nvidia's next gen cards will come close. You might see a single effect like shadows or reflections but that's about it. Entirely replacing a game's lighting system is also a problem. If you started making a game with regular lighting and then switched to RTX or decided to do both, you'd have to go back and ensure all the game levels / cells are conveying the lighting the developer had originally intended. IMO the current limitations of the hardware limit the effects to the point where the benefit is questionable at best while the negatives are 80% of your performance, which is ridiculous.


It would definitely be better if Cyberpunk was demanding enough to require the power of PS5 or Scarlett.

Require no. That's would be insanely stupid of them to cut out 87% of the market by making such a ridiculous hardware requirement. I think you meant "make use of".
 
I said it from the beginning that it wouldnt make it until Nov. First came the first announcement but even then it didnt think would make sense releasing 2 months before a new console. This delay likely gives them 2 extra months to polish if thats whats needed and I would assume so.
 
Extensive ray tracing isn't happening on modern games anytime soon. You can barely run minecraft with RTX on at 1080p with a $1,200 graphics card. A modern title like Cyberpunk is completely out of the question. Not even Nvidia's next gen cards will come close.

I wouldn't be so sure of that.

It depends on how you personally define extensive. Minecraft was entirely path traced which is extremely demanding. Cyber will be ray traced AFAIK, and probably have quite a lot of it.

Mainly because it's going to be a showcase for Nvidia's next gen cards. I think it's a given those cards will have significantly increased ray tracing performance. The RT hardware on Turing now wasn't a huge area of the die, doubling it on every SKU probably wouldn't be beyond Nvidia for the RTX3000 series. They won't have double the raw raster performance, but they could double the dedicated RT performance.

Imagine a scenario where a 2080Ti takes a 50 percent hit to frames with RT enabled but an RTX3080 only takes a 25 percent hit for example.

You're also looking at a second generation RT implementation in software. It's bound to be better and run faster than the initial stuff two years ago like Control and Battlefield 5.

Minecraft just demonstrated how powerful DLSS can be. Metro showed how the resolution of ray tracing is decoupled from the completed frame and capable of also being upscaled in the render pipeline. Any combination of DLSS and types of RT upscaling are going to boost performance.

In short I think we'll see quite a lot of ray tracing going on in Cyberpunk 2077. It's going to require the heftiest RTX3000 parts to push at the highest resolutions. However I think you'll be seeing midrange RTX3000 cards deal with the game just fine at more mainstream resolutions.
 
I wouldn't be so sure of that.

It depends on how you personally define extensive. Minecraft was entirely path traced which is extremely demanding. Cyber will be ray traced AFAIK, and probably have quite a lot of it.

Mainly because it's going to be a showcase for Nvidia's next gen cards. I think it's a given those cards will have significantly increased ray tracing performance. The RT hardware on Turing now wasn't a huge area of the die, doubling it on every SKU probably wouldn't be beyond Nvidia for the RTX3000 series. They won't have double the raw raster performance, but they could double the dedicated RT performance.

Imagine a scenario where a 2080Ti takes a 50 percent hit to frames with RT enabled but an RTX3080 only takes a 25 percent hit for example.

You're also looking at a second generation RT implementation in software. It's bound to be better and run faster than the initial stuff two years ago like Control and Battlefield 5.

Minecraft just demonstrated how powerful DLSS can be. Metro showed how the resolution of ray tracing is decoupled from the completed frame and capable of also being upscaled in the render pipeline. Any combination of DLSS and types of RT upscaling are going to boost performance.

In short I think we'll see quite a lot of ray tracing going on in Cyberpunk 2077. It's going to require the heftiest RTX3000 parts to push at the highest resolutions. However I think you'll be seeing midrange RTX3000 cards deal with the game just fine at more mainstream resolutions.

The 2080 Ti takes an 80% performance hit on average with ray tracing enabled. Let's stick with what we know. If Cyberpunk has more RTX features, I highly doubt the performance hit is going to shrink. Nvidia has zero incentive to backport any software improvements to older cards as well, so don't expect them to improve the 20xxx series performance once the 3000 series launches. Nvidia has and continues to release software features (like integer scaling) that would work perfectly fine on older cards but soft locked to newer ones simply to push sales.

Even if you assume that RTX performance doubles on the 3000 series, you'd still be looking at a massive performance hit for a single RTX effect.

I'm sure Nvidia may make a lot of claims about RTX on the 3000 series just like it did the 2000 series but I'll wait for 3rd party benchmarks. I personally don't think we will see reasonable performance until the 4000 - 5000 series.
 
Last edited:
The 2080 Ti takes an 80% performance hit on average with ray tracing enabled. Let's stick with what we know. If Cyberpunk has more RTX features, I highly doubt the performance hit is going to shrink. Nvidia has zero incentive to backport any software improvements to older cards as well, so don't expect them to improve the 20xxx series performance once the 3000 series launches. Nvidia has and continues to release software features (like integer scaling) that would work perfectly fine on older cards but soft locked to newer ones simply to push sales.

Even if you assume that RTX performance doubles on the 3000 series, you'd still be looking at a massive performance hit for a single RTX effect.

I'm sure Nvidia may make a lot of claims about RTX on the 3000 series just like it did the 2000 series but I'll wait for 3rd party benchmarks. I personally don't think we will see reasonable performance until the 4000 - 5000 series.

80 percent is severely exaggerated. I don't know where you get that figure from but it is inaccurate. Techspot themselves found that it was significant- first gen games like Metro Exodus could definitely take up to a 50 performance hit for the maximum quality offered. However even the RTX2060 Super at 1080p didn't suffer 80 percent performance loss.

That's what we know.

The article also points out that 50 percent is far too much, which we all agree on. However if the penalty was much lower at a quoted 20 percent then there is more justification in use of the technology. I have to agree with that assessment.

As for Turing then what I suppose I am proposing as you tacitly observed is yes, it could be rapidly obsolete for ray tracing if Nvidia doubled the performance for the RTX3000 series. As a first gen technology this is hardly a surprise though if you find a midrange card copes a lot better than even an RTX2080Ti when you turn on heavier ray traced effects.

Nvidia haven't made many claims at all with RTX3000, all we have is idle speculation. All we can do from here on in is wait and hope they deliver, and that AMD deliver something competitive. That's the incentive you talk about to push this forward.

My hopes are already raised by the fact we have confirmed PS5 games using ray tracing quite nicely even if it is at a lower level. That bodes well for the future of the tech on PC in both software and hardware, which should considerably outperform either console in this area.
 
80 percent is severely exaggerated. I don't know where you get that figure from but it is inaccurate. Techspot themselves found that it was significant- first gen games like Metro Exodus could definitely take up to a 50 performance hit for the maximum quality offered. However even the RTX2060 Super at 1080p didn't suffer 80 percent performance loss.

That's what we know.

The article also points out that 50 percent is far too much, which we all agree on. However if the penalty was much lower at a quoted 20 percent then there is more justification in use of the technology. I have to agree with that assessment.

As for Turing then what I suppose I am proposing as you tacitly observed is yes, it could be rapidly obsolete for ray tracing if Nvidia doubled the performance for the RTX3000 series. As a first gen technology this is hardly a surprise though if you find a midrange card copes a lot better than even an RTX2080Ti when you turn on heavier ray traced effects.

Nvidia haven't made many claims at all with RTX3000, all we have is idle speculation. All we can do from here on in is wait and hope they deliver, and that AMD deliver something competitive. That's the incentive you talk about to push this forward.

My hopes are already raised by the fact we have confirmed PS5 games using ray tracing quite nicely even if it is at a lower level. That bodes well for the future of the tech on PC in both software and hardware, which should considerably outperform either console in this area.

I don't think so: https://www.techspot.com/review/1759-ray-tracing-benchmarks-vol-2/

At 1080p BFV takes a 65% performance hit and at 1440p it's 75%

Unless you are going to sit here and tell me people are spending $1,200 on a graphics card to play at 1080p in non-esports titles, you are presenting a best case scenario for the card, a scenario that isn't even realistic. It's a questionable increase in quality of either shadows or GI in return for massive decrease in frame-rate or resolution. That's still only a single effect as well. A gross exaggeration it was not. Realistic, it was. I don't know a single PC gamer asking for low FPS 1080p.
 
I don't think so: https://www.techspot.com/review/1759-ray-tracing-benchmarks-vol-2/

At 1080p BFV takes a 65% performance hit and at 1440p it's 75%

Unless you are going to sit here and tell me people are spending $1,200 on a graphics card to play at 1080p in non-esports titles, you are presenting a best case scenario for the card, a scenario that isn't even realistic. It's a questionable increase in quality of either shadows or GI in return for massive decrease in frame-rate or resolution. That's still only a single effect as well. A gross exaggeration it was not. Realistic, it was. I don't know a single PC gamer asking for low FPS 1080p.

I think your figures are confused. Your link is BF5 in December 2018. Very early days in RT for Turing. Even then it shows that after the first patch performance is basically halved. That's 50 percent. Going from 128FPS DXR off average post patch to 68 average with ultra DXR is a 50 percent reduction give or take. A bit less actually.

Not an 80 percent reduction.

My link was an entire year later, and shows that on that very same game, the hit is still about 50 percent. It also shows on Metro Exodus and Control and any of the other ray traced games, the hit is 50 percent or less. This is no doubt A LOT. It's not 80 percent though.

So my original point that if you can reduce the hit from the 50 percent down to say 25 percent within one generation, you're already making significant progress. If the game with 100FPS DXR off is now 75FPS with it maxed out instead of 50FPS, you have strong gains.

If you can imbue a mid range graphics card with the ability to do ray tracing at the same sort of overall performance as a 2080Ti can do ray tracing, you're starting to make inroads.

Taking that to the upper tiers of GPUs you would be looking at cards capable of doing 4K resolution with good frame rates and enable ray tracing. Perhaps not at extreme levels, but at least better than what is demonstrated by the early ray traced console titles we have seen.
 
I think your figures are confused. Your link is BF5 in December 2018. Very early days in RT for Turing. Even then it shows that after the first patch performance is basically halved. That's 50 percent. Going from 128FPS DXR off average post patch to 68 average with ultra DXR is a 50 percent reduction give or take. A bit less actually.

Not an 80 percent reduction.

My link was an entire year later, and shows that on that very same game, the hit is still about 50 percent. It also shows on Metro Exodus and Control and any of the other ray traced games, the hit is 50 percent or less. This is no doubt A LOT. It's not 80 percent though.

So my original point that if you can reduce the hit from the 50 percent down to say 25 percent within one generation, you're already making significant progress. If the game with 100FPS DXR off is now 75FPS with it maxed out instead of 50FPS, you have strong gains.

If you can imbue a mid range graphics card with the ability to do ray tracing at the same sort of overall performance as a 2080Ti can do ray tracing, you're starting to make inroads.

Taking that to the upper tiers of GPUs you would be looking at cards capable of doing 4K resolution with good frame rates and enable ray tracing. Perhaps not at extreme levels, but at least better than what is demonstrated by the early ray traced console titles we have seen.

Like I pointed out earlier, at 1080p with a 2080 Ti, which is a best case scenario. People do not buy a $1,200 GPU to play at 1080p.
 
Like I pointed out earlier, at 1080p with a 2080 Ti, which is a best case scenario. People do not buy a $1,200 GPU to play at 1080p.

No, they don't. Which is why RTX2080Ti could be quickly obsolete at least for ray tracing.

Only about 10 percent of the die is dedicated ray tracing hardware, which means on a 7nm shrink you could have that much RT hardware on a midrange card, if not more.
 
Very unlikely.

Not at all. First generation technologies often become rapidly obsolete if consumer acceptance grows. Remember Physx cards? Remember Physx on a dedicated second GPU?

Performance expanded so rapidly that a single GPU could do it. Both techniques were dead and buried within a couple years.
 
Try 6 years. That's not quickly. The RTX2080TI will be a relevant GPU for RTRT for at least two more years.

More like 6 years from the first dedicated cards, which had no market penetration. So Nvidia bought them up in 2008 and integrated the technology into their drivers, and suggested dedicated physics GPUs. That was where it went mainstream.

It took them more years to kill the features but reality was nobody used it for more than a couple years because by the time Fermi launched in 2010 the higher end GPUs could do it.

RTX2080Ti has a very limited amount of ray tracing hardware because Nvidia knew adding any more was a severe risk if there was no acceptance for it.

Now we know it is folded into Direct3D as well as console support there is no doubt there is a major future for the technology.

Nvidia can go ahead and include much more ray tracing performance from their next architecture and it'll be a saleable feature even for the mainstream cards.

That could easily put Turing's Ray tracing performance potentially at such a low end for the technology even the fastest card will suffer.

I'm willing to bet a midrange next gen card with ray tracing on will put up a fight with first gen cards even at the higher end. Wait and see, it won't be too long. 3-4 months perhaps.
 
That's an interesting perspective. While I disagree with the details you're stating, I respect your point of view. It will be interesting to see how things play out.
 
Back