Playing Cyberpunk 2077 on first-gen PS4 and Xbox One units is proving troublesome

The Jaguar CPU in these consoles are actually two quadcore units. And because of this latency between the two modules is pretty high, so the second module is rarely used for anything more than OS and some side tasks. The Main game thread and all extra main game threads tend to only run on those 4 cores. It was a really poor choice for a Console CPU, I fully blame AMD for pushing that design choice. The PS4 would have held up so much better if it only had a better CPU. Really should have used a 3-4 module Bulldozer setup. Jaguar has such high latency between the two quad core modules it makes scaling a game engine on these devices past 4 core impractical. Bulldozer wouldn't have had this issue. Plus the mid cycle console refresh would have a good deal more powerful. So much so that we probably wouldn't have gotten the PS5/XSX this year, which would have been a great thing. As IMO this generation of consoles should have had a DLSS type of hardware solution.

Putting aside the fact Bulldozer wasn't great either, it never would have been used due to heat constraints and the necessity for also developing an off-die GPU unit, increasing both costs and power draw.

Lets also remember that prior to the Witcher 3 release, CDPR openly criticized the CPU horsepower on both consoles. So I'm not shocked in the least a far more graphically heavy game is running into significant issues.
 
Putting aside the fact Bulldozer wasn't great either, it never would have been used due to heat constraints and the necessity for also developing an off-die GPU unit, increasing both costs and power draw.

Lets also remember that prior to the Witcher 3 release, CDPR openly criticized the CPU horsepower on both consoles. So I'm not shocked in the least a far more graphically heavy game is running into significant issues.

Bulldozer wasnt so bad at low clock speeds. Even though it wouldn't have been a big jump from jaguar for overall ipc, it would have been a good deal better for later in life performance. IMO a 2.8 or 2.6ghz clock speed would have been more than enough and kept the consoles in decent power ranges.

Sadly both the PS4 and XB1 were pretty low power from day one.
 
I wonder (and would place money) a lot of the texture pop-in issues only appear on PS4's running HDD's and SSD equipped ones are fine.
 
Even though I play games on a PS4 Pro myself, I can only laugh. NEVER PREORDER GAMES - I guess some people will never learn.
 
People are tired of delays and getting mad at the devs for "crunching" and then this happens. They said they were having trouble optimizing the game for all platforms weeks before release and it was actually the cause for the last delay.

Gamers are never happy. Yell at the devs to release the game when it's not ready and then get mad when it doesn't run right.

Know what, they deserve what they get this time. It's one thing when EA or some other greedy dev releases a buggy game as a cash cow, but CDPR told us about this. The hypocrisy of the gamers complaining is simply astounding.
People are tired of what? Not having a product to consume? Must be so though, let me get my violin.
 
The Jaguar CPU in these consoles are actually two quadcore units. And because of this latency between the two modules is pretty high, so the second module is rarely used for anything more than OS and some side tasks. The Main game thread and all extra main game threads tend to only run on those 4 cores.
The SoC in the Series X/S & PS5 is also a split module CPU, so that in itself isn't the issue for the dire cross module latency problem with the XBO/PS4 - it's everything else in the design. Each of the 4 cores in a module shared a miserly 2 MB of L2 cache, and no other higher level - at that time, an Intel i7-3770K sported 1 MB L2 per core and 8 MB of shared L3. All core connections were routed via the L2 cache, so while its fully inclusive and sports 16-way associativity, the constant hammering of the cache for any core access didn't help; neither did the 128-bit bus to the cache. As a comparison, that i7-3770K has 256-bit buses for the L2 and L3 caches. All of which basically means you want as little cross-module data transfers as possible, hence why game developers ran their engines almost exclusively on the one module.

It was a really poor choice for a Console CPU, I fully blame AMD for pushing that design choice. The PS4 would have held up so much better if it only had a better CPU. Really should have used a 3-4 module Bulldozer setup.
The choice came from Microsoft and Sony - they wanted a super cheap, ultra-low power SoC with a decent GPU. AMD didn't offer any Bulldozer SoCs simply because the design was big and power thirsty: the twin module FX-4100 had a TDP of 95 W and a die area of 316 mm2. That's not much smaller the SoC in the Xbox One/PS4 (360 mm2) and the total power draw of those consoles was in the region of 120W.

As IMO this generation of consoles should have had a DLSS type of hardware solution.
The RDNA 2 CUs fully support tensor operations, so they can be used to do neural network-based temporal upscaling. Not as good as having dedicated hardware for the task, but doing it this way is a more effective use of the available die space.
 
Back