Black Myth: Wukong GPU Benchmark - 43 GPUs Tested!

For sure 5-10% worse quality but hey it s OK . If we compare WRC 7 max quality vs NFS Unbound +frame generation , The latter wins in my opinion .
Why have you got such a semi for frame generation, when you even said above your card at native res and refresh rate runs the game only 68% utilised?

You shouldn't even be using frame generation IMO. Literally pointless.
 
So, 4080 or 7900XTX is the minimum for this game..
if I consider playing on native 4K..
I guess I'll try another game first..
 
Why have you got such a semi for frame generation, when you even said above your card at native res and refresh rate runs the game only 68% utilised?

You shouldn't even be using frame generation IMO. Literally pointless.
1. for fun sake
2. draws far less power
3 . GPU runs cooler ; now its 33C outdoor summer , well even 58C is fine but the cooler the better
4. may reduce latency
 
1.4mil concurrent players on steam right now. this game is destroying DEI consultancy :)
Good finding but I m no sheep . I ll upvote your post

Edit - I m not going to debate it any more . Everyone can test for themselves . Also too many posts I did .
 
Last edited:
Full Cinematic Very High RT with RTX 4080 in performance DLSS mode returns 48fps average with no FG. It's pretty easy to tweak from here to get a 60fps average (which is fine for this type of game). The game scales really well. It's also the first implementation of FG that is not horrible. As Steve mentions in his optimization video, enabling FG is better than using motion blur. The RT here doesn't cause the framerate to go out of control with very low drops, it's pretty stable and 1% lows stay fairly tight with average fps.

Playing this on an RTX 4080 paired with a 7800X3D 32 GB DDR5 6000 CL 30 setup.
 
I actually don’t mind that games have “ultra” settings that require “ultra” hardware.

Too many games are artificially capped down to the current midrange and offer little to be excited for when turning the settings up.

As others have said, as long as the game is reasonably well optimized, I don’t mind this scenario one bit.
 
4. may reduce latency
Wat? The opposite is true. Frame generation adds more latency by default since it needs to buffer frames to generate the inbetweens before displaying anything to you, and the slower the real rendered frames, the worse it is. At displayed 60 with FG it should be very noticeable unless you're extremely tolerant to sluggish performance.

FG is great tech, but it's best used to improve visual motion quality on high refreshrate monitors, and when your base rendered framerate is already reasonably high. Your use case is about the worst possible application for it.
 
I read a lot of GPU reviews before buying and a lot of them said AMD cards are less expensive but very good.
What happened?

This benchmark went ahead and placed an emphasis on testing 2/3 of it with various Ray Tracing settings enabled. That is still largely an Nvidia optimized and dominated niche setting. It's similar to something like the sporadic Phys-X effects you used to see back in the day. Is it a more viable impactful setting? Sure it makes a difference to a bigger extent in regards to visual quality. Is it worth the obscene performance hit that comes with running it? To the vast majority of users, absolutely not. It's something to tout your new GPU purchase so you can brag to your friends. In reality, anyone buying $1000+ GPUs are using high refresh rate monitors, and there is 0 chance in hell they'd rather play a game at 52-64 fps for vaguely noticeable lighting and shadow effects vs running at 100+ fps with scaling cranked up for better eye candy on what people are looking at the vast majority of the time, I.e. overall texture quality.

This game is also mostly optimized for Nvidia cards in general so it's obviously going to run better on that hardware just like how any AAA Sony title is going to run better on AMD hardware as it's the basis of what the current consoles (and the vast majority of gaming consumers) are playing on.
 
Not the people who were around when Quake and Crysis came out, when flagship hardware struggled to achieve playable rates at high settings.

Assuming Wukong isn't an unoptimised mess doomed to a life of patching, maybe it's intended to be a hero game for hardware to 'grow into' or a target for other devs to aim for.

(Also remember that's only a $2000 GPU because nVidia decided it would be and the market rolled over for it. Better performance/price ratios are available.)
1000% on the last paragraph. Nvidia peaked with the 10xx series and it's been downhill ever since and unfortunately with them making such a large amount of income exclusively from their GPUs dominating in the generative AI field, they really don't have to bother caring about their core consumer base that made them who they are for decades.

Even when Crisis and Quake were out, those GPUs weren't being priced as obnoxiously as they are today. The top Nvidia card at the time was the 580 which launched on 11/9/2010 at a MSRP of $500. This WAS THE TOP OF LINE AVAILABLE NVIDIA CARD on the market. The previous gen Nvidia king was the Nvidia 480 which launched 3/26/2010 at an MSRP of..... $500. $500 in 2010 when factoring in inflation would amount to $719.00 in 2024 via the US gov inflation calculator.

The 1080? Launched at a MSRP of $600. The 2080? $700. The 3080? $700. The 4080? $1200.

This chart shows a good data set of the lack of overall performance increases gen over gen in relation to the price hikes per gen. I'd like to pause and state that yes that are many variable in play here, things like manufacturing costs, material costs, etc. however even when taking these into account they make a very poor argument in regards to pricing increases even with inflation.


TL;DR, Nvidia has data center and AI money now and it doesn't need any of us any more and has priced the long running previous best performance per dollar segment into the grave. Now it's explicitly the more dollars you spend, the more incremental performance increases you gain. Dont like Nvidias pricing structure? They don't care. Cry about it.
 
So, 4080 or 7900XTX is the minimum for this game..
if I consider playing on native 4K..
I guess I'll try another game first..

No they aren't, with intelligent pc usage and settings even a 2060 can get good results. This is a benchmark showing unrealistic settings with hardly any scaling and is useful for higher end users, but has no bearing on the lower end. Settings are there to be changed and tweaked otherwise buy a console.
 
Jijijijijijijiji why expend lots on a GPU to have FPS higher than 90 FPS when the human eye acuity ranges from 60 to 90 FPS. The thing is that not even RTX 4090 can't make it natively 90 FPS in BMW
 
A game to play after 10 years, maybe. now it's all about benchmarking and make people believe that they have to replace their hardware to play it.
 
It's impressive to see how well the RTX 4090 performs across different settings, but it's clear that high-end GPUs are almost a necessity for an optimal experience with such demanding games. For those of us with less powerful setups, it’s helpful to know the benchmarks so we can adjust our expectations and settings accordingly. Thanks for the detailed analysis and data!
 
Full Cinematic Very High RT with RTX 4080 in performance DLSS mode returns 48fps average with no FG. It's pretty easy to tweak from here to get a 60fps average (which is fine for this type of game).

I feel it's exactly this type of game (action combat) where you want more than 60 fps. 60 fps is fine for like, Crusader Kings 3 or something. But this kind of game will be more fun with 120fps.
 
GPU bench with upscaling and fake frame generator? Sorry... It sounds like a joke.

Show the public the harsh reality that none of the GPUs reach playable framerates with RT @ max. quality. 30fps master race:
How is that reality "harsh" ? Your own benchmark shows 27 to 37 fps, which for a non-competitive single-player game is eminently playable -- I'm not a gamer, but I defy anyone to tell the difference between 27 and 30 fps without a counter running.

Further, "max settings" is an entirely arbitrary metric: the studio could have easily omitted a couple bells and whistles from that setting. Performance would have been better, but the game itself would not be. And the 5090 is released in four months, and your objection vanishes entirely. Would you prefer if they didn't future proof the title?

The real clue to the non-logic here is the "fake frames" catch phrase. You realize that all visual frames -- in this game and all others -- are equally fake: each and every pixel generated by somewhat arbitrary algorithms.
 
I feel it's exactly this type of game (action combat) where you want more than 60 fps. 60 fps is fine for like, Crusader Kings 3 or something. But this kind of game will be more fun with 120fps.
IMO in this type of game the need for extremely low latency just doesn't seem to be there. It definitely doesn't require the kind of accuracy that you need in a shooter. I agree that 120 fps just looks better, but it's a balance between that and all the other visuals. I typically hate FG, like in Starfield, it's just awful and I would never turn it on. However, in this game it seems to work really well and so after I got the average framerate above 60, I turned on FG for 90-100 frames which improves the motion without hurting the gameplay. It doesn't feel sluggish and you don't feel any additional latency like you would in a shooter.
 
So, came in here looking for some benchmarks, but I must say the amount of settings you guys are showing is a bit overwhelming. Ultimately, seems we need those 5080s/5070s to get those 60+fps at highest quality settings.
 
1000% on the last paragraph. Nvidia peaked with the 10xx series and it's been downhill ever since and unfortunately with them making such a large amount of income exclusively from their GPUs dominating in the generative AI field, they really don't have to bother caring about their core consumer base that made them who they are for decades.

Even when Crisis and Quake were out, those GPUs weren't being priced as obnoxiously as they are today. The top Nvidia card at the time was the 580 which launched on 11/9/2010 at a MSRP of $500. This WAS THE TOP OF LINE AVAILABLE NVIDIA CARD on the market. The previous gen Nvidia king was the Nvidia 480 which launched 3/26/2010 at an MSRP of..... $500. $500 in 2010 when factoring in inflation would amount to $719.00 in 2024 via the US gov inflation calculator.

The 1080? Launched at a MSRP of $600. The 2080? $700. The 3080? $700. The 4080? $1200.

This chart shows a good data set of the lack of overall performance increases gen over gen in relation to the price hikes per gen. I'd like to pause and state that yes that are many variable in play here, things like manufacturing costs, material costs, etc. however even when taking these into account they make a very poor argument in regards to pricing increases even with inflation.


TL;DR, Nvidia has data center and AI money now and it doesn't need any of us any more and has priced the long running previous best performance per dollar segment into the grave. Now it's explicitly the more dollars you spend, the more incremental performance increases you gain. Dont like Nvidias pricing structure? They don't care. Cry about it.

Accurate assessment. But it's not that Nvidia doesn't care, it's that they're making money because people are willing to pay these prices. Also, it's astonishing that there's really no competition for them. AMD? Intel? Nope. And they're shifting their focus to AI too, so they don't care for this gaming market.
 
I got 3-4 times higher results I did same settings
Cinematic Quality: Upscaling 75% + Frame-Gen, RT Very High
???? fake test? or I did something wrong????
 
Last edited:
Back