I compared the benchmark results of Assassin's Creed Mirage, especially the Intel Core Ultra 285K results. Using the same highest settings and FHD resolution, my setup with the RTX 4080 recorded 199 FPS
If I had used the same RTX 4090, my system with a Ryzen 9800X3D would have achieved scores close to or equal to TechSpot's test. For further verification, I downclocked the DDR5-6400 memory, but the score still remained as high as 190 FPS.
It is also worth noting that my system is running Windows 11 version 24H2 (build 26100.2605). In theory, 24H2 should perform worse, but in my environment, 24H2 is running stable and without problems.
The core question arises here: is it really possible to achieve such a low score with a 285K + RTX 4090 configuration? Were the benchmarks intentionally tweaked to make the 9800X3D look better?
It seems unlikely that the RTX 4080 would outperform the more powerful RTX 4090 under the same conditions, as it would undermine the fundamental promise of GPU scaling. Given the upcoming RTX 5000 series, this implies the following strange scenario
If I had used the same RTX 4090, my system with a Ryzen 9800X3D would have achieved scores close to or equal to TechSpot's test. For further verification, I downclocked the DDR5-6400 memory, but the score still remained as high as 190 FPS.
It is also worth noting that my system is running Windows 11 version 24H2 (build 26100.2605). In theory, 24H2 should perform worse, but in my environment, 24H2 is running stable and without problems.
The core question arises here: is it really possible to achieve such a low score with a 285K + RTX 4090 configuration? Were the benchmarks intentionally tweaked to make the 9800X3D look better?
It seems unlikely that the RTX 4080 would outperform the more powerful RTX 4090 under the same conditions, as it would undermine the fundamental promise of GPU scaling. Given the upcoming RTX 5000 series, this implies the following strange scenario