I bet you're no fun at partiesIt is not cool to promote the consumption of illegal drugs online.
I bet you're no fun at partiesIt is not cool to promote the consumption of illegal drugs online.
Couldn't agree more! I cap my frames at around 90fps to 100fps on my 120Hz monitor, as I can't tell the difference. I also undervolt my GPU just by reducing its power to 75% and all up I keep if very cool. I guess measuring high frame rates is the only effective way to get a benchmark on capability and comparisons, if I'm correct. Don't agree with the drugs comment though - not appropriate. Keep away from drugs and promote a happy and healthy lifestyle.I don't really see the need for overclocking anymore. Most chips are run right at the edge of stability ANYWAY. I wouldn't want to overclock the X3D chips AT ALL because of what happened with the last gen chips. They were boosting too high and the LGA de-laminated from the bottom of the chip.
The Extra Cache will make the chip faster than any OC on a standard chip anyway.
Frankly, I don't understand why people keep chasing ever higher FPS numbers. Going BACK to 75-90FPS was an obvious improvement. I don't know if people remember the dark days of when LCDs took over CRTs and we were stuck at 60FPS for nearly a decade. 90-120 is nice to have. I can't tell the difference between a 120 and 144. My buddy has a 240hz OLED but I think that's more the responsive time of the OLED than the actual refresh rate.
I'm getting away from the point I was trying to make. Anyway, overclocking these chips isn't going to yield any noticeable performance gains. With the risks associated of what happened with the 7000X3D series, I don't think chasing bragging rights is worth killing a $500 CPU. If I wanted to waste $500 I'd spend it on cocaine and have a lot more fun doing it.
it's interesting you bring up reducing power because I find it more impressive reaching advertised speeds while lowering the TDP than pumping as much power into something to gain a few percentage points of performance. People end up increasing their system TDP by 20% or more and don't really gain anything practical.Couldn't agree more! I cap my frames at around 90fps to 100fps on my 120Hz monitor, as I can't tell the difference. I also undervolt my GPU just by reducing its power to 75% and all up I keep if very cool. I guess measuring high frame rates is the only effective way to get a benchmark on capability and comparisons, if I'm correct.
If you enable PBO then your chip will run at the edge of stability and if you tweak the settings then you can get more performance while pulling less power.I don't really see the need for overclocking anymore. Most chips are run right at the edge of stability ANYWAY. I wouldn't want to overclock the X3D chips AT ALL because of what happened with the last gen chips. They were boosting too high and the LGA de-laminated from the bottom of the chip.
The benefits of high FPS highly depends on the games you are playing. Slow paced games like RTS and most single player games will see little to no benefit from high FPS. Fast paced games like CoD, BF, Counterstrike, racing sims, etc do see a massive benefit though as the latency between your actions and the results on screen decreases. The pixel response times of OLEDs does play well with higher FPS because your screen can update pixels faster even at high refresh rates which reduces any smearing and ghosting that you may see. The only downside of OLED displays is their tendency to experience screen burnin.Frankly, I don't understand why people keep chasing ever higher FPS numbers. Going BACK to 75-90FPS was an obvious improvement. I don't know if people remember the dark days of when LCDs took over CRTs and we were stuck at 60FPS for nearly a decade. 90-120 is nice to have. I can't tell the difference between a 120 and 144. My buddy has a 240hz OLED but I think that's more the responsive time of the OLED than the actual refresh rate.
We don't know if there is any improvements in the stacking process which may help mitigate the issues seen with the 7000 series X3D chips. One of the major downsides of previous generation X3D chips was the decrease in maximum clock speeds which made the chips perform worse with computational workloads and hopefully the improvements in the 9000 series will help with that.I'm getting away from the point I was trying to make. Anyway, overclocking these chips isn't going to yield any noticeable performance gains. With the risks associated of what happened with the 7000X3D series, I don't think chasing bragging rights is worth killing a $500 CPU. If I wanted to waste $500 I'd spend it on cocaine and have a lot more fun doing it.
It's a recommended option by MSI using Afterburner, without doing a full, more involved undervolt. But, I actually just do it now through the new Nvidia App, which is really simple. It certainly reduces power usage and runs cooler with no loss of frames.it's interesting you bring up reducing power because I find it more impressive reaching advertised speeds while lowering the TDP than pumping as much power into something to gain a few percentage points of performance. People end up increasing their system TDP by 20% or more and don't really gain anything practical.
I do feel it is similar to hotrodding in its own way so I can see it as a hobby, but I don't consider it necessary
I originally had my rig uncapped and also had it capped at 120fps with my 120Hz monitor. I then dropped it to 100fps or even 90 and no difference to me, but a lot less power and heat. I do sim racing only, so maybe it's not as intense and needed for smooth gameplay.I can tell a difference on the not-FPS games Spider-Man and Everspace. Below 90 FPS feels sluggish to me, while 90-120 feels slightly sluggish in those games.
I think some people believe it doesn't matter because they haven't experienced truly smooth gameplay. If you've never ridden in a luxury car, an Accord's ride seems great. And compared to a Taurus it is. It's all about the reference point of your own experience.
They’re not gimping clockspeeds on 9800 X3D - we might see the same with 9950 X3D - and if they can get the scheduler working correcly..well, anyways - will be very interesting chipsIn gaming? Nah, will probably still loose to 9800X3D even if it gets 3D cache on both CCDs. Yet most leakers say AMD is going to repeat, with only cache on a single CCD.
Why? Because you will be gimping both CCDs in terms of clockspeed then, and you buy 16 cores for productivity, not gaming.
I firmly expect 9800X3D to beat 9950X3D overall in gaming, by 5-10%
Just like 7800X3D beat 7950X3D by 5-10%
Gamers don't need more than 8 cores and this won't change anytime soon. By the time AMD can put 16 cores on a single CCD, it might be worth doing. Till then, nah, not worth going dual CCD and eat the latency hit, especially not if AMD only puts 3D cache on a single CCD again with 9900X3D and 9950X3D.
7900X3D was pretty bad because it only had 6 cores with 3D cache. It was closer to 7600X3D than 7800X3D in gaming. 7900X3D did not really make sense for anyone. "Bad" for gaming and "mediocre" for productivity - The chip did not excel at anything.
I would rather have bought 7950X and enabled PBO then, then you would get way better MT perf and gaming perf would still be close.
Expect to see the same thing repeated with 9900X3D. 9950X will make more sense. Splash for 9950X3D is better for people that actually need productivity perf and still have close to top tier gaming perf. 9800X3D will win in gaming tho, while costing less, running cooler and drawing less power.
I never said they were gimping 9800X3D clockspeeds but those dual CCD chips are going to struggle with inter CCD latency like all other dual CCD chips, meaning they will perform worse in gaming than the single CCD ones overall.They’re not gimping clockspeeds on 9800 X3D - we might see the same with 9950 X3D - and if they can get the scheduler working correcly..well, anyways - will be very interesting chips
I will disagree with you in one aspect of that. 60FPS is the minimum tolerable frame rate targeted by gamers and thought about often by gamers.
But, yeah, I am happy with anything over 75, 90 is my target and everything after that is just a bonus. There was my buddies 240hz OLED, but I swear that was a pixel response time thing and had very little to do with the FPS. Now that I think about it, he was showing me cyberpunk with Ray tracing on his 3080 so I doubt he was getting more than 70FPS@1440p.
Leaks show like 13-16% increase from 7800X3D in Far Cry 6 which is heavily CPU bound.
26% up in Blender too. 9800X3D shows 10% uplift from 9700X here.
Cinebench, 20% increased ST perf and 30% increased MT perf. 9800X3D will be no joke for applications and fix 7800X3D's biggest weakpoint.
Also, unlocked for OC will increase this even further for tuners.
Last gen, 7800X3D was at like 7700 Non-X level in terms of productivity performance. This won't be the case for 9800X3D. Leaks shows 9700X application perf or even better.
Some applications likes the cache as well.
Placing the cache below the CCDs was clever. 2nd gen 3D cache looks to fix many of the issues. 9000X3D will have vastly higher clockspeeds and this was the weakest point for 5000X3D and 7000X3D.
Can't wait for reviews. My 7800X3D might be going in the HTPC once 9800X3D hits.
I am a 360 Hz user and high fps gamer. I need every drop of CPU performance I can get. Especially when I am going 480 Hz OLED next year.
People that don't know how to tune for high fps gaming, don't need a golden gaming CPU like this. 240 Hz is bare minimum too. This is not for casual gamers.
If you are using a 100-144 Hz monitor, just get a cheaper CPU. You won't need this kind of CPU power for gaming at those fps. Even 7800X3D is overkill here.
Maybe for multiplayer, competitive shooter games, but for sim racing 60fps is fine at Ultra settings. I'm capping mine at about 90 on a 120Hz 4K TV and it's fantastic and no way I could go back to a 1440p monitor.The Pixel response time of OLED makes them thee best gaming monitors.
As such, OLEDs do not need a 266Hz+ for motion clarity like you do for LCD/LED panels, so that is a massive bonus. But pixel response/clarity aside... a true Gamer still want at least a 144Hz or higher panel if they play ANY multi-player shooter.
So, I disagree with you, and feel 60Hz is garbage for gaming. For the last Decade even TV are 120Hz, so why would ANY gamer be touting an old monitor at 60Hz to game on. That is absurd.
In Cyberpunk, it does not matter if you are getting 45 frames, or 120 frames.... bcz frames do not matter in that game. It's for enjoyment only and there is no element of precise timing, or fps movement, etc. SO I can understand how you can feel that frames don't matter...
Well, I did say 60fps is minimum tolerate frame rate, not that 60hz is a monitor people should get. Even for productivity purposes, the feel of a 120/144hz display is something I wouldn't want to be without. That said, I will tolerate 60FPS but I won't tolerate 60hz in anything but a worklaptop for typing and spreed sheets.The Pixel response time of OLED makes them thee best gaming monitors.
As such, OLEDs do not need a 266Hz+ for motion clarity like you do for LCD/LED panels, so that is a massive bonus. But pixel response/clarity aside... a true Gamer still want at least a 144Hz or higher panel if they play ANY multi-player shooter.
So, I disagree with you, and feel 60Hz is garbage for gaming. For the last Decade even TV are 120Hz, so why would ANY gamer be touting an old monitor at 60Hz to game on. That is absurd.
In Cyberpunk, it does not matter if you are getting 45 frames, or 120 frames.... bcz frames do not matter in that game. It's for enjoyment only and there is no element of precise timing, or fps movement, etc. SO I can understand how you can feel that frames don't matter...
60 fps = 60 Hz on a VRR monitor even if its 480 Hz tops.Well, I did say 60fps is minimum tolerate frame rate, not that 60hz is a monitor people should get. Even for productivity purposes, the feel of a 120/144hz display is something I wouldn't want to be without. That said, I will tolerate 60FPS but I won't tolerate 60hz in anything but a worklaptop for typing and spreed sheets.
I was thinking about my friends 240hz monitor and I think VRR makes the "feel" of a monitor a Grey area. A 240hzVRR monitor can more accurately display a frame at the time it is rendered than a 120hz display can. The precision at which a 240hzVRR monitor can display a frame is twice that of a 120hz monitor. I'm curious if that's what I was feeling when using my friends OLED.
I think anything over 80 to 90 fps must be a placebo effect, as it can't be seen by the human eye apparently. I much prefer to turn up the quality settings then chase pointless frames. Really, it's just a way for companies to measure and promote gaming performance. But, I'm on 4K and wouldn't go back to anything less.60 fps = 60 Hz on a VRR monitor even if its 480 Hz tops.
I am absolutely not playing any games at less than 100 fps myself and I prefer 200+ fps
60 fps might do, for console / TV gaming. Not for PC gaming.
Again with "the human eye can't see past x frames per second" trolling! I thought that was rendered obsolete about the same time when 120 Hz monitors became prevalent.I think anything over 80 to 90 fps must be a placebo effect, as it can't be seen by the human eye apparently. I much prefer to turn up the quality settings then chase pointless frames. Really, it's just a way for companies to measure and promote gaming performance. But, I'm on 4K and wouldn't go back to anything less.
Not sure what you mean by "Trolling", just interested in the technology behind it, but you can look these studies up, but make sure it is Peer Reviewed research and healthcare related, not silly reviews by Youtubers. Looking at some of the studies, the consensus seems to be that your eye can see more than 60fps, but anything over 80 is just diminishing returns as far as what you are actually seeing.Again with "the human eye can't see past x frames per second" trolling! I thought that was rendered obsolete about the same time when the 120 Hz monitors became prevalent.
You must be blind. I can easily see the diffrerence.I think anything over 80 to 90 fps must be a placebo effect, as it can't be seen by the human eye apparently. I much prefer to turn up the quality settings then chase pointless frames. Really, it's just a way for companies to measure and promote gaming performance. But, I'm on 4K and wouldn't go back to anything less.
As long as you believe that.You must be blind. I can easily see the diffrerence.
Stop that human eye misconception haha. Pilots are tested in 200 fps scenarios.
120 vs 240 fps is very noticeable.
360-480 fps on a 360-480 Hz is next level smoothness.
Or rather "as far as what I am actually seeing". Because you have not seen what I have seen.Not sure what you mean by "Trolling", just interested in the technology behind it, but you can look these studies up, but make sure it is Peer Reviewed research and healthcare related, not silly reviews by Youtubers. Looking at some of the studies, the consensus seems to be that your eye can see more than 60fps, but anything over 80 is just diminishing returns as far as what you are actually seeing.
Well, "I have actually seen" what you are seeing. I have a powerful GPU and have had uncapped my frames etc etc and achieved over 200fps, but couldn't see jack **** of a difference from 90fps! Have you actually seen any research on this? I'm going to take a guess, that you are still running on 1080p or 1440p, you have your frames uncapped with no - Gsync, so what you are possibly seeing is micro stutter and keep chasing higher frames thinking you can overcome this to find the Holly Grail! Going to a higher resolution (4K) is also regarded as a way of improving overall visual effect and this issue. If at all, higher frames rates (and it would have to be apparently 20,000 FPS +) did present much smoother gameplay, it's no longer what the eye sees in real life, as we do see a level of natural motion blur and stutter if you like, through peripheral vision, so what's wrong with this? As I said, most research suggests 90fps is about the limit of what is recognised by the human eye. Of course organisations like Nvidia, CPU manufacturers etc want us to believe higher frames rates matter, to sell higher end products, as they having no other value proposition! On top of this manufacturers keep producing outdated lower than 4K and 8K monitors/TVs, as gamers are chasing FPS and they won't achieve this at these resolutions as of yet.Or rather "as far as what I am actually seeing". Because you have not seen what I have seen.
There is a notable difference above 90 fps - but depends on the games you’re playing. Any top fps player would notice it in games that are moving really fast. I don’t think we need anything above the 165hz mark - but then again, I’m no longer 18 with mad reaction and perceptionWell, "I have actually seen" what you are seeing. I have a powerful GPU and have had uncapped my frames etc etc and achieved over 200fps, but couldn't see jack **** of a difference from 90fps! Have you actually seen any research on this? I'm going to take a guess, that you are still running on 1080p or 1440p, you have your frames uncapped with no - Gsync, so what you are possibly seeing is micro stutter and keep chasing higher frames thinking you can overcome this to find the Holly Grail! Going to a higher resolution (4K) is also regarded as a way of improving overall visual effect and this issue. If at all, higher frames rates (and it would have to be apparently 20,000 FPS +) did present much smoother gameplay, it's no longer what the eye sees in real life, as we do see a level of natural motion blur and stutter if you like, through peripheral vision, so what's wrong with this? As I said, most research suggests 90fps is about the limit of what is recognised by the human eye. Of course organisations like Nvidia, CPU manufacturers etc want us to believe higher frames rates matter, to sell higher end products, as they having no other value proposition! On top of this manufacturers keep producing outdated lower than 4K and 8K monitors/TVs, as gamers are chasing FPS and they won't achieve this at these resolutions as of yet.
You don't know what you're talking about.60 fps = 60 Hz on a VRR monitor even if its 480 Hz tops.
I am absolutely not playing any games at less than 100 fps myself and I prefer 200+ fps
60 fps might do, for console / TV gaming. Not for PC gaming.
Oh, yes I do. You obviously don't.You don't know what you're talking about.
It was a joke and a reference to a line spoken by a famous character from a certain fantasy series which has some rings in it.Well, "I have actually seen" what you are seeing. I have a powerful GPU and have had uncapped my frames etc etc and achieved over 200fps, but couldn't see jack **** of a difference from 90fps! Have you actually seen any research on this? I'm going to take a guess, that you are still running on 1080p or 1440p, you have your frames uncapped with no - Gsync, so what you are possibly seeing is micro stutter and keep chasing higher frames thinking you can overcome this to find the Holly Grail! Going to a higher resolution (4K) is also regarded as a way of improving overall visual effect and this issue. If at all, higher frames rates (and it would have to be apparently 20,000 FPS +) did present much smoother gameplay, it's no longer what the eye sees in real life, as we do see a level of natural motion blur and stutter if you like, through peripheral vision, so what's wrong with this? As I said, most research suggests 90fps is about the limit of what is recognised by the human eye. Of course organisations like Nvidia, CPU manufacturers etc want us to believe higher frames rates matter, to sell higher end products, as they having no other value proposition! On top of this manufacturers keep producing outdated lower than 4K and 8K monitors/TVs, as gamers are chasing FPS and they won't achieve this at these resolutions as of yet.
Just so you know:You don't know what you're talking about.