AMD Ryzen 9 7950X3D Memory Scaling: Does DDR5 Performance Matter?

Anyone playing at 4k dlss set to quality and performance the 1440p and 1080p data does affect even the 4090 owners using upscaling techniques.
While the Zen4 3d parts are quiet expensive everything else has crazy deals now like this one at Microcenter

AMD Ryzen 7 7700X, MSI B650-P Pro WiFi, G.Skill Flare X5 Series 32GB DDR5-6000 Kit, Computer Build Combo for $689.96 SAVE $189.97

$499.99.
 
Anyone playing at 4k dlss set to quality and performance the 1440p and 1080p data does affect even the 4090 owners using upscaling techniques.
While the Zen4 3d parts are quiet expensive everything else has crazy deals now like this one at Microcenter

AMD Ryzen 7 7700X, MSI B650-P Pro WiFi, G.Skill Flare X5 Series 32GB DDR5-6000 Kit, Computer Build Combo for $689.96 SAVE $189.97

$499.99.
You have a 4090 and you're using upscaling? Do you hate yourself or do you have poor vision and don't care if games look slightly blurry in the first place.
 
You have a 4090 and you're using upscaling? Do you hate yourself or do you have poor vision and don't care if games look slightly blurry in the first place.
Actually it's choice 3 I like a balance between visuals and actual playable frame rates. Questions?
 
Actually it's choice 3 I like a balance between visuals and actual playable frame rates. Questions?
Why are 4090 owners complaining about the 7950X3D like they didn't spend $2k for a GPU? Seems ironic. And yes, like I said, all the X3D chips are sold out. Why is it a surprise? Stock will be in and out in the coming weeks but should stabilize when the 7800X3D arrives.

Again, the majority does not own a $1k+ GPU or even $500, $300+ for a CPU, or should anyone spend an exhorbitant amount to play video games. A R5 1600/2600/3600/5600 paired with a 6650XT/6750XT is well capable of pushing high fps. I have to imagine there are less fortunate people out there.
 
Last edited:
Why are 4090 owners complaining about the 7950X3D like they didn't spend $2k for a GPU? Seems ironic. And yes, like I said, all the X3D chips are sold out. Why is it a surprise? Stock will be in and out in the coming weeks but should stabilize when the 7800X3D arrives.

Again, the majority does not own a $1k+ GPU or even $500, $300+ for a CPU, or should anyone spend an exhorbitant amount to play video games. A R5 1600/2600/3600/5600 paired with a 6650XT/6750XT is well capable of pushing high fps. I have to imagine there are less fortunate people out there.
Who is complaining? You said the 12 core parts are sold out and I proved they were not. What are you on about?
Also I did mention 4090 could benefit from the zen4 3d if playing at 4k dlss set to quality and lower due to 1440p and 1080p samples. What does this have anything to do with anything you are talking about? Hmm?
Update plus the 16 core part is losing to the upcoming 8 core 7800X3D in forced 1 active ccd so there is that as well in gaming.
 
Last edited:
Who is complaining? You said the 12 core parts are sold out and I proved they were not. What are you on about?
Also I did mention 4090 could benefit from the zen4 3d if playing at 4k dlss set to quality and lower due to 1440p and 1080p samples. What does this have anything to do with anything you are talking about? Hmm?
Update plus the 16 core part is losing to the upcoming 8 core 7800X3D in forced 1 active ccd so there is that as well in gaming.
You and others are here complaining. Not hard to see.

And no you proved nothing. WHAT YOU ON ABOUT. It's sold out. Check again. 😂
 
Without DLSS on how many fps do you get/ what games are you playing?
Vermitide 2 max settings at 4k no upscaling about 180 to 240 fps.
Darktide with dlss set to quality at 4k max settings without rt about 160 fps and with rt max settings getting about half ( the only problem with rt here is game crashes with rt)
Finished plaques tail at max settings at dldsr 2x4k set to quality got 85 fps.
Cyberpunk max settings and rt set to psycho 4k dlss set to quality and reflex set to boost getting around 60 fps min playable experience.
Red dead redemption 2 max settings rt on with all the bells and whilstles like water reflections at native resolution getting 80 to 90 fps.
Control with my previous 3090 xc3 at max settings and rt on with dlss set to quality at 4k about 80fps about to continue game with 4090 with new hdr and texture mod.
Hitman 3 max settings with rt on at native 4k getting 80 fps.
Hogwarts rt on max settings at native but frame generation on getting around 80 fps with reflex set to boost mode.
Some games look pretty good at 4k dlss set to quality with rt on like Cyberpunk and Control,but some games look better at native RD2, Hitman3, Vermitide 2, Darktide , Hogwarts and even better with dldsr above native like Plagues tail reuqueim.
I specifically waited for Cyberpunk with 3090 to get playable frame rates that is now possible with my 4090 with rt on.
This is all subjective obviously and the take home message.
 
99% of market does not have MC. Funny.
And yet the market that in reach is rejecting them. My specific location is a heavy PC enthusiast market in NYC flushing area where the 4090s, i9s sold out including the 7950X3D and 7900 xtx instantly. Sure let's believe that people want the 7900x3d. From my observation on launch day before the store opened the 7950X3D had 16 units per store and 7900x3d about 50% more 24 units per store and as you can still see there is plenty of stock.
 
And yet the market that in reach is rejecting them. My specific location is a heavy PC enthusiast market in NYC flushing area where the 4090s, i9s sold out including the 7950X3D and 7900 xtx instantly. Sure let's believe that people want the 7900x3d. From my observation on launch day before the store opened the 7950X3D had 16 units per store and 7900x3d about 50% more 24 units per store and as you can still see there is plenty of stock.
I can't walk into a MC and pick up a 7950X3D bud. Because I don't have one. What you on about?
 
I can't walk into a MC and pick up a 7950X3D bud. Because I don't have one. What you on about?
So then it doesn't apply to you. The 12 core 7900x3d is available just in case you want one. I mean if you are patient enough they will likely restock these soon everywhere. It's not like this is a paper launch to trigger and create hype for the upcoming 7800X3D cpu for a whole month and half. Right Right?
 
Vermitide 2 max settings at 4k no upscaling about 180 to 240 fps.
Darktide with dlss set to quality at 4k max settings without rt about 160 fps and with rt max settings getting about half ( the only problem with rt here is game crashes with rt)
Finished plaques tail at max settings at dldsr 2x4k set to quality got 85 fps.
Cyberpunk max settings and rt set to psycho 4k dlss set to quality and reflex set to boost getting around 60 fps min playable experience.
Red dead redemption 2 max settings rt on with all the bells and whilstles like water reflections at native resolution getting 80 to 90 fps.
Control with my previous 3090 xc3 at max settings and rt on with dlss set to quality at 4k about 80fps about to continue game with 4090 with new hdr and texture mod.
Hitman 3 max settings with rt on at native 4k getting 80 fps.
Hogwarts rt on max settings at native but frame generation on getting around 80 fps with reflex set to boost mode.
Some games look pretty good at 4k dlss set to quality with rt on like Cyberpunk and Control,but some games look better at native RD2, Hitman3, Vermitide 2, Darktide , Hogwarts and even better with dldsr above native like Plagues tail reuqueim.
I specifically waited for Cyberpunk with 3090 to get playable frame rates that is now possible with my 4090 with rt on.
This is all subjective obviously and the take home message.
How many hz is your monitor?

I'm having a hard time wrapping my head around having that much power to run max settings just to upscale it to lower image quality for a little extra fps. Rather have it on high than ultra if I needed the fps. If you had a 2070 I'd understand.

You do you but I just don't get it. I hate the fuzziness. I still think DLSS was a marketing tool for the 2000/3000 series not being able to hit higher fps.

Awesome system though I'm obviously not throwing shade quite the opposite.
 
How many hz is your monitor?

I'm having a hard time wrapping my head around having that much power to run max settings just to upscale it to lower image quality for a little extra fps. Rather have it on high than ultra if I needed the fps. If you had a 2070 I'd understand.

You do you but I just don't get it. I hate the fuzziness. I still think DLSS was a marketing tool for the 2000/3000 series not being able to hit higher fps.

Awesome system though I'm obviously not throwing shade quite the opposite.
Cx oled 4k 120hz. I agree the image softness on some implementations are not worth the image quality loss, but this is obviously with rt on. With pure rasterization titles I don't use upscaling because the performance is often very good to ideal. With some rt titles the quality loss from no rt at native is greater than running it 4k dlss set to quality with rt on like cyberpunk. Don't get me wrong nothing beats native 4k with rt on but the performance is not playable even with frame generation in cyberpunk. Hogwarts is the opposite at 4k native with rt on and frame generation nothing beats the image quality with dlaa on in terms of image quality and playable latency imo. The only thing that beats image quality and 4k native is downsampling from a higher resolution to a lower resolution buy currently this is mostly possible in games without rt and pure rasterization like for me Plagues tail requim dldsr 2x4k set to quality the performance is actually better than native and image quality as well imo. Do I use a blanket approach to my gaming quality obviously not. Can I go through each setting and see which one gives me significant performance gains with least amount of visual loss probably yes but with my busy schedule it would be too cumbersome. I can try lowering settings across the board by 1 tear and not using upscaling but from what I can recall upscaling only in the 4k dlss set to quality was a better balance for me. Every game is different and emphasizes the P in personal computers.
 
I'm curious what the difference is with 3200, 3600, 4000 and say 4400 DDR4 RAM. 5800X3D I think could really only do 3200, maybe 3600, but Intel 12th and 13th gens should be able to get 4000 and even 4400. Is it worth getting the DDR4 boards still? And if so how much performance is lost versus the DDR5 builds?

I'm still on the sidelines, especially with all the ethernet and other board issues that seem to plague both Intel and AMD motherboards (adding insult to injury considering their high pricing). These recent gens are quite disappointing to me.
 
Cx oled 4k 120hz. I agree the image softness on some implementations are not worth the image quality loss, but this is obviously with rt on. With pure rasterization titles I don't use upscaling because the performance is often very good to ideal. With some rt titles the quality loss from no rt at native is greater than running it 4k dlss set to quality with rt on like cyberpunk. Don't get me wrong nothing beats native 4k with rt on but the performance is not playable even with frame generation in cyberpunk. Hogwarts is the opposite at 4k native with rt on and frame generation nothing beats the image quality with dlaa on in terms of image quality and playable latency imo. The only thing that beats image quality and 4k native is downsampling from a higher resolution to a lower resolution buy currently this is mostly possible in games without rt and pure rasterization like for me Plagues tail requim dldsr 2x4k set to quality the performance is actually better than native and image quality as well imo. Do I use a blanket approach to my gaming quality obviously not. Can I go through each setting and see which one gives me significant performance gains with least amount of visual loss probably yes but with my busy schedule it would be too cumbersome. I can try lowering settings across the board by 1 tear and not using upscaling but from what I can recall upscaling only in the 4k dlss set to quality was a better balance for me. Every game is different and emphasizes the P in personal computers.
If in doubt shadows down a knotch, if you need a bit more post processing, and effects. Not saying go down to medium just a knotch from ultra to high. I've spent far far too much time tweaking settings for fps multiplayer games but still wanting them to look good on a budget. Especially since I like decent AA and textures. If you ever need a quick boost without investing much time with big baller hardware.
 
No mention of DDR5-7200 prices for a fair comparison between Intel and AMD systems, though I know it's not the official purpose of the article. Still, as the comparison is with an Intel CPU, the clearer the better.
 
If you have heavy workloads and like efficiency they are damn nice. Granted I'd jump on a 7600X3D (which doesn't exist) without thinking twice. I don't need a space heater I want an efficient gaming CPU. Heck of a lot more appealing to me than a 13900K.

Are the DDR5 6000 CL30 and DDR5 CL32 widely available or are they like some of the low cas DDR4 modules that were available early on then skyrocketed in price later?
i9-13900K is pretty terrible in terms of perf per dollar and watt, you will see the same gaming performance (or 1% lower) with a i5-13600K which is waaay more efficient.

However I personally want 8 performance cores. So i7-13700K would be my choice if I bought anything from 13th gen.

I would never buy 13900K or 13900KS when i7-13700K exist for a gaming PC
 
I was wondering, will there be any difference in gaming at 4K and general gaming for:

F5-6000J3038F16GX2-TZ5NR and F5-6000J3040G32GX2-TZ5NR

Are the 32 GB more tighten timings better than th 64 GB or for 7000X3D it won't matter and just get the capacity I need?
 
AMD pulled the same strategy in terms of pricing what comes $100 apart at launch and limited in units compared to the cheaper variant that no one wants?

This has to stop. AMD priced 7900XT at 899 and 7900XTX at 999 - Then produces 90% 7900XT and 10% 7900XTX.

This is not how they will win back marketshare.

I bet they delayed 7800X3D because it's going to deliver the same if not better gaming perf as 7900X3D and 7950X3D. The higher clockspeed on the more expensive models is "up to" and only affects the CCD without 3D cache. I bet the cores with 3D cache are going to run at same clockspeed on all 7000 3D chips...

7800X3D has 8 cores with 3D cache on a single CCD. 7900X3D has 6 cores with 3D cache in comparison.. and 6 without but 32MB more cache that likely won't matter anyway

With 7800X3D you don't have to worry about games choosing the wrong cores.. and lets be honest, I'd rather have 8 cores with 3D cache than 6 - The 7900X3D is a weird chip and I don't see many reviews on this, it's mostly 7950X3D reviews

Just release the 7800X3D already..
 
This has to stop. AMD priced 7900XT at 899 and 7900XTX at 999 - Then produces 90% 7900XT and 10% 7900XTX.

This is not how they will win back marketshare.

I bet they delayed 7800X3D because it's going to deliver the same if not better gaming perf as 7900X3D and 7950X3D. The higher clockspeed on the more expensive models is "up to" and only affects the CCD without 3D cache. I bet the cores with 3D cache are going to run at same clockspeed on all 7000 3D chips...

7800X3D has 8 cores with 3D cache on a single CCD. 7900X3D has 6 cores with 3D cache in comparison.. and 6 without but 32MB more cache that likely won't matter anyway

With 7800X3D you don't have to worry about games choosing the wrong cores.. and lets be honest, I'd rather have 8 cores with 3D cache than 6 - The 7900X3D is a weird chip and I don't see many reviews on this, it's mostly 7950X3D reviews

Just release the 7800X3D already..
7900X3D was released because otherwise web would be full of comments like "AMD only released top model and forces users to buy best and most expensive one blah blah". 7900X3D also sold out quickly, no-one wants that yeah :D

They delayed 7800X3D because there simply are not enough 3D-cache chips available. Now we know that while 3D-cache is made on 7nm process, it's different than 7nm used on cache chips used on 5800X3D. If AMD cannot make enough cache chips for $699 CPU, how would there be enough for $449 CPU?
 
Back