AMD Ryzen 5 7500F: The Most Affordable Zen 4 CPU

That's quite a significant gap in gaming tbf. I guess also AM5 offers longer life support so chances are the Ryzen 5 8600 will also work in the same motherboard. I do hope AMD supports AM5 as much as they did with AM4. I just wished the motherboard prices were more affordable.
 
It's nice to see cheap Ryzen parts, but even $180 is kinda high, especially when AM5 platform costs are taken into consideration.

right now I'm looking at rebuilding my media PC to use that sweet low profile 4060 that came out, and a intel board is only $127 open box, I can get them for $140 new.

Meanwhile, on newegg AM5 mITX boards are all well north of $200, except for a SINGULAR gigabyte A620 board, which apparently has board warping issues.
 
It's nice to see cheap Ryzen parts, but even $180 is kinda high, especially when AM5 platform costs are taken into consideration.

right now I'm looking at rebuilding my media PC to use that sweet low profile 4060 that came out, and a intel board is only $127 open box, I can get them for $140 new.

Meanwhile, on newegg AM5 mITX boards are all well north of $200, except for a SINGULAR gigabyte A620 board, which apparently has board warping issues.
Damn, Where abouts in the world are you? In the UK, B650 Motherboards start at around £130-£140 these days, with the cheaper A620's coming in around £80.
 
Last edited:
Crazy how 1080p was introduce in late 90s and gained popularity in the mid 2000's and here we are 20 yrs later and it's the default gaming resolution.
 
Crazy how 1080p was introduce in late 90s and gained popularity in the mid 2000's and here we are 20 yrs later and it's the default gaming resolution.

I haven't used that resolution since the late 1990's. That in 2023 we have gpu's that can struggle with 1080p is a sad indictment of the whole industry.
 
I haven't used that resolution since the late 1990's. That in 2023 we have gpu's that can struggle with 1080p is a sad indictment of the whole industry.
No-one used 1080p even in the 90's because it was still the era of 4:3. We went from 320x240 > 640x480 > 800x600 > 1024x768 > 1280x1024 > 1600x1200 (CRT). Then when the early LCD's were introduced they were also still 4:3 (1024x768, 1280x1024, 1600x1200). Then when non 4:3 widescreen TFT's were introduced they were 1920x1200 for a couple of years (because it took the largest mainstream 1600x1200 resolution and added extra width same as 3440x1440 vs 2560x1440 does today). It took until 2003 for LCD's to start outselling CRT's in general, and for mid-late 2000's for "1200p" to become mainstreamed 1080p in line with the HDTV push. And 2000-2001 era DirectX6 GPU's like the Voodoo 5 or Radeon LE were still all analogue VGA only. DVI wasn't even invented until late 1999 (and HDMI until 2002) and both 15-pin analogue VGA and Direct X7 API limitation both max out at 2048×1536 (so the largest consumer widescreen mainstream res in practice = 1920x1200 well into the 2000's).
 
Last edited:
I think there's a typo in the 2nd "performance per watt" chart title as it still states "Total system power usage". It should perhaps also say how the value is calculated.
 
I think there's a typo in the 2nd "performance per watt" chart title as it still states "Total system power usage". It should perhaps also say how the value is calculated.
Yeah, that's a boo-boo. The graph should say something like watts-per-average fps -- take the figures for The Last of Us. The Intel system averaged 141 fps and consumed 446W, giving the indicated figure of 446/141 = 3.16.
 
Back