The CPU/GPU suggestions... from the Dark Side

Since this is my first post, my further post *may* contain a *bit* of roleplaying - unless mod politely forbids me to - but I answer either with firm fact or the whole post is a joke (hopefully, fun for all). And I have PC since 1987, and have been silent reader of this place for years and so on...

I would like to state my opinion on CPU/GPU review states generally done on most sites (except sometimes someone - several times it happened here) - it's SAD. And goes something like this:

  • We're reviewing low, or under 200g CPU. 'To eliminate GPU bottleneck, we're use this NVIDIA Titan++ GTX Megalodon(tm) 64Gb HDB4 card, who alone cost at least 5 times whole configuration.
    • it's very unlike that any of these configurations was ever made outside testing labs, and certainly doesn't give 'clear picture'
    • then we use it to measure 1080p gaming, in DOUBLEPLUSULTRA mode, who is basically made for showing off, and it's proven on double (plus) blind test that players can't distinguish that mode, from next one, you know 'very high'
    • 1080p gaming is slowly dying off (still major part, but goes less and less), but point is that player buying new CPU probably won't pair doubleplusultra on 1080p, he is more interesting in QHD (which I'm using, for example) or 4k (which doesn't work well on any card, except this Megalodon thing
  • We're reviewing a variety of low to high GPUs, on, guess what - most expensive Intel CPU, 'to eliminate CPU bottleneck, which is provenly the biggest on... 1080p doubleplusgood
    • it's very unlike...
    • 1080p doubleplusgood with 300 FPS exceeds monitor frequency for high end configuration
    • laptops typically fail miserably, while they're in fact OK
    • 1080p gaming is slowly dying off
  • lower end cards show 20-something FPS, which is mostly due to doubleplusgood settings and who has one (card) isn't eager to max settings, more like if that card supports stable 60FPS and what is the settings level
If we take my example, I have QHD IPS monitor with 60Hz refresh. I'm not much of a high-end gamer, stuff I play isn't that demanding, but say it gives general picture. For me, anything capable of delivering steady 60+FPS (I know it's low, but where will I go - perhaps to bigger screen with 75Hz or something like that) - ok, GPU/CPU combination delivering 60FPS in next-best mode is what I'm looking for. I may spend more, but - it's tested - 'Ultra-quality' is mainly a myth. No matter, let it also be in a review. But when I recommend something to friends, I definitely don't take 4k textures with 32+AA as relevant.

Players in general are interesting - at least by my opinion, even if they don't know it themselves - to have a good combination that does the duty fairly on their configuration - FHD, QHD, 4k or some very-wide abominations.

For me, GPU with stable 75+FPS in QHD is the goal and I'm currently circling about real reviews (when they come) of 5700.

For those cats and dogs who can catch difference on 120/144Hz monitors, in resolution of choice - it's the GPU/CPU with stable (say) 150+FPS - probably reachable only on 1080p now

For 4k players - 35FPS and 45FPS and likely 55FPS are essentially the same. They WILL have some tearing and whatever. They need a combination with at least 60+FPS, but more the better. And they are probably only ones who need BOTH high end CPU/GPU



Quality - mainly towards Mr. Huang who is especially prone to trick, but some goes to AMD also
Yes, I was 'in a biz' a long time ago and know some stuff
  • Ray tracing, Ray scatter, Radiosity and so on - none of those things were invented by NVIDIA. They exist 30-50 years long. Short list what CAN do Raytrace:
    • New NVIDIA generations that cost... appropriatelly
    • Old NVIDIA generations
    • Any NVIDIA since Ge-Force
    • Any AMD graphic card since forever
    • S3, Tseng Labs, Trident, Voodoo 1 and all forgotten graphic cards
    • Any CPU ever produced
They are not equally good, but not equally priced, also. That Mr. Huang proclaimed ray-tracing NVIDIA exclusive and locked drivers for certain models doesn't mean they can't do it. The best - and probably what they really done, let's not be malicious, is bit optimized new GPUs for specific kind of math needed for this rendering technique. Everything else how only this-and-that only can ray-trace is a lie.

Google 'render farms' and you'll find a ton about 80386 CPU and Macs with (I think) 68030 - hundreds of them, rendering IN RAY-TRACE some CGI for Hollywood. Later, different CPUs, expensive as hell, and now... who knows. But it's nothing but the truth that CPU started this, and everything capable of calculation can do it still.
  • RTX - AI improves textures and saves time, increasing quality. Right. This stink from a mile if you know that AI is enhancing textures 4x smaller than game producer intended. It DECREASES quality, but increases FPU - that's the only truth.
  • Maybe you've noticed that the same card increases and decreases quality at the same time - this result I simply HAVE to see to decide
  • Doubleplusultra mode vs Very High - I mentioned some studies. Only fair is to compare quality in some ingame demo, you know - when the picture is in halves, showing both qualities
  • 16AA is a cake, and also a lie. It's not winning argument nowadays, but let's say that we (we, at that time, me as a member of a team doing... something) had VERY serious discussion, different type of pictures, different monitors, different lighting conditions etc. to make a decision if 4AA is needed at all. Let's say that it is, and that increases with picture size, but 32, 16 - placebo. Anti-aliasing at 8 is more than you need - and again, would you notice the difference?
This was very long, does it make any sense?

Yes. A bit. Overkill on either CPU or GPU may actually CREATE a bottlenecks and give false results.
Advertised stuff can often be a snake-oil, it might really does do something... which can be noticed by a cat or that fish who see 6 primary colours. But not humans.
All of us doesn't have 20/20 and sometimes action on video game is frantic and we really, really can't notice the difference.

I merely suggest that maxing CPU/GPU practice be avoided in certain review - like that Megareview - 30-something GPUs/CPUs tested for... I like that stuff! Back to topic, that *sometimes* a review could concentrate more on gains for major player group requirements, not specific breeds who have integrated-to-low GPU, but sporting few thousand g's CPU or vice versa. On low, medium and high need user groups.

And that those 'divided' videos - maybe in 4 parts - show the same scene and FPS in 'AI improved', raytrace, ultra and next-best to ultra.

Thank you if you read this, especially the stuff - it's long but I think it has some use.

PR of The Master:
By the endless generosity of All-Seeing Eye, who can spot difference on any screenshot where it actually exists, His idea of using 6 or more primary colours during rendering, including deep IR and and perhaps microwaves is FREE to use by ALL GPU producers. He is above patent trolling, as He is above everything.
 
Back