dividebyzero
Posts: 4,840 +1,279
Kind of sounds like academic interest only IMO. Trying to make a case to show a difference* in what amounts to basically an unrealistic scenario. Crossfired R9 380's would be extremely marginal for 4K usage, and generally cost more ($360-400) than a single R9 390 ($275-300) which is a better performer and comes without the headache of multi-GPU driver profile issues.Therefor it would be interesting to see 2GB compared to 4GB cards in SLI/CF configurations and tested at 4k.
The same could be said for the GTX 960. A couple of 4GB cards are around $400, yet an AIB OC'd GTX 970 at $285-320 comfortably outperforms them given the vagaries of SLI scaling.
I'm actually trying to think of a scenario where someone would voluntarily choose between two comparatively expensive lower performing cards and a single more powerful, cheaper, and probably less power hungry card. I could understand the dilemma if the user already had a single 2GB/4GB 380/GTX 960 and wanted to double up, but then a comparison is pretty superfluous since they just get another of what they already own. The only other scenario doesn't make a lot of sense - owning a 2GB card then needing to decide whether to buy another 2GB card, or ditch the one you have to buy two 4GB cards that are more expensive than a single more powerful card. Even if you managed to score a couple of 380's/960's as cheap as the single 390/GTX970 OC card, why would anyone compromise after plunking down a sizable amount of cash on a 4K panel?
Just seems all academic really unless I'm missing a scenario that is "real world"
* There will obviously be differences. A 2GB pair will need to swap textures in/out of system RAM more often. Moving those textures over the PCIE bus causes GPU stalls (framerate dips), but at 4K (or using FSAA/downsampling) the bigger issue will always be the lack of GPU power (fillrate).
Last edited: