Having fun beating your head against a wall?
I think regardless of how many times you explain that the GPU runs out gas before the vRAM becomes the limiting factor, some people are still going to obsess about a few percentage points being a deal breaker even if the game is borderline playable at best.
Really? Most comparisons I've seen using HD texture packs for both tend to back up what Steve is trying valiantly to get across. At 19x10 and 25x16 resolutions vRAM usage has virtually no impact, while 4K and above starts seeing separation...but renders the game at a framerate not indicative of what virtually anyone would willingly subject to - rather I suspect, people would turn down the game IQ to get a better gameplay experience...which again nullifies the framebuffer disparity.
Here's an example of Shadow of Mordor scaling 4GB vs 8GB using the aforementioned HD texture pack put together by Scott Wasson once of Tech Report,
now an employee of AMD's RTG. Please note how the separation between 4GB and 8GB cards really only increases once the cards are well past 4K resolution and <30 f.p.s.
And I'll quote Scott here - who mirrors Steve's own observations. Quelle surprise!
[
Source]
So well done Hammayon, you have definitely proven that hardware vendors marketing is indeed very effective. I can now see why they went to the trouble of offering 4GB versions of the GT 640 and R7 250E.