I don't personally believe its Nvidia's "fault" to only have 1.5GB per card. I think it was purposefully done that way.
A) It suits pretty much any bezel corrected resolution across 3 1920x???? monitors which is more mainstream in the eyefinity/surround community.
B) 3x 30" monitors is a niche among niches. Just a guess here but I would say there are probably less than a thousand users running 3x 30" displays not including companies and the like.
Why would they spend more money when the majority of your niche surround community isn't going to see any benefit. So the few that run 3x 30" monitors have to buy ATi, wow, Nvidia just lost 1000 customers? Big deal.
If you look at drivers as an example, most games are either patched in the drivers before they come out or directly after. ATi and Nvidia both try to keep the mass of users happy by doing this. However if you look at Eyefinity/Surround support both in the drivers and in games, there is a chasm. Game devs don't take the time to add support if the engine doesn't already support it because its not cost effective and their user base isn't running it anyway, why should they care. Same effect in driver support.
There just aren't that many people with the dosh to splash out on 3x monitors and 2x video cards, let alone 1 video card and a decent sized single monitor, though with prices dropping this is changing some.
What really bothers me personally as a eyefinity/surround owner is that this "feature" is something both ATi and Nvidia tout as 'the next great thing' and yet their own support for it is atrocious. Lets just hope it gets better and not worse.
All things being said though, I wouldn't change a thing. It was well worth the money in my opinion and adds incredible emmersivness to games not to mention making desktop tasks much more enjoyable and efficient, even if you only use two of them for the later.
As far as the article goes, I'm not ridiculing Techspot for comparing the two cards. They generally do an excellent job in the review department and usually better than most. Quite the contrary, I commend them for even attempting the subject.
Who ELSE?!?!?!? has even broached the topic of how a video card can handle 7megapixels at >30fps? Other than WideScreenForums, none that I know of. It should be a staple for all review sites to test Eyefinity/Surround resolutions when a card is released, along with the bevy of other resolutions. If they don't, they're just doing what the devs, ATi and Nvidia are doing... ignoring the customers that get the most from their products and catering to the everyone else that doesn't. Seems very unintuitive from a customer standpoint.
I was merely trying to draw attention to this fact for the uneducated Eyefinity/Surround newcomers.