If you are rocking a 16 year old GPU you probably aren't interested in gaming, or modern computing for that matter. You are talking about a DX7 part here that uses an interface I haven't seen in a decade. How would we even go about making a meaningful comparison?
Outside of people making "classic" computers, nobody is. But that would not be the point. There's value in charting progress in itself. Or as you put it, "appreciate the statistical significance of the data presented"
You can't compare directly via fps. But you can do it relatively. Take lowest couple of cards, like 480 and 580 or even with 680 and retest them, along with older cards like 280, 9800gtx, etc or whatever makes sense in terms of dx3d or interface, with a different set of games from 2007-08, like company of heroes, bioshock, etc. down until performance drops off. Then pick a few lowest generations of that set, and retest them again with another, still older set of games. Those couple of cards will serve as a bridge to stitch a graph of performance along X axis, and compare all the cards all the way back to 256, like the chart on page 6 does now. Except that on Y axis it will not be possible to show unbroken fps. It would need to have separate fps counter for each set of games.
It's basically GeForce 256, 2, 3, 4, 5, 6, 7, 8, 9 and 200 series, + however equivalent releases are within series. It shouldn't be more than a few different sets, unless I'm horrifically wrong about agp/dx3d/etc support in cards and games. I'm sure it's a pain to source all those old cards, MOBOs and CPUs and drivers, but if successful, comparison would be just so epic.