Ah.. Well, maybe it will be bundled.
I quoted a German mag which said that D3 would stay in testing until after summer. iD then went to say that the entire article was false, meaning that D3 might very well come out sooner.
So it's a bit unsure if it'll be bundled with NV40 or not.
Sorry about that.
But I can say that currently all Nvidia's got is a sticker to put on the boxes of their cards which says that iD says the cards will perform ok. [size=tiny]I wonder how much that cost nvidia![/size]
As for the topic,
THE card for D3 will be either R500 or NV50, since D3 is based to scale to future hardware. But those are at least 18 months away, so I wouldn't wait to buy them just for D3.
It also should be noted that a GF3 will be able to run the game (though if it'll be able to run @ 1024/768 as the video shows is another matter).
What is of interest (imo) is R9700pro or higher, since they can run the ARB2 (Carmacks name) path, which means everything turned on and acceptable framerates.
The FX5900 and higher will be able to run either the ARB2 or the NV30 path, which might mean a bit higher precision (32bit compared to 24 bit, but shouldn't result in any visible difference!), or a bit faster at lower precision (16bit vs 24bit, but once again at hardly a visible difference if you don't know what to look for).
If you're going to base your next purchase on D3, I'd suggest waiting for the next gen hardware (R420 and NV40) and see which of them is best suited. As it stands now, it's impossible to tell.
If, however, you don't want to wait, I suggest getting an ATI card (9800 pro or XT), as they are the best available on the market today. The FX5950Ultra isn't that far behind, but ATI has also been the deFacto standard for developers when it comes to DX9, so you'll be sure that most games will run without a hitch.
You'll also be sure that whatever settings you choose in the control panel is what you get in the game, compared to hoping that nvidia isn't lying about using your settings or not. (Check around the net for info about Nvidia's driver "optimizations" for (among others) 3DMark '03)
Hope this will be of some help