Upgrade to ge2 mx400

Status
Not open for further replies.

actionn

Posts: 16   +0
Upgrade

Well thats it boys, im finally tossing my voodoo 5 out the window, i just bouht a xtasy ge2 mx400, and i am happy as a lark. The voodoo cannot run xp well, and i hate 3rd party drivers. What do you think about my upgrade?
 
well you should of got a directX 8 card like geforce 3ti 200 for $100USD or the Geforce4 ti 4200 for $199. Your upgrade isnt bad, you just could of done better. If you are waiting for a direct X9 card*, then the geforce2 MX will do you fine.

*will come late august/early sept 2002
 
Actionn,

That was my choice for a PCI card too & I'm also happy with it. :grinthumb

I've installed the "Intel Application Accelerator" to improve my drives (Ultra ATA & Ultra DMA), used "CPUCool" to up the FSB speed (& the PCI) for improved processor performance & used "Riva Tuner" to oc the MX400. :cool:

I've installed a case cooling fan to blow across the PCI MX400 & suggest you might do the same. If you need help w/the card or the programs, lemme know. ;) I'm using the 29.20 driver available @ www.guru3d.com along with the other files I mentioned. All are working very well for me.

Enjoy! :D
 
Please do not run back here to complain about Image Quality.;)

You prolly have more speed but other then that, can't say you've gained a lot. One thing you did gain is a nice little bug concerning s3tc.:dead:

It's a good thing you're not getting random crashes or anything, many people forget to uninstall VooDoo drivers before upgrading to a GeForce.
 
CS fps boasting.... ;)
1.4Ghz Athlon @ 1.503Ghz
Asus Geforce 3 (Original)
256mb 2100DDR
1024x768 4x AA, Level 2 Antistropic filtering...
99 fps

Remember fps_max 100 otherwise its default is 72.
Just type "fps_max 100" in the console...

My old TNT2 in my 1Ghz Athlon Rig with 384mb PC133 got about 35 fps in 1024x768, and my Voodoo 3 2000 got about the same but in 16 bit colour.
 
Actionn,

Yes I use it for games, but I'm into racing sims. NASCAR, Evo 4x4, etc. I couldn't tell ya what fps I'd get for CS, Quake, Payne, RCW, etc. I run 1024x768, 32bit, high detail w/all but 'tire tracks'/'particles' on & avg 35+fps in my games. Not killer, but for a MX card & 500 Celeron/320 PC-133 it is working fine for now. :cool:

I recently dl'd a new demo: "Pro Race Driver" by Codemasters. Cranked the graphics setting to full & opened it up. Beautiful graphics. Very realistic renderings. Detailed cars & scenery. 10fps! Yep, it started at 10 fps. I had to turn things down due to the 'lag' between the js & the pic. This is a DX8 game. It isn't 'out' yet. It 'worked' my system to the max. 'New Generation' game. ;)

For PCI, all we have is the MX cards. For todays games: they do a decent job, IMHO. The price can't be beat either! :grinthumb

FWIW: I haven't had a single problem w/my VisionTek GF2MX400 64mb PCI. I read comments about 'problems' with the cards, but I haven't experienced any. I must stress that I do have a cooling fan blowing across the HS equipt PCI card. :cool:

Enjoy your card,
 
could you possibly explian why i might want a cooling fan, how much of a difference it woould make, if i have a heatsink and where i could get a compatible one.:rolleyes: sorry about all the questions:rolleyes:
 
Actionn,

Video cards generate heat. The more intensive their use, the higher the heat. Heat creates resistence. Resistence slows the card down & heat stresses the components. Active cooling (moving air) is 'better' than simple heat sinks (radiated cooling). Since this card doesn't have any RAM HS's & only a GPU HS & because it will be 'stressed' to render future games: active cooling is a wise choice, IMHO. ;)

Because there aren't any RAM HS's I chose to use a case cooling fan to actively cool the whole card & not just get a GPU fan that wouldn't cool the RAM. It's cheap insurance for the stability & longevity of the card. Since I am also oc'ing the card > it's a necessity to keep it as cool as practical. :grinthumb

Since cases are different, I can't say *which* fan will work for you. The higher the CFM the better tho'. Here's what I have & where I got it:


http://www.compusa.com/products/product_info.asp?product_code=280286

Hope this helps,


Vehementi,

Devils advocate here: how is the GF4 MX420 'better'? Granted it is DX8.1 compliant, but it runs at the same 166 memory clock, has the same 2.7Gig/sec memory bandwidth & is presently equipt w/the same SDRAM as the GF2 MX400. Now if those PNY cards *actually* have DDR like the 440/460 ... :cool:

C'mon Ti4200 PCI!
 
Originally posted by JAV
Devils advocate here: how is the GF4 MX420 'better'? Granted it is DX8.1 compliant, but it runs at the same 166 memory clock, has the same 2.7Gig/sec memory bandwidth & is presently equipt w/the same SDRAM as the GF2 MX400. Now if those PNY cards *actually* have DDR like the 440/460 ... :cool:

The GF4 MX aren't DX 8.1 compliant. They're just speed bumped GF2 MX cards with that new type of FSAA ( Accuview AA ) & a tweaked memory architecture ( Lightspeed Memory Architecture II ).

Visiontek Xtasy GeForce4 MX440 review
 
If compliant means it has drivers allowing it to run DX8.1 games, then yes it is compliant but it doesn't have any DX 8.1 features such as Vertex shaders.
 
After a quick comparison between GF4 Titanium and MX chips, we noticed the most important feature missing on the MX is what NVIDIA likes to call the “nfiniteFX II” engine. Originally introduced on GeForce3 chips, the pixel and vertex shaders that compose this on-hardware feature were designed for games and other graphics-intensive applications so that developers could specify personalized combinations of graphics operations to create their own custom effects instead of choosing from the same hard-coded palette of effects and ending up with a generic look and feel.

Enjoy.;)
 
Didou,

What does the nfiniteFX II engine have to do with DX8.1 support? Are you saying that ATI, Matrox, etc w/o nfiniteFX II engines CAN'T support DX8.1 either? How 'bout GF3 Ti's?

http://www.nvidia.com/docs/lo/1050/SUPP/gf3ti_overview.pdf

Here's the Ti page from nVidia. It says the EXACT same thing (concerning DX8.1 support) as the MX page. Nothing more & nothing less.

http://www.nvidia.com/docs/lo/1467/SUPP/PO_GF4Ti_2.05.02.pdf

I *think* some assumptions are being made. The GF3 Ti page says: "DX8.1 & lower", the GF4 MX & Ti pages say: "Complete Direct X support, including DirectX 8.1" Both *exactly* the same statement.

*IS* nVidia lying? :confused:
 
Devils advocate here: how is the GF4 MX420 'better'? Granted it is DX8.1 compliant, but it runs at the same 166 memory clock, has the same 2.7Gig/sec memory bandwidth & is presently equipt w/the same SDRAM as the GF2 MX400. Now if those PNY cards *actually* have DDR like the 440/460 ...

The GF4MX is not more DX8.1 compliant then any GF2 ( MX, GTS, ULTRA, you name it ) that's what I'm stating. The only advantage you could find for the GF4MX over the GF2MX is wrong, that's what i'm stating.

You said it yourself, the MX420 has the same bandwith & fillrate. I then proceeded to add that the only extra functions the GF4MX has is Accuview AA & LMA2, that's it.

If you play a D8.1 game out now ( such as Aquanox ) the GF4MX won't be able to do anything more then a GF2. Whereas a GF3 will be able to use Pixel & Vertex Shaders.

I'm not trying to be a ****** here ( OK maybe just a little bit ;) ) it's just that people might get the wrong idea when reading some of the things posted in this thread.

* Edited by Arris....
 
Didou,

Hmmm.
The GF4MX is not more DX8.1 compliant then any GF2 ( MX, GTS, ULTRA, you name it ) that's what I'm stating. The only advantage you could find for the GF4MX over the GF2MX is wrong, that's what i'm stating.

I'm not trying to be a ****** either (just happens) but, that is a mighty strong statement. I wonder if you can prove it. You see I'm a Paralegal living in California & nVidias' Corporate Offices are kinda local. I enjoy 'Consumers Rights' Actions.

If you have proof to back up your statements, I'll take the appropriate steps towards a 'Deceptive & Misleading Advertising' Action on behalf of consumers. With pleasure. :D

I am serious.

* Edited by moderator
 
Hmm you have just given me some valuable info there. Not as much on the gf front ( I have a GF3 so I don't really care about a gf4mx :D ) but I'm very apprehensif about all the digital laws being passed on right now.

Have a look here -> DigitalConsumer.org

You could come in very handy.;)

As for the GF4mx issue, I'm sure nVIDIA's marketing department have sealed all possibilities. Their DX8.1 compliance is just like Intel claiming the Internet will go 3x faster if you own a P4 ( didn't they say the same thing with the P3 allready ? ). I'm afraid the GF4mx cards has had more marketing brainstorming then technical. Just make sure to avoid it like the plague.

To get some hard evidence I would have to buy a GF4mx & I ain't touching that thing with a 50 foot pole.;)
 
Basically the GF4 MX works with DirectX 8 games.
But is not fully compliant with the DirectX 8 API. i.e. It cannot do anything about Pixel or Vertex shading as Didou points out.
There isn't much of a problem understanding the fact that there are features of DirectX 8 that the GF4 MX can't take advantage of, but it can still run DirectX 8 games without a hitch.

Tomshardware on the Pros and Cons of Vertex and Pixel shaders
Tomshardware VGA card charts - Check out the placing of the GF2 MX400 and the GF4 MX440...
 
Arris.

That's what I'm trying to clear up: GF4MX's support DX8.x (tho' naturally not to the fullest extent of the GF4Ti's) but the GF2MX's only support up to DX7. Is that a true statement, IYO?

I'm impressed w/the GF2Ultra running up w/the GF3Ti200's on the Aquanox BM! Too bad there isn't a 420 in the ratings. The 440 sure looks weak. :rolleyes:

C'mon Ti4200 PCI! :angel:


Didou,

I can research & supply the info I find. I can NOT advise. Only an Atty. can do that. I'm familar w/Cal State Laws & (lucky) have some knowledge of Federal Copyright/Patent Laws.

Lemme know what you had in mind.
 
That's what I'm trying to clear up: GF4MX's support DX8.x (tho' naturally not to the fullest extent of the GF4Ti's) but the GF2MX's only support up to DX7. Is that a true statement, IYO?


Geforce 2's & GF4MX's can run DirectX8 Software, but they have to emulate hardware fuctions which they do not have (i.e. they take a performance hit), in 3d Mark this is more apparant as Direct3D Pure Hardware T&L Cards (i.e. fully DirectX8 Hardware) will outperform Direct3D Hardware T&L and Direct3D Software T&L based cards.

IMHO the best value graphics cards at the minute are the GF3Ti200 at around £100 or the GF4Ti4200 64Mb at around £140, both fully support all features of DirectX8, and are around 1/2 - 1/3 the price of the top end cards.
 
Status
Not open for further replies.
Back