Well i would say xp 64 kinda sucks, vista 64 is better...
Why would you need more than 3.3gig's of ram on XP?
That was a suggestion given they wanted 64 bit and nothing to do with Vista but still use Windows.
The biggest thing to know about FPS and gaming, is that anything over 30 FPS is for Videophiles that get a hard on seeing those numbers. Human eyes can not see faster than 24 FPS period no matter what you do or what you use.
I know,it's just that XP doesn't need that many gig's.
The thing is i have seen a difference between 24 FPS and 60 FPS on NFS Undercover.
There is a hugely noticable difference between 30 fps and 100 fps. Almost anyone can tell the difference between 30 fps to 60 fps, and any skilled gamer could tell the difference between 60 and 120 fps.
Doesn't matter if human eyes can only see 24 fps, which im not so sure about though thats what i hear a lot, every single frame taken by the eye must exactly match an outputted frame from your monitor, else you will still be able to see a difference.
The sweet spot for any game is one that matches the refresh rate of the screen you're playing it on.
Also, 60FPS also happens to be the limit beyond which the eye cannot distinguish between individual frames. This is why it is considered an average "sweet-spot" for most games.
hold on there bro
24fps and 60fps is an easy catch for my eye.
I won't get into that argument here, its the same as the speaker wire arguments and it would just end up going for pages. I'll just go with if you feel like you see it, you must, regardless of whats really happening.
Lets just say its an opinion and not delve on it further, even though im right .
The fact is, 18 - 30 fps is NOT good enough, 30+ could be acceptable for some, the poster did say 30 - 45 fps, which is reasonable.
More RAM wouldn't even benefit fps much at all.
i agree with this.
if we are speaking on averages, then in no way shape or form is 24 acceptable to me.
lets take my last 3 video cards and one game (guildwars) for example.
evga 8800gts 320
evga gtx 280
the x1300 averaged about 20fps in guildwars
the 8800gts averaged about 50fps
the gtx 280 averages over 200fps
with the x1300, that 20fps as an average, was just tooo choppy for me. this dips from low to high in fps was just too much for me. it would dip into the single digits during heavy play and boost up to about 30fps during light play. this wasnt good enough.
with the 8800gts, boy was that a nice upgrade over the 1300.
even with the average of about 50fps, the min fps was still not enough for my tastes, diping below 20 at some points. and maxing out @ 60. but most of the time it ran over 50fps.
now i have the gtx 280, its just flawless @ over 200fps average. never dipping below 100, and sometimes reaching over 400. i know this may be big overkill, but this is how i like my games ran, flawless.
the only lag i get is from server or my connection (ping).
so by saying 24fps average is just as good as 60fps ave to the eye, is just not right, because you have to take in the min vs max into consideration.