Does RAM Latency Matter?

Status
Not open for further replies.

Julio Franco

Posts: 9,097   +2,048
Staff member
System memory is often the forgotten cousin among components when you're building a new PC. A lot of PC builders just buy whatever's out there, as long as it works in their systems. Some enthusiasts take the opposite route. They get expensive ultra-high-frequency or low-latency memory, hoping it will give them a big performance boost.

You can pay from 30 to 100 percent more for these low-latency offerings, but are they worth the extra money? ExtremeTech examines the effects of low-latency memory on two high-end systems to determine its value to PC builders.
 
Intersting article, I always thought it made a big difference although I never did it myself. Too expensive ;)
 
heck yea its important!

But my Kingston (Value) Ram, is rated at 3-3-3-6, and it clocks perfectly at 2-2-2-5.

My other friend got some Kingston Value Ram, at 3-3-3-6 timings, but would only fire away at 2.5-3-2-5.

My point - Just get some 2.5-3-3-6, and most can post at more expensive ram timings, 2-2-2-5.
 
Originally posted by ---agissi---
heck yea its important!

Did yo read the article Agissi? It basically say it's not really worth it unless you got money to throw around.. you're better off buying a new graphic or sound card with the money than eeking out 1-2% increased performace...
 
Ok, that is true... but it's already in the 90 FPS range, so that's just overkill... Youi can get more FPS from upgrading your video card than tweaking memory CAS and risking system stability.

That's my opinion, let it go ;)
 
Good discussion I recon. Nothing critical.

I wish I could buy a videocard for the price of Cas2.5 over 2 :)

For the heavy duty games that would normally pull 15fps, tweaking your CAS could pull you upto 25fps, thats the difference between really choppy and smooth. (To the human eye, anything over 24fps looks the same, 24fps is 100% smooth, below that its choppy/jerky/"laggy" looking)
 
agissi, just because you can't perceive something doesn't make it useless. I can't perceive the difference between 25fps and 100fps, but I'd rather have the latter when I'm in a massive firefight. With 100fps I have some wiggle room in that my system can handle more graphical goodness (in case I'm ambushed, for example). At 25fps I'll be lucky to bleed without causing a perceivable slowdown.
 
Anything below 60fps _can_ appear jaggy/slow to your eye, this is due to the monitors we use and the fact that there is also a dealy in the input devices we use (mices)

There are good articles about this if you look on google... Sorry, just got a bit carried away ;-)

And Aggissi: the 10fps increase was just a 5% increase, so in your example of a game with 15fps it would increase to: 15,75fps.....
 
To the human eye, anything over 24fps looks the same, 24fps is 100% smooth, below that its choppy/jerky/"laggy" looking

ROFL!! this 'urban internet legend' STILL persists lol!!

do some reading up on the subject before you start spouting mindlessly. and make sure you READ up on hte differences between how a television streams its data to you compared to how a PC graphics system streams it to you.

30 fps is jerky is all hell, hell i can tell subtle differences right upto about 75hz

if a monitor is refreshing at 75hz i can 'see' it and it gives me a killer headache and eyestrain. I cant use a monitor unless its at a minimum of 85hz

Ive been playing the Unreal series of games competitively for 5 yrs, so i've become very sensitive to fps and jerky movement.

- Watch your language on these forums
Didou
 
I run Halo at 1280x1024 with Vsync turned on and it runs as smoothly as a baby's bottom. (not that I would know or anything)
SO how does Vsync work? I've tried halo with 30fps and it seems jiggier than vsync so what is the framerate for vsync?

Thanks for Help in advance.:grinthumb
 
The "human eye can't see past 24 fps" thing is indeed a myth. It all comes from the fact that in the early days of film they decided that 24 fps was ENOUGH to trick the mind into seeing fluid motion. They never said more wouldn't look any better. But all of that is null and void when you're talking about anything other than motion photography because one of the factors involved is the motion blur.

Each frame of motion photography has a natural motion blur so in essence each frame is really representing more than just a single image in one exact place. Its showing the smeared image of everywhere that subject went in the time that the shutter was open.

There is no natural motion blur in video games so the higher the fps the better. I can see a huge difference in 60 fps and 100 fps in a game (provided my monitor is set at a 100hz refresh rate).
 
Yes, the 30fps of a movie is really 60fps

So 60fps for a game should be enough but it is not due to the fact of latency induced by your input devices.

I myself need 100hz at 1152x864 in resolution otherwise I get eye-strain. The same is true for my games, I use refresh rate fixes to force my games to 100hz since Windows 2000/XP defaults to 60hz otherwise.

And I see a big difference from 60 to 100 in fps, though it also depends on the game you play, in Far Cry I have noticed that I can go down further in fps without noticing it, this is because of the slow action compared to for example the Unreal games...

And about V-sync: turning it on capps your fps tot he hz you have your monitor set to, the reason to do this is that otherwise the image will tear if you get fps that is higher than the monitor can draw... (tear as in being split or out of sync...)
 
Hz and fps are not the same thing and the refresh rate vs framerate will show differences at lower refresh rates. Even at higher framerates you will notice that the refresh rate is crap if it is 65Hz, but don't blame it on the fps.
The actual number is a little closer to 55fps, but it is NOT a myth, the myth is that higher framerates are good for anything except maybe helping out in "heavier" graphic areas.

According to the many documents I have read on the topic(mostly from video broadcasting classes in college) It says tht a human with perfect vision can only see 55fps, anyone who can see more than that is either from Krypton or lying(more than likely they are placing too much faith in whatever is showing them the current framerate they are supposed to be seeing)
 
Originally posted by StormBringer
Hz and fps are not the same thing and the refresh rate vs framerate will show differences at lower refresh rates. Even at higher framerates you will notice that the refresh rate is crap if it is 65Hz, but don't blame it on the fps.
The actual number is a little closer to 55fps, but it is NOT a myth, the myth is that higher framerates are good for anything except maybe helping out in "heavier" graphic areas.

According to the many documents I have read on the topic(mostly from video broadcasting classes in college) It says tht a human with perfect vision can only see 55fps, anyone who can see more than that is either from Krypton or lying(more than likely they are placing too much faith in whatever is showing them the current framerate they are supposed to be seeing)

so how come a monitor with a refresh rate of 75hz gives me a headache and eyestrain?? i can 'see' the refresh, also im talking about when its on desktop, not while playing games.

its the fact i can 'see' the refresh that causes the eyestrain and headaches.

i run my 19" iiyama at 1152x864 @ 100hz, i also run all my games the same.
 
Originally posted by StormBringer
According to the many documents I have read on the topic(mostly from video broadcasting classes in college) It says tht a human with perfect vision can only see 55fps, anyone who can see more than that is either from Krypton or lying(more than likely they are placing too much faith in whatever is showing them the current framerate they are supposed to be seeing)
Are you talking about Movie or game fps? As I already explained there is a big difference, a movie at 55fps is really 110fps...

A game at 60fps should be comparable to a movie at 30 but it is not due to other slowdowns that the interactivity brings; i.e. the input from our mice and keyboard slow it down... (or rather; make it look slower to our eyes)
 
you guys buy into too much propaganda from the gaming industry, and I'm not sure what you are talking about with the refresh rate but refresh rate is the rate at which the screen is rescanned, and of course a lower frequency is going to be more noticable. A 75Hz refresh rate means that it scans the entire screen 75 times in a second, thats fairly slow and yes it can be seen, especially if it is dark. It has nothing however to do with the fps of the game you are playing.

As for this movie vs game fps, well, the basic CRT TV shows video at 35fps, though the source is broadcast at 60fps(this is assuming you are watching a movie or TV show on a network and not from a video disc) The end result is 30-35fps viewed by the user.
 
I don't need and tech specs or theories because i can see with my own eyes a very large difference between 55 fps and 100 fps. I also see a large difference between 85hz refresh rate and 100hz refresh rate.

One thing thats very important to remember is that your visible fps will never exceed your refresh rate. You can't see 100 fps with a 60 hz refresh rate. Its just not possible.

The whole motion blur thing is also really important. In motion photography the blur conects the range of movement from one frame to the next. Rendered graphics don't have that blur so if in one frame the subject is on the left side of the screen and the next frame its at the center of the screen theres going to be quite a large jump in it's motion. As you go up in framerate for rendered graphics i've noticed that a moving image starts to look more and more like a solid object moving across your screen. That is to say that more of its distance travelled over time is accounted for visually.
 
Originally posted by Caxus
I don't need and tech specs or theories because i can see with my own eyes a very large difference between 55 fps and 100 fps. I also see a large difference between 85hz refresh rate and 100hz refresh rate.

I cant see a difference in the FPS at all, but I can see a difference between 60hz and 72hz. Anything above 72hz looks the same though.

So I dont care if my game is 24fps, or 240fps, as long as its above 23 ;) and if my cas latencys can make the game go from 14 to 24, thats the difference between needing an upgrade and not needing an upgrade :) or, downgrading the resolution for that matter (in the game).
 
Status
Not open for further replies.
Back