AA Blur.

Status
Not open for further replies.

vilex

Posts: 19   +0
Does anyone else find that enabling anti-aliasing, either in 2x or 4x, causes the image to be too blurry to enjoy looking at? Apparently enabling this has become a standard for all games...so it seems like others don't really have a problem looking at it.
 
It MAY become blurry, if you enable AA while using a 640 X 480 resolution. That happened to me.

Then I upped the res to 1024 X 768, and the blurring dissapears :knock:
 
Well compared to without AA, definitely. Even just 2xAA makes it rather unpleasant to look at after awhile.
 
What is your graphic card and what game? I think the bluryness your referring to is called artifacting.


I run all my games @ 1240 x 1280 @75hertz, with AA @ 8XS and Anistrpoic filttering @ 16x, and in all my games it looks awsome. I get about 65 to 500 fps (depending on what game it is)
 
Well I'm really not new to video cards, or tweaking them extensively...I just notice that having AA makes the screen more blurry than without--which is actually what AA does in order to create that smoother image. It's just that it doesn't seem to bother many people.

I will note that you're using a refresh rate under 85hz, which could lead to eye strain. If that doesn't bother you, I'm not sure AA would appear blurry to you either.
 
vilex said:
I will note that you're using a refresh rate under 85hz, which could lead to eye strain. If that doesn't bother you, I'm not sure AA would appear blurry to you either.


Ever heard of a lcd?? They run beautifully at 60 hertz. lol save the incoherent jesters to your self lol. *smashes big box* :knock: :knock: :blackeye: :rolleyes:
 
...

Yeah. First of all, I don't have an LCD and I don't find them easier on the eyes than CRTs. Second of all, what I said wasn't meant to be offensive and didn't really deserve your assholish response. And third, there really isn't a third, it just sounds more official.
 
Hi vilex-
AA itself does not make an image more blurry- at least not standard or convention methods of AA.

As AA, in a nutshell simply means "Anti-aliasing"... so yes, applying some sort of blur filter or other ways to mask high-contrast pixels or visual artifacts can fall into the (loose) descriptiion of being an "AA" algorithm.

Example- Quincunx AA on Nvidia cards simply applies a blur filter so details are lost and overall, the image become a bit blurrier in hopes of trading sharp details lost for a reduction to aliasing or display artifacts.

Multisampling and supersampling do not blur the image. In fact, they are an oversampled/more accurate depiction of the rendering in pixel-space as there is more information per pixel (i.e. the averaged result of multiple samples).

Some forms of AA that perform supersampling (such as Adaptive, Transparent, Super AA or Crossfire AA) may need the LOD bias adjusted in order to restore prior texture detail. So this is a simple case of drivers not automatically adapting the LOD Bias to compensate for the new pixel space. Most 3rd party tools (RivaTuner for NV/ATI or ATI Tray Tool for ATI) have adjustments where you can simply bump LOD Bias downwards (towards negative or into the negatives) to compensate for this LOD shift from multiple texture samples.

Multisampling? Most MS routines do not make an image blurrier at all unless there is some sort of gamma correction or gamma shift in the algorithm. This again is adding something to normal multisampling AA- trying to hypothesize that adjusting pixel gamma (another form of detail LOSS) in certain ways looks more aesthetically pleasing to the eyes, which may not in all cases. Again, there are usually ways to disable gamma alteration added to multisampling techniques.

Hope this helps and makes sense to you. :)
 
Well I didn't quite find it offensive, because I quickly proved my self wrong. Just in the future don't jump to conclucsions (I would say I am sorry but you took it to far, and before the mods get into it lets just stop it now). It really just the users opion in the end if it looks blury. Play with the graphic settings until you find what you like (somethimes for a particularsetup some setting will look better if they are off and if some others are on). Anyways what game is it and what hardware do you have? Because like I said some settings look better off and some look better on (and max out). Just take a good 30mins playing with all the settings.
 
Funny...all I asked was if anti-aliasing appeared blurry to anyone else. That's all I wanted to know.
 
vilex-
My reply was basically: "Nope, it doesn't look blurry.. although there are a number of ways it *can* look blurry, but using tools to correct problems removes this easily."

So if my reply wasn't straightforward enough, perhaps I can make it a little simpler:
1) Don't use blur-filter AA methods (i.e. Quincunx)
2) With supersampling AA methods, it's possibly your IHV (i.e. ATI or NVIDIA) didn't shift LOD Bias to compensate for the higher sub-sample space per pixel. Adjust LOD value downwards.
3) With multisampling AA, disable gamma correction if this appears blurry.

Again, IHV's implement "AA + some stuff".. where the "some stuff" can be different per mode, per hardware maker, per display and per game. This is what 3rd party tools help with- disabling things that may make your experience a bit "blurry" hehe. Makers of videocards/drivers try to make simple modes for ease of use but there is no "one size fits all" with AA modes. Many may look deliciously smooth and detailed with 90% of the games, but blurry eye-sores on the other 10%. So for that remaining 10%, you can turn off/tune the "something else" your IHV/drivers have added.

If you have some specific questions- such a particular card, particular resolution, particular AA settings on a particular game that you find blurry- feel free to share it. We may be able to point you towards what setting or things you can apply to sharpen things up while still enjoying the benefits of AA and increased visual detail/accuracy.

Good luck and happy gaming!
 
Sharkfood,

Thanks for the detailed and helpful replies. I've found a page that describes nicely the kind of effect that AA seems to be having on my graphics: http://www.grafx-design.com/02gen.html

Whether I enable AA from the particular game (I play tons of games...the blurryness is the same across the board) or force it from the nvidia control panel, it has the same effect. I already have everything sharpened on my monitor via the nvidia control panel because this CRT seems to be blurry to me otherwise...(I suppose it's getting rather old) so the LOD Bias reduction isn't going to help. I tried it anyway, -1.0 and then -0.5 and it made everything far too sharp. But still, the image is fine and detailed to me normally until AA is applied...then not only do I realize it's more blurry, but my eyes seem to be bothered by the change. It doesn't matter though...in a way it's good that AA seems to bother me--means I don't have to have it on all the time and have lower framerates because of it.

If I have AA on 2x in a higher resolution like 1280x960 it doesn't seem to be as blurry. But that causes lower framerates...I'd rather just use 1024x768 with no AA and play smoothly. And I have an MSI Geforce 6600GT...which seems to be widely popular. The monitor is a 19" Philips 109B CRT.
 
Hi again vilex!

As you're on a 6600GT, your AA options are going to be limited. It's only an 8-pipeline card with 128-bit memory interface.. whereas most games with AA are better suited to 16-20-24 pipeline cards with 256 bit memory interface for reasons of samples vs. performance.

As you're talking specifically enabling normal AA (or in-game AA), this means you are limited to Multisample AA. LOD Bias will have no effect on this as Multisampling does not effect textures at all- only polygon edges. You can use anisotropic filtering to sharpen/modify textures.

What you are describing though is the SAME thing that bothers the heck out of me and is specific to NVIDIA hardware- it uses an "intellisample" for it's multisampling that creates a gamma correction shift. ATI also has gamma correction shifts for their AA but it's far, far superior to NV's. As you're also a CRT user (such as myself), it makes a bigger difference here.. and not noticeable at all on LCD's/flat panel displays as they already have fixed gamma pixel space.

There is also a bug with gamma corrected AA in the 81.xx series Forceware drivers so, more often than not, you get very blurry/inaccurate results. Later Forceware versions are better, but not great.

In the Global Driver Settings, Advanced Settings of the Forceware control panel, you should find an option for "Gamma correct antialiasing" be sure to try this in it's "Off" setting and also apply a reboot to ensure a re-read of the registry. If your drivers are installed correctly and your registry is clean of foreign/past NV drivers, this setting should sharpen up edges greatly for you. If it has little impact, it's possible it's not taking or you have a set of drivers with the gamma bug.

In all, I really dont think a 6600GT is best suited for AA in modern games. It's a great entry level/budget 3d gaming card, but using AA and advanced AA modes is best suited to 6800->7900 series cards, or Radeon's with 12 or more pipelines + 256 bit memory interfaces.

Hope this helps!
 
Again, thanks for the detailed reply. I couldn't find the "gamma correct anti-aliasing" option that you spoke of...I have the official 84.21 drivers...not seeing this option anywhere in the global settings. Is there any outside software to disable it, or perhaps there's an overall better driver to use with the 6600gt? I know that when I had a Geforce4TI 4200 64mb, there was one older, universal driver everyone swore by...
 
Status
Not open for further replies.
Back