Hi vilex-
AA itself does not make an image more blurry- at least not standard or convention methods of AA.
As AA, in a nutshell simply means "Anti-aliasing"... so yes, applying some sort of blur filter or other ways to mask high-contrast pixels or visual artifacts can fall into the (loose) descriptiion of being an "AA" algorithm.
Example- Quincunx AA on Nvidia cards simply applies a blur filter so details are lost and overall, the image become a bit blurrier in hopes of trading sharp details lost for a reduction to aliasing or display artifacts.
Multisampling and supersampling do not blur the image. In fact, they are an oversampled/more accurate depiction of the rendering in pixel-space as there is more information per pixel (i.e. the averaged result of multiple samples).
Some forms of AA that perform supersampling (such as Adaptive, Transparent, Super AA or Crossfire AA) may need the LOD bias adjusted in order to restore prior texture detail. So this is a simple case of drivers not automatically adapting the LOD Bias to compensate for the new pixel space. Most 3rd party tools (RivaTuner for NV/ATI or ATI Tray Tool for ATI) have adjustments where you can simply bump LOD Bias downwards (towards negative or into the negatives) to compensate for this LOD shift from multiple texture samples.
Multisampling? Most MS routines do not make an image blurrier at all unless there is some sort of gamma correction or gamma shift in the algorithm. This again is adding something to normal multisampling AA- trying to hypothesize that adjusting pixel gamma (another form of detail LOSS) in certain ways looks more aesthetically pleasing to the eyes, which may not in all cases. Again, there are usually ways to disable gamma alteration added to multisampling techniques.
Hope this helps and makes sense to you.