16-bit vs 32-bit color difference?

Status
Not open for further replies.

cigarman

Posts: 33   +0
Is there really a visual difference to the naked eye between 16-bit color and 32-bit color? I can see none. When I fix other people's computers, particularly older, under-powered units, I usually tweak down there color setting to 16-bit and find that their unit runs a bit faster. Awhile back I read that there was little visual difference to the naked eye and so I continue to tweak down older computer's color settings to 16-bit.

Any comments or differing opinions?
 
People have different sensibilities as some people won't stand 16-bit colors, others won't mind. If you can't tell the difference then you might as well use 16-bit colors, I myself wouldn't be able to use it more than 5 minutes. :p
 
It really depends on many factors and what specifically you're talking about.

If you mean the Windows desktop color depth, yes- you will see noticeable grainy/banding in true color images. If you pull up something with many shades of a single color, you'll see the color banding at 16 bit that will be much smoother in 32-bit.

For example- try viewing the linked image below to see an example of color banding that occurs between 16 bit and 32 bit color settings:
http://img452.imageshack.us/img452/8343/gradient32bit3ze.png

Now, if you mean in games- some games use dithering and other things to reduce banding. Plus many older games do not even provide true 32-bit textures, but instead 16-bit. While running games in 32 bit can improve shading/colors, some examples gain very little.

Overall though, most "modern" videocard truly no longer gain a big performance improvement between 16 and 32 bit. Even lower end 3d and 2d videocards should not suffer much measurable performance difference between 16 vs 32 provided they have sufficient frame buffer/video ram for the resolution used.
 
Status
Not open for further replies.
Back