I may be wrong, but this is my take on the bit system:
A bit is one of the smallest measures of file size in the computer industry. Most internet connections are measured in kilobit- 1,000 bits. Such as 28.8k. You should notice that you don't actually download at 28.8 kilo
bytes, as downloaders measure, but rather 3.6 kilobytes, which is the equivalent of 28.8 kilbits. A byte is 8 bits, and 28.8 divided by 8 is 3.6. To measure a bit, it is one normal character, i.e. not ASCII, be it a number, letter, sign, etc. A byte is an 8 letter word.
But, in relation to computer processors, I think the default '32-bit' architecture partains to how much information is calculated on every clock cycle, either 32 or 64 bits at once.* An operating system also runs with this architecture, which is why you hear about the 64-bit version of Windows XP. 64-bit architecture is backwards compatible with 32-bit, but 32 bit cannot run any 64-bit optimized architecture.
As for the 128-bit GPUs, I think you may be out of context, but I really can't explain that.
I think I butchered that. I never really learned about the bit architecture system, and these are all just logical assumptions. I would gladly enjoy someone who would explain it to me better/correctly
*I'm really not sure about that.