In this thread, I'm thinking out loud. I honestly don't really know what I'm talking about, but despite that, I'm trying to reason out ideas based on what I think I understand. I'm hoping to get some useful information and perhaps a discussion involving networking. Facts only please - If you don't know something for sure, make sure you mention you aren't positive. Here are my assumptions and my questions... Perhaps someone with more network cabling knowledge can help point out where I'm wrong, right and provide answers my questions. This thread got me thinking about this stuff. 100Mbps & 1Gbps use the same two pair of wires. Is my assumption valid? I've actually read online that 1Gbps networks use all 8, but I don't feel like this is correct. I do know that 100Mbps networks use 4 wires of a CAT 5-6 cable (8 wires). Two wires are for TX (Transmission) and two strands are for RX (Receiving). I believe that the additional wires are used for other applications such as Power Over Ethernet and possibly other proprietary uses. Regardless, I've always included all the wires when crimping and will continue to do so... But I'd like to know for certain. CAT cables must meet the following MHz ratings: CAT-5 (100MHz), CAT-5e (125MHz) and CAT-6 (250MHz.) This is my understanding - are these numbers correct? One pair for TX and one pair for RX - One wire is positive, one wire is negative? I'm not an electronics engineer, so I don't really know much about currents, components and how they work togerher to make things happen etc.. but my assumption is that these two wires create a single, serial data path. Is this true? Does this make sense? What's really going on? Assuming network data is serial, how does MHz tie into all of this? Serial implies one packet of data at a time, yes? Certified CAT-5e is capable of 1000Mbps, the 'standard' being @ 125MHz. 1000Mbps is 1,000,000,000 bits per second. 125MHz is 125,000,000Hz. Coincidentally, this gives us 8 bits per cycle. 8 bits is a byte. 1 byte of data per clock cycle might have been my assumption, but the MHz ratings for cabling don't scale properly. 5e (1000Mbps capable) is 125MHz where CAT-5 (100Mbps capable) is merely 25MHz less. CAT-6 is 250MHz. Can anyone explain how MHz and data transmission work together to achieve a maximum bandwidth? Is my assumption of 1 byte per cycle correct and CAT cable such as 5 was under utilized? Does this mean, theoretically, that CAT-6 can achieve 2.5Gbps or greater? Is it as simple as 1 byte per clock cycle? Are there other factors like compression or other technologies which make this number more fuzzy? Any other thoughts?