nanako
Member level 5
asking for some knowledge question. i'm pretty much a digital fellow and would like to know of how it is possible in analogue to achieve high data transfer rate. an example would be the Serial ATA. the data rate for that is 1.5Gbps (for 10-bit rate). since this is a serial data that means you are outputting a bit every 1.5GHz (approx. 0.6 ns per bit impossible). I believe it is the analogue portion that finally make the thing work and would like to know the theory behind it.