asking for some knowledge question. i'm pretty much a digital fellow and would like to know of how it is possible in analogue to achieve high data transfer rate. an example would be the Serial ATA. the data rate for that is 1.5Gbps (for 10-bit rate). since this is a serial data that means you are outputting a bit every 1.5GHz (approx. 0.6 ns per bit impossible). I believe it is the analogue portion that finally make the thing work and would like to know the theory behind it.
Look for the paper or patent regard this field . If there are paralell data process concurrently , the internal frequency is not as high as the serial data rate . Only the front-end and few circuit work on the bus clock rate . 1.5GHz combine from 10 bits , that means 150MHz/bit . The core frequency could be far from its io frequency .
0.6 ns per bit is not impossible - that's how Serial ATA works!
Each low voltage differential signal (one transmit, one receive) is clocked at 1.5 GHz. It's 8/10 encoded, so you get 150 megabytes per second. Maximum cable length is one meter, if I recall correctly.
so am i right to say that if the digital circuit is operating at 150MHz (which is very reasonable frequency) and each cycle performing operation on 10-bit and then passed onto the front-end block (analogue) to conevert it to serial data and output at the rate of 1.5Gbps.