The relation between jitter and data rate

Status
Not open for further replies.

ssuchitav

Junior Member level 3
Joined
Jun 12, 2006
Messages
30
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,487
Hi,

Would like to know the relation between jitter and data rate. is the jitter increases if we increase the data rate ?

thanks in advance.
 

Re: Jitter Vs Data Rate

Once you exceed the transmission line time-of-flight (round
trip) things will get a lot nastier.

If your data path is not allowed to fully settle within the bit
interval there will also be data dependent jitter. Subtler
things like bias racks being perturbed and "remembering"
data history short-term, also are possible.

But starting from zero up to frequencies where those
kinds of effects start to roll on, I don't think jitter follows
data rate. Although sample size and the statistics you can
acquire in reasonable time, will. That is, you'll have to
wait a whole lot longer to catch your +4sigma outlier at
1MBPS than 1GBPS. So short-term 'scope-type P-P jitter
stats may -look- better at 1MPBS than 1GBPS, if the
link is otherwise capable.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…