One can recycle the pattern you wish to test and rely on your stable clock for any choice of bit length or use single shot triggers using the clock and enabled by the data pattern. This can also measure the phase-amplitude margin in other methods. One such method is to delay the clock with linear or adjustable delays for early , PLL nominal and late pulse, then count accurate data in all samples. These delays can also be contributed to data to produce a pseudo-random early-nom-late bit shift then increase the delays until errors occur in one packet or burst or any length of measure.
Normally for clock and data recovery you have a budget for all the effects of amplitude and phase distortion. In logic systems it would be just measured in window margin and before the slicer in eye pattern with margin vs number of bits for a particular pattern.
- In RLL codes the spacing between min/max will have different group delays in the channel that must be designed. This can be minimized with raised cosine channel filters.
- the data slicer will have asymmetry that contributes to margin loss
- The clock stability also has jitter. This can be minimized with PLL compensation filters to suppress random noise, jitter, and offset.
- mismatched impedances will have amplitude echos and ripple such as in your plots. This can be minimized with an active terminator or pull-up/dn resistors to match the channel Zo unless there are excessive stubs in MIMO channel.
- there are asymmetric propagation delays based on asymmetric impedances or push/pull currents and also other root causes. This will show up in different worst-case data patterns. Each RLL code has differ
- Phase margin Analyzers come in all types depending on the bandwidth, bit rate and protocol
Once you have a design budget and verify the design meets the budget then the BER can be controlled vs the bit shift and projected in ps/decade # of bits or the uncorrected or corrected error rate base on the SNR and Window margin..