faisal78
Member level 3

Hi
I am interested to know the details and technicalities of a automated memory controller phase tuning. As the EMMC/DDR memory gains in speed, the timing constraints and Signal Integrity become more and more important and critical. I have got some information from JEDEC but the details are not clear on how memory controllers in a Application Processor "tunes" the DLL Tx and Rx phase to the ideal spots...
Here is what I understood, experts please do correct me...
1. Memory controller has 16 DLL phase tuning points for a particular IO Driver strength.
2. A fixed data pattern is used
Drive strength: DLL Phase <0..15>
where P = pass, F=fail
2mA : F,F,F,F,F,F,P,P,P,F,F,F,F,F,F,F
4mA : F,F,F,F,F,P,P,P,P,P,P,F,F,F,F,F
6mA : F,F,P,P,P,P,P,P,P,P,P,F,F,F,F,F
8mA : F,F,F,F,F,P,P,P,P,P,F,F,F,F,F,F
10mA : F,F,F,F,F,F,P,P,P,F,F,F,F,F,F,F
So the memory controller would start at a default drive strength of 2mA, write to the memory device a fixed test pattern, and somehow the memory device will compare to a internal pattern and reply...
this goes on from the 0..15 DLL phase delays.
Run the next IO driver streangth.
Eventually, a pass/fail array will be achieve and a center point will be used.
Is this correct?
I am interested to know the details and technicalities of a automated memory controller phase tuning. As the EMMC/DDR memory gains in speed, the timing constraints and Signal Integrity become more and more important and critical. I have got some information from JEDEC but the details are not clear on how memory controllers in a Application Processor "tunes" the DLL Tx and Rx phase to the ideal spots...
Here is what I understood, experts please do correct me...
1. Memory controller has 16 DLL phase tuning points for a particular IO Driver strength.
2. A fixed data pattern is used
Drive strength: DLL Phase <0..15>
where P = pass, F=fail
2mA : F,F,F,F,F,F,P,P,P,F,F,F,F,F,F,F
4mA : F,F,F,F,F,P,P,P,P,P,P,F,F,F,F,F
6mA : F,F,P,P,P,P,P,P,P,P,P,F,F,F,F,F
8mA : F,F,F,F,F,P,P,P,P,P,F,F,F,F,F,F
10mA : F,F,F,F,F,F,P,P,P,F,F,F,F,F,F,F
So the memory controller would start at a default drive strength of 2mA, write to the memory device a fixed test pattern, and somehow the memory device will compare to a internal pattern and reply...
this goes on from the 0..15 DLL phase delays.
Run the next IO driver streangth.
Eventually, a pass/fail array will be achieve and a center point will be used.
Is this correct?