# Metastabilty and data loss

1. ## Metastabilty and data loss

Hi all,

After reading many posts in the forum, I started to get confused.
My understanding on metastability is it cause the output to be X. This mean it can be any values.
By having high MTBF, we have high probability to avoid metastability.
My question is does this mean we will have a correct output value? Or it just won't be intermediate value between "0" and "1" but could be wrong value?
As if the output value is wrong, it still might lead to data loss.

2. ## Re: Metastabilty and data loss

Originally Posted by sherline123
Hi all,

After reading many posts in the forum, I started to get confused.
My understanding on metastability is it cause the output to be X. This mean it can be any values.
By having high MTBF, we have high probability to avoid metastability.
My question is does this mean we will have a correct output value? Or it just won't be intermediate value between "0" and "1" but could be wrong value?
As if the output value is wrong, it still might lead to data loss.

Let me try to simplify things. Output going X is just a simulation thing. In Silicon, the output always settles to a definite value after having intermediate values between 0 and 1. With synchronizers, FIFO or other mechanisms to reduce metastability you can not guarantee correct output but you can make sure that output is stable. Generally when you are crossing clock boundaries, the sender should keep the data stable for certain clock cycles to be sampled by receiver.

Golden rule is:
You can have delay using synchronizer but you will have stable value at output.

As a result of above golden rule, downstream logic will be safe.

3. ## Re: Metastabilty and data loss

You should ask yourself what you consider wrong data?

1. Single bit data, toggling at arbitrary times. You'll either read the old or the new data. Both or equally correct, just different.

2. A multibit entity. Individual bits can be expected to have a certain skew. If you manage to sample a data word while the value is changing, you may get a wrong data value. Example 0x7ff changing to 0x800. You may eventually sample 0x000, 0xfff or any value in between. Multi bit data must be transferred consistently, e.g. by using a domain crossing FIFO or some kind of handshake.

•

4. ## Re: Metastabilty and data loss

Originally Posted by FvM
You should ask yourself what you consider wrong data?

1. Single bit data, toggling at arbitrary times. You'll either read the old or the new data. Both or equally correct, just different.

2. A multibit entity. Individual bits can be expected to have a certain skew. If you manage to sample a data word while the value is changing, you may get a wrong data value. Example 0x7ff changing to 0x800. You may eventually sample 0x000, 0xfff or any value in between. Multi bit data must be transferred consistently, e.g. by using a domain crossing FIFO or some kind of handshake.
Wrong data to me is unexpected output. Let's say I am sending in a '1' to synchronizer and the output is '0'. To me, this is wrong output value.
The reason I am asking this question is IF input data frequency is different with destination clock frequency, when you fail to sample then it will just miss the data and lead to data loss.

5. ## Re: Metastabilty and data loss

Hi,

Wrong data to me is unexpected output. Let's say I am sending in a '1' to synchronizer and the output is '0'. To me, this is wrong output value.
X means the state is undefined = unknown.
So if you send a "1" .... and you can't be sure the output is "1", then the output hast to be considered "false".
It doesn't matter if the output really is "0", or something "intermediate" (that at the end still will be interpreted as "0" or "1") or by accident you get a "1" ... you can't rely in the output.

The reason I am asking this question is IF input data frequency is different with destination clock frequency, when you fail to sample then it will just miss the data and lead to data loss.
This has nothing to do with "metastability". This rather is "undersampling".

Thus your target clock frequency hast to be synchrounous to the data, or according nyquist higher than twice the dara rate.
In detail it depends on the interface and it's specification.

Klaus

6. ## Re: Metastabilty and data loss

Wrong data to me is unexpected output. Let's say I am sending in a '1' to synchronizer and the output is '0'.
That's not possible.

•

7. ## Re: Metastabilty and data loss

Originally Posted by FvM
That's not possible.
When metastability happens, then it is possible?

Originally Posted by KlausST
Hi,

X means the state is undefined = unknown.
So if you send a "1" .... and you can't be sure the output is "1", then the output hast to be considered "false".
It doesn't matter if the output really is "0", or something "intermediate" (that at the end still will be interpreted as "0" or "1") or by accident you get a "1" ... you can't rely in the output.

This has nothing to do with "metastability". This rather is "undersampling".

Thus your target clock frequency hast to be synchrounous to the data, or according nyquist higher than twice the dara rate.
In detail it depends on the interface and it's specification.

Klaus
This is what i am asking. Putting synchronizer only guarantee output is defined signal (0 or 1) but it does not guarantee it is the correct output.
If metastability happens, the output still can be wrong but it is a defined signal. Is this correct?

8. ## Re: Metastabilty and data loss

Hi,

If metastability happens, the output still can be wrong but it is a defined signal. Is this correct?
No, metastability = X = undefined

But a synchronizer output should be considered as a defied logic level.
Afaik a sysnchronizer can not completely 100% metastability at itīs output, but it greatly reduces the chance for metastability.
Many papers recommend a two stage synchronizer to minimize the chance for metastability to very close to zero.

Klaus

9. ## Re: Metastabilty and data loss

Originally Posted by KlausST
Hi,

No, metastability = X = undefined

But a synchronizer output should be considered as a defied logic level.
Afaik a sysnchronizer can not completely 100% metastability at itīs output, but it greatly reduces the chance for metastability.
Many papers recommend a two stage synchronizer to minimize the chance for metastability to very close to zero.

Klaus
Okay sorry, I overlook when typing this. Metastability output will be X.
Synchronizer only lower metastability probability but the output still can be wrong (as defied logic).
Is this correct?

10. ## Re: Metastabilty and data loss

Hi,

Metastability is caused by violating setup and hold timing.
On may avoid it by using a synchronized signal. This may be impossible.

Then yes: reduce the chance for metastability by using synchronizers.

But you may use oversampling and filtering techniques in a way that you completely can avoid wrong signals.
With two signals of known different clock frequencies - there is no change that metastability happens on two folowing clock signals.

Metastability does not happen randomly. (although it sometimes may look like)
Metastability does not happen when the input signal is HIGH or LOW.
Metastability just happens when the input signal is in transition between HIGH and LOW while the FlipFlop_active_clock also is in transition.

Draw this situation on a paper and let random people decide whats the expected output. Some will say HIGH, some will say LOW, soīme will say "donīt know". --> there is no "right" and "wrong" ... but "donīt know" is not acceptable in a digital system.
The synchronizer just reduces the "donīt know" to almost zero. But still there is the "HIGH" and "LOW" problem. Your reciver should be able to handle it.

Klaus

1 members found this post helpful.

11. ## Re: Metastabilty and data loss

Originally Posted by sherline123
Okay sorry, I overlook when typing this. Metastability output will be X.
Synchronizer only lower metastability probability but the output still can be wrong (as defied logic).
Is this correct?
Yes. Even with a syncronizer if bits are sampled as they're changing you'll randomly end up with either the old value or the new value.

In a multi-bit vector you can end up with a mix of old bits and new bits giving you a value that's 'wrong' (it's not the old value or the new value).

1 members found this post helpful.

12. ## Re: Metastabilty and data loss

Asynchronously sampled single bit data won't be incorrect. By nature of asynchronous processing, the data changes at arbitrary times related to the sample clock. Respectively you don't know in advance if a value sampled near a transition is the old or the new one.

A simple example is an asynchronously sampled UART signal. The sample rate must be high enough to reproduce the bit stream, e.g. fourfold bit rate. Samples near the edges may be either low or high, you still get the correct bit pattern.

•

13. ## Re: Metastabilty and data loss

I not sure how to understand multibit data if I cant even understand how a single bit data works.