s.b.f
Newbie level 6
- Joined
- Jun 14, 2004
- Messages
- 14
- Helped
- 0
- Reputation
- 0
- Reaction score
- 0
- Trophy points
- 1,281
- Activity points
- 104
I was trying to design a kind of clip detector circuit which can detect either current or voltage clipping. This clipping detector is required for a differential driver circuit, which can drive up to 2App current @8Vpp. The supply is 12V and the circuit is band limited to 70MHz. However, the input signal never exceeds 20MHz.
first, we tried to attenuate output voltage and compare it with input signal to see if there is any inconsistency with input and output signal (this difference was accepted within a predefined range). The problem with this structure was that the gain of driver is varying with load and frequency and its not feasible to do this by a predefined gain/attenuator stage.
I was wondering if there is an alternative way to detect voltage/current clipping. I would appreciate it if anybody could help me in this regard.
Thanks
S.B.F
first, we tried to attenuate output voltage and compare it with input signal to see if there is any inconsistency with input and output signal (this difference was accepted within a predefined range). The problem with this structure was that the gain of driver is varying with load and frequency and its not feasible to do this by a predefined gain/attenuator stage.
I was wondering if there is an alternative way to detect voltage/current clipping. I would appreciate it if anybody could help me in this regard.
Thanks
S.B.F