zorro
Advanced Member level 4
jasmin_123 said:Hi, zorro,
Why not necessarily the output has to grow? If signal at some frequency is increasing while traveling over the loop and remains in phase (G>1 at zero phase implies just this), then the output does necessarily have to grow.
---
The output always have to grow for a G>1, P(G)=0. Why for the same G>1, P(G)=0 it does grow in some cases and does not grow in the other cases? Why?
Jasmine
Hi Jasmine,
Some preliminaries (variations on my previous post):
The output maintains stable if the output is exactly the same as the input (G=1 exactly). As I said, If G is real and G>1 for some frequency, the output is no more the same as the input, and that is not an equilibrium point.
The equilibrium point is at that complex frequency at which the output is exactly the same at the input at any time. If for some frequency it happens G>1, it is not G=1 at this is not an equilibrium point. Then, where is there an equilibrium point? If there is one for some growing complex exponential (poles of closed loop in RHP) the circuit is unstable, otherwise it is stable.
The fact that G>1 for some sinusoidal waveform guarantees that an equilibrium point is not there, but it does not guarantee that the equilibrium point is in the RHP.
Now, let’s try to solve the paradox: why not necessarily the ouput has to grow?
This example will be helpful:
Once a student asked me, astonished, how an amplifier with gain –2 at all frequencies could be stable when connecting the ouput to the input. His reasoning was as follows:
Imagine that the input is 1. Then the ouput is –2. As it is connected to the input, the input becomes –2 and the output becomes +4, then –8, etc……..
The problem is that he was thinking in discrete time, not in continuous time. Implicitly he put a delay in the loop.
In continuous time, the only stable point is with output 0 (solution of output=input at all times).
In discrete time with one-sample delay (also in continuous time but including a pure delay), the situation would be really unstable. Plot the root lucus of such a simple system in discrete time with one-sample delay: in open loop there is a pole at z=0. Increasing the gain, the pole moves to left, an for any G>1 (A<-1) the pole goes outside the unit circle => unstability.
You are making that kind of reasoning. You think something like this: “if some cycle has amplitude 1, then the following one has amplitude G, then G^2, and so on”.
You place an additional delay in the loop. That makes you think that the output does necessarily have to grow. It is very intuitive, but incorrect.
Is it clear?
Regards
Z