pifouille
Junior Member level 1
Hi everybody.
I'm trying to simulate a tuned amplifier in LTSpice in order to get a better grasp on its operation.
This is a first step in my studying of oscillators.
Anyway, I am getting weird results that I am not able to explain.
I would be very grateful if some of you guys could help
I'd love to attach my .asc file but apparently, I can't. But here is a picture of the setup:
I'm am using the MOS model from cmosedu.com available here, in particular the N_1u model.
There are actually several things I don't understand in the results I get from those simulations.
For now, I am gonna focus on the one that most annoys me: the result of the transient simulation for V(out)
depends (a lot) on the max timestep (dTmax: 4th parameter of the .TRAN command line) as shown below
With a max timestep of 1ns, i get:
Here, in the permanent regime, the peak-to-peak amplitude of V(out) is 11.95mV which
is approximately what I expected. In my understanding, at the resonnace, the gain of the tuned amp
should be slightly higher than the one of the common source (wich is 11.76mV) since DC level of Vds is higher.
Now, with a max timestep of 50ns (still 40 times lower than the signal period), I get:
Not only the transient part of the response is quite diferent,
the amplitude in the permanent regime is now 1.2mV!
Here is a zoom of the permanent regime (same plot as above, only zoomed)
Does anyone have any ideas about why this is happening? I am a bit confused.
Thanks all in advance.
Pif
I'm trying to simulate a tuned amplifier in LTSpice in order to get a better grasp on its operation.
This is a first step in my studying of oscillators.
Anyway, I am getting weird results that I am not able to explain.
I would be very grateful if some of you guys could help
I'd love to attach my .asc file but apparently, I can't. But here is a picture of the setup:
I'm am using the MOS model from cmosedu.com available here, in particular the N_1u model.
There are actually several things I don't understand in the results I get from those simulations.
For now, I am gonna focus on the one that most annoys me: the result of the transient simulation for V(out)
depends (a lot) on the max timestep (dTmax: 4th parameter of the .TRAN command line) as shown below
With a max timestep of 1ns, i get:
Here, in the permanent regime, the peak-to-peak amplitude of V(out) is 11.95mV which
is approximately what I expected. In my understanding, at the resonnace, the gain of the tuned amp
should be slightly higher than the one of the common source (wich is 11.76mV) since DC level of Vds is higher.
Now, with a max timestep of 50ns (still 40 times lower than the signal period), I get:
Not only the transient part of the response is quite diferent,
the amplitude in the permanent regime is now 1.2mV!
Here is a zoom of the permanent regime (same plot as above, only zoomed)
Does anyone have any ideas about why this is happening? I am a bit confused.
Thanks all in advance.
Pif