Mike@Malta
Newbie level 3
- Joined
- Jun 9, 2014
- Messages
- 4
- Helped
- 0
- Reputation
- 0
- Reaction score
- 0
- Trophy points
- 1
- Activity points
- 30
Hello,
I am trying to design and simulate a linear ramp generator using an op-amp to be used as part of a timing circuit.
The basic block consists of an integrator. The input voltage is either a 0V or 5V supplied through a micro-controller
When a 5V is supplied to the circuit the output voltage should ramp down linearly until it reaches the ground rail and remains saturated to this rail until the input voltage changes to 0V at which point the output voltage should saturate to the positive rail.
Attached is the circuit and simulation results (using TINA-TI).
View attachment Op-Amp Integrator.bmp
View attachment Op-Amp Integrator Sim Results.bmp
As you may notice in the results, when the input voltage changes there is a delay before the output voltage starts ramping.
I would like to understand where this delay is coming from.
Any insights would be appreciated.
I am trying to design and simulate a linear ramp generator using an op-amp to be used as part of a timing circuit.
The basic block consists of an integrator. The input voltage is either a 0V or 5V supplied through a micro-controller
When a 5V is supplied to the circuit the output voltage should ramp down linearly until it reaches the ground rail and remains saturated to this rail until the input voltage changes to 0V at which point the output voltage should saturate to the positive rail.
Attached is the circuit and simulation results (using TINA-TI).
View attachment Op-Amp Integrator.bmp
View attachment Op-Amp Integrator Sim Results.bmp
As you may notice in the results, when the input voltage changes there is a delay before the output voltage starts ramping.
I would like to understand where this delay is coming from.
Any insights would be appreciated.