Often convergence errors occur when I have 3-terminal devices, and I power them from one supply voltage, yet drive the base/gate from a different voltage. In real life it is not necessarily an error. But the simulator gets confused.
In particular, your buck converter has a 12V supply. The mosfet gates are driven by 20V. There's a chance this causes the error. It would be easier if the simulator would tell us, wouldn't it?
Another possibility: mosfet M2 is N-type. Its gate is referenced to the source terminal. For it to turn on, it needs to see a definite lower volt level at the source terminal. However there are components intervening in the path to ground.
Furthermore when it turns on, it conducts 12V to the node below it, and the node goes to almost 12V. To solve this with real hardware we drive the gate with 20V... But how the simulator interprets it, only the simulator knows.
It may help if you use a P-mos for the high side.
when the error is more the pulse width should be more, but here it is less for more error and becomes more for small error.I am not getting where to make the changes for the pwm, tried changing the ramp amplitude,but that may be trial and error.Are there any design equations for this ?
i tried the way you told.But i did not get why that change of sense voltage application to non inverting input matters.
I now need gate drivers for my design or can i do without gate drivers, any simple way to do ?
To be honest, I went on the old saying: 'If it doesn't work the way you hooked it up, then try reversing the connection.'
The inverting input is tricky that way.
An alternate solution is to switch the inputs at your second op amp instead. What you have then might agree better with your intuitive sense of how the two op amps cooperate.
Screenshot:
The big question mark is whether your mosfets will turn on fully in response to a 5V pulse. If they will, then you don't need to step it up.
The problem is my circuit regulates well for these values,but when i make variations the output starts oscillating
I read on a paper that for say a 12/1.5V, 20A voltage regulator module if you use a single stage synchronous buck you will require a large output capacitor which will eat up too much space on the motherboard and make the design impractical. so the solution to this is use multiphase synchronous buck which will reduce the size of the output capacitance
My question is:- For saving the board space if you are using a multiphase VRM, it will have more no of MOSFETs as compared to the single phase, wont it again increase the size
Isnt the reduction of capacitor size cancelled by the increased no of MOSFETs ?
What changes occur in the design when the load changes ?
In a mutiphase buck converter what is the need of giving phase shifted signals to the high side MOSFETS ?
That way the supply does not need to provide a single high peak, but two lesser peaks. This method is useful if the supply current is limited due to internal impedance (portrayed by a 3 ohm resistor in the screenshot below).
I did not get what a single high peak means.Please elaborate on that.
Also what is the impact of input and output voltage on transient response of a buck converter ?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?