Continue to Site

# MOSFET gate drive capacity

Status
Not open for further replies.

#### abhishek.2138

##### Full Member level 2
For calculating the gate driver power supply capacity, I am following below equations –

Average gate current from driver = Peak source current * (MOSFET turn ON time / Time period)

Average gate voltage from driver = Max gate voltage * (MOSFET turn ON time / Time period)

So, average power = average gate current * average gate voltage.

Peak gate source current will flow only for Nano seconds, so to design gate drive power supply average power capacity is required.
Are these calculations CORRECT to determine gate drive power supply capacity?

Hi,

there are many good application notes available in the internet.
They discuss the problem in detail. (good AN: semiconductor manufacturer)

I guess none of your formuals is generally true.

So, average power = average gate current * average gate voltage.
Take this as example.
Let´s consider a power supply 10V DC, a switch 50% duty cycle, a 10 Ohms resistor. (almost ideal parts)
Do you agree that the power that comes out of the power supply has to be the same as the power that is dissipated in the resistor? (where else should be the difference?)
Hopefully Yes.
Do you also agree
* that the current during ON is 10V / 10 Ohms = 1A?
* that the average current of a 50% duty cycle of 1A = 0.5A?
* that the average current of the resistor is the same as the current of the power supply?
* that the average voltage of the power supply is 10V?
* that the average voltage on the resistor is 5V

so the power of the power supply then is: 10V x 0.5A = 5W (this is the correct one)
and the power of the resistor is: 5V x 0.5A = 2.5W

You see there is a mismatch in power.
Even RMS does not work (here the value for the resistor is correct but not for the power supply)

Klaus

### kabeer02

Points: 2
Peak gate source current will flow only for Nano seconds, so to design gate drive power supply average power capacity is required.
The power sinked from device gate to gate driver is insignificant, what really matter is the peak current capacity of the gate driver.
Coincidentally, there is another thread dealing with a similar issue on this forum.

Hi,

there are many good application notes available in the internet.
They discuss the problem in detail. (good AN: semiconductor manufacturer)

I guess none of your formuals is generally true.

Take this as example.
Let´s consider a power supply 10V DC, a switch 50% duty cycle, a 10 Ohms resistor. (almost ideal parts)
Do you agree that the power that comes out of the power supply has to be the same as the power that is dissipated in the resistor? (where else should be the difference?)
Hopefully Yes.
Do you also agree
* that the current during ON is 10V / 10 Ohms = 1A?
* that the average current of a 50% duty cycle of 1A = 0.5A?
* that the average current of the resistor is the same as the current of the power supply?
* that the average voltage of the power supply is 10V?
* that the average voltage on the resistor is 5V

so the power of the power supply then is: 10V x 0.5A = 5W (this is the correct one)
and the power of the resistor is: 5V x 0.5A = 2.5W

You see there is a mismatch in power.
Even RMS does not work (here the value for the resistor is correct but not for the power supply)

Klaus
So here how to find the average power supplied by the power supply? Does 5W is the average power?

Qg*fs is the usual calculation method for average gate drive power.

and the losses in the driver circuitry - which can add up at higher frequencies ...

Some MOSFET drivers have a substantial shoot-through
which is additive to Qg gate charge / discharge.

Peak current is only had at the beginning of the transition,
it will be somewhat triangular / "lumpy" (like, you might
sit at a drive-limited midrange during the Miller plateau).

An accurate MOSFET and MOSFET driver model would
serve you better than simple calcs.

Status
Not open for further replies.