chinuhark
Member level 5
I just went over the concept of dissipation and thermal resistance calculations for MOSFETs and opened a few IGBT datasheets do repeat the exercise for IGBTs.
Now for MOSFETS, at 24V and 35A with 20kHz switching, I got losses of say 3W max for many cheap MOSFETS (RDSon about 2mOhm). So even a cheap heatsink, say 18K/W would be 72C above Ambient.
For the IGBTS I saw, a general value of total switching energy Eon+off is 5mJ. So switching losses at 4kHz are 20W. Add to that about 2Vx5A=10W of conduction losses and you have 30W per IGBT....
Is this right or am I making a mistake (hopefully).
For 30W, even 3K/W of thermal resistance will make the IGBT unusable.
What am I doing wrong?
Now for MOSFETS, at 24V and 35A with 20kHz switching, I got losses of say 3W max for many cheap MOSFETS (RDSon about 2mOhm). So even a cheap heatsink, say 18K/W would be 72C above Ambient.
For the IGBTS I saw, a general value of total switching energy Eon+off is 5mJ. So switching losses at 4kHz are 20W. Add to that about 2Vx5A=10W of conduction losses and you have 30W per IGBT....
Is this right or am I making a mistake (hopefully).
For 30W, even 3K/W of thermal resistance will make the IGBT unusable.
What am I doing wrong?