yanc
Newbie
Hi, I have a query regarding a dual MOSFET delay circuit I'm trying to implement.
I'm need to switch a 16V 5A load using a 16V control signal and once swiched there needs to be a 3sec delay before the MOSFET actually turns off.
I thought I would be able to do this using a N & P channel MOSFET configuration in a dual device package with a R/C delay circuit on the gate. The circuit does work but the time delay changes considerably depending on how long the MOSFET has been conducting. I have tried different capcitor types (Tant and ceramic) and values with the same effect. So initially the time delay is approx 10sec but if the the circuit is left on this gradiually reduces to less than 3sec.
Assuming this due to the gate resistance changing in Q1A as it warms up. I didnt think this would be critical as Q1B is the device doing the main switching, but maybe as they are in the same package there is enough thermal transfer between the two devices to have this effect?
Any help would be most appreciated..
Thanks
I'm need to switch a 16V 5A load using a 16V control signal and once swiched there needs to be a 3sec delay before the MOSFET actually turns off.
I thought I would be able to do this using a N & P channel MOSFET configuration in a dual device package with a R/C delay circuit on the gate. The circuit does work but the time delay changes considerably depending on how long the MOSFET has been conducting. I have tried different capcitor types (Tant and ceramic) and values with the same effect. So initially the time delay is approx 10sec but if the the circuit is left on this gradiually reduces to less than 3sec.
Assuming this due to the gate resistance changing in Q1A as it warms up. I didnt think this would be critical as Q1B is the device doing the main switching, but maybe as they are in the same package there is enough thermal transfer between the two devices to have this effect?
Any help would be most appreciated..
Thanks