Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Diode as temperature sensor

Status
Not open for further replies.

Mapuia

Junior Member level 3
Joined
Nov 30, 2011
Messages
25
Helped
1
Reputation
2
Reaction score
1
Trophy points
1,283
Activity points
1,577
When using an Si PN diode as a temperature sensor, I think it would be best to keep the bias current low in order to minimize self-heating and the effect of the ohmic resistance. Is this correct?

For example, would biasing a 1N4148 at around 0.1mA be OK to sense a narrow temperature range, roughly 0-10 deg C ? (I know there are dedicated temperature sensors but, for reasons which are not relevant here, it's more convenient for me to use a diode in a project).
 

Are you using a const current and measuring the voltage drop?
 

A constant current source and voltage measurement would be easiest to implement but over a zero to 10 degree temperature range, the voltage difference would be very small, maybe 25mV at best and it would be within the absolute value range of manufacturing variance. You would have to be very careful to eliminate the effects of temperature on the remainder of your circuit.

Brian.
 

I should have made one thing clear: The circuit is for controlling, not for measuring, temperature. The control range doesn't have to have a linear scale, so variations in forward voltage drop with temperature doesn't have to be perfectly linear.

I'll be using a TL431 as both the reference through a voltage divider and for biasing the diode via a resistor. The voltage divider is variable for setting the desired temperature and there's also a preset pot to set the range. The diode drop and the reference voltage go to a comparator to switch the controller with about +/-1 deg hysteresis.

I think a true constant current source would be superfluous given the non-linear requirement. Do you agree?
 

I should have made one thing clear: The circuit is for controlling, not for measuring, temperature. The control range doesn't have to have a linear scale, so variations in forward voltage drop with temperature doesn't have to be perfectly linear.

I'll be using a TL431 as both the reference through a voltage divider and for biasing the diode via a resistor. The voltage divider is variable for setting the desired temperature and there's also a preset pot to set the range. The diode drop and the reference voltage go to a comparator to switch the controller with about +/-1 deg hysteresis.

I think a true constant current source would be superfluous given the non-linear requirement. Do you agree?

I don't agree. Roughly speaking, you will find no perfect linear relationship between the voltage drop and the temperature in any reference.
So, even in the case you dont have stricted specs, you should provide a constant current source in order to ensure you are measuring (or giving the right reference signal) the correct physical quantity.
 

A diode that is biased with an accurate current source will have a Vd related to the absolute temperature as follows.

Vd(T)=-2mV/C

So if we assume the biasing current is constant and accurate, Vd will have a negative slope inverse proportional to absolute temperature.
You can use this relationship in your system.
 

Let's do some sample calculations (with some simplifications) and see what we come up with.

Say we start out at the center temp of 5ºC. The diode is biased through 22k from 2.5V (TL431). Assume a diode Vf of 550mV at that temp and current. Ibias = (2.5-0.55)V/22k = 88.6µA. (550mV is approximately what I got from actual measurements).

Let's say the diode has a tempco of -2.2mV/ºC. Over a temperature range of ±5ºC, the voltage drop across the diode, and therefore across the bias resistor, will change by ±11mV which translates to a change in bias current of ±0.5µA.

If my memory of basic semiconductor physics is correct, the diode should have a dynamic resistance of roughly 350 Ω at that temperature and current level. A variation of 0.5µA in bias current will then introduce an error of about 0.175mV. This results in an error of 0.08ºC which is more than acceptable.

Can you spot any flaws in my logic?
 

I don't agree. Roughly speaking, you will find no perfect linear relationship between the voltage drop and the temperature in any reference.
So, even in the case you dont have stricted specs, you should provide a constant current source in order to ensure you are measuring (or giving the right reference signal) the correct physical quantity.

1. Const current source is essential because changes in current will cause a change in the voltage drop that will appear as error signal.

2. Because the temp range is small, it is fairly accurate to consider the relationship as linear (proportional parts; I forget the theorem or equation; if the function is well behaved etc etc)

3. The diode body must be small with a low heat capacity else it will not respond rapidly and you cannot correct till it is too late.

Of course it all depends on YOUR application.

- - - Updated - - -

...Vd(T)=-2mV/C....

The right hand side does not contain T explicitly.

- - - Updated - - -

Let's do some sample calculations (with some simplifications) and see what we come up with.

Say we start out at the center temp of 5ºC. The diode is biased through 22k from 2.5V (TL431). Assume a diode Vf of 550mV at that temp and current. Ibias = (2.5-0.55)V/22k = 88.6µA. (550mV is approximately what I got from actual measurements).

Let's say the diode has a tempco of -2.2mV/ºC. Over a temperature range of ±5ºC, the voltage drop across the diode, and therefore across the bias resistor, will change by ±11mV which translates to a change in bias current of ±0.5µA.

If my memory of basic semiconductor physics is correct, the diode should have a dynamic resistance of roughly 350 Ω at that temperature and current level. A variation of 0.5µA in bias current will then introduce an error of about 0.175mV. This results in an error of 0.08ºC which is more than acceptable.

Can you spot any flaws in my logic?

Your analysis appear to be correct; the 22K is the impedance of the CC source.

If you want to resolve 1C, you need an accuracy of ~2mV and your analysis is correct. But if you want to control the temp accurate to 0.1C, then it may not be so.
 

Your analysis appear to be correct; the 22K is the impedance of the CC source.
That was my point. Supply from a well-regulated voltage through 22k to 350 Ω dynamic impedance is essentially constant current.

If you want to resolve 1C, you need an accuracy of ~2mV and your analysis is correct. But if you want to control the temp accurate to 0.1C, then it may not be so.
As I said in my second post, the target temperature regulation is not very tight - ±1ºC or at most ±0.5ºC. The error introduced by the not-quite-constant nature of the current source is less than 0.02ºC per 1ºC change.
 

It would seem that a voltage bias for the diode through a resistor would be quite adequate for your application.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top