Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Determining Proper Voltage For Desired Temperature Utilizing RTD Sensor

Status
Not open for further replies.

Zak28

Advanced Member level 2
Joined
Aug 19, 2016
Messages
579
Helped
6
Reputation
12
Reaction score
6
Trophy points
18
Activity points
4,681
Hello I will be designing a circuit which requires constant temperature, I have chosen an RTD sensor with adequate characteristics. I do not know the arithmetic for determining proper voltage for desired temperature. Chosen sensor:

https://www.mouser.com/ds/2/719/PPG101C1-769683.pdf
 

The datasheet has 1mA for maximum applied current. Don't allow more than this amount to go through the sensor. As a safety cushion keep it under .5 mA. Therefore always keep a resistor inline with it.

I do not know the arithmetic for determining proper voltage for desired temperature.

For a simple method to take readings, measure voltage at the node between them. This is not necessarily the most accurate method, nor most linear. You'll need to try a few methods.
 

There are standard tables available for platinum RTDs.
Here is correct one for a 100 ohm platinum device (Pt100)

**broken link removed**

Feed it with a very precise 1mA and the voltage generated will be 1mV per ohm.
In other words exactly 100mV at zero degrees Celsius.

You can then read the voltage straight off the chart for your desired temperature set point.
 

What is the maximum current for PT1000 ? is this also 10 times then PT100 i.e. 10 mA ?
 

Hi,

not likely.

Because the current generates heat.

With 1mA @ 100Ohms this is: P = I * I * R = 0.1mW
With 10mA @ 1000 Ohms this is 10mW. 100 time the value before. --> Expect error in temperature reading.

To get the same dissipated power you should go for: I = sqrt( P / R) = sqrt ( 0.1mW / 1000 Ohms) = 316uA.

For a more precise answer you should look at PT1000 datasheets and /or PT1000 general information documents.

Klaus
 

The data sheet under the following link does not provide information about maximum current.
It actually does by giving a self heating specification. It's your job to calculate a maximum current for the acceptable temperature error in your application.
 

What an acquaintance of mine did, was to use the resistor divider method, whereas the bottom "resistor" was the RTD sensor, and the top one was a precision, low tempco 0.1% resistor. The top resistor was fed from the same reference voltage used for the 16 bit ADC. That way the output value was ratiometric and not absolute.

Of course, the voltage output will be non-linear. But with an excell spreadsheet you can create a calibration table which is fed to the microcontroller. A large table with many datapoints can be quite tedious, so my acquaintance only calculated it in 1C increments, and then used a simple interpolation algorithm for the finer resolution.
 

Hi,

What an acquaintance of mine did, was to use the resistor divider method, whereas the bottom "resistor" was the RTD sensor, and the top one was a precision, low tempco 0.1% resistor. The top resistor was fed from the same reference voltage used for the 16 bit ADC. That way the output value was ratiometric and not absolute.

Good design. I think this is the most precise method. And it is simple hardware, too.

Usually you have a V_Ref error and a current_error.
Both are cancelled out, only the resistor error remains. --> Absolute_value_error may be calibrated out. Then only the drift_error remains.
It will be less than the PT1000 error.

For sure the table is some effort....

Klaus
 

There's a nice ADI application note about RTD linearization methods https://www.analog.com/media/en/technical-documentation/application-notes/AN709_0.pdf

The result is that piecewise linear interpolation with moderate number of points, e.g. every 5 or 10 degree C gives superior performance over the often used polynomial interpolation. The additional Rt/(Rt + Rref) characteristic of a ratiometric voltage divider configuration (also my favorite topology) should be inversed in software before applying RTD linearization.
 

Sorry, I come up with same question regarding biasing of PT1000. Which one of the following method is more practical to measure temperature between -50 C to 200 C. I have to use approx 2 meter cables for PT1000.

A battery source of 12 V connected to one 1K ohm resister, the other end of the resister connected to PT1000 whose other terminal is grounded. The output voltage is measured across PT1000 which is amplified latter using operational amplifier and fed to microcontroller.

The other configuration is to connect a resistor 10 K ohm with 10 V in series which serves as 1 mA current source. Connect PT1000 to this 1 mA current source and ground the other terminal of PT1000. Measure the output across PT1000 and send it to microcontroller using operational amplifier in order to scale the output signal to full range of ADC.
 

a resistor 10 K ohm with 10 V in series which serves as 1 mA current source
A resistor isn't a current source. Both configurations are only different regarding resistor value but basically the same (voltage divider configuration). 12V/1k is a bit too high current for Pt1000, see previous discussion about self heating.
 

The datasheet says 0.5C/mW self heating. Could you please suggest in which configuration should I use PT1000 to measure the temperature between -50C to 200C
 

I already answered your question in post #10. I would use voltage divider configuration.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top