Maker Pro
Maker Pro

°K to °C conversion: hardware or software?

F

__frank__

I have to measure two temperatures:
the first one is outdoor, the second
one instead is indoor.

So I was thinking to use two very common
sensors: AD590 for the outdoor and
LM335 for indoor.

The measurement will be managed by
a National Instruments DAQ USB card;
the temperature will be displayed on PC.

The unit to use must be °C; the sensors
are in °K so, of course, I have to
make a conversion.

Then I have two options, the first
"software" and the second "hardware"

1 - Acquire voltage related to °K
(10mV/°K) - for the AD590 I have
to use a I/V converter (of course)
then convert it (with a simple
software statement in source code) in °C

2 - Acquire voltage "scaled" to
0...5V, -20°C->0V and 40°C->5V

Thanks to Google I saw that the first solution
is rarely used while the second solution
is more used but of course involves more
components.

I was wondering which solution was the better
and why "software" solution is rarely used.

Thanks
 
R

Roger Hamlett

__frank__ said:
I have to measure two temperatures:
the first one is outdoor, the second
one instead is indoor.

So I was thinking to use two very common
sensors: AD590 for the outdoor and
LM335 for indoor.

The measurement will be managed by
a National Instruments DAQ USB card;
the temperature will be displayed on PC.

The unit to use must be °C; the sensors
are in °K so, of course, I have to
make a conversion.

Then I have two options, the first
"software" and the second "hardware"

1 - Acquire voltage related to °K
(10mV/°K) - for the AD590 I have
to use a I/V converter (of course)
then convert it (with a simple
software statement in source code) in °C

2 - Acquire voltage "scaled" to
0...5V, -20°C->0V and 40°C->5V

Thanks to Google I saw that the first solution
is rarely used while the second solution
is more used but of course involves more
components.

I was wondering which solution was the better
and why "software" solution is rarely used.
Actually you use both.
No 2, _does not_ perform °K to °C, but simply scales the output range, to
make best use of the ADC. So the resulting reading from the ADC, will be
'0' for -20, and a software subtraction or addition will then be used to
generate the required temperature reading. So the software solution _is_
being used.
What is being done, is that the I/V conversion factor is being chosen to
give an output for the full span of the ADC, for the required reading
range, to make best use of the available resolution of the ADC. This is
independant of the need to scale the temperature. The resulting output
range,is then having a voltage subtracted, to bring the bottom of the used
range, to the bottom of the ADC range. If (for instance), you have a 10bit
ADC, you have potentially (ignoring ADC errors), 1024 'points' across the
required 60C range. If instead you wanted to do the complete solution in
software, and instead set the I/V factor, so that the maximum 40C 'point'
was at 5v, with 0K at 0V, you would only have 131 points across the
same -20 to +40 range.

Best Wishes
 
__frank__ said:
2 - Acquire voltage "scaled" to
0...5V, -20°C->0V and 40°C->5V

This is not used for conversion but for normalisation. We normalise our
input ranges because we want better, more precise measurement. For
example, at 10mV/K, assuming 0V represents 0K, 5V would represent 500K.
That's roughly 226C. If what we want to measure never reaches that
temperature then we are wasting a lot of resolution on a range which
will never be measured. Software cannot rectify this since the data is
already sampled. How is the software supposed to know if 120K is really
120.2K or 119.9K if the hardware can only tell it 120K? Also, very few
things which we usually want to measure reach down to zero K. Again we
are wasting resolution by building hardware that can measure what will
never be measured. So, people 'scale' the hardware to pick the
appropriate range for example:

-20C to 80C for room temperature
-40C to 10C for freezers
50C to 400C for ovens

then you scale again in software to convert to human readable form.
 
M

Mike Harrison

I have to measure two temperatures:
the first one is outdoor, the second
one instead is indoor.

So I was thinking to use two very common
sensors: AD590 for the outdoor and
LM335 for indoor.

The measurement will be managed by
a National Instruments DAQ USB card;
the temperature will be displayed on PC.

The unit to use must be °C; the sensors
are in °K so, of course, I have to
make a conversion.

Then I have two options, the first
"software" and the second "hardware"

1 - Acquire voltage related to °K
(10mV/°K) - for the AD590 I have
to use a I/V converter (of course)
then convert it (with a simple
software statement in source code) in °C

2 - Acquire voltage "scaled" to
0...5V, -20°C->0V and 40°C->5V

Thanks to Google I saw that the first solution
is rarely used while the second solution
is more used but of course involves more
components.

I was wondering which solution was the better
and why "software" solution is rarely used.

Becase the datasheets & appnotes for the devices were probably written when software was expensive
due to the cost of the hardware to run it on, and most instrumunts were all analog and did not
contain microcontrollers. Doing it in hardware nowadays would be silly.
 
S

Spehro Pefhany

I have to measure two temperatures:
the first one is outdoor, the second
one instead is indoor.

So I was thinking to use two very common
sensors: AD590 for the outdoor and
LM335 for indoor.

The measurement will be managed by
a National Instruments DAQ USB card;
the temperature will be displayed on PC.

The unit to use must be °C; the sensors
are in °K so, of course, I have to
make a conversion.

Then I have two options, the first
"software" and the second "hardware"

1 - Acquire voltage related to °K
(10mV/°K) - for the AD590 I have
to use a I/V converter (of course)
then convert it (with a simple
software statement in source code) in °C

2 - Acquire voltage "scaled" to
0...5V, -20°C->0V and 40°C->5V

Thanks to Google I saw that the first solution
is rarely used while the second solution
is more used but of course involves more
components.

I was wondering which solution was the better
and why "software" solution is rarely used.

Thanks

BTW, note that there's no '°' used with Kelvins.

Anyway, if you need a 60°C span, then you have a span of 600mV with
10mV/K. If you scale and offset the voltage with a good op-amp,
precision resistors, and an accurate and stable voltage reference, you
can have a span of 5000mV, so almost two orders of magnitude better
resolution with a given ADC resolution. OTOH, you have new sources of
error to consider.

Whether it makes any difference for your application is something you
have to decide-- it's not a matter of taking a popularity poll. High
resolution (and low noise) helps in control applications in allowing
you to approach an analog circuit in performance. For example, if
you're implementing digital PID control and you have crap ADC
resolution then your derivative signal will be crap. You can filter it
so the actuator doesn't flap all over the place, but that has other
undesirable effects-- once the resolution is gone, it's gone.


Best regards,
Spehro Pefhany
 
J

John Woodgate

I read in sci.electronics.design that "[email protected]"
It's degrees Celcius but it's just Kelvins (without degrees).

Well, it's 'Celsius' and it does have the degree sign. But 'K' can have
it or not, and they mean different things. '100°K' is 100 degrees above
absolute zero, whereas '100 K' is simply any temperature difference of
100 kelvin.
 
J

John Woodgate

I read in sci.electronics.design that John Devereux
That is counter-intuitive to me. (It is also contrary to 3 years of a
physics degree course, where we were told never to write "°K").

I think they changed the rules at some point, to distinguish clearly
between absolute temperatures and temperature differences.
Surely it should be the other way around?

No, degrees ('degrees Kelvin) are associated with a *scale* - the Kelvin
scale in this case. Plain units are not associated with a scale, which
is why expressing temperature differences in degrees is incorrect.

There is a similar thing with weighted noise levels. The difference
between 80 dB A-weighted and 90 dB A-weighted is 10 dB, not 10 dB
A-weighted. It's not weighted at all; it's a constant 10 dB at all
frequencies.
 
J

John Woodgate

I read in sci.electronics.design that John Devereux
I still seems backwards to me.

I sympathise, without agreeing.
Kelvin is special because it is zero-based, such that Kelvins *are*
units. You can, for example, take the ratio of two Kelvin temperatures.

You are confusing the scale and the unit still. I don't find it easy to
explain. The kelvin is certainly a unit. The ratio of two temperature
differences is potentially just as meaningful, in an appropriate
context, as the ratio of two Kelvin temperatures.
So I don't see the distinction between a scale and a unit in this case.
The other temperature "scales" are not zero based, so are always
expressed relative to a base temperature, e.g. that of triple point of
water.

I don't know of any scale based on that, but that's a side issue.

You can look at the matter like this. Any temperature difference can be
expressed in kelvins. But a temperature difference from absolute zero is
distinguished by having the ° sign included.
 
John said:
I read in sci.electronics.design that John Devereux


I sympathise, without agreeing.


You are confusing the scale and the unit still. I don't find it easy to
explain. The kelvin is certainly a unit. The ratio of two temperature
differences is potentially just as meaningful, in an appropriate
context, as the ratio of two Kelvin temperatures.


I don't know of any scale based on that, but that's a side issue.

You can look at the matter like this. Any temperature difference can be
expressed in kelvins. But a temperature difference from absolute zero is
distinguished by having the ° sign included.

OK but I've certainly never seen or heard of Celsius without the
'degrees' before. Even when talking about temperature differenials its
always been degrees Celsius. But I have seen Kelvins without degrees.

Wikipedia seems to refer to Celsius as "degree Celsius":

http://en.wikipedia.org/wiki/Celsius

I don't think Celsius alone, without the 'degree' is a unit. But there
is a difference between degrees Celsius and Celsius degrees which
unfortunately has the same SI notation. See:

http://www.islandnet.com/~see/weather/whys/tempconv.htm
 
J

John Woodgate

I read in sci.electronics.design that "[email protected]"
OK but I've certainly never seen or heard of Celsius without the
'degrees' before.

Did anyone say they had? Such a notation could have been introduced IF
the idea of distinguishing between temperature differences and actual
temperatures had preceded the move to the Kelvin scale and the kelvin
unit.
Even when talking about temperature differenials its always been
degrees Celsius. But I have seen Kelvins without degrees.

Yes, and I tried to explain why.
Wikipedia seems to refer to Celsius as "degree Celsius":

Yes, which is not quite rigorous. But to make it rigorous, the concept
of a unit 'celsius' would have to be introduced, identical to the
kelvin.
http://en.wikipedia.org/wiki/Celsius

I don't think Celsius alone, without the 'degree' is a unit.

See the explanation above.
But there is a difference between degrees Celsius and Celsius degrees
which unfortunately has the same SI notation. See:

http://www.islandnet.com/~see/weather/whys/tempconv.htm

This (C°) is a notation used only (AFAIK) by meteorologists. Everyone
else uses K, except for Americans, of course. (;-)
 
Top