Maker Pro
Maker Pro

Energy conversion efficency of lamps

I

Ingo Thies

Hi all,

in Wikipedia and many other places the efficiency of artificial light
sources, e.g. incandescent bulbs, is stated in per cent, and thus
treated as energy conversion efficiency. Here, the typical values for an
incandescent lamp (non-halogen, with a filament temperature of about
2700 K) are around 5% with 95% being wasted as "heat".

I am familiar with the definitions of luminous efficacy and efficiency,
both being well explained at [1]. However, according to this a light
bulb has a luminous efficacy about 12-15 lumens per watt, corresponding
to a luminous efficiency of about 2% (lumens per watt devided by the
theoretical maximum of 683 lm/W for monochromatic light near 550
nanometres). The value of five per cent does never occur. This would
correspond to a 100 per cent efficiency between 240 and 300 lm/W. The
maximum for a truncated Planck spectrum (i.e. clipped to the 400-700 nm
range) lies in this range [2], but it is never clearly cited as such.
Furthermore, this truncated Planck figure is ambigous since it strongly
depends on the truncation wavelengths and the reference wavelength (the
refecence paper uses 5800 Kelvins and 400-700 nm).

So, is there any "official" definition (maybe ISO) of the energy
conversion efficiency of a light source, and if so, how is it related to
luminous efficiency/efficacy?

[1] http://en.wikipedia.org/wiki/Luminous_efficacy
[2] http://physics.ucsd.edu/~tmurphy/papers/lumens-per-watt.pdf

Best wishes,

Ingo
 
B

boxman

Hi all,

in Wikipedia and many other places the efficiency of artificial light
sources, e.g. incandescent bulbs, is stated in per cent, and thus
treated as energy conversion efficiency. Here, the typical values for an
incandescent lamp (non-halogen, with a filament temperature of about
2700 K) are around 5% with 95% being wasted as "heat".

I am familiar with the definitions of luminous efficacy and efficiency,
both being well explained at [1]. However, according to this a light
bulb has a luminous efficacy about 12-15 lumens per watt, corresponding
to a luminous efficiency of about 2% (lumens per watt devided by the
theoretical maximum of 683 lm/W for monochromatic light near 550
nanometres). The value of five per cent does never occur. This would
correspond to a 100 per cent efficiency between 240 and 300 lm/W. The
maximum for a truncated Planck spectrum (i.e. clipped to the 400-700 nm
range) lies in this range [2], but it is never clearly cited as such.
Furthermore, this truncated Planck figure is ambigous since it strongly
depends on the truncation wavelengths and the reference wavelength (the
refecence paper uses 5800 Kelvins and 400-700 nm).

So, is there any "official" definition (maybe ISO) of the energy
conversion efficiency of a light source, and if so, how is it related to
luminous efficiency/efficacy?

[1] http://en.wikipedia.org/wiki/Luminous_efficacy
[2] http://physics.ucsd.edu/~tmurphy/papers/lumens-per-watt.pdf

Best wishes,

Ingo

Any standard photometry unit will be based on the base SI unit of
candela. The candela refers to the CIE 1931 2° standard observer
photopic response function. A tabulation of that response curve can be
found here and is in the y2 column.

http://www.cis.rit.edu/research/mcsl2/online/CIE/StdObsFuncs.htm.

Based on that, you can see that truncating to 400-700 nm does not
introduce a significant amount of error and therefore when talking about
luminous efficiency, the method of calculation used in [2] cited above,
would be considered standard practice. In rigorous form, the
calculation could be performed over the interval of 380-780 as that is
what was determined by the CIE, however that is unlikely to change the
calculated result by any significant amount due to the extremely low
responses measured at the high and low wavelengths.

Granted there are numerous arguments about whether the CIE 1931 response
curve is an accurate measurement of human response to light in all
situations, color temperature, etc., however the standard measurements
performed for photometry must be reported as a function of that curve.

Hope that helps clarify your question.
 
I

Ingo Thies

Based on that, you can see that truncating to 400-700 nm does not
introduce a significant amount of error and therefore when talking about
luminous efficiency, the method of calculation used in [2] cited above,
would be considered standard practice. In rigorous form, the
calculation could be performed over the interval of 380-780 as that is
what was determined by the CIE, however that is unlikely to change the
calculated result by any significant amount due to the extremely low
responses measured at the high and low wavelengths.

You are right that the difference in terms of total visual brightness
won't change much between [400:700] and [380:780] nanometres. But if a
truncated Planck curve within these intervals is taken as a reference
for "100% efficiency", the difference will be large, because the larger
interval contains much more light in the low-response regime of the
retina, i.e. light which doesn't contribute much to brightness, but
indeed contributes much to energy consumption.

I have tested: The smaller interval, taken as 100%, would result in
about 5% efficiency of the light bulb (247 lm/W vs. 12.4 lm/W at 2700
K). But the larger interval will give it about 8% efficiency (148 lm/W
vs. 12.4 lm/W; values from own computation using full and truncated
Planck spectra and the CIE 1931 model). These percentages are, If I'm
not mistaken, equivalent with the ratio of the integrals of the
energetic flux (i.e. without the CIE weighting function); say that a
2700 K blackbody emits about 5% of its power within 400-700 nm and 8%
within 380 and 780 nm, further about 20% between 100 and 1000 nm, and,
trivially, 100% between 0 and infinity.

Even worse: One can easily increase the luminous efficacy by using a
non-ideal spectrum. For example, a 2700 K chromaticity can be fitted by
two spectral lines at 445.7 and 580.4 nm with amazing 516 lm/W (and a
color rendering index of about the elevation level of Hell in metres...).

But the worst problem is that the output quantity is not an energy flux
but luminous flux, i.e. you simply can't define an energy conversion
factor of the form eta=P_out/P_in because P_out has a different
dimension. Any percentage would require an arbitrary definition like the
examples above. There is no natural reason why 250 lm/W (and not 150
lm/W or 683 lm/W) should be equivalent to "100%".

So, we can reduce the problem to the question: Is the truncated Planck
spectrum between 400-700 nm interval, which best fits the popular 5%
figure, an official standard for lighting efficiency?

Anyway, the value of lumens per watt is in any case the better choice,
since it

Best wishes,

Ingo
 
B

boxman

Based on that, you can see that truncating to 400-700 nm does not
introduce a significant amount of error and therefore when talking about
luminous efficiency, the method of calculation used in [2] cited above,
would be considered standard practice. In rigorous form, the
calculation could be performed over the interval of 380-780 as that is
what was determined by the CIE, however that is unlikely to change the
calculated result by any significant amount due to the extremely low
responses measured at the high and low wavelengths.

You are right that the difference in terms of total visual brightness
won't change much between [400:700] and [380:780] nanometres. But if a
truncated Planck curve within these intervals is taken as a reference
for "100% efficiency", the difference will be large, because the larger
interval contains much more light in the low-response regime of the
retina, i.e. light which doesn't contribute much to brightness, but
indeed contributes much to energy consumption.

I have tested: The smaller interval, taken as 100%, would result in
about 5% efficiency of the light bulb (247 lm/W vs. 12.4 lm/W at 2700
K). But the larger interval will give it about 8% efficiency (148 lm/W
vs. 12.4 lm/W; values from own computation using full and truncated
Planck spectra and the CIE 1931 model). These percentages are, If I'm
not mistaken, equivalent with the ratio of the integrals of the
energetic flux (i.e. without the CIE weighting function); say that a
2700 K blackbody emits about 5% of its power within 400-700 nm and 8%
within 380 and 780 nm, further about 20% between 100 and 1000 nm, and,
trivially, 100% between 0 and infinity.

Even worse: One can easily increase the luminous efficacy by using a
non-ideal spectrum. For example, a 2700 K chromaticity can be fitted by
two spectral lines at 445.7 and 580.4 nm with amazing 516 lm/W (and a
color rendering index of about the elevation level of Hell in metres...).

But the worst problem is that the output quantity is not an energy flux
but luminous flux, i.e. you simply can't define an energy conversion
factor of the form eta=P_out/P_in because P_out has a different
dimension. Any percentage would require an arbitrary definition like the
examples above. There is no natural reason why 250 lm/W (and not 150
lm/W or 683 lm/W) should be equivalent to "100%".

So, we can reduce the problem to the question: Is the truncated Planck
spectrum between 400-700 nm interval, which best fits the popular 5%
figure, an official standard for lighting efficiency?

Anyway, the value of lumens per watt is in any case the better choice,
since it

Best wishes,

Ingo

The short answer is I don't believe there is a standard. The original
experiments used to establish the photometric response curve that is
contained with the CIE 1931 standard limited the actual testing to
between 400 and 700 nm. The data outside of this is a result of fitting
a curve model to the experimental data.

Standard color calculations are expected to be integrated over the range
of 380-780 nm, although I believe that measurements have shown responses
from 360-830 nm. I suppose the key point would be to apply the same
range whenever you are making comparisons between different sources and
to inquire about the range used for calculation when using other peoples
efficiency data.

In reference to using combined monochromatic sources to increase the
efficiency, clearly it is a manipulation that can be done based on the
mathematical models used to describe the efficiency. However in the end
someone who is interested in producing light that is useful and that can
be marketed for a profit won't likely do very well in their endeavor if
they choose to manufacture products based on maximum efficiency claims
that disregard the other aspects of human response to light.
 
Top