I was trying to establish whether the amps I was repairing could be
correctly rated at 100W rms output , variously stated 150 , 180W "music
power" into 8 ohm with a THD at maximum power of 10 percent.
What would the likely wattage of thermal loss be for such a 100W rms rated
amp at say 20 and 50 watt rms output? As part of dummy load testing I was
monitoring the output vented air temperature in use at different power
levels, for the next time I come across such amps.
I appreciate they are thermally more efficient but I could not find what
temperatures to expect in a given ducted and vented cooling airflow
Class D can be done quite efficiently - 80 percent in a lot of off-the-
shelf designs. So I would not expect the power dissipation to be much
more than 10 watts for 50 watts output. (Efficiency isn't flat over
the entire power range but around half to full power you'll be in the
sweet spot for efficiency).
In some circumstances class-D efficiency can reach 90-95-98 percent in
certain applications but most audio amps seem to max out around 80
percent efficiency in real-life commercial products.
Vented air temperature by itself cannot be directly mapped into power
dissipation without a metric buttload of other parameters and thermal
and hydrodynamic/convection modeling. But if you want, just put a
power resistor in the same box and see how much power it's dissipating
when the vent temperature matches. It's not a perfect comparison but
there are just so many variables in play.
Tim.