Good answers, good answers….
I guess my question is: why is Ohm's law correct and so different than common sense? If I knew nothing about electronics common sense would say that the power consumption of one circuit illuminating a light bulb and a second circuit illuminating a light bulb (albeit dimmer) *and* heating up a resistor (of the same resistance as bulb) in series would be equal or even more.
You need to upgrade your common sense. Ohms law is very basic and
fundamental to understanding electronics. The resistance or lack of it
determines the amount of current that can flow in a circuit. The voltage
dropped across a resistive component multiplied by the current flowing
through it determines how much power it will dissipate.
At constant applied voltage the current is halved by adding an equal
value series resistance to your bulb (actually it isn't because at a
lower current the cooler bulb filament offers less resistance). An
examiner marking scheme got this one wrong in an exam once and I turned
up with a page of algebra and a colleague turned up with a plank with
the actual circuit nailed to it! It was agreed to give marks for both
the official marking scheme answer *and* the correct answer.
(ISTR the syllabus had been teaching that their answer was right!)
my second question (not really related) is: so are resistors evil things? It seem like nobody would want to waste energy by creating heat. Is there no way to "throttle" the flow of electrons without wasting power by turning it into heat?
Electric fire bar or fan heater is the most common application of
deliberately turning electricity into heat - an air source heat pump
would be more efficient but turning electricity into heat is done.
Ultimately all high grade energy ends up being dissipated as heat as an
inevitable consequence of thermodynamics.
For many applications you can get away with switching the maximum
voltage on and off very quickly as a form of pulse width modulation to
avoid having any significant resistive losses. Class-D amplifiers use
this method to get high efficiency with less waste heat.
http://en.wikipedia.org/wiki/Class_D_Amplifier
Along those lines, if I run a 6V circuit with a 6 Ohm bulb (pretending bulbs are linear resistors) it would consume 6W. If I want to power the same bulb by a 12V battery I would have to add a 6Ohm resistor to have the bulb illuminate at the same brightness. This circuit would consume 12W of power. That seems like an incedible waste of power. What am I missing here?
thanks alot for your input.
These days if you wanted to do that you would probably use PWM or a DC
to DC converter to provide 6v from the 12v source at the right current
and high efficiency. Using a power rheostat (variable resistor) to
control current flowing in a motor or other load is much rarer these
days than it was in the past. See for example:
http://en.wikipedia.org/wiki/Pulse-width_modulation