My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?