I am setting up a remote observatory and I'm building a data and power station for it. I am not an electrical engineer, so I've been learning a lot as I've progressed. Here's my current conundrum...
I have a remote DC Power Switch that uses two 12V DC power supplies for input power. Each one powers 4 channels (8 total) that I can turn on/off via an internet connection. There is a common ground (- wire) for all the channels and the two power supplies.
I put a multimeter on each DC power supply. Power Supply 1 = 14V, Power Supply 2 = 14.3V. If I have power supply 1 on only, I see a 2 amp reading with a few items loaded on the circuit. If I turn on the 2nd power supply, even with a very small load (0.1A), on its channels, things get weird. The 1st meter amperage drops to 1.8A and the 2nd one increases to about 1.4A. But, I only have 2 amps from my original reading and I now am showing 1.8A+1.4A=3.2A by turning on the 2nd power supply.
These are cheap meters, so there is a calibration risk that I have not tested.
These meters are effectively in parallel between the load and the negative terminal on each power supply. I am sure there is a formula that tells me what a parallel reading should be. If they are additive, then I would expect them to be 1/2 the total amperage or approximately 1A each.
What's going on here? How many amps should I be reading?
Thanks,
Ian
I have a remote DC Power Switch that uses two 12V DC power supplies for input power. Each one powers 4 channels (8 total) that I can turn on/off via an internet connection. There is a common ground (- wire) for all the channels and the two power supplies.
I put a multimeter on each DC power supply. Power Supply 1 = 14V, Power Supply 2 = 14.3V. If I have power supply 1 on only, I see a 2 amp reading with a few items loaded on the circuit. If I turn on the 2nd power supply, even with a very small load (0.1A), on its channels, things get weird. The 1st meter amperage drops to 1.8A and the 2nd one increases to about 1.4A. But, I only have 2 amps from my original reading and I now am showing 1.8A+1.4A=3.2A by turning on the 2nd power supply.
These are cheap meters, so there is a calibration risk that I have not tested.
These meters are effectively in parallel between the load and the negative terminal on each power supply. I am sure there is a formula that tells me what a parallel reading should be. If they are additive, then I would expect them to be 1/2 the total amperage or approximately 1A each.
What's going on here? How many amps should I be reading?
Thanks,
Ian