I'm doing some troubleshooting on 12-volt sealed lead-acid (SLA) batteries (out of APC battery backup units).
I've got a battery that gives me 13.2 volts.
It is an Exide PowerSafe EP1229W. On the side, it reads:
This battery is rejected by the battery backup as no good. I'm trying to figure out why. I figured I would start with trying to determine the battery's internal resistance. (If you think there's a better way to test these batteries, please let me know!)
I have one DMM and four 10-watt resistors (two 100-ohm and two 50-ohm, all 10% tolerance).
First, I test the voltage of the battery with no load. Open-circuit voltage is 13.2 volts.
I only have one DMM at the moment, so I am making one measurement at a time. When I hook up one 100-ohm resistor across the battery, the terminal voltage drops slowly. It seems to hold steady at 12.79V Current appears to be 140 mA at that point. The resistor I am using has a measured value of 99.3 ohms.
So, wouldn't it be a simple case to apply Ohm's Law? V= I*(R+Rbat)?
12.79 = .14 * (99.3 + X)
X = -7.94 ohms?
Am I doing something wrong?
Thank you for your time.
I've got a battery that gives me 13.2 volts.
It is an Exide PowerSafe EP1229W. On the side, it reads:
Charge Parameters: Constant Voltage Charge with voltage regulation (27 degrees celsius)
Standby Use: 13.6V - 13.8 V
Cycle Use: 14.1V - 14.4V
Max initial current: 1.4A
This battery is rejected by the battery backup as no good. I'm trying to figure out why. I figured I would start with trying to determine the battery's internal resistance. (If you think there's a better way to test these batteries, please let me know!)
I have one DMM and four 10-watt resistors (two 100-ohm and two 50-ohm, all 10% tolerance).
First, I test the voltage of the battery with no load. Open-circuit voltage is 13.2 volts.
I only have one DMM at the moment, so I am making one measurement at a time. When I hook up one 100-ohm resistor across the battery, the terminal voltage drops slowly. It seems to hold steady at 12.79V Current appears to be 140 mA at that point. The resistor I am using has a measured value of 99.3 ohms.
So, wouldn't it be a simple case to apply Ohm's Law? V= I*(R+Rbat)?
12.79 = .14 * (99.3 + X)
X = -7.94 ohms?
Am I doing something wrong?
Thank you for your time.