Please help me understand how to use my digital multimeter to measure amps. There are two scales on my meter for measuring DC amps: 200 mA and 10 A. When I measure a current with the multimeter set to 200 mA, I get about 10 mA, but when I set the scale to 10A, it says 0.10 A. Since there are 1,000 mA in 1 A, why doesn't it say either 10 mA / 0.01 A, or 100 mA / 0.10 A? I don't get it.
-
Categories
-
Platforms
-
Content