I have a electronic device that uses an OEM ($$$) wall wart rated at
5VDC and 220mA. In my collection of transformers I found one rated at
5VDC with the same plug/ polarity except it's rated at 500mA. Would it
be okay to use or should mA ratings match more closely?? Thanks.
Wall plug-in transformer adapters are a good economical source for
transformers. Beware: most are only halfwave rectified but that is no
problem if you output into a fullwave bridge. You can find them for
little cost at swapmeets, garage sales, surplus stores etc. Some things
to look for on the transformer case are voltage, current, and if it's
DC or AC, and if it's a positive or negative ground -- usually
indicated on the ouside ring of the plug symbol. AC is our first
choice.
When measuring the output voltage with no load applied the voltage will
always read as much as 40% higher than its rating. To determine the
true voltage and current rating of the unit apply a resistance load at
the ouput and measure the voltage.
For example, to find the resistance for a voltage rating of 12VDC @
300mA. Looking at Ohms Law with , E/I , E=12V, divided by I=.300A =
..040k ohms or 40 ohms and the wattage of the resistor should be E X I
or 12V X .300A = 3.6Watts. That's the wattage required for a permanent
application, but for a fast voltage measurement a one watt resistor
could be used.
roma