The simple constant current battery chargers imho all suffer from an inadequate means of terminating the charge cycle: time. If you leave the batteries in the charger for too long, they will be overcharged and their lifetime will be decreased.
A goof charger detects end of charge e.g. by the dv/dt method and/or by registering the temperature rise of the battery near end of charge. You can find many dedicated NiMh charger ICs (often doubling for obsolete NiCd, too).
Or you build one of the simple circuits and life with the drawback of having to time the charge.
The question any AC/DC power supply with an output of 12dc 200mA is usable ?
A rather standard question with a rather standard answer. The output rating of any typical power supply means:
- voltage is output as stated, in this case 12 V.
- current output is defined by the load, with a maximum current draw allowed as stated, in this case 200 mA.
By no means does this imply that the output current will be 200 mA for any load nor does it imply that the output current is safely limited to 200 mA. You may be able to draw more than 200 mA but this will overload the power supply and in the long term lead to premature failure of the power supply due to overheating.
My favorite comparison is a mains outlet in your home. You'll get 115 V regardless of the load. A small desk lamp will draw only a few Milliamperes whereas an appliance may draw a few Ampere - all from the same outlet. Current is limited only by the fuse in your distribution panel.