This is a just-curious question for the down-under folks: why do your RCDs trip at 30 mA? This was stated in this thread.
It's usually agreed that a person can't let go of a conductor at around 10 mA. I was supposing that half that value is what the US manufacturers chose for their GFI trip points (the ones I've measured have all tripped at 4.5 to 5 mA). (A similar argument is used for the typical AC leakage current spec of 500 uA, but instead the 1 mA level is the generally-agreed sensibility threshold for humans.) Now, of course, if a GFI/RCD trips in 40 ms or so, the current level probably isn't terribly important. I was just curious why the Aussies chose 30 mA. Anyone know? Or is it one of those bureaucratic things designed by committee so that no one is responsible?
It's usually agreed that a person can't let go of a conductor at around 10 mA. I was supposing that half that value is what the US manufacturers chose for their GFI trip points (the ones I've measured have all tripped at 4.5 to 5 mA). (A similar argument is used for the typical AC leakage current spec of 500 uA, but instead the 1 mA level is the generally-agreed sensibility threshold for humans.) Now, of course, if a GFI/RCD trips in 40 ms or so, the current level probably isn't terribly important. I was just curious why the Aussies chose 30 mA. Anyone know? Or is it one of those bureaucratic things designed by committee so that no one is responsible?