This is not necessaryily so - also in many if not most cases you're right.
A high input impedance allows a source (e.g. MP3 player) to output the signal as a voltage while the current is very low (I=V/R). A low current serves to minimize the power that the source needs to drive the input.
A low output impedance allows the amplifier to drive a load (e.g. speaker) with high power, which typically goes hand in hand with high current (P=I²*Rload) while keeping the losses in the amplifier low (P=I²*Routput).
So far for your scenario. There are cases, however, where a source has a high impedance, acting more like a current source than a voltage source (you know you can replace any non.ideal voltage source by an equivalent non-ideal curretn source and vice versa, don't you?). If you were to drive a high-impedance input from such a source, the source would have to be able to source a high voltage (
compliance voltage). This is often impractical (consider an MP3 player powered by a 1.5V battery). In this (admittedly rare) case you would want to have a low input impedance for the amplifier and you would control the amplifer by the current from the source, not by the voltage.
Similarly if the load impedance of the amplifier is high, you do not need a high current to drive the load, therefore you can afford a higher output impedance for the amplifier, possibly easing the design.
In a system design you will also have to look at mutual influences of the different impedances e.g. in connection with feedback mechanisms, stability considerations and noise - to name a few.