Huh??
Your comment makes no sense to me. First you say the voltage must be greater, then you say it doesn't have to , just amplify it???
Confused here.
Perhaps another simpler explanation.
An amplifier can be thought of as having connections to three things:
1) an input
2) an output
3) a power supply
Ignoring some very special cases, the output cannot exceed the bounds of the power supply. For example if the power supply was a simple 30V, the output could not go below 0, and could not rise above 30.
How the amplifier actually transforms a small voltage (say 1V) on the input to a higher voltage (say 10V) on the output is another story, but essentially it involves (as Davenn says) the output being a controlled version of the power supply that is controlled by the input signal.
Davenn also points out that this is voltage amplification. It's worth pointing out (although you possibly don't need to understand it right now) that the output voltage does not have to increase for the device to amplify.
None of these things are particular to an op-amp. John Monks describes some things that are particular to op-amps, but understanding a much simpler amplifier first is important.
What is also important is the theoretical model of the op-amp. The simplest theoretical model allows for the output to exceed the power supply (they do this by basically ignoring the power supply entirely). If you are using a simulation where the power supply to the op-amp is not shown (especially where it is not allowed for) you may be getting results that cannot happen in practice.