I assume you've already read the response
here.
Added to that, my question is "What do you want calibrated?"
If you have other calibrated equipment (say a voltmeter and a thermometer) then you can create your own calibration procedure.
You might (say)
1) Turn unit on and leave powered up for 1 hour with an ambient temperature of 25C
2) connect your calibrated multimeter on the 20V range directly to the 5V output of the power supply.
3) take a reading with no load.
4) connect a 10k load and take another reading
5) connect a 10 ohm load and take a reading immediately the load is connected.
6) leave the load connected for 10 minutes, taking readings after 2, 4, 6, 8, and 10 minutes.
7) disconnect the 10 ohm load and take another reading,
8) connect a 10k load and take another reading.
What you'll get is a series of readings that you can use to determine the characteristics of your power supply over a variety of load conditions.
Your 10k and 10 ohm resistances could also be calibrated, and the 10 ohm "standard" should be generously rated so it doesn't get excessively hot.
Then you determine a schedule to run this calibration. As long as the output of your power supply is within the specifications you decide are important, you can claim it is calibrated. You can note corrections (such as the output voltage is 0.15% high) if that is important.
If the output is "out of spec" then you need to do something (and for that you'll need more information).
I work for a company that does a significant amount of internal "calibration". (I just happen to be in the middle of auditing it right now). In our case it's in the medical device field and we get our calibration procedures essentially from NATA where the manufacturer doesn't supply them. Do you have a professional testing authority which can provide you with advice? (I'm now curious enough that I'm going to ask what we can get from NATA -- I'm sure I wouldn't be able to pass it along though)
There's not a lot of magic involved. You aim to remove as many variables as you can, and do tests with equipment operating in the range(s) that you will be using them.
In your case, if you use the 5V rail for loads between 100mA and 250mA, then it is pointless to calibrate at 10mA and 1A, you would be better to choose points within and near the ends of your operating range.
You may be able to take procedures for another similar piece of equipment and modify those procedures.
I hope this is of some help...