Yes, but the idea is also to read voltages from the digital display
(as 1.55Vpp for example).
In such a case one could even use continuous variable gain to set sensitivity.
This sounds like a good idea, from the cost viewpoint; a cascade
of three AGC type amplifiers (Gilbert cells) will give
logarithmic gain programmed by a voltage source (8 bit DAC?).
You can use a phantom input channel to feed a calibration
source, to get past the thermal drift, and a sample/hold
circuit to hold the gain constant during an accumulate/calibrate
cycle.
But for human interface reasons, you want a human-set screen
range that DOESN'T change until the human changes it. It's
OK to have the internals do some gain-riding with the signal,
but I'd get dizzy watching the screen readjust all the scale values
in realtime.
That little grid overlay on the oscilloscope screen is a GREAT
simplification for lots of uses. I'm not happy thinking I have
to digitize with a cursor instead of just reading it off (no hands!).