You corrected the number of PLLs in the FPGA, how do they get the
clocks' phases aligned?
Beats me.
Could be as simple as fixed tuned gate delay inside the FPGA, or could be
some more complex system that auto-calibrates perhaps.
I'm no expert on digital scopes, but I suppose
the clock frequency changes depending on the time base?
Only when required when the memory is full.
On short memory mode (1GS/s, 16KB) a relay clicks in at 100ns/div and it
tells you it's doing 1GS/s on 50ns/div to 5ns/div.
No such click occurs when in long memory mode (500MS/s 1Mpoints), and it
stays on 500MS/s up to 100ns/div.
The relay is thus most likley switching the input signal between either 5 or
10 ADC's depending upon the requirement.
Do they just drop some ADCs and keep the frequency, thereby permitting
the use of trace delays to get the 100ps delays or what?
Are the GS/s figures real time or equivalent time?
1GS/s real time of course, that's been the selling point of these Rigol
scopes for the last 5 years or so.
I think it is possible Rigol tests the ADCs and bins them in-house. Do
you think they might run the power supply a bit "hotter" to get the
parts to work at higher clocks?
I don't think so, likely they have just done exhaustive testing of these
parts to ensure the work. Binning each one in house would be possible, but
messy.
Perhaps AD are the ones pulling the swifty and charging a huge premium
(almost 3 times) for the exact same part?
Dave.