Maker Pro
Maker Pro

DIY PC Oscilloscope

R

rickman

That has more to do with how the software shows the signal.
Interpolation can solve a lot.

That may be. The OP was talking about debugging the sinc reconstruction
and I've been thinking a little bit about just how useful that is. Is
there a downside to sinc reconstruction, other than the work required?
I was thinking an aliased signal might interfere with this, but now that
I give it some thought, I realize they are two separate issues. If you
have an aliased tone, it will just be a tone in the display whether you
use sinc reconstruction or not.

In fact, the *very* low end scope I was using may have had a limitation
in the display itself! 8 bits is 256 steps. That shouldn't be too big
a problem.
 
J

John Devereux

rickman said:
You say that the 16 bit converters are expensive, then talk about
using a 20 GHz 8 bit ADC. Is that not expensive, not to mention the
clocking, the board for the high speed signals and the power supply to
make all this happen?

Yes, it is very expensive. I did say "high end scope", by which I mean
"really expensive". They usually need to go to GHz anyway, so already
have the high speed digitizer. At lower bandwidths they can utilise the
excess samples to increase the apparent resolution.
I can't imagine this is actually a better approach to designing a
scope with a stated goal of 20-25 MHz bandwidth. I would like to see
at least 300 MHz, but the OP says 25 is good enough.

Absolutely, to me the only point of a 25MHz scope would be if it was
higher resolution, 16+ bit ideally. Otherwise you may as well just use
one of those cheap USB gadgets. A "dynamic signal analyser" that goes
above 100kHz seems to be missing from the market AFAIK. So it could do
good spectrum analysis, evaluate noise, servo loops, have a tracking
generator and plot filter responses, that sort of thing.
 
N

Nico Coesel

rickman said:
That may be. The OP was talking about debugging the sinc reconstruction
and I've been thinking a little bit about just how useful that is. Is
there a downside to sinc reconstruction, other than the work required?

It is useable if you have at least 5 samples per period. So that is
0.2fs. The whole problem though is not the number of samples per
period. According to sampling theory the signal is there but it just
needs to be displayed properly so the operator can see a signal
instead of some 'random' dots. With the proper signal reconstruction
algorithm you can display signals up the the Nyquist limit (0.5fs).
I was thinking an aliased signal might interfere with this, but now that
I give it some thought, I realize they are two separate issues. If you
have an aliased tone, it will just be a tone in the display whether you
use sinc reconstruction or not.

In my design I used a fixed samplerate (250MHz) and a standard PC
memory module. 1GB already provides for more than 2 seconds of storage
for 2 channels. That solves the whole interference issue and it allows
to use a proper anti-aliasing filter. With polynomal approximation I
could reconstruct a signal even when its close to the Nyquist limit. I
tested it and I could get it to work for frequencies up to 0.45fs.
 
R

rickman

It is useable if you have at least 5 samples per period. So that is
0.2fs. The whole problem though is not the number of samples per
period. According to sampling theory the signal is there but it just
needs to be displayed properly so the operator can see a signal
instead of some 'random' dots. With the proper signal reconstruction
algorithm you can display signals up the the Nyquist limit (0.5fs).


In my design I used a fixed samplerate (250MHz) and a standard PC
memory module. 1GB already provides for more than 2 seconds of storage
for 2 channels. That solves the whole interference issue and it allows
to use a proper anti-aliasing filter. With polynomal approximation I
could reconstruct a signal even when its close to the Nyquist limit. I
tested it and I could get it to work for frequencies up to 0.45fs.

I'm not following. Are you saying you need a long buffer of data in
order to reconstruct the signal properly?
 
N

Nico Coesel

rickman said:
I'm not following. Are you saying you need a long buffer of data in
order to reconstruct the signal properly?

You need about 10 samples extra at the beginning and end to do a
proper reconstruction. Lots of audio editing software does exactly the
same BTW. Using a fixed samplerate solves a lot of signal processing
problems but also dictates a lot of processing needs to be done in
hardware to keep the speed reasonable.
 
J

Jamie

John said:
Interesting stuff can be done with a really long record. You can do signal
averaging of a periodic waveform with no trigger. Our new monster LeCroy scope
can take a long record of a differential PCI Express lane (2.5 gbps NRZ data),
simulate a PLL data recovery loop of various dynamics, and plot an eye diagram,
again without any trigger. Well, if it doesn't crash.
Oh, you have that problem too? Our lecroy goes belly up now and then
for no apparent reason.. It appears to me there is some hardware to
software random issue.

Kind of reminds me back in the days when Visual Basic was first put on
us in Windows 3.xx days. A serious app designed to operate fabric
cutting machines for intricate designs would simply fault a plug in
component because it would get stuck on some missed signal from the
hardware and then time out or do a stack overflow. THe app was loaded
using VB controls that just simply was not resource friendly and
controlled properly.

On top of that, this app cost clients upwards in the $10k range. I was
offered a job where this app was developed and was allowed to see it in
operation and saw its random failures, they wanted to to join the
debugger team to resolve it and move forward. I declined on the offer.


Jamie
 
Top