Maker Pro
Maker Pro

Why do single-ended to differential conversion for coax cables?

Hi everyone,

I am somewhat confused about the importance of single-ended to
differential conversion for the case of a coax cable input to the ADC.
Suppose I have a single-ended cable (SMA or BNC coax) carrying the
analog signal to be digitized via an ADC with differential inputs.
Aside from buffering the ADC input, why is it necessary to do single-
ended to differential conversion?

If the cable was differential, then it makes sense as common mode
noise is suppressed in the cable already.

But here is what I have trouble with:
If the cable is coax (single-ended), then won't all the noise make it
into the ADC anyway?
Regardless of whether I do differential conversion or just ground the
A- and let the input sit at DC midsupply?

Thanks,
Tele
 
J

John Larkin

Hi everyone,

I am somewhat confused about the importance of single-ended to
differential conversion for the case of a coax cable input to the ADC.
Suppose I have a single-ended cable (SMA or BNC coax) carrying the
analog signal to be digitized via an ADC with differential inputs.
Aside from buffering the ADC input, why is it necessary to do single-
ended to differential conversion?

If the cable was differential, then it makes sense as common mode
noise is suppressed in the cable already.

But here is what I have trouble with:
If the cable is coax (single-ended), then won't all the noise make it
into the ADC anyway?
Regardless of whether I do differential conversion or just ground the
A- and let the input sit at DC midsupply?

Thanks,
Tele

Most differential-input ADCs work better - less distortion, better s/n
ratio - if fed differentially.

You can usually drive them single-ended with good results. A typical
video adc will want both inputs to be biased some common-mode voltage
off ground, +1.5 volts or something like that. One of these can be
stiff, bypassed to ground, and you can apply the signal to the other.
One usually wouldn't DC ground an input.

John
 
F

Fred Bloggs

But here is what I have trouble with:
If the cable is coax (single-ended), then won't all the noise make it
into the ADC anyway?
Regardless of whether I do differential conversion or just ground the
A- and let the input sit at DC midsupply?

Most single ended to differential conversion techniques provide common
mode rejection. Single ended coupling does not do this and is
ill-advised. Cheap wideband tranformers with center tapped secondaries
are readily available and provide total isolation between signal source
and ADC circuit grounds.
 
J

Jamie

Hi everyone,

I am somewhat confused about the importance of single-ended to
differential conversion for the case of a coax cable input to the ADC.
Suppose I have a single-ended cable (SMA or BNC coax) carrying the
analog signal to be digitized via an ADC with differential inputs.
Aside from buffering the ADC input, why is it necessary to do single-
ended to differential conversion?

If the cable was differential, then it makes sense as common mode
noise is suppressed in the cable already.

But here is what I have trouble with:
If the cable is coax (single-ended), then won't all the noise make it
into the ADC anyway?
Regardless of whether I do differential conversion or just ground the
A- and let the input sit at DC midsupply?

Thanks,
Tele
For those types of inputs, I just tie off one input to the common with a
low value non inductive resistor of some kind and use the other for the
single ended feed. Some devices already employ input resisters enough
to simply tie it directly to the input common.

The option is yours to choose from. It is better to use a differential
line, for which the inputs are supplied, if so desired.
 
T

Tony

For those types of inputs, I just tie off one input to the common with a
low value non inductive resistor of some kind and use the other for the
single ended feed. Some devices already employ input resisters enough
to simply tie it directly to the input common.

The option is yours to choose from. It is better to use a differential
line, for which the inputs are supplied, if so desired.

Applying the signal to just one input usually only lets you drive the ADC to half
full-scale. If that doesn't matter, then it's fine.
 
V

Vladimir Vassilevsky

Hi everyone,

I am somewhat confused about the importance of single-ended to
differential conversion for the case of a coax cable input to the ADC.
Suppose I have a single-ended cable (SMA or BNC coax) carrying the
analog signal to be digitized via an ADC with differential inputs.
Aside from buffering the ADC input, why is it necessary to do single-
ended to differential conversion?

It depends on your accuracy and SNR requirements. The single end connection
typically can drive the ADC only to 1/2 of the scale, so here is the 6dB of
degradation. Also the CM ground bounce would be fed to the input of the ADC.
If the cable was differential, then it makes sense as common mode
noise is suppressed in the cable already.

But here is what I have trouble with:
If the cable is coax (single-ended), then won't all the noise make it
into the ADC anyway?
Regardless of whether I do differential conversion or just ground the
A- and let the input sit at DC midsupply?

If you connect the ADC like that, the path of the "hot" and "cold" inputs is
going to be different. Thus the common mode noise from the ADC GND will be
seen at the ADC input.

Vladimir Vassilevsky
DSP and Mixed Signal Consultant
www.abvolt.com
 
Top