David Farber said:
Ian Jackson wrote:
In message <gJknl.13371$i42.7428@newsfe17.iad>, David Farber
Ian Jackson wrote:
In message <EZhnl.5083$Sv4.2066@newsfe15.iad>, David Farber
I have a trusty old Sound Technology ST1000A FM generator and I
have a question regarding the variable FM output level. The
owner's manual states that the output impedance is 50 ohms,
VSWR<1.3, 200Vdc isolation. My question is if I am using this
device to check the sensitivity of an FM tuner, will the output
level be affected if I connect the output terminal to an RG-58
cable and then use a 75 ohm to 300 ohm matching transformer to
make it compatible with the old FM tuners? It seems to me the
2:1 step up ratio of the matching transformer should affect the
level not even considering the fact that the 75 ohm input of the
matching transformer does not match the 50 ohm output of the
cable. Thanks for your reply.
The fact that the generator 50 ohm output and coax has a 75 ohm
load will give you a voltage of 0.6 of the generator open circuit
voltage. If the load had been 50 ohms, you would get 0.5 of the
open circuit voltage. The increase is 0.6/0.5 = x1.2 (+1.58dB).
75-to-300 ohm transformer should give you a 2-to-1 voltage step-up
(+6dB).
The transformer will have some loss. This should not be more than
about 0.5dB.
So, the voltage at the FM tuner 300 ohm input will be the
generator output (into 50 ohms) + 1.58dB + 6dB -0.5dB = Vout +
7.08dB.
This assumes, of course, that the tuner input impedance really IS
300 ohms (which it probably isn't!).
I believe the output level dial is calibrated for a 50 ohm load. So
using your numbers if the output dial is set to 10?V then I
calculated a 7.08dB gain to be 22.6 ?V. In effect, the sensitivity
of the tuner at this point is 7 dB worse than what the dial
indicates. Is that right? Thanks for your reply.
The calculation looks OK. However, I suppose it depends on what your
'standard' impedance is.
If you have a 'normal' halfwave dipole at (say) 100MHz, the
impedance at the centre will be about 75 ohms. It would be normal
to connect it (via a 75 ohm feeder) to a tuner with a 75 ohm input
impedance. [Note: It might be more correct to say that the tuner is
designed to work best when fed from a 75 ohms source. In practice,
it might not have a very good 75 ohm input.] Anyway, let us assume
that the level of a received 100MHz FM radio signal level (into 75
ohm) is 1?V.
Now, if you replace the 'normal' dipole with a folded dipole, you
would use 300 ohm feeder and connect it to a tuner with a 300 ohm
input impedance. Ignoring distractions like differences in feeder
loss, the tuner input voltage will be 2?V.
However, despite being fed with twice the voltage, the 300 ohm tuner
won't work any better with the folded dipole than the 75 ohm tuner
works with the 75 ohm 'normal' dipole. In both cases, the input
power is the same. Internally, the electronics will be basically
the same. The only difference will be in the matching circuit
between the input and the RF stage.
So, it is reasonable to conclude that, when you specify the
sensitivity of a receiver, you have to specify the impedance. In
your test, if your standard is 300 ohms, then you would say that
your
tuner was receiving 22.6?V (7.04dB more than indicated on the
generator dial). If it was
75 ohms, it would be 11.3?V (6dB less).
Well, I think I'm correct! What do you reckon?
I don't think you need to mention the impedance of the receiver when
you mention FM sensitivity. If it is has the correct antenna and the
correct impedance matching circuit, the sensitivity of the circuitry
should be enough. No? That way I can compare one receiver/tuner to
another and not care what the input impedance is.
No. If you have a 75 ohm receiver which requires 'V' volts for a given
SNR, a 300 ohm receiver would require '2V' volts to give the same SNR.
If you were comparing the two, you would need to mention what
impedance applied.
For all you know, the 300 ohm receiver is simply the same model as a
75 ohm version, except that it has an internal 300-to-75 ohm step-down
transformer. For the same performance, the 300 ohm model would
definitely need twice the input voltage.
If you have 75 ohm signal source and a 300 ohm receiver, you can get
'free' gain by simply using a step-up transformer to give you more
source voltage. The limit is when the transformer matches the source
to the load. If you try to us a step-up transformer between a 75 ohm
source and a 75 ohm receiver, you will actually get LESS into the
receiver than you would with direct connection.
It's kind of like stating that the power
of a speaker is independent of what type of amp it's hooked up to.
Ah, but....
With audio systems, the output impedance of the source is very low
compared with the load impedance. The power fed into the speaker only
really depends on the source voltage and the speaker impedance.
I did a little more digging in the owner's manual. There is an
intermediate, almost lossless, antenna matching network (which I
don't have) that is supposed to be hooked up between the tuner and
the generator to make things go together nicely. There is also a
simple schematic diagram of how to construct one yourself if you
don't have the Sound Technology matching device but the drawback is
the ~6dB attenuation. So to go from the 50 ohm unbalanced output of
the generator to the 300 ohm balanced input of the tuner would
require three resistors connected as follows:
R1 130 ohms from the center terminal output of the generator to one
of the 300 ohm inputs of the tuner.
R2 150 ohms from the shield side of the generator to the other 300
ohm input of the tuner.
R3 62 ohms which goes directly across the generator output.
This will probably OK if the receiver input is completely floating,
with no reference to ground (including a centre tap on receiver input
coil or transformer - if it has one).
In this configuration, the readings on the dial are approximately
twice the actual output. In other words, if the dial is reading
10?V, then there is only 5?V going to the tuner.
I reckon that you are better investing in a simple ferrite-cored
75-to-300 ohm balun / matching transformer (or make one - dead easy)
and allow for the small corrections discussed previously. I see that
Tim Schwartz has suggested 50-to-300 ohms but, to me, 75-to-300 ohms
is much easier as it is a simple turns ratio of 2:1. If you really
want 50-to-300 ohms, it would be simpler to add a couple of resistors
to act as a resistive 50-to-75 matching pad. But, in over 40 years in
cable TV, I never worried too much about mixing 50 and 75 ohm
impedances (but only where it didn't matter, of course!).