Maker Pro
Maker Pro

Questions about equivalents of audio/video and digital/analog.

G

glen herrmannsfeldt

Don Pearce wrote:

(snip)
No, you haven't. You merely have a signal at a set of discrete levels.
You need an analogue to digital converter to take each of those
quantized levels and convert it into a digital word (of 1s and 0s).
Digital means "represented by digits", not "in discrete voltage
steps".

Now it is getting complicated. Once it is quantized it "could"
be represented by digits. Whether you actually have to do that,
I am not so sure.

I haven't followed quantum computing so carefully, but it might
be possible to do computing on discrete voltage levels that
haven't been converted to digits. (And note that the usual
representation of a digital signal is by voltages on wires.)

-- glen
 
B

Bob Myers

I don't mean to imply that there may not be idiot-savants
on the interweb. Al Einstein himself may easily have been
perceived as a troll if he were online :)

And let's not forget Alfred Nobel's half-brother
Ignatz, the benefactor behind the Ig Nobel prize,
awarded for outstanding contributions to that
very field...;-)

Bob M.
 
G

glen herrmannsfeldt

Scott Seidman wrote:

(snip)
Doesn't "analog" also imply that x(t) exists for all t in range, and not
just at nT for all n in range? Or would people just call that "sampled"?

Yes, that would be "sampled". Since analog tends to imply continuous
(non-sampled) it would probably be best to use "sampled analog" for
non-continuous non-quantized data.

-- glen
 
"Martin Heffels" wrote ...



I don't mean to imply that there may not be idiot-savants
on the interweb. Al Einstein himself may easily have been
perceived as a troll if he were online :)

There is NO mistaking Albert Einstein for Radium. Even
if you disagreed with Einstein, his math was impeccable
and self-consistent and provided a plausible explanation
for observed phenomenon that was at variance with
Netwonian physics.

Radium, on the other hand, is simply a blithering idiot.
 
F

Floyd L. Davidson

No, you haven't. You merely have a signal at a set of discrete levels.

Sheesh! That *is*, by definition a digital signal.
You need an analogue to digital converter to take each of those
quantized levels and convert it into a digital word (of 1s and 0s).

Digital means "represented by digits", not "in discrete voltage
steps".

Bullshit son. Look it up. I've provided you with
quotes from an authoritative reference, twice now. You
don't have to take my word for it, that *is* the agreed
technical definition of the term.
 
F

Floyd L. Davidson

Sorry, but that is simply nonsense. A signal that is sampled in time,
but not quantized is an analogue signal. It is treated and processed
by analogue circuits. For a signal to be digital its sampled levels
must be represented by numbers, which are processed mathematically by
some sort of microprocessor.

That is, it must actually be quantized.

Perhaps that is what you meant to say earlier, but you
actually didn't, and said that the quantized signal has
to be represented by numbers, which it is by definition.
The signal can be reconverted to an
analogue one later by a D to A.

It's best to call that a quasi-analog signal...
 
S

Scott Seidman

[email protected] (Floyd L. Davidson) wrote in
That is by definition a digital siganl. As soon as the possible values
are "constrained to a limited set", it is by definition digital data.

Wouldn't this make the output of a D/A converter digital by definition?
 
S

Scott Seidman

[email protected] (Floyd L. Davidson) wrote in
Sheesh! That *is*, by definition a digital signal.

Funny, that's just what my D/A converters put out, and the spec sheets
claim they're putting out analog signals. Perhaps I should return them.
 
D

Don Pearce

There's a gap in your understanding. the "segment" is the equivalent of
floating point's exponent, and the bits that divide the segment into
equal parts are like floating point's mantissa.

Jerry

No gap. The expressions are used simply to derive a set of
quantization points which, in the telephony systems that use them,
generate 8 bits of data - no floating points, which would many more
bits to encompass a mantissa and an exponent. The result is just the
integer numbers -128 to 127.

d
 
D

Don Pearce

That is, it must actually be quantized.

Perhaps that is what you meant to say earlier, but you
actually didn't, and said that the quantized signal has
to be represented by numbers, which it is by definition.
No it isn't - it can simply be a signal where the smooth curve has
been replaced by steps. If you want to process that signal you must do
so with analogue circuitry - amplifiers, filters etc. Representing
those steps by numbers is a different matter. Once you have done that,
you can no longer process in the analogue domain, you must use maths
on the numbers; that is what makes it digital.
It's best to call that a quasi-analog signal...
No it isn't. It is an analogue signal, because it is no longer
represented by digits.
d
 
D

Don Pearce

Sheesh! That *is*, by definition a digital signal.
If you put that signal through an analogue amplifier, it will be
amplified. That makes it an analogue signal. If you want to amplify a
signal in the digital domain, you must perform maths on the numbers.
Can you really not see the difference?
Bullshit son. Look it up. I've provided you with
quotes from an authoritative reference, twice now. You
don't have to take my word for it, that *is* the agreed
technical definition of the term.

Sorry, but you are wrong. And any reference you have found that makes
such a claim is not authoritative; it is also wrong.

d
 
F

Floyd L. Davidson

[email protected] (Don Pearce) said:
[...]
A "quantized analogue signal" is digital by definition.


No, you haven't. You merely have a signal at a set of discrete levels.
You need an analogue to digital converter to take each of those
quantized levels and convert it into a digital word (of 1s and 0s).

Digital means "represented by digits", not "in discrete voltage
steps".

I've never seen that definition, while I have seen the definition
Floyd is proposing, and I think it is a reasonable one.
No, it isn't. It misses the fact that sampled and digital are
different things. Digits are numbers.

Are you kidding? It is *the* industry standard
definition. It is not something that I made up, I
merely looked it up.

http://ntia.its.bldrdoc.gov/fs-1037/

That is, since you seem unable to grasp or investigate
it, the web site of the National Telecommunications and
Information Administration, a part of the US Federal
Department of Commerce, in Boulder Colorado. Which is
to say they are next door to and under that same
management as the NIST (the National Institute of
Standards and Technology), and NOAA (National Oceanic
and Atmospheric Administration) which you may also have
heard of...

Or, to put it another way, you will not find anywhere in
the world a valid definition that disagrees with that
one. If yours is not in agreement, you are *wrong*.
Really? Can you point me at something that does DSP on signals that
have been merely sampled in time? I've never come across any such
thing.


Logic will do.

Logically you are walking the plank. Such technical
definitions have nothing to do with logic. It is an
arbitrary decision that it means this or it means that.
If we all agree on the arbitrary decision then we have a
standard, and we can use it knowing that others will
understand what it means.

Until someone like you walk in and says they have their
own definition...
If you are doing digital signal processing, you are
doing arithmetic on the numbers that come out of an AtoD converter.

That is not necessarily true. Not all digital signals
originate as analog signals that require A->D
conversion.
You can't do that with some voltage levels out of a quantizer.

Out of a quantizer? You certainly can.
As for discrete time, that is simply sampled, like a class D
amplifier, and nothing to do with digits. There is plenty of laziness
in the use of nomenclature (as well as misuse by people who simply
have no idea what they are talking about).

I totally agree with that statement. ;-)
 
D

Don Pearce

[email protected] (Floyd L. Davidson) wrote in


Wouldn't this make the output of a D/A converter digital by definition?

It certainly would. But apparently there are those that can't see the
difference between a limited set of values, and a set of numbers
describing those values.

d
 
F

Floyd L. Davidson

Sorry, but that isn't DSP, it is just calculating the power. Let me
put this very simply. If you have a quantized signal and you want to
make it twice as big, can you do that with an amplifier, or do you do
it mathematically? If the signal is quantized, an amplifier will do
it. If it is digitized it won't. You can amplify 0110111001 all you
like, you will still have 0110111001.

You are confused. You are taking about sampling, not quantizing.
Current usage is just fine. A digital signal is one composed of
digits.

And a "digit" is nothing other than a discrete value. Hence
you have a valid definition, but don't recognize what it means.
 
D

Don Pearce

[email protected] (Don Pearce) said:
[email protected] (Don Pearce) writes:
[...]
A "quantized analogue signal" is digital by definition.


No, you haven't. You merely have a signal at a set of discrete levels.
You need an analogue to digital converter to take each of those
quantized levels and convert it into a digital word (of 1s and 0s).

Digital means "represented by digits", not "in discrete voltage
steps".

I've never seen that definition, while I have seen the definition
Floyd is proposing, and I think it is a reasonable one.
No, it isn't. It misses the fact that sampled and digital are
different things. Digits are numbers.

Are you kidding? It is *the* industry standard
definition. It is not something that I made up, I
merely looked it up.

http://ntia.its.bldrdoc.gov/fs-1037/

That is, since you seem unable to grasp or investigate
it, the web site of the National Telecommunications and
Information Administration, a part of the US Federal
Department of Commerce, in Boulder Colorado. Which is
to say they are next door to and under that same
management as the NIST (the National Institute of
Standards and Technology), and NOAA (National Oceanic
and Atmospheric Administration) which you may also have
heard of...

Or, to put it another way, you will not find anywhere in
the world a valid definition that disagrees with that
one. If yours is not in agreement, you are *wrong*.
You have misunderstood what is meant by the definition. It is not
intended to describe quantized signals, but data where sequential time
steps represent data. It is very clear that when those definitions
were being written, nobody on the committee was thinking of quantized
analogue signals.
Logically you are walking the plank. Such technical
definitions have nothing to do with logic. It is an
arbitrary decision that it means this or it means that.
If we all agree on the arbitrary decision then we have a
standard, and we can use it knowing that others will
understand what it means.

Until someone like you walk in and says they have their
own definition...


That is not necessarily true. Not all digital signals
originate as analog signals that require A->D
conversion.
of course not. They can start life in a computer or whatever. Are you
trying to confuse the issue with a red herring?
Out of a quantizer? You certainly can.
You can do arithmetic on the output of a quantizer? How do you do
that, it is not in a numeric form. If you want to do arithmetic on it,
you must first digitize it.
I totally agree with that statement. ;-)

Well, that is a start!

d
 
D

Don Pearce

You are confused. You are taking about sampling, not quantizing.
No I'm not. Let me explain with an example. Suppose I have a ramp that
changes smoothly from 0 to 1 volt. Now I quantize it in steps of 0.1
volts. I now have a staircase that rises in 0.1V steps from 0 to 1
volt. If I put that through an amplifier with a gain of 2, I will get
a staircase from 0 to 2 volts. I can put it through an amplifier
because it is still an analogue signal.

If I digitize the signal, I will get a set of signals which might be
0000, 0001, 0010, 0011 etc. If I want to apply a gain of 2, I can't
use an amplifier, I will have to use maths. In the case of applying a
gain of 2 it is easy - the result is 0000, 0010, 000, 0110 etc.

That is the difference between an analogue signal (whether quantized
or not) and a digital one.
And a "digit" is nothing other than a discrete value. Hence
you have a valid definition, but don't recognize what it means.

A digit is a number.

d
 
J

Jerry Avins

Scott said:
[email protected] (Floyd L. Davidson) wrote in


Wouldn't this make the output of a D/A converter digital by definition?

Of course it would. I think it's a bit silly (pretty stupid, actually)
to argue about what to call something and believe that's the same as
arguing about what it is. One could say that a continuous signal
measured with a 3.5-digit meter is quantized by the measurement even if
it's unchanged thereby. And if the signal is recorded hourly in a log
book, I suppose it becomes sampled. Is it worth trying to make a
definition that withstands all possible logical contortions? Probably
sometimes, but not here; not now.

Jerry
 
F

Floyd L. Davidson

Bob Myers said:
OK, it's not reasonable to ME, either, if you're impressed
by taking a vote on this sort of thing.

The problem with the definition that you and Floyd seem to
want to use is that it leads to several problems in both
theory and practice, in addition to the fact that there are
numerous counter-examples one can point to.

It doesn't lead to any such problems.

What you need to get straight is that it is not *my*
definition. It is the *standard* technical definition
recognized by virtually *every* standards organization.
I quoted the NTIA's Federal Standard 1037C.
"Reasonable" would seem (at least to me) to mean that you

It makes no difference what you think is or is not
reasonable, unless we want to discuss *you*. If you
disagree with the standard definition then you don't
understand the term, and we can determine how far off
you are by how much your definition differs from that
one! ;-)
can justify your definition *through reason*, which Don has
done.

Which proves that he doesn't understand it. It says
nothing about whether the National Telecommunications
and Information Administration, knows or what the MilStd
specification knows.
Simply pointing to a published work, including a
standard, as a reference to support your definition is what's
called an "argument from authority," and it has exactly zero

That is a logical fallacy on your part. An "argument
from authority" has great weight if it is valid. To
be valid it must pass three tests:

1) The authority cited must actually be an authority.
2) All authorities must agree on the topic.
3) The authority cannot be misquoted, taken out of
context, or be joking.

Clearly citing the NTIA and MilStd definition is indeed
a *very* strong appeal to authority, and no mere opinion
can even come close to invalidating it.
weight in light of an opposing argument based on evidence
and logic.

What evidence? And the logic is clearly invalid and
based on false assumptions.

You know one way to be absolutely positive that your
logic is not good is to do a reality check and find that
the answer you have is wrong. It this case that is very
easy to do, which is why *standard* definitions are
quoted from authoritative sources. If you disagree,
then clearly you *don't* have the logic right!
However, if you like, I can also point to several
references which support the definition that Don and I (and

So cite even one such valid reference! (You *cannot*,
because there are none.)

(And recognize that if you think you have one, then
there is one of two things clearly true: Either 1) you
do not understand that the other definition is not
actually different, or 2) your reference is not a valid
one.)
I believe others) are proposing. You might claim the list to
be invalid, however, since it would contain works that I
myself wrote for publication. Which is, of course, the whole

You are not a valid reference. You don't even come
close to being equal to the NTIA.

And it is *hilarious* that you would (again, because
this isn't the first time) try to convince anyone that
you are.
point - simply having your statements published does NOT
make them any more or less correct; the deciding factor is
whether or not they can be shown to be true through evidence
and logic.

Except technical definitions are sometimes merely
arbitrary agreements on one of many possible logical
ways to define a term. We could have decided that
"digital" means binary, or a decimal system. We didn't,
but both would be logical.
A common misuse or misunderstanding does not become
less so merely because it IS common.

Hmmm...
 
J

Jerry Avins

Don said:
No gap. The expressions are used simply to derive a set of
quantization points which, in the telephony systems that use them,
generate 8 bits of data - no floating points, which would many more
bits to encompass a mantissa and an exponent. The result is just the
integer numbers -128 to 127.

Oh? The concept of floating point prescribes a certain number of bits?
That you fail to see the parallel doesn't mean there isn't one.

Jerry
 
F

Floyd L. Davidson

Quantization isn't important. If you don't quantize all it means is
that you are dealing with floating point rather than integer numbers.
Still digital of course. I can't think of any floating point ADC's off
hand, of course.

Floating point is analog, integer is digital.

....
We're all guilty of sloppiness. What is important is that we are able
to understand and work with the fine differences when they matter.

That is only true if we use standard definitions.
 
Top