Bob Myers said:
The definitions according to yourself and your
selected "authorities"? Yet again? How about some
reasoning, instead, for a change?
Every authority uses the same definitions. I've shown
you a list of URL's where, if you had looked, you would
have found exactly the same definition from each
different source.
I note that even though you deny it and claim there are
other definitions, you can't find even one that is
different.
See Shannon, "A Mathematical Theory of Communication,"
section V (27). Have someone help you with the big words,
as needed. When you're finished, find us an example of
a noiseless channel, and demonstrate to us what you're
saying.
You clearly need some help understanding it...
Shannon never used the word "digital," though, and used
the term "bits" simply because that is the commonly-used
unit of information (in ANY system), per information and
Wrong. Shannon didn't use the word "digital" because
it was not a popular term for what he was working with
until *after* he published his work and the study of
digital signals became popular. Nobody except Shannon
was, at that time, thinking in terms of digitizing data,
signals, channels, and everything else. (Okay, not totally
true, as people like Alec Reeves certainly was too...)
Shannon didn't use the work "bit" because of any common
useage! John Tukey, a co-worker of Shannon's at Bell
labs had coined the term "bit" as a short form for
"binary digit", only the year before Shannon first used
it in a 1948 paper. It fit their needs, and later
became popular with others too.
In fact, as the "commonly-used unit of information (in
ANY system)", Claude Shannon was the very *first* person
to use the term! He certainly did not use it because
any common useage, given it had been used only by one
other person and then with a slightly different meaning.
communications theory as it was established at the time
The theory had not been established at the time. It was
Shannon's work that started "Information Theory". While
Hartley and Nyquist had theorized about various things,
and demonstrated certain effects, Claude Shannon was the
one who mathematically *proved* them, thus providing
various theorums for future research.
(and is still in use today). Given that you've admitted to
not being an authority in the field, I wouldn't expect that
One thing that is quit obvious Bob is that I am
significantly more authoritative in the field of
communications than you are. Not that that is saying
much...
you understood that before, so I'm more than happy to
give you this chance to learn. You're welcome.
It would have been nice if you had known what you
were talking about before you again spouted nonsense.
Right - he defined a DISCRETE channel. The only
Yes. You'll remember that the definition of a digital
signal is that it has discrete values. Guess what
Shannon discussed... a channel for transmission of
discrete values. The way we describe that today is
"digital", and Shannon's "discrete channel" is exactly
what a digital channel is.
tie between "discrete" and "digital" appears to exist solely
in your own mind, since your cherished definitions of
That is one of the most hilarious things you've said
in all of this. You flat deny the very reason that
these terms exist, ignore all of their history, and
proclaim yourself an expert. Astounding.
"quantized" also unfortunately neglect to use the word
"digital" at any time. Shannon did NOT refer to either
"digital" or "analog" channels at all.
Of course he didn't. And in 1948 nobody at all was
using those terms to describe a communications channel,
of any kind.
That useage did not become popular until after Shannon's
papers were published. And it should be obvious to
anyone who can read English that the standard
definitions of "digital" an "analog" (as applied to
communications, transmission systems, signals, data,
etc. etc) was selected to precisely match what Shannon's
paper analyzed, and was taken directly from those
papers. (Heh, you wanted evidence and reasoning, well,
there you are. The evidence and the reason.
Wrong again. He discussed the theory of CONTINUOUS
channels, which is how they are consistently referenced in
his paper. In fact, the term "analog/analogue" does not appear
even once in Shannon's paper.
Exactly. The terms digital and analog only became
popular *after* Shannon's paper was published, and as
noted above the standardized definitions are
specifically intended to reference the papers that
Shannon published.
"Analogous" appears a grand
total of six times, each time with the clearly-accepted meaning
of "similar to," as opposed to refering to any specific class of
signals. Similarly, "digital" does not appear at all. Shannon
correctly did not make any distinction between "analog" and
"digital" encoding in terms of information capacity or content,
as his theorems apply to any and all systems.
Nobody else was using those term in that way at the time. It
was *because* of Shannon's work that such terms came into
existance *after* Information Theory became a popular topic
for research.
Or it would have been, had Shannon actually provided
any such definitions. Too bad he didn't.
Too bad you didn't realize the direct connection!
Are you actually so brass as to claim that the "discrete
channel" term that Shannon used is not precisely
describing what we call a "digital channel" today? Or
that his "continuous channel" is not what we call an
"analog channel"? I don't mean something about the
same, close, or relatively... I mean the terms are
*exactly* identical, in all respects.