Bob Myers said:
No, Don had it right. A quantized analog signal
You can repeat that all you like, but you are wrong
every time you do.
By *definition* it is a digital signal.
quantization:
A process in which the continuous range of values
of an analog signal is sampled and divided into
nonoverlapping (but not necessarily equal)
subranges, and a discrete, unique value is assigned
to each subrange.
A _sampled_ signal is still analog. A _quantized_ signal is
digital by definition.
If you do not stay with standard definitions it is
impossible to discuss anything rationally.
remains analog as long as the relative values of the
quantization levels, one to the other have significance;
they thus can carry information, which is the fundamental
goal of any such system.
The quantization levels are digital. By definition.
If that isn't what you mean, then you need to use other
words because you are confusing the issue by misuse of
standard terms.
Now, we could certainly assign values to those levels
which (for instance) are NOT in order from "top to
bottom" (or whichever direction you choose to use),
which might be done to distribute the susceptibility of
any given "bit" in said value to noise evenly. In this
case, the levels MUST be interpreted as the intended
numeric values in order to recover the original
information, and hence this would be a "digital"
encoding system.
Exactly. But mere quantization by itself does not
suffice to render a signal "digitally encoded," no
matter what a given government "expert" may claim.
The quantization of a signal makes it digital.
(It *is* encoded, too, BTW. But until you understand
what makes it digital, there is little point in trying
to define what "encoded" means.)
No matter how dense you want to be about it, that
government "expert" happens to be right. And you cannot
find *any* expert that will disagree. That is the
*standard* definition, and virtually *everyone* agrees
that it is correct.