Jim Kelley said:
Here are some valid standard defintions:
Actually, they are good definitions, but they are *not*
"valid standard definitions" for this discussion. You
are citing a dictionary of _common_ English, as spoken
by the general population. But we are discussing what
is called a "term of art".
Term of Art:
technical word: a word or phrase with a special
meaning, used in a specific field of knowledge
In other words, it may or may not be the same, when used
in the information or communications industry as it is
used by the general population of English speakers.
It does happen that in this case there is no significant
difference, and your definitions are useful
illustrations, but they are not very precise, while the
term of art definitions are *very* precise.
"quantize - to subdivide into small but measurable increments."
(Merriam Webster's Collegiate Dictionary, Tenth Edition)
Note that in the definition, there appears no mention of
assigning a value.
It says "into small but *measurable* increments". That
is assigning a value, no more and no less. (Indeed,
it would be worthless otherwise.)
Whatever, here is what Wordnet says,
quantize
v 1: telecommunications: approximate (a signal varying
continuously in amplitude) by one whose amplitude is
restricted to a prescribed set of discrete values [syn:
quantise]
2: apply quantum theory to; restrict the number of possible
values of (a quantity) or states of (a physical entity or
system) so that certain variables can assume only certain
discrete magnitudes that are integral multiples of a
common factor; "Quantize gravity" [syn: quantise]
They provide both a term of art definition and a common
usage definition. Both make if very clear that the
result is digital. They both use the word "discrete",
and *that* is indeed the key to defining "digital".
Assigning a value would then be
considered a part of a separate and distinct process of
converting to digital form, as in
Well, except that it is clearly an intrinsic part of
quantization you are right. Of course that also clearly
negates your point.
Indeed, if we do look at a "valid standard definition"
for the term of art,
quantization:
A process in which the continuous range of values
of an analog signal is sampled and divided into
nonoverlapping (but not necessarily equal)
subranges, and a discrete, unique value is assigned
to each subrange.
From Federal Standard 1037C.
We can see that it *clearly* does mean to make it digital.
That is the *only* purpose for quantization.
"digital - of, or relating to data in the form of numerical digits",
That is one of the several common English definitions.
It is rather poorly stated if one is thinking of the
term of art used in the communications/information
industries simply because it will confuse people (just as
you were above by the "measurable increment" as opposed
to stating a "value").
Not all things that are in the *form* of numerical
digits are obviously so. For example, it might be a
difference between flags.... round, square and
triangular. That would in fact be a digital signaling
system, and those are in fact "in the form of numerical
digits", but it might not be immediately obvious either.
and as opposed to
"analog - of, relating to, or being a mechanism in
which data is represented by continuously variable
physical quantities."
Again, that is close, but it is an imprecise common
usage definition. It does not make if clear that the
*value* of the data is continuous, and that merely being
represented using some physical characteristic that is
continuously varying is *not* what it means. It could
easily be misconstrued (and commonly is), for example,
to mean that because a binary digital system using
voltage to encode data does not have *instant* rise and
fall times, that it is in fact an analog system, which
it is not.