John Fields said:
---
They really aren't, I think. If you buy the bit

about that the
smallest part of _any_ signal being discrete is true, then you buy the
premise, which states that nothing which is composed of discrete
particles can be continuous must also be true, since the whole is the
sum of its parts and the smallest part comprising the whole is
discrete.
True; eventually, you run into discrete "things" at SOME level.
My point was, though, that the consideration of something as
either "discrete" or "continuous" actually has very little to do with
whether it is "analog" or "digital", despite most "analog" types of
communications generally being at least treated as "continuous,"
and most "digital" types being treated as discrete.
We watch movies and perceive them as being continuous, yet they're
presented to us one frame at a time (one pixel at a time if you're
watching TV or if you're reading this), and the pixels we see (which
escape each frame) are themselves composed of discrete numbers of
photons being allowed to pass through or being blocked by bunches of
individual dye molecules which are made up of bunches of individual
atoms which are made up of...
Sure; the distinction between "discrete" and "continuous" at pretty
much any level has to do with the limitations of human perception.
On the other hand, re the above, if you really want to get down to the
smallest levels, we're going to run into the whole wave/particle duality
thing - and I don't EVEN wanna go THERE...
I agree in the sense that if a zillion photons impinging on a surface
caused an effect to occur which, if measured, would increase by a
factor of two if the measurement were made when two zillion photons
impinged on that same surface, then that relationship might be termed
linear. In other words, Y=kX.
But "linear" is also irrelevant. There are a number of signalling
methods which are generally considered to be "analog" by any
reasonable, PRACTICAL definition of the word, and yet do
not involve linear encoding of the information. (Video is again
possibly the most obvious example; the transformation from
light levels to signal amplitude is REQUIRED to be non-linear
in just about all standards, for some very good reasons.)
I don't think there's any way we can get around quantization (and its
attendand requirement for particulate quantities) unless you can,
somehow, show that there are infinitessimally subdividable states
which exist between the smallest infinitessimaly divisible states
which can't be further divided but which must instead be smeared.
Sure - but again, mere "quantization" has nothing to do with
whether or not we consider a signal to be "analog" or "digital" in
any practical sense. Fundamentally, both analog and digital
methods of encoding information into ANY real-world signal run
into what is really the same limit, just viewed from two different
perspectives. That limit is the inability of the receiving device or
entity to distinguish "adjacent" meaningful states (symbols) in the
information stream, due to the corrupting effects of what must
generally be called "noise in the channel." It doesn't matter what
the source of that "noise" is - there is ALWAYS a fundamental
limit to the number of states that can be reliably distinguished, and
so on the information capacity of the channel. That's the basic
notion behind the Shannon equation for information capacity, which
(despite typically giving a result in "bits/second") is NOT talking
about either "analog" or "digital" methods specifically. (Here,
"bits" is being used in the information theory sense as the smallest
possible unit of information - the answer to a yes/no question. It
is a concept which is as applicable to "analog" systems as to "digital,"
although admittedly the linkage is not as intuitive or obvious in the
"analog" case.)
My bottom-line point is that the TERMS "analog" and "digital"
really just apply to different methods of encoding or interpreting
information. They are not the same as "linear" or "continuous," or
"discrete" or "quantized," respectively. Each of these words has
a perfectly good meaning already, and we're just getting sloppy
in our thinking when we confuse them.
Bob M.