Maker Pro
Maker Pro

Questions about equivalents of audio/video and digital/analog.

B

Bob Myers

Floyd L. Davidson said:
You don't appear to understand that the limited set of values makes
it digital, by definition. PERIOD.

More argument from authority. Yawn.

Bob M.
 
B

Bob Myers

It is in fact! It's a digital PAM signal. Indeed, v.90 modems
make use of it.

That's funny, so do the analog inputs of a PC monitor.
Ya just gotta wonder - how do they KNOW? :)
However, just as you can convert an analog signal to digital, you
can indeed convert digital to analog. One method is to produce a
digital PAM signal and run it through an analog channel.

Floyd, help me out here - is a length of coax an "analog
channel" or a "digital channel"? Mine don't seem to be
labelled....

Bob M.
 
D

Don Bowey

Arny Krueger wrote:

...


Scope? SCOPE? isn't that an analog device? True digital work is don with
pencil, paper, and calculator. Theory rules!

Jerry

Since I see no smiley face to let us know you are jesting, I have to believe
you really mean it. In which case I have to believe you aren't as bright as
you were sounding. A final test of applied theory is..... Does it work as
intended? Where's my scope?
 
B

Bob Myers

Floyd L. Davidson said:
That is, since you seem unable to grasp or investigate
it, the web site of the National Telecommunications and
Information Administration, a part of the US Federal
Department of Commerce, in Boulder Colorado. Which is
to say they are next door to and under that same
management as the NIST (the National Institute of
Standards and Technology), and NOAA (National Oceanic
and Atmospheric Administration) which you may also have
heard of...

Or, to put it another way, you will not find anywhere in
the world a valid definition that disagrees with that
one. If yours is not in agreement, you are *wrong*.

And here, kids, we see the entire heart and soul of
Floyd's argument. "My definitions are correct, because
they come from a source that I considered to be correct.
Any that aren't in agreement with these definition are
wrong, since they aren't what I consider correct."

Could it possibly GET any more circular than that?

By the way, NIST is just up the road from me: I've
contributed to (and corrected) several standards that
NIST personnel were developing. None of those people,
by the way, showed any evidence of halos or made any
claims of infallibility.

But by Gawd, Floyd will trust them implicitly, because,
after all, they're the GUMMINT!!!!!!
Logically you are walking the plank. Such technical
definitions have nothing to do with logic.

At least in your case, this is obvious.
It is an
arbitrary decision that it means this or it means that.
If we all agree on the arbitrary decision then we have a
standard, and we can use it knowing that others will
understand what it means.

Hardly arbitrary. Floyd, have you ever done ANY
standards work at all? Erroneous definitions in such
standards tend to stand until someone walks in to the
committee meetings, notices the problem, and makes
a compelling *logical* argument as to what the problem
is and how to correct it. Then it gets cleaned up in
the next revision. This does NOT mean that the older
revision was correct up until the point of change, you
know...

Bob M.
 
D

Don Bowey

And you, like Arny, probably have no idea what you'd
see on a decent scope anyway.

I'd like Arny to explain how he can look at a scope and
tell if a single cycle of a sine wave is an analog signal
representing one cylce of a pure tone, or is just a digital
signal that represents 8000 different bytes of data from
a digital image.

At this point I need to say "who cares how it was generated?" It sure won't
need to be passed through any analog channel to make it analog. It is
analog by the time it gets off the board that generating it.
 
B

Bob Myers

You just digitized it. You can no longer have a value of 0.15 volts.

You can't have a value of 0.15 volts, but it's still an
"analog" signal and may be interpreted as such. Consider
the example of a gray-scale-bar pattern in an ANALOG
video system, mentioned earlier. The levels of the video
signal are ANALOGOUS to the desired luminance level,
and that's all it takes to be "analog."
No. By definition it is not. With an analogue signal you have
(technically) an infinite number of values between an input of
0.1 and 0.2. With digital you do not.

Well, by Definitions According To Floyd it's not, but
by any rational thought process Don is precisely right.
And there can never be an "inifinite" number of values
available in any signal, digital OR analog, per the
Gospel According To St. Shannon.
Digital doesn't mean numbers, it means discrete values.

Seems like they would've called it "quantized," then,
rather than using a term which contained the root word
"digit" within it. Oh, wait - people DO use "quantized"
whenever THAT word is applicable. Guess you must
be confusing the two, huh?

Bob M.
 
B

Bob Myers

What you need to get straight is that it is not *my*
definition. It is the *standard* technical definition
recognized by virtually *every* standards organization.

Really? Then I shouldn't be able to find any standards
organizations which use a conflicting definition, right?
That is a logical fallacy on your part. An "argument
from authority" has great weight if it is valid. To
be valid it must pass three tests:

1) The authority cited must actually be an authority.
2) All authorities must agree on the topic.
3) The authority cannot be misquoted, taken out of
context, or be joking.

But ANY argument from authority always takes a back
seat to an argument from evidence and reason, since
those arguments directly undermine item (1) above.
Prior to the very late 19th century, all "authorities"
could be quoted as saying that it was impossible to
create a heavier-than-air flying machine. They were
all wrong. There is a nearly-endless supply of simlar
examples.
Clearly citing the NTIA and MilStd definition is indeed
a *very* strong appeal to authority, and no mere opinion
can even come close to invalidating it.

Well, it's very strong, I suppose, if you're impressed by
something simply being an NTIA or MIL standard;
if you've actually seen such things being put together,
you tend to lose a lot of reverence for them, and
certainly would never consider them to be infallible.
Standards also have a tendency to enshrine common
but erroneous thoughts, simply because they ARE
common and no one stops to question them before
they get put into the standard, simply BECAUSE
"everyone knows this" or "everyone says it."
Arguments from authority have a nasty habit of
breeeding more "authority," through cycles of
repeated reference to incorrect notions.
You know one way to be absolutely positive that your
logic is not good is to do a reality check and find that
the answer you have is wrong. It this case that is very
easy to do, which is why *standard* definitions are
quoted from authoritative sources. If you disagree,
then clearly you *don't* have the logic right!

You sound exactly as one who would be arguing, in
early 1904, against investing in those crazy Wright
brothers, since it's clear RIGHT HERE IN THIS
TEXT that a flying machine is impossible! Anyone
who says or even, God forbid, demonstrates otherwise
clearly MUST be wrong. (This is the Reality Must
Always Change to Conform To Established Thought
position.)
So cite even one such valid reference! (You *cannot*,
because there are none.)

(And recognize that if you think you have one, then
there is one of two things clearly true: Either 1) you
do not understand that the other definition is not
actually different, or 2) your reference is not a valid
one.)

Once again: "MY references are right, because they
agree with me - YOURS simply MUST be wrong, because
they don't!" What a wonderfully circular form of
argumentation you have there!
You are not a valid reference. You don't even come
close to being equal to the NTIA.

Floyd, who do you think makes up the NTIA or
any other standards body? Gods who have come
down from Olympus?

Bob M.
 
B

Bob Myers

Floyd L. Davidson said:
Floating point is analog, integer is digital.

This will be news to anyone designing floating-point processors....

Bob M.
 
B

Bob Myers

Floyd L. Davidson said:
You can repeat that all you like, but you are wrong
every time you do.

By *definition* it is a digital signal.

quantization:
A process in which the continuous range of values
of an analog signal is sampled and divided into
nonoverlapping (but not necessarily equal)
subranges, and a discrete, unique value is assigned
to each subrange.

Funny, I don't see the word "digital" in there. Perhaps
you could point it out? No one is arguing that
"quantized" does not mean the above - but you seem
to be arguing that "quantized" is precisely equivalent
to "digital," while none of the definitions you provide
say that.
If you do not stay with standard definitions it is
impossible to discuss anything rationally.

Yes, you have made that quite evident.


Yes, you said that again; you repeat it as though it
were a mantra that would somehow make your particular
odd misunderstandings correct. Again, please show me
the word "digital" IN THIS DEFINITION.

No matter how dense you want to be about it, that
government "expert" happens to be right. And you cannot
find *any* expert that will disagree.

No one that you will accept as an "expert," at least,
since apparently "by definition," an "expert" is someone
who agrees with your position, and no one who disagrees
could possibly be an "expert." Or can you please tell
us some OTHER criteria that you would use to judge
"expertise," so that we can search for "experts" that
you would find authoritative?
That is the
*standard* definition, and virtually *everyone* agrees
that it is correct.

Since there are numerous respondants in this thread
who apparently do NOT agree with your claim that
this is the "standard definition," that statement is
prima facie incorrect.

Bob M.
 
D

Don Bowey

Yes. I've got the specs right here! :) Literally, I have
had a graph on my web site for several years now that I drew up
to illustrate something I wrote once upon a time

http://www.apaflo.com/floyd_davidson/t1pulse.jpg

(snip)

The pulse for which you provided the link, is not DSX-1, because it will not
fit within the DSX-1 envelope.

I posted the DSX-1 template, and a representative pulse within it (MS Word),
on a.b.s.e. The pulse shown was from equipment that generated the pulse
using an analog method. As you can see from the envelope, other pulses,
specifically, those generated digitally, could be much more "square" if
given enough processing time.

Numeric points for plotting the template to a spreadsheet are available if
anyone wants them, but I will be away until next Saturday.
 
B

Bob Myers

Floyd L. Davidson said:
(Again, that is
the nature of arbitrary definitions, this time of what
"encode" and "modulate" mean.)

Definitions are arbitrary only to those who don't
truly understand them.

Bob M.
 
B

Bob Myers

Floyd L. Davidson said:
Your opinion of standard definitions is worthless.

....because it disagree's with Floyd's opinion, and
Floyd has somehow been granted Infallibility by the
Gods of Technology.

Or does that only apply when you are wearing the
big white hat and formally speaking ex cathedra?

Bob M.
 
B

Bob Myers

Again, not really true. Quantized is necessarily
digitized.

Why? And please, for a change, try to cite a REASON,
not merely a definition. After all, if you have the level
of understanding of this topic that you implicitly claim,
you should easily be able to do that.

Bob M.
 
B

Bob Myers

Floyd L. Davidson said:
Look up the definition of "quantization" again. It simply
makes no difference. If an analog signal is quantized, the
result is a digital signal. That is by definition, and you
cannot escape that with mumbo-jumbo and faulty logic.

But of course, you haven't yet even posted a definition
which says that, let alone provided any reasoning which
would support such a definition.

Bob M.
 
B

Bob Myers

Floyd L. Davidson said:
No, in fact it is not. Electrons do not necessarily all move
at the same speed...

Ummmm - now you have a problem with the definition
of the units used to quantify current? An Ampere (the
standard unit of current) is defined as 1 Coulomb of charge
passing a given point per second, and the Coulomb is most definitely
defined in terms of the fundamental unit of charge (which equals
the magnitude of charge on a single electron). Nothing in this
requires all the electrons to be moving at the same speed,
any more than a flow of 10 gallons/hour of water requires that
I move each ounce of water at the same rate.

Geeze, Floyd, which is it? Either definitions are important,
or they're not.

Bob M.
 
I

isw

You don't appear to understand that the limited set of values makes
it digital, by definition. PERIOD.

So, does that make quantum physics "digital"?

Isaac
 
J

Jerry Avins

Floyd said:
And you, like Arny, probably have no idea what you'd
see on a decent scope anyway.

I'd like Arny to explain how he can look at a scope and
tell if a single cycle of a sine wave is an analog signal
representing one cylce of a pure tone, or is just a digital
signal that represents 8000 different bytes of data from
a digital image.

If it comes down a wire, it's analog. You wrote that yourself at one point.

Jerry
 
J

Jerry Avins

Floyd said:
If you don't understand what they said, you probably do
have a bridge that somebody sold you...

Maybe I didn't understand. Please clarify
"The actual sampling rate required to reconstruct the original signal
will be somewhat higher than the Nyquist rate,
*because of quantization errors introduced by the sampling process"*
(emphasis added).

Jerry
 
J

Jerry Avins

Don said:
Since I see no smiley face to let us know you are jesting, I have to believe
you really mean it. In which case I have to believe you aren't as bright as
you were sounding. A final test of applied theory is..... Does it work as
intended? Where's my scope?

Sorry, Don. I thought the sarcasm would be obvious.

Jerry
 
F

Floyd L. Davidson

Arny Krueger said:
(1) A decent scope gives a pretty close approximation of what theory
predicts.

True. Theory predicts that you cannot look at a scope
and tell what kind of information, digital or analog, is
carried by a signal. So one wonders why you want to talk
about scopes.
(2) I was probably working with decent scopes before you were born.

Nobody made a "decent" scope for several years after I
was born.

Bringing up the distraction of looking at such signals
with a scope clearly indicates that you do not
understand it. You *cannot* distinguish between digital
and analog signals with a scope. (See below, for a very
good example of why that is true.)
I've seen both kinds of data many times. Imaging data almost never looks
like sine waves.

It *commonly* does. Every time anyone fires up a v.90
modem and downloads *anything*, the waveform on the
signal from the telco to the subscriber's v.90 modem
looks like sine waves. That is true whether it is
imaging data, text, voice, or whatever else you can
think of.

But in fact those "sine waves" are as much as 8000 bytes
per second digital signaling.

And if you put your scope on the line and look at them,
and then hang up the phone and make a voice call, you
will not be able to see *any* difference between the
analog voice signal and the v.90 protocol digital
signal!

The reason is because there absolutely is no difference
at all. Both signals are generated in exactly the same
way by the exact same CODEC in the line card at the
telco. They necessarily will look identical on a scope.
 
Top