For which I am extremely grateful.
CutePDF is a pdf creator; it works as a printer driver. It's so fast
that, first few times you run it, you think it's not working. It is.
Sorry, I was looking at Foxit when I wrote that. My point is that
none of the open source PDF programs actually do what Acrobat does.
The commercial programs that do are paid for programs and I have no
idea if they work any better. Foxit puts stamps on each page it
modifies. I also see similar crappy behavior to Acrobat. For
example, every time I switch focus away from the program and back, the
toolbars become disorganized and take up four rows instead of two. If
the bleeding program can't even remember where I put the toolbars, how
well is it going to handle the hard stuff?
So for the short term, I am stuck with Acrobat. As crappy as it is,
it is the devil I know.
I didn't say it was required. I said that it would, if properly done,
simplify the overall structure and make it easier to build a
crash-proof OS. Of course, anybody with enough programmers and
advanced tools can screw up any architecture.
I don't mean to be rude, but that is not saying anything. "IF
properly done" is a catch phrase that eliminates any meaning. If
operating systems were "properly done" we wouldn't even be having this
discussion. But if you have some basis for saying that multi-
processor actually facilitate correctly working software, I would like
to hear it. So far everything you have posted applies to any
software, on single processors as well as multi-processors.
Why? Transistors keep getting faster, and the real limit to cpu power
is thermal. A CPU doesn't take many transistors, so doing X amount of
computing dissipates about the same power whether it's done on 1 CPU
or 256. Less, if you avoid context switching overhead.
Transistors may get faster, but systems don't. The critical limits
are being reached where logic built of smaller geometries do *NOT* get
faster. I thought this was discussed already. The smaller geometries
require lower voltages. The lower voltages are reaching the point
where they turn on and off fully. This is limiting the improvements
they can make in speed. Why else have CPUs been limited to about 3
GHz for the last 5 or 6 years?
Transistors are getting smaller, but the circuits they are in are not
getting significantly faster.
I guess that if we are hitting a wall
Exactly my point. We don't need more speed, and we sure don't need
more complexity. Let's use all those transistors to make things
simpler and more reliable. WIntel was and still is decades behind
other architectures in applying fundamental hardware protections to
computing, hence atrocities like buffer overflow exploits.
I hate to tell you, but using hundreds of processors *is* complexity
compared to using one processor.
A request to print should start printing NOW. It would if a CPU was
assigned to do nothing but print your file.
Why do you say that? Printing on my 1.4 GHz machine often is limited
by the CPU speed. On a chip with hundreds of CPUs, each one will be
slower than a single CPU using the same technology. So that print job
will take longer to process.
Does it really take
Acrobat and Windows are salient examples of what's broken about
computing. Imagine grannie trying to zap a trojan by editing the
registry!
I agree 100% with that statement. But nothing you have said relates
to multi-processors.
Your point about Granny is very valid. I have a friend who is in his
70's. He is the sort that is self taught and been a welder, AC
repairman, electrician and general all around mechanic and machinist.
So the man is not at all dumb. But he has no interest in learning
about computers. They are basically wicked, cantankerous, malodorous
piles of scrap metal. They don't follow any line of normal reasoning
and can only be handled in unique ways that require you to learn a new
mindset. There are some thing he would like to do with a PC, such as
record his cassette tapes on to CD. But I can't figure out a way to
educate him on how to do this sort of stuff.
Rick