The OS cpu will assign it a task, create its memory image, set up its
privileges, and kick it off. And snoop it regularly to make sure it's
behaving.
You are missing my point. The fact that tasks run on separate
hardware does not mean they don't share memory and they don't
communicate. You still have all the same issues that a single
processor system has. It is **very** infrequent on my system that it
hangs in a mode where I can't get control of the CPU. I am running
Win2k and I let it run for a month or more between reboots. Usually
the issue that makes me reboot is that the system just doesn't behave
correctly, not that it is stuck in a tight loop with no interrupts
enabled. So multiple processors will do nothing to improve my
reliability.
How does it know it is a device driver rather
See above.
How does memory *not* get shared? Main memory
Hardware memory management can keep any given CPU from damaging
anything but its own application space. Intel has just begun to
recognize - duh - that it's not a good idea to execute data or stacks,
or to allow apps to punch holes in their own code. Allowing things
like buffer overflow exploits is simply criminal.
So this indicates that multiple processors don't fix the problem. The
proper use of hardware memory management fixes the problem. No?
But speed is no longer the issue for most users. Reliability is. We
need to get past worrying about using every transistor, or even every
CPU core, 100%, and start making systems manageable and reliable.
Since nobody seem able to build reliability or security into complex
software systems, and the mess isn't getting any better (anybody for
Vista?) we need to let hardware - the thing that *does* work - assume
more responsibility for system integrity.
What else are we going to do with 256 CPUs on one chip?
That is the big question. I like the idea of having at least two
processors. I remember some 6 years ago when I was building my
current computers that dual CPUs on a mother board were available.
People who do the kind of work that I do said they could start an HDL
compile and still use the PC since they each had a processor. I am
tired of my CPU being sucked dry by my tools or even by Adobe Acrobat
during a download and the CPU nearly halting all other tasks. Of
course, another solution is to ditch the Adobe tools. Next to
Microsoft, they are one of the very worst software makers.
Personally, I don't think we need to continue to increase processing
at this geometric rate. Since we can't, I guess I won't be
disappointed. I see the processor market as maturing in a way that
will result in price becoming dominant and "speed" being relegated to
the same category as horsepower. The numbers don't need to keep
increasing all the time, they just need to match the size of the car
(or use for PCs). The makers are not ready to give up the march of
progress just yet, but they have to listen to the market. It will be
within 5 years that nobody cares about the processor speed or how many
CPUs your computer has. It will be about the utility. At that point
the software will become the focus as the "bottleneck" in speed,
reliability and functionality. Why else does my 1.4 GHz CPU seem to
respond to my keystrokes at the same (or slower) speed than my 12 MHz
286 from over 10 years ago? "It's the software, stupid!" :^)
who just rebooted a hung XP. Had to power cycle the stupid thing. But
I'm grateful that it, at least, came back up.
I am ready to buy a laptop and I am going to get a Dell because they
will sell an XP system rather than Vista. Everything I have heard is
that XP is at least as good as Win2K. No?