N
Nigel
This thread seems less like a question for serious debate than a starting
point for soapbox environmental politics.
point for soapbox environmental politics.
I think you underestimate Seti@Home.
Seti@Home beats the living crap out of your CPU, running it near 100 percent
capacity, with the floating point processor being beaten senseless 95
percent of that time.
This increases (on the average) the CPU's power consumption an average of 25
watts (it depends on the speed and the die size of the CPU).
Normally the
CPU idles most of the time and uses a lot less power.
Normally the FPU is
not even enabled when the CPU is idle. Just enabling it causes a
respectable increase in the CPUs power consumption: it accounts for around
half of the CPU's silicon real estate.
This information is well documented
and widely known, including the watt consumption for many differnt CPUs.
You see, overclockers love to use Seti@Home to test the stability of their
overclocked CPU. If anything will crash a system that is even slightly
unstable, the demands of Seti@Home will do it. Check the overclocker sites
for the CPU temperature information, and you'll be amazed how much running
Seta@Home will heat up your silicon.
In other words, it wastes 25 million watts continuously 12 hours a day.
25 megawatts is no small matter, rolling brownouts and blackouts in
California depend on less excess power than that being used.
At 10 cents a kilowatt hour, that means it costs $30,000.00 a day worth of
electricity to run Seti@Home. Thirty thousand dollars a day to search for a
one in 500 quadrillion chance of receiving transmissions from space
aliens... if space aliens found out we were doing that they would never
contact our retarded carcasses in the first place since no one wants to hang
out with fools.
AC/DCdude17 said:X-No-Archive: Yes
No, get the envirowhiner liberals out of the office. I'd say outlaw low-income
utility subsidies and kill two birds with a stone. This prevent them from
getting cheaper rates at courtesy of your tax money. Incentives for them to use
less power. It will jack up their utility cost and force them to save power or
risk getting their power cut off.
I think your idea gives very little returns for the effort needed and increases
cost.
Minimizing incandescent lamps from residential properties is far more
significant. Almost all the lights in homes are incandescent. I'm sure you've
came across a bathroom with five or six globe lamps that's 60W each. These
decorative lamps gets perhaps 10 lumens to the watt. Mandating motion detector
activated lights in bathrooms might be effective.
And while we're at it, let's get rid of bad weather and entropy.feklar said:I am not the expert, so maybe this is a stupid or ignorant idea. On the
other hand, maybe I'm right, so at the risk of flames I will either make
sense or make an ass of myself.
Resistors are used in circuits to step down voltages. The excess voltage is
radiated away as heat.
A question: Can any resistor that exists be replaced by a AC or DC
transformer? (think miniature transformer for small value parts)?
If am I right in presuming that, then the use of resistors must continually
flush probably at least a Gigawatt down a toilet somewhere in the USA alone
every day, just as waste heat. (What's the total power generation capacity
of the USA, 13 point something gigawatts total? 13.6 gW?)
What are the heat losses for transformers vs. for resistors?
Can a transformer always replace a resistor, or does it create
insurmountable circuit design problems in frequency generation and control
circuits? Can inductors usually be used as replacements in those cases?
I have a sneaking suspicion that if a law was passed making it illegal to
use a resistor in a circuit as a voltage dropping device, that law would
save at least a continuous half a Gigawatt from being wasted in the USA.
Wade Hassler said:And while we're at it, let's get rid of bad weather and entropy.
Wade Hassler
If you want further quantification, run it and look at your own CPU
temperature readouts.
Heat can be expressed as watts, figure it out.
If your motherboard is so old it doesn't have CPU temperature readouts, then
feel the CPU heat sink while the CPU has been idling, then run Seti@Home for
an hour, then make the mistake of touching the CPU heatsink again.
I can pretty much guarantee you won't ever make that mistake again.
This was a number that was expressed on one of the overclockers sites. Its
not a life and death issue for me, so I will trust my memory rather than
invest many hours on a search of the hundreds of overclocker sites that may
or may not find the original site I got the numbers from.
Pay me for the
research and I will be more than happy to undertake it. Until then, my
memory and my making the mistake of doing a fingertip heat comparison once
will have to suffice. I know the difference between "cool to the touch" and
"getting my fingertips burned". Personally, the only difference I can find
between that experience and touching a 25 watt light bulb is that when
turned on for a while, the light bulb is cooler to the touch than the
heatsink of my K6-2/400 was. (yes the CPU fan was running)
AC/DCdude17 said:As I said, I have a true power meter. It plugs into the outlet and computer
plugs into it and reads out the power consumption in watts.
Heat can be expressed as heat when all other coefficients are known. Just
knowing the temperature won't translates into wattage without knowing the
ambient tempreature near the CPU, heatsink thermal coefficient, heatsink radiant
coefficient, air flow etc. etc.
My 20W soldering iron gets a whole lot hotter than 100W lightbulb, so surely the
soldering iron must be using more power...
RIGHT!
making it illegal to use a resistor in a circuit
as a voltage dropping device, that law would
save at least a continuous half a Gigawatt from
being wasted in the USA.
making it illegal to use a resistor in a circuit
as a voltage dropping device, that law would
save at least a continuous half a Gigawatt from
being wasted in the USA.
And all those vampire appliances! A few days ago, as I was going
to bed, I turned off all the lights and I was stunned at the sheer
amount of LEDS illuminating the darkness of my house!
Let's count... Three computers, three monitors, two videos, two
DVDs, dish rerceiver, two remote-operated sound systems, DSL
modem, hub, microwave, digital clock on the oven, four or five
wall warts permanently connected and warm to the touch, not
counting the clock radio and the fridge (who obviously need
power 24/24). Sheesh!
Sometimes, I am tempted to shut everything off, close the
switches on the numerous blinking power bars in my house
(BTW, why do the little neon lamps blink after a while?)
I am not the expert, so maybe this is a stupid or ignorant idea. On the
other hand, maybe I'm right, so at the risk of flames I will either make
sense or make an ass of myself.
Resistors are used in circuits to step down voltages. The excess voltage is
radiated away as heat.
And while we're at it, let's get rid of bad weather and entropy.
Wade Hassler
CJT said:Wade Hassler wrote:
And friction. Things would run so much cooler without it.
-----------------------------Certainly modern circuits use fewer large
resistors than they used to. In the 70's,
most TV's had large "dropper" resistors
providing voltages throughout the set.
They consumed a lot of energy, and gave
out a lot of heat. Today we have much
more efficient power supplies.
Sofie said:Instead of a lot of heat producing, power wasting
voltage dropping resistors, we NOW have a lot of
heat producing, power dissapating (wasting), large
voltage regulator chips, SCRs, power transistors,
zener diodes, and other semiconductors
that do the voltage dropping job better but there is
still voltage dropped, power dissapaated (wasted)
and heat produced.
This is progress for sure.