[edit question changed]
so if a single cpu takes what it takes to light 1 led.
a gpu would take what it takes to light 2000 leds, if it had 2000 cores.
How much power does it take to run 1 led?
So its a bit of a silly question, because I bet a computer (even an analogue one) maybe could run on less power than it takes to light an led by maybe a 1000 times? (if it were 1 operational pathway) But ud have to amp the output to make it visible, or make it able to spin a motor.
So ive got 2,000,000 pathways planned for this "computer" so that would only come to 2000 leds, 50x50 matrix of visible light, I wonder how much power that is.
Yes im feeling silly now... thats not the only problem. if i want to go 100 gigahertz, then I just computed I need 25 amps to go back and forth that quickly.
wth?
so if a single cpu takes what it takes to light 1 led.
a gpu would take what it takes to light 2000 leds, if it had 2000 cores.
How much power does it take to run 1 led?
So its a bit of a silly question, because I bet a computer (even an analogue one) maybe could run on less power than it takes to light an led by maybe a 1000 times? (if it were 1 operational pathway) But ud have to amp the output to make it visible, or make it able to spin a motor.
So ive got 2,000,000 pathways planned for this "computer" so that would only come to 2000 leds, 50x50 matrix of visible light, I wonder how much power that is.
Yes im feeling silly now... thats not the only problem. if i want to go 100 gigahertz, then I just computed I need 25 amps to go back and forth that quickly.
wth?
Last edited by a moderator: