Hi guys,
I'm looking at Arduino code and learning about While statements. I've got the following code:
const int kPinLed = 13;
void setup()
{
pinMode(kPinLed, OUTPUT);
}
int delayTime = 1000;
void loop()
{
while(delayTime > 0){ // while delayTime is greater than 0
digitalWrite(kPinLed, HIGH);
delay(delayTime);
digitalWrite(kPinLed, LOW);
delay(delayTime);
delayTime = delayTime - 100;
}
while(delayTime < 1000){
delayTime = delayTime + 100; // do this first so we don!!
digitalWrite(kPinLed, HIGH);
delay(delayTime);
digitalWrite(kPinLed, LOW);
delay(delayTime); }
}
I understand it's function is to flash the LED faster then go back to flashing slower. However I don't understand why it works. It seems to me that the programme would work in the following way.
1.The delay time is 1000 so the first while statement executes because the time is greater than 0. so the light goes on and off then the delay time is reduced to 900.
2. Now both while statements will execute because the time is both above 0 and below 1000. This means the LED will flash on and off and the time be reduced to 800, then the time will be returned to 900 by the second while statement before the LED flashes at a delay of 900. This process will repeat over and over which as far as I can see means the led will always flash at 900 time delay.
However, i'm sitting here with an LED that flashes slower then faster then Slower so i've obviously got it wrong. anyone able to lay out the steps to show how the timer gets reduced?
Thanks.
I'm looking at Arduino code and learning about While statements. I've got the following code:
const int kPinLed = 13;
void setup()
{
pinMode(kPinLed, OUTPUT);
}
int delayTime = 1000;
void loop()
{
while(delayTime > 0){ // while delayTime is greater than 0
digitalWrite(kPinLed, HIGH);
delay(delayTime);
digitalWrite(kPinLed, LOW);
delay(delayTime);
delayTime = delayTime - 100;
}
while(delayTime < 1000){
delayTime = delayTime + 100; // do this first so we don!!
digitalWrite(kPinLed, HIGH);
delay(delayTime);
digitalWrite(kPinLed, LOW);
delay(delayTime); }
}
I understand it's function is to flash the LED faster then go back to flashing slower. However I don't understand why it works. It seems to me that the programme would work in the following way.
1.The delay time is 1000 so the first while statement executes because the time is greater than 0. so the light goes on and off then the delay time is reduced to 900.
2. Now both while statements will execute because the time is both above 0 and below 1000. This means the LED will flash on and off and the time be reduced to 800, then the time will be returned to 900 by the second while statement before the LED flashes at a delay of 900. This process will repeat over and over which as far as I can see means the led will always flash at 900 time delay.
However, i'm sitting here with an LED that flashes slower then faster then Slower so i've obviously got it wrong. anyone able to lay out the steps to show how the timer gets reduced?
Thanks.