Hi folks!
Having a strange issue with my PIR motion sensor that I could use some suggestions on.
Note: I always give the sensor at least 30 seconds to calibrate, but now i'm up to 45. Also, I have the sensor inside of a cardboard tube (paper towel tube) to narrow and isolate the FOV from interference (works great).
The problem is the sensor, when hooked up to the arduino, seems to be experiencing something causing the sensor to return false triggers after the first true trigger. Confusing, so let me explain another way...
When I hook up the PIR sensor on the breadboard, feed a steady 4.5v (sensor needs anything between 3.3-5v), let the sensor calibrate, the output works correctly changing to HIGH upon motion, returning to low after a couple seconds of no motion. When there is no motion, the sensor is pretty good about being calm and not false triggering. So this suggests the sensor is in good working order.
When i hook the sensor up to the arduino though (5v, PIR output --> Digital Pin 6) a problem surfaces; it will boot fine, initialize the sensor fine, and return no false readings UNTIL i trigger the sensor once with motion. The sensor will go HIGH (as it should), return LOW (as it should), then continue to oscillate between the two states every few (randomized) seconds as if there is some kind of weird loop or noise in the line.
Here's the code:
int IRpin = 6;
int LEDpin = 13;
int initialBoot = 1;
int IRstateCur = 0;
int IRstateLast = 0;
void setup() {
pinMode(LEDpin, OUTPUT);
pinMode(IRpin, INPUT);
digitalWrite(IRpin, LOW);
digitalWrite(LEDpin, LOW);
Serial.begin(115200);
}
void loop() {
if(initialBoot == 1){
Serial.println("Initializing IR Motion Sensor, 45 seconds...");
delay(25000);
Serial.println("20 seconds...");
delay(10000);
Serial.println("10 seconds...");
delay(5000);
Serial.println("5 seconds...");
delay(5000);
Serial.println("...initialized!");
initialBoot = 0;
}
IRstateCur = digitalRead(IRpin);
//Serial.println(IRstateCur); //use for debug, shows idle time between states
//manage LED according to IRsensor
if(IRstateCur == 0){
digitalWrite(LEDpin, LOW);
}else if(IRstateCur == 1){
digitalWrite(LEDpin, HIGH);
}
if(IRstateCur != IRstateLast && IRstateCur == 1){
Serial.println("Motion detected");
IRstateLast = 1;
}else if(IRstateCur != IRstateLast && IRstateCur == 0){
IRstateLast = 0;
}
delay(200); // wait 200ms
}
If I enable that one line for debugging, it shows that the time between false triggers in NOT always equal, but seems kinda random. What I can say, is that it is NOT due to outside interference. I have successfully isolated the sensor in the tube, and cannot get it to 'falsely' trigger when on the bread board (very accurate).
Here's the output with the debug enabled:
(Sorry its so long, but it needs to be to show how the sequence is unfolding. at the beginning after initialization notice how there is no false output until the first motion event which was my hand in front of the sensor, after that, ALL other 'motions' detected are actually false and the oscillating problem I am describing.)
Initializing IR Motion Sensor, 45 seconds...
20 seconds...
10 seconds...
5 seconds...
...initialized!
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
Motion detected
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
0
0
0
0
1
Motion detected
1
1
1
1
1
1
1
1
1
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
Motion detected
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
0
0
0
0
0
0
0
1
Motion detected
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
Motion detected
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
Motion detected
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
I've experimented with various 100-10K resistors on the IR sensor output to ground to try and remove any 'noise' while the output is LOW, but nay. Problem will not go away!!
What am I missing?!
Thank you!