Author Topic: Problem with using LED as photoresistor/light sensor  (Read 1289 times)

0 Members and 1 Guest are viewing this topic.

Offline notgivenTopic starter

  • Jr. Member
  • **
  • Posts: 23
  • Helpful? 0
Problem with using LED as photoresistor/light sensor
« on: January 23, 2013, 06:51:52 PM »
Hi. I have 3 LEDs here: a red, a blue and a green. I'm trying to use any one of them (along with an arduino uno) to sense presence of light, but I seem to be getting some noise. The serial output (which is related to voltage the LED puts out) keeps fluctuating very erratically, more  than I think it should in my dimly-lit room. Because of this, I can't go to the next step in the experiment which is to specify what serial outputs should be interpreted as light and dark.

For example, consider his baseline ("dark" or "less light")....:


...and consider my baseline (there doesn't seem to be one):


I have the setup described in this video (except that I don't think my LEDs are IR because they don't glow purple):
How To Use An LED As a Light Sensor
. Like he said, we have GND to - terminal of LED, and  +terminal of LED to analog INPUT 1 (A1). I will draw up a diagram if anyone needs it, as I know its hard to see from the video. And here's his code which I modified such that there is a delay in the serial printout:

Code: [Select]
/*
  LED as light sensor
  Created By Sean Jonson
 
  Uses an IR LED (but it can be any kind)on analog pin 5 to turn off a regular LED on digital 13 when there is not much light on the LED.
 
  The circut:
  LED+ to analog pin 5
  LED- to GND
 
  A second LED+ to pin 13
  A second LED- to GND
 
 
 */

int sensorPin = A5;    // select the input pin for the LED
int ledPin = 13;      // select the pin for the LED
float sensorValue = 0;  // variable to store the value coming from the LED

void setup() {
 
  pinMode(ledPin, OUTPUT); // declare the ledPin as an OUTPUT:
  Serial.begin(9600); // begin the serial to the computer
}

void loop() {

  sensorValue = analogRead(sensorPin);     // read the value from the LED
    Serial.println(sensorValue);    // serialy print the value from the LED
    delay (200);
    if(sensorValue < 160){  // if the LED "see's" less than 129 light,  if you dont use the same LED im using you might want to tweek this number. For red-glassed LED, "<160" is good.
      digitalWrite(ledPin, HIGH);  // turn on the light
    }
    else{   
      digitalWrite(ledPin, LOW);  // turn it off
    }
                 
}
 

The strangest thing is that when I move my had towards the LED, the voltage goes up (actually, more like it spikes, temporarily) and when I pull my hand away, it goes down (temporarily). This confuses me as to whether or not these LEDs are IR.
Also, it doesn't seem to change much from this erratic behavior when I shine light on it.
And one more thing: even when there's no connection to LED, or when the LED is is connected but covered by a black cover, the serial output is still giving values in the 100's. Shouldn't it be zero in both cases?

So is there actually anything wrong with this scenario?  Should the LED fluctuate that much even when I'm not doing anything to it? Would a proper CdS photoresistor behave this way (I've never used one so I don't know)?

Thanks for any advice.
« Last Edit: January 23, 2013, 07:34:51 PM by notgiven »

Offline WilbertLopez

  • Beginner
  • *
  • Posts: 1
  • Helpful? 0
Re: Problem with using LED as photoresistor/light sensor
« Reply #1 on: July 24, 2013, 12:56:11 AM »
Hi. I have 3 LEDs here: a red, a blue and a green. I'm trying to use any one of them (along with an arduino uno) to sense presence of light, but I seem to be getting some noise. The serial output (which is related to voltage the led light puts out) keeps fluctuating very erratically, more  than I think it should in my dimly-lit room. Because of this, I can't go to the next step in the experiment which is to specify what serial outputs should be interpreted as light and dark.

For example, consider his baseline ("dark" or "less light")....:


...and consider my baseline (there doesn't seem to be one):


I have the setup described in this video (except that I don't think my LEDs are IR because they don't glow purple): How To Use An LED As a Light Sensor . Like he said, we have GND to - terminal of LED, and  +terminal of LED to analog INPUT 1 (A1). I will draw up a diagram if anyone needs it, as I know its hard to see from the video. And here's his code which I modified such that there is a delay in the serial printout:

Code: [Select]
/*
  LED as light sensor
  Created By Sean Jonson
 
  Uses an IR LED (but it can be any kind)on analog pin 5 to turn off a regular LED on digital 13 when there is not much light on the LED.
 
  The circut:
  LED+ to analog pin 5
  LED- to GND
 
  A second LED+ to pin 13
  A second LED- to GND
 
 
 */

int sensorPin = A5;    // select the input pin for the LED
int ledPin = 13;      // select the pin for the LED
float sensorValue = 0;  // variable to store the value coming from the LED

void setup() {
 
  pinMode(ledPin, OUTPUT); // declare the ledPin as an OUTPUT:
  Serial.begin(9600); // begin the serial to the computer
}

void loop() {

  sensorValue = analogRead(sensorPin);     // read the value from the LED
    Serial.println(sensorValue);    // serialy print the value from the LED
    delay (200);
    if(sensorValue < 160){  // if the LED "see's" less than 129 light,  if you dont use the same LED im using you might want to tweek this number. For red-glassed LED, "<160" is good.
      digitalWrite(ledPin, HIGH);  // turn on the light
    }
    else{   
      digitalWrite(ledPin, LOW);  // turn it off
    }
                 
}
 

The strangest thing is that when I move my had towards the LED, the voltage goes up (actually, more like it spikes, temporarily) and when I pull my hand away, it goes down (temporarily). This confuses me as to whether or not these LEDs are IR.
Also, it doesn't seem to change much from this erratic behavior when I shine light on it.
And one more thing: even when there's no connection to LED, or when the LED is is connected but covered by a black cover, the serial output is still giving values in the 100's. Shouldn't it be zero in both cases?

So is there actually anything wrong with this scenario?  Should the LED fluctuate that much even when I'm not doing anything to it? Would a proper CdS photoresistor behave this way (I've never used one so I don't know)?

Thanks for any advice.[/u]


hello were you able to solve out the problem? I will be working on similar project in future so trying to get as much information I can.. Please reply
« Last Edit: July 24, 2013, 10:38:09 AM by WilbertLopez »

Offline MrWizard

  • Full Member
  • ***
  • Posts: 116
  • Helpful? 0
  • My cylon friend told me a killing joke......
Re: Problem with using LED as photoresistor/light sensor
« Reply #2 on: July 25, 2013, 10:08:36 AM »
Do you use any resistors ? A potmeter 100 ohm - could perhaps be the best option to do some quick prototyping.

Offline johnwarfin

  • Full Member
  • ***
  • Posts: 120
  • Helpful? 3
Re: Problem with using LED as photoresistor/light sensor
« Reply #3 on: July 25, 2013, 06:47:25 PM »
leds are photovoltaic devices, not photoresistive so adding a resistor makes no sense. and their output is relatively low unless exposed to bright light. ldr and photodiodes are far more sensitive and oppropriate for sensing applications. i suggest hooking up a meter and nothing else to get a handle on this. if you dont see a voltage change it means you have it connected backwards or its a blown led.

 


Get Your Ad Here

data_list