I am working on a project that would use computer vision to locate an object. What I hoped to do was have a green led and a red led be recognized in an image, and then I could draw some conclusions about where the object is.
Ok, here's my problem:
Using a webcam, the image does not show the color properly. In the image, the leds show more as white with color around the edges than a solid color. I have tried a couple different webcams with the same results. However, when I take a picture with my phone, the color seems much better. I have also tried both clear and translucent covers over the leds in attempt to disperse the light more evenly. It still just shows as white. Actually, with the translucent cover, it showed uniformly white.
In my final application, I am intending to process this with a camera module and a microprocessor, I'm just using the webcams to get my software routines right. I can understand that different sensors would have different capabilities. Without simply buying a bunch of them in the hope that one would show better than another, is there a standard that I should be looking for?
Does anyone have any suggestions on the best way to proceed?