New lower price for Axon II ($78) and Axon Mote ($58).
0 Members and 1 Guest are viewing this topic.
You will want to calibrate each sensor so they all give the same reading under the same conditions as you build the 'bot. Once that is done there should be no need to recalibrate providing nothing changes.As for failure, the algorithm should be sound ...but there are no guarantees. Build it and test it.
I have not tried this but it just flashed to my mind - use another sensor to sense the light conditions of the robot's surroundings, based on this calculate the line sensors' value.
If you store the first reading of each sensor and then subtract it from future readings then you will be getting a 'difference' - so it should adapt to the initial light level.Things to be wary of: when powering on the robot then make sure that all sensors are either on or off the line and that you aren't casting a shadow over some but not others.The other alternative is to store the minimum and maximum ADC readings for each sensor on an on-going basis. This may cope better given that each resistor/device may not be identically matched. So one sensor may give: 1v-4v and the other may give 0.8v-4.2v; say; for the same light conditions. So what you could do is:1. Take new ADC reading2. SensorMin = MIN(SensorMin, newReading); // keep smallest so far3. SensorMax = MAX(SensorMax, newReading); // keep largest so far4. Range = SensorMax - SensorMin; // The range of values it has seen5. X = newReading - SensorMin; // The new value - between 0 and Range6. Percent = (100 * X) / Range // A value from 0 to 100%Now you have a percentage value for each sensor ie what the current value is, expressed as a percentage of the light values it has seen. So if all sensors have seen the same range of light values then they will all return 50%, for example, for the same light level.You need to guard against a divide by 0 at step 6. This can be done in your startup code by taking an initial reading and setting SensorMin = that value - 'fudge' and SensorMax = that value + 'fudge' where the 'fudge' value represents a major shift in light level ie Black vs White rather than a small shadow.
thought that too,you mean use differential amplifier to substract sensor readings from ambient light?well i tried that before on my fire fighting robot,but didnt give much success
hi now I,m kind a confused.according to your method we read all 8 sensrs with light on and offsubstract each values to get ambient fre reading for each sensorcalcuate thresh holddecide which sensors are on the line and which are not,based on thresh hold
but what if robot goes off the line?,he want be able to identify it because it reads some values.
and in this article there is a different method. can you compare your algo with thishttp://www.societyofrobots.com/programming_mobot.shtmlwhich one is better
...is all this steps are for single sensor? (min max values)thanks