I've been youtubeing for a while, and i noticed a guy made IR becaons and used those to triangulate the robot's position.
i had this idea for a long time (using sound, for the matter, which has a longer wavelength) but i haven't found it feasible.
with IR, you can send a message from the robot, asking something like "what beacons are around?"
the beacons will send their message back (via IR of course), after a preset time. Each beacon will have an id, and it will calculate this time somehow (for example, id * 10 ms, if the messages is shorter than 10 ms).
than, the robot might ask beacons to fire a continous "locator" light, and it will count how quickly that beam reaches back. the problem is, light is too fast, and i'm sure a PIC won't do the task.
with sound, though, it might work, but I might have to go to the audible range, or even infrasounds (which might require a lot of power). Taking the average 340 m/s, computing the robot's position at centimeter precision would require a resolution of 1/34000, ~ 30 us resolution. Using an inbuilt timer on a PIC 16F877, for instance, it might just work.
what do you guys think? Of course, we're considering everything ideal for the moment, not taking into acount, that, if the beacons are not at the receiver level, sphere "distortion" will alter the result. and no reflections, either
edit: this is not so much of a question, but mostly sharing ideas. i've read into odometry from the robot's builder bonanza and various stuff on the site, but i'm not looking for that