This robot has been assembled and programmed to introduce myself to the AXON and some of the latest robots sensors. I had done some robotics about 15 years ago.
I took many off-the-shelf components along with the AXON and have put together a robot that does the following:
- light following: it will follow light based upon sampling 3 different directions in front by using a turret mounted sensor and picking a direction (partially working needs to be tweaked a bit)
- wandering: it will basically wander around avoiding bumping into things by keeping the turret centered and reading both IR and SONAR. Works as a random walk of the environment.
- singing: chooses a song to sing
- dancing: plays a sound effect and performs small movement routine with various turns
- waiting: when waiting it randomly chooses and plays sound effects.
Three Software Layers
All the above behaviors are available all the time and are utilized in this highest level motivation software layer. The 'bot randomly chooses (although it's weighted) a behavior to follow and the behavior supplies a list of atomic operations (like move forward for X tics, stop for so long to plan another forward move) to a batch command processor. The command processor takes the commands and executes each step for the supplies duration. This highest level of software follows a sense-plan-act strategy.
In order to avoid ramming headlong into things I use a subsumption-light architecture that preempts the commands in the command processor. These overriding reflexes supplies new orders to the command processor for a temporary period. The motivation behavior then takes back over and supplies new orders. Some of the various sensors are read very often in the interrupt routine.
The lowest level of software is an interrupt routine that reads various sensors, sets various conditions and otherwise updates basic world and robot state. This is used by the middle layers of reflexes to anticipate collisions. This layers also used a sampling and smoothing algorithm to help with noise and spurious readings in the various sensors. The routine uses scaling and biasing to lower the influence that a single ( or few) faulty readings have.
Current physical elements on board at this time:
- 3-axis accelerometer - being read by interrupt routine but not otherwise utilized
- AXON controller
- Sharp IR sensor
- light level sensor
- light sensor
- Servo turret to point IR/SONAR And light sensor in different directions
- 2X16 Serial LCD for various information
- amplified speakers for->
- Rogue Robotics uMP3
- continuous rotation servos for drive wheels
- 3-tier chassis
To Be completed:
1) rear IR collision sensor
2) utilize the connected to 3 axis accelerometer to determine if movement appropriate for desired state, i.e. sense when no forward acceleration when expect movement or sense sideways acceleration when should only be forward
3) better mapping and movement algorithm for both wander and light seek mode
4) connect touch sensors to the pins programmed for them (I have bump sensor reflexes programmed but not in use)
5) battery monitor circuit to either shutdown or complain about recharge needed.
6) capacitance touch circuits on large areas to sense 'petting' and react with sound
7) continuous motion; right now I move forward so much, stop and scan the environment and then plan the next move
8) I have a movement sensor on the bot but it's been removed I may put a couple back on to have some way to to follow movement and torment the cats
1) Love to do a recharge station
2) Zigbee communication to a home base
3) line of site follow/communication to another bot
I've spent a lot of time getting the sound and music to work since this robot is being made to interact with people (especially children) quire a bit. It'll get some initial use in my daughters Kindergarten class along with a bumpbot that I built.
Current Picture with all basic elements mounted: