I'd like to measure the velocity of a (hypothetical) robot's body relative to its environment (room). The robot is an unstable biped which will fall if left alone. Does anyone have any idea how this could be done?
My best idea so far is to use an accelerometer, and integrate its output over time, but then after a while the velocity estimate I'll get will be very far from reality due to integration error. Is this correct? Even worse - I think the accelerometer will give an output that is affected by the direction of gravity. If I understand correctly, an accelerometer always reports the real acceleration plus 1G in the Z direction (up/down). This means that if the robot is motionless, I'll get a Z reading of 1G. I'm supposed to subtract this 1G in order to get the real Z acceleration. But if the robot is tilted and motionless, the gravity vector won't be directed along the Z axis, so after subtracting 1G from the Z reading it will seem that the robot is accelerating. This means that I can't really know where the robot is accelerating, without knowing precisely how it's tilted.
Does anyone know a way to overcome these problems? Are these problems even real? Is there another way to estimate a robot's velocity?