IMO i think discussing a somewhat complicated and textbook oriented topic like a kalman filter on an open forum like this will basically make it easier to understand and implement for less advanced users (like me Wink) and maybe help trigger some grey cells which might actually lead to an optimal and more general solution of the SLAM problem.
o, yes. i agree. i was just trying to explain why although Benji had asked similar questions across a few different threads he had not got any real answers.
the more focused a question though the more likely the answer. asking how to implement SLAM or how a kalman filter works is better explained by a reference to a textbook or .pdf.
on the other hand, explaining what you do understand about such a particular point and asking what is wrong with your understanding is far more likely to get a useful answer.
and atleast one prioceptive sensor i.e. one which tries to detect its own change in location using actuator motion or something (encoder,gps,etc) and i think you dont have a sensors truly belonging to the second category
i think Benji was planning to trust that his bot had gone the distance it was commanded to. you can get acceptable resolution this way as long as you have high build quality and are running on a smooth surface.
here is where i dont understand, are the filter's input multipe sensor readings of the same location?or readings at different locations?
both are valid.
if you have multiple sensor readings from the same location then you can assume the distance traveled = 0 (with no error) so the maths are far simpler as you only have to work out the correct way to average the sensor readings.
the examples in the paper mostly presume readings at different locations though.
if you can explain the basic equation i would be glad
this is the bit i feel the paper explains far better than i can. (as the guy writing the paper has a far better understanding of it than me.)
ill tell you what do i want,
i have a robot ,i should get it to map a room (so its indoor environment) and show this map on MATLAB.
the wavefront algorithm is an algorithm to chase a goal navigate the robot to a desginated target, so its not what i exactly want.
the wavefront algorith does not explain enough about how to generate the map of the room its using to navigate,which is what i actually WANT to do.
so, i only found SLAM to be capable of mapping. and im not allowed to use becons ,so any other than SLAM?
so as i said before, i would simplify the problem.
you want to have the robot map the objects round it so it can navigate it's way round them?
if i were you i would forget about SLAM for now. (make SLAM your next project...)
SLAM is used for taking into account the error you always get as a robot moves.
for the time being i would ignore this error. pretend your robot is exactly where it thinks it is.
now your robot has a map that is pretty accurate for objects it has seen recently but gets less accurate for objects that are further away.
if you command your robot to "move 2 meters north". it has a compass to know with reasonable accuracy which way north is. it can now work out a path round the objects on it's map to reach the destination. the objects close by on the map are probably going to be reasonably accurate but as it moves further it may discover there are objects in different places as where it thought. this doesn't real matter for it's current goal of "move 2 meters north" as it will still be close to a possible path. (presumably things have not moved *too* far since last time it was here...) simply overwrite the map with the new data as it will be more accurate than the old.
what i'm describing here is not SLAM. it's the same way Admin's Wavefront robot builds a map.
while it will allow your bot to take commands like "move 2 meters north" it will not allow you to build an accurate map with no errors.
i would leave that to a future project and first work on the simpler task of recording map data to the best of the robot's ability and making route finding decisions based on them.
if you do need an accurate map of everything the robot has seen the please ignore my ramblings.
it should be possible with the limited sensors you are planning but it will take you a *long* time.
i did see an article about a university robot project that could locate it's self on a predetermined map with multiple readings from only one sonar sensor but i can't seem to find it now...