Hi all ! I need a little help!
Currently am working on robotics project where differential driving robot navigates autonomously using stereo vision technology. The robot uses two webcams to capture real time image data then the image data is transmitted to PC and processed using MAtlab computer vision toolbox. My 1st goal is to detect nearby obstacles using the depth map.
I could obtain depth map by comparing the stereo image pair but ended up with little bit noisy result. I was able to reduce this noise with the help of Kalman filter.
now I have a depth map similar to following matrix. (numbers are just to get an idea)
0.1 0.5 0.6 1.2 1.5 1.8 3.2 0.2 ..................... 0.6 0.4 |
0.5 0.4 0.6 3.2 1.5 6.8 3.2 1.2 ..................... 0.6 0.4 | 480 rows
0.3 0.1 0.6 1.2 1.5 1.8 3.2 0.2 ..................... 0.6 0.4
0.1 0.4 0.6 2.6 1.5 5.8 3.2 3.2 ..................... 0.6 0.4
0.3 0.2 0.6 1.2 1.5 1.8 3.2 0.2 0.6 0.4
.
.
.
0.2 0.6 0.6 1.2 1.5 1.8 3.2 0.1 0.6 0.4
640 colmns
number represents the depth information of each pixel.(if the number is high -> high distance , low -> low distance)
Since i am using two 640x480 resolution video streams , I have 640 columns and 480 rows in depth map.
Now I need to use clustering or classification algorithm to cluster/ classify these depth data, so I can detect the area where the obstacle is. I tried to Kmeans clustering algorithm in matlab, but still i dont have clear idea on how to apply it to my application.
what I need is a result like this ,
(Please check the attached image.)
The nearby objects are shown in read colour . I need to use clustering/ classification algorithm to do this segmentation. then the robot can decide in which direction to turn to avoid that obstacle.
Id there are experts in this field, please give me some idea to get through this .
Your suggestions are really appreciated . Thanks