Wednesday, June 19, 2013

SLAM with imperfect sensors

SLAM from 5 positions joined into a best belief map. Red is particle filter position likelihood estimation.  Green lines are Sonar and yellow are IR distance readings. Rectangles + line shows old and current poses. Circles are 10 cm apart.
I have been very sparse with my blogging but I'm actually getting somewhere with my SLAM attempts. I can now combine sensor-readings from multiple location and the robot moves between poses.

The sensor-readings consists of IR and Ultrasonic distance sensors on a servo taking readings 1 degree apart. resulting in 170 distance readings per pose. I have a compass but I'm not using it right now. These sensor-readings are used to estimate the new position using a particle filter.
Robot with LiPo battery. IR-sensor and Ultrasonic sensor on a servo in the front.
My biggest problem right now is that the sensors behave a bit uncertainly, not just noisy but also depending on material, angle and size of the object you get different readings. I have tried to mediate this by combining the IR with the ultrasonic.

IR has small angle but is sensitive to reflective, transparent and luminescent materials. It also behaves badly on striped materials. And my IR sensor also works ok between 20 and 100 cm.

Ultrasonic sensor has wide angle 15-20 degrees, is quite precise, works on most materials that I know of but is sensitive to steep angles.

1 comment:

  1. thnks sir...
    if you have more information about slam and arduino plss share