SLAM is the process of building a map for unknown environments and localizing a robot relative to the map simultaneously using its on-board sensors. An expensive laser scanner has been used in most SLAM algorithms, but only range sensors cannot ensure...
SLAM is the process of building a map for unknown environments and localizing a robot relative to the map simultaneously using its on-board sensors. An expensive laser scanner has been used in most SLAM algorithms, but only range sensors cannot ensure the robustness of SLAM. A vision sensor can compensate for the shortcomings of the range sensors, but it takes a long time to extract visual features. Fusion of range sensors and vision sensors can be effective because it can take the merits of both sensors. This paper proposes an EKF-based SLAM algorithm which uses a stereo vision and a low-cost IR scanner. The rough pose estimation from the IR scanner can be compensated for by object recognition and the slow update of the vision information can be overcome by the frequent update of the range information. Also, the method to cope with the uncertainty of features is proposed. Experimental results show that the estimated robot pose is accurate enough be used for building an accurate map.