A major drawback in many robotics projects is the dependance on a specific environment and the otherwise uncertain behavior of the hardware. Simple navigation tasks like driving in a straight line can lead to a strong lateral drift over time in an unknown environment. In this paper we propose a fast and simple solution for the lateral drift problem for vision guided robots by real-time scene analysis. Without an environment-specific calibration of the
robot’s drive system, we balance the differential drive speed on the fly. Therefore, a feature detector is used on
consecutive images. Detected feature points determine the focus of expansion (FOE) that is used for locating and
correcting the robot’s lateral drift. Results are presented for an unmodified real-world indoor environment that
demonstrate that our method is able to correct most lateral drift, solely based on real-time vision processing.