Mobile Service Robots

An autonomous mobile robot should be able to reproduce its application environment in an internal representation and use it for navigation. The occupancy grid, an explicit environment model, is commonly used for global robot motion planning and can be built from sonar sensor data.

For the mapping the following sensors were used:

  • a ring of 24 Polaroid ultrasonic sensors mounted on a mobile robot RWI B-21
  • a laser scanner "Sick LMS200"

The sonar fusion method is based on a probabilistic four-dimensional model, which is generated using sonar data of a real environment and beam pattern properties. The model of the laser range finder takes errors and delays of scanner and odometry data into account.

To navigate in a real environment using an occupancy grid a mobile robot must know its current position and orientation. The "dead reckoning" approach for solving this problem is easy to implement and gives exact information for short distances. Unfortunately the measurement errors will accumulate over distance.

The procedure presented here is based on the correlation between sequentially generated local grid maps. In addition to sonar mapping 2d laser range finder data can also be integrated into the map. The use of the laser scanner also permits the correction on long distances of the robot travel due to their accuracy and the larger measuring range (long direct passages and walls will be represented accurately). Besides, no given environment plan is necessary.

Experiments with two RWI B21 mobile robots "Colin" and "Robin" show the high accuracy of maps and tracking position and orientation. At an exploration velocity up to 0.5 m/s the collision avoidance, mapping and the robot position tracking are executed on-line at the same time. The grid maps show an exact representation of the environment.



Top right: using ultrasonic sensors. Top left: using laser scanner. Below: correlated maps.


Contact

Alexander Mojaev, Tel.: (07071) 29-77174, alexander.mojaev (at) uni-tuebingen.de
Philipp Vorst, Tel.: (07071) 29-70439, philipp.vorst (at) uni-tuebingen.de


Autonomous Mobile Robots at the Cognitive Systems Dept.

Research focuses on how to make intelligent systems mobile and autonomous. Computer architecture especially has to consider the boundary conditions (restricted space, battery operation, unreliable radio links, limitations due to the robots sensors). It is interesting how a hardware and software system architecture can be created in spite of all these restrictions which enables the mobile robot to cope in an unknown environment (navigation), to make plans of the environment, to recognize people as well as objects and to interact as intelligently as possible with users.
The basic research platform is initially the mobile robot B21 by Real World Interface (RWI) with stereo-camera system, ultrasonic, infrared and tactile sensors as well as a laser range finder.
In order to recognize objects and persons conventional image processing algorithms and neural networks will be used. For optimization tasks EvA can be used off-line.

Below some pictures of Robin:

Front and side view of Robin: the pan-tilt unit with stereo camera system, radio ethernet antenna and the white compass module is visible. The narrow front doors on the left picture open for the arm which will be installed later.

Robin frontal Robin seitlich

Robin from top: the pan-tilt unit is difficult to see. The big red buttons in the back are emergency off-buttons, the colored ones user definable buttons.

Robin von oben

Robin has 2 dualprocessor-Pentium/133Mhz PCs under Linux. The right PC is equipped with two framegrabbers and its main function is image processing, the left PC's function is driving control, navigation and planning. The basis itself contains a MC68332 microcontroller for the low-level control of driving commands which are sent via a serial interface.

Robin links offen Robin rechts offen

From a standing position Robin can move into any direction with the synchronous drive turning all four wheels into the same direction. Along with the wheels the top is turned, while the basis always maintains the same orientation. The four batteries with a 1500Wh performance are able to move the robot for 6 hours.

Robin's Basis (Synchro-Drive)