Jacobs University EECS Smart Systems Contact & Directions Impressum
 

Robot Sensor and Command Stream Recordings
from RoboCup German Open 2007

1. Usage and Semantics

This page provides access to data that was collected in the competition arena of the Rescue Robot League during the RoboCup German Open at Hannover Fair in April 2007. It consists of recorded streams of raw sensor data and motor commands of a so-called Jacobs rugbot (“rugged robot”); please refer to [1][2] for more detailed information on this robot and its operation. The main design purpose of this robot is for research in rescue robotics [3]. The data and the related software can be freely used for academic purposes. Please use references to the Jacobs robots [1][2] when you generate own work based on this data. Please contact Robotics at Jacobs University (http://robotics.jacobs-university.de) if you want to use the data or the related software for non-academic purposes. We thank the German Research Foundation / Deutsche Forschungsgemeinschaft (DFG) for their support to the project 3D Mapping in Unstructured Environments during which parts of the recording and play back software for this data were developed.

There are two zipped data collections of the streams from two different runs of the robot:

The data-formats are descibed in README-files. Nevertheless, to replay this data, it is easiest to use the Jacobs Robotics Sensor Replay Software Framework. It consists of example C++ code for accessing and processing the data. It can be obtained free of charge for academic purposes from the following link:

The recorded streams contain data from following five sensors:

  • an XSense MTi Gyroscope,
  • two Hokuyo URG04-LX laser range-finders,
  • a CSEM SwissRanger 3D time of flight camera, and
  • a Cube-System, an in-house development for fast robot prototyping.

Click on the robot image to see where the sensors are mounted on the robot. In the following, the sensor properties are explained in detail.

 

1.1 3D Gyroscope

The gyro delivers the roll, pitch and yaw of the robot, as well as the initial yaw. This is because yaw is given relative to local magnetic north whereas for most mapping algorithms initial yaw is expected to be 0. The unit is centidegrees, i.e. 1/100 degree.

URL: http://xsens.com/

 

1.2 Laser-Range-Finder (LRF)

Both LRFs on the robot are of the same type. The opening angle is 240 degrees which is covered by 682 beams. The unit is mm. Values below 20 are to be considered as error beams. These could either be beams which are beyond the LRF's maximum range of 4 m, or beams which have hit a disadvantageous surface.

  • Horizontal LRF: One LRF is mounted such that all its beams are horizontal to the floor. Its typical use is mapping.

  • Tilted LRF: Another LRF has been mounted at an angle of 45 degrees to the floor of a perpendicular distance of 420 mm. Its can be used to prevent the robot from falling into abysses.

URL: http://www.hokuyo-aut.jp/products/urg/urg.htm

 

1.3 SwissRanger

This is a 3D time of flight (TOF) camera, which delivers Cartesian point clouds. The coordinate system is as follows: the x-axis is going into the camera, the y-axis is going to the right and the z-axis is going up. The data can also be interpreted as a 176x144 pixel distance image if one just uses the x coordinates of the points. The pixels are stored line-wise starting at the top-left. The unit is meters. Beside the distance information, the SwissRanger also gives brightness information, for which the values are between 0 and 255.

URL: http://www.swissranger.ch/

 

1.4 Robot Control Data by the CubeSystem

The CubeSystem is a development kit for fast robot prototyping [4][5]. It is based on a Motorola 68k processor and takes care of low-level tasks like motor control and odometry. Hence, it can give the (x,y) position of the robot which is in mm. Besides, it stores the value of the last commands given to the motors. The are do not necessarily correspond to the current speed of the robot, but generally it can said that, iff the speeds change between updates, the motors are turning.

URL: http://robotics.jacobs-university.de/CubeSystem/


2. Examples

Here, two maps of the arena generated from the streams by using the LRF data as input to a SLAM algorithm by Grisetti et.al. [1]


Here, two videos in form of animated GIF generated from the SwissRanger data as the robot moves through the arena:

Run 1:

Run 2:

Each video shows the distance information on the left (bright is near, dark is far, calculated per frame) annotated with a frame count and the grayscale intensity image on the right. The first frames were dropped for the creation of the video, since the robot was standing still in the beginning of the runs for quite some time.

 

3. References

[1] Andreas Birk, Kausthub Pathak, Soeren Schwertfeger and Winai Chonnaparamutt
The IUB Rugbot: an intelligent, rugged mobile robot for search and rescue operations
International Workshop on Safety, Security, and Rescue Robotics (SSRR)
IEEE Press, 2006

[2] Andreas Birk, Stefan Markov, Ivan Delchev and Kaustubh Pathak
Autonomous Rescue Operations on the IUB Rugbot
International Workshop on Safety, Security, and Rescue Robotics (SSRR)
IEEE Press, 2006

[3] Andreas Birk and Stefano Carpin
Rescue Robotics - a crucial milestone on the road to autonomous systems
Advanced Robotics Journal, 20 (5),
VSP International Science Publishers, 2006

[4] Andreas Birk
Fast Robot Prototyping with the CubeSystem
International Conference on Robotics and Automation, ICRA,
IEEE Press, 2004

[5] Andreas Birk, Holger Kenn and Thomas Walle
On-board Control in the RoboCup Small Robots League
Advanced Robotics Journal,
Volume 14, Number 1, pp 27 - 36
VSP International Science Publishers, 2000

[6] Giorgio Grisetti, Cyrill Stachniss, and Wolfram Burgard.
Improving grid-based slam with rao-blackwellized particle filters by adaptive
proposals and selective resampling.
In Proceedings of the IEEE International Conference on Robotics and Automation,
ICRA, 2005.


IUB renamed Jacobs University