UC Irvine
ML Repository
Theme

Wall-Following Robot Navigation Data

Download(931.8 KB)

About

The data were collected as the SCITOS G5 robot navigates through the room following the wall in a clockwise direction, for 4 rounds, using 24 ultrasound sensors arranged circularly around its 'waist'. The provided files comprise three different data sets. The first one contains the raw values of the measurements of all 24 ultrasound sensors and the corresponding class label (see Section 7). Sensor readings are sampled at a rate of 9 samples per second. The second one contains four sensor readings named 'simplified distances' and the corresponding class label (see Section 7). These simplified distances are referred to as the 'front distance', 'left distance', 'right distance' and 'back distance'. They consist, respectively, of the minimum sensor readings among those within 60 degree arcs located at the front, left, right and back parts of the robot. The third one contains only the front and left simplified distances and the corresponding class label. It is worth mentioning that the 24 ultrasound readings and the simplified distances were collected at the same time step, so each file has the same number of rows (one for each sampling time step). The wall-following task and data gathering were designed to test the hypothesis that this apparently simple navigation task is indeed a non-linearly separable classification task. Thus, linear classifiers, such as the Perceptron network, are not able to learn the task and command the robot around the room without collisions. Nonlinear neural classifiers, such as the MLP network, are able to learn the task and command the robot successfully without collisions. If some kind of short-term memory mechanism is provided to the neural classifiers, their performances are improved in general. For example, if past inputs are provided together with current sensor readings, even the Perceptron becomes able to learn the task and command the robot succesfully. If a recurrent neural network, such as the Elman network, is used to learn the task, the resulting dynamical classifier is able to learn the task using less hidden neurons than the MLP network. Files with different number of sensor readings were built in order to evaluate the performance of the classifiers with respect to the number of inputs.
Subject Area
Computer Science
Instances
5,456
Features
24
Data Types
Multivariate, Sequential
Tasks
Classification
Feature Types
Continuous

Features

Introductory Paper

Additional Metadata

Keywords
Authors
Ananda Freire
Marcus Veloso
Guilherme Barreto
Year Created
2009
License
CC BY 4.0