Supporting material for the paper:

Evolving homogeneous neuro-controllers for a group of heterogeneous robots:
coordinated motion, cooperation, and acoustic communication

by Elio Tuci, Christos Ampatzis, Federico Vicentini, and Marco Dorigo
August 2006


Table of Contents
  1. First post-evaluation tests: trials completion time graph
  2. Group n.9: movies of successful trials
  3. Group n.9: a description of the signalling behaviour
  4. Group n.9: the significance of the Interaural Intensity Differences (IIDs)


First post-evaluation tests: trials completion time graph


Box-and-wisker plots visualising the complition time of successful post-evalution trials of groups (n. 1 to n. 10) whose controllers are built from the genotype with the highest fitness of each evolutionary simulation. Grey bars refers to Env L, and white bars refers to Env R. The horizontal line indicates the time-limit of a trial during evolution. The box comprises observations ranging from the first to the third quartile. The median is indicated by a horizontal bar. The wiskers extend to thee most extreme data point which is no more than 1.5 times the interquartile range. Outliers are indicated as circles.

Group n.9: movies of successful trials


Env L Env R

By clicking on the pictures, it is possible to run the movies which show the trajectories of the simulated robots during a successful trial in each environment. In the movies, the purple circle represents robot Ral (in the pictures, the black lines), the green and the blue circles represent the robots Rir (in the pictures, light and dark gray lines respectively). The small yellow circle is the light. On the bottom of the window, the orange bars represents the activation of the IR sensors of robots Rir; the yellow bars represent the activation of the ambient light sensors of robot Ral; the black bars represent the activation of the sound sensors of each robots.

Group n.9: a description of the signalling behaviour

Each robot of the group is required to coordinate its actions in order (i) to remain close to the other two agents without incurring into collisions, and (ii) to make actions which bring the group closer to the target. What is the role of signalling for the achievement of these goals? Is signalling used by robot R_al to communicate to robots R_ir information concerning the relative position of the target? Similarly, is signalling used by robots R_ir to inform robot R_al on the position of obstacles against which it may collide? In order to provide an answer to this type of questions, we carried out a series of tests that look at the properties of the sound signals perceived by each robot during a successful trial in each environment. Our goal is to identify oscillatory phenomena or other distinctive features in sound production/perception whose properties can be exploited by the robots to coordinate their actions. Despite the fact that our analysis is limited to two successful trials, one for each type of environment, we hope that the results help us to formulate "general" hypotheses concerning the role of signalling in the coordination of the group actions.

Before proceeding further, we should remind the reader that the intensity of sound perceived at each microphone results from the summation of two components---the "self" and the "non-self"---and the noise. The "self" component (i.e., the agent's own signal) is only determined by the intensity of the sound emitted by the robot itself. The "non-self" component is determined by the intensity at which the sound is emitted from the loud-speaker of a sender as well as by the relative distance and orientation of the loud-speaker with respect to the receiver's microphones. Although the agents have no means to distinguish between the "self" and the "non-self" components of the perceived sound, they can act in a way to determine patterns in the flow of sensations which are informative on their spatial relationships.

For robots group n.9, we proceeded by separately recording the "self" and the "non-self" components of the sound perceived at each microphone, and the heading at each time-step of each robot during a successful trial in each environment. As shown in here, each robot of group n.9 combines phototaxis with a rotational movement. The latter, due to the simulated physics, may introduce rhythms in the perception of sound through its effects on the "non-self" component.

With a Fast Fourier Transform analysis (FFT), we transform the sequences of heading and the "self" and "non-self" components of the sound signal perceived by each robot at each microphone from the time domain into the frequency domain. By looking at the power spectral density (PSD) we observe that: (a) the "self" component of each robot does not display any harmonic (fn_i) at any frequency different from 0 Hz; (b) for all the robots, there are three principal harmonic components in the spectrum of the sequence of heading (see Table 2, columns 3, 4, 5); (c) the "non-self" component of each robot has only one principal harmonic (see Table 2, columns 6, and 7); (d) fn_1 of robot R_al differs from fn_1 of both robots R_ir.

From point (a) we conclude that for each robot there are no oscillatory phenomena in sound production. Oscillations are instead observed in the perceived sound. From points (a), (b) and (c) we conclude that oscillations of the perceived sound are produced by the rotational movement of each robot through the effect that the movement has on the characteristic of the "non-self" components. A further evidence of the causal relationship between the rotational movement and the oscillation of the "non-self" components is given by the fact that the principal harmonic of the "non-self" components has a very similar frequency to the first harmonic of each robot's sequence of heading (see Table 2, columns 3 and 6). Moreover, the similarities of the first harmonic of the "non-self" component between robots R_ir and the differences between robots R_ir and robot R_al confirm that there is a dynamic speciation of the characteristics of the homogeneous controllers with respect to the physical properties of the robots. In particular, robot R_al rotates slightly faster than the other two robots.

So far, we have identified periodic phenomena and their relative frequencies in sound signals and in the rotational movement of the robots. The next step of our analysis focuses on the characteristics of the "non-self" components. We use the frequencies of the principal harmonic fn_1 obtained from the PSD analysis to filter the sound signals. In particular, we applied a narrow bandpass filter at frequencies 0 Hz and fn_1. In this way, we transform the "non-self" components into the following sinusoidal signals:
where alpha is the DC offset of the signal, p is the peak amplitude, and fn_1 is the frequency of oscillation. Recall that the DC offset is the mean amplitude of a waveform; if the mean amplitude is zero, there is no DC offset.

From our analysis, it results that the average amplitude and standard deviation of the "self" components and the DC offset and peak of ns_i with i in [1, 3] do not substantially differ (i) among the robots; (ii) between the two sensors (S1 and S2); and (iii) between the two environments (i.e., Env L and Env R). In particular, the mean value of the "self" components contributes to more than 90% of the perceived sound (see Table 3, columns 2, and 8). The "self" components are described by referring to their average and standard deviation since they do not present any periodic oscillations. Given the high intensity of the "self" component, the "non-self" component can only induce changes in the perception of sound that are less than 10% of the sensors' receptive-field. By looking at the DC offset of the sinusoidal signals ns_i (see Table 3, columns 4, 6, 10 and 12) we can infer that the "non-self" components are already very "weak", possibly due to the relatively "far" robot-robot distances. Despite this, if we sum, for each robot (i), for each sensor (s) and for each environment, the average intensity of the "self" component, the DC offset and the peak of the ns_i, we obtain values that give us an indication of what could be the reading of the sound sensors when the "non-self" components are at their highest intensity. Since these values are higher than 1, it follows that the reading of the sound sensors saturate. From this we infer that, if not attenuated by the shadowing effect, the "non-self" plus the "self" component may be sufficient to saturate the sensors' receptive field of the receiver.

Group n.9: the significance of the Interaural Intensity Differences (IIDs)


Iridia Iridia
(a) (b)
Iridia Iridia
(c) (d)
Iridia Iridia
(e) (f)
Iridia Iridia
(g) (h)

The graphs show the percentage of failure during 1200 trials with disruptions applied to: (a) robot Ral sound sensor S1 in Env. L; (b) robot Ral sound sensor S2 in Env. L; (c) robot Ral sound sensor S1 in Env. R; (d) robot Ral sound sensor S2 in Env. R; (e) robots Rir sound sensor S1 in Env. L; (f) robots Rir sound sensor S2 in Env. L; (g) robots Rir sound sensor S1 in Env. R; (h) robots Rir sound sensor S2 in Env. R. The disruptions concern the decrease of the the IIDs of the percentage indicated on the horizontal axis. The black area of the bars refers to the percentage of trials terminated without collisions and with the group not having reached the target. The light grey area of the bars refers to the percentage of trials terminated due to robot-robot collisions. The dark grey area of the bars refers to the percentage of trials terminated due to robot-wall collisions.