MISS Meeting 2005-04-01

From IridiaWiki
Jump to navigationJump to search

What is that document ?

This is the report of a MISS meeting held on friday april 1st, 2005. People present : Alexandre Campo, Anders Christensen, Roderich Gross, Halva Labella, Shervin Nouyan, Rehan O'Grady, David Tran-Dinh, Vito Trianni, Elio Tuci.

After a tough work of David Tran-Dinh on a new version of MISS, the simulator of IRIDIA, we have made a big meeting with all the people interested in working with the swarmbots and designing experiments in simulation before carrying them out in reality. The goal of this meeting was to describe the simulator to everybody. People could also express their needs about it. This way we can from now focus on tuning the work. We want users to be able to use the sotware easily and quickly for simulating experiments.

In this document we shall summarize the main features of this new version of the simulator, then the questions asked and eventually the resulting todo list.

Features of the simulator

  • multi-plateform on linux (probably unix) and windows.
  • 3d view you can have a graphical display of your simulation.
  • .me and .xml files are used they contain world description and all informations needed to design an experiment. Everything was thought to prevent you from programming a lot. The .me files of vortex are not compatible with the new ones, but there is a good agreement between the two formats and it should not be too hard to convert them.
  • console an interactive command line is provided in the graphical display. You can introduce all the commands to the simulator by this means.
  • key bindings it is possible to assign any keys to any commands thanks to a configuration file.
  • ray sensors are a general way to build sensors. They can be used as groud sensors or proximity sensors for example.
  • gripper so far, the gripper is accuretely designed and needs some improvement.
  • trajectory drawing on the graphical view, it is possible to see the trajectory o the objects with a trace.
  • different proximity sensors it is possible to have proximity sensors in 3 ways. The user is free to choose the best tradeoff between speed and quality.
  • cloning you can't load an object and replicate it 10 times, but instead of that you can load it 10 times.

About description files

Description files are .me and .xml files where you can specify almost everything about your experiments. It is also possible to add some custom code to the simulator when the classical mechanisms provide are not enough. Several characteristics of the description files are reported here :

  • same body as with vortex
  • can choose maximal force
  • due to numerical integration errors, there is always a possibility to have a numerical explosion. This would completely mess up experiments. To avoid that problem one should take care of the metrics employed. The best is to scale values as close as possible to 1. It is also thought that for the sake of clarity people that design experiments should use experimental values (in agreement with reality) expressed in the International Metric System. We might decide to put a scaling factor to allow both an accurate description of the experiment and a smart way to deal with numerical explosions.
  • physical connections between objects are described with joints in ODE. Several actions can be associated to a joint in the description files. Two types of actions are available. First, the joint action which is a generic a set of generic behaviours provided by the simulator. Second, the user action which is a piece of code written by the user. For example the movement of the wheels of a vehicle can be set using a joint action. On contrary the shooting action of a tank is implemented through a call to a user action.
  • the material table has changed. Trying to stay very close of the ode documentation.
  • samples for proximity sensors are already available
  • sound sensor is implemented and returns a signal that is a function of the distance of the emitter. This feature has to be improved. We would like to be able to emit at different frequencies and intensities. In the same time we want a robot to perceive a range of frequencies and the intensities of the sounds modulated by a function of the distance.
  • nothing implemented for the camera so far.
  • proximity sensors can be implemented with the use of samples (taken experimentally) or rays shooted by the software.
  • ray sensors could be used as proximity sensors.
  • a joint feedback is available. It allows to know what force is applied between two objects (actually it is the length of the vector between 2 objects jointed)


  • Anders : What is the performance for a formation ? The robustness of errors ? Is the simulator able to handle to exponentially growing complexity of that specific configuration ? Answer : we don't have any data to answer properly to that questions. A working example should be made to demonstrate the performances of the simulator in that situation.
  • Rodi : Is it possible to run the simulator without graphics ? Answer : yes, you have to remove on line of code. This can easily be implemented as an option on the command line for example.
  • Vito : How is the gripping implemented ? Answer : there is a ``grippable flag on objects. A collision mechanism is used to determine which object can be grasped when grasping is required.
  • Rodi : Have any tests with a detailed model of the sbots been carried out ? Answer : so far no. No detailed model of the sbots is available yet. We could try to import the vortex files of the previous simulator.
  • Alex : How is handled the noise in the simulation ? Answer : noise is handled using a homemade method. The range of noise could be specified in the xml part. We should use a different random number generator to achieve a statistical accuracy. Let's use gsl, let's specify in xml the noise of the sensors.
  • Rodi : Is there any documentation available ? Answer : not really. It is required to write a manual. Structure of xml files, basic mechanisms of the code. How to make a custom simulation.
  • Halva : Is there a support for the emulation of libsbotapi ? Answer : no. We should do it, but it is a though work.
  • Halva : About the camera sensor : what is available ? Answer : nothing available. It is possible to implement a samples based mechanism.

List of sensors and actuators required in simulation

Much work has been done so far, but we need to take into account the needs of all the potential users of swarmbots. Thus we provide a specific section to describe all the required sensors and actuators. We also specify the current state of the implementation for each item.


  • light sensors
  • sound sensors (todo) we should be able to perceive frequencies and emit frequencies
  • ground through ray sensors (todo)
  • traction sensor (todo)


  • anything with joints can be specified in xml

TODO list

At the end of this meeting we have gathered many questions and critics that lead us to write a TODO list. Items are not yet assigned to developers. A wiki page will be set so that we can distribute tasks among swarmbots people. A schedule should also be decided in order to get a usable simulator as fast as possible.

  • set colored leds on sbots. Must be dynamically modifiable, either the whole ring at once, either leds by leds.
  • emulation layer of sbotapi
  • camera sensor implemented with experimental samples (useful only for 2d experiments)
  • camera sensor implemented with environmental mapping (more general emulation, but not realistic)
  • trying to run a simple evolution
  • write a standard sbot.me and provide a good enough model of the sbots to start working
  • design a simple example of a moving swarm
  • proximity sensors and handling of noise (complicated stuff between David and Vito...) -> use GSL for generating noise
  • benchmark of the new simulator -> video, physics, controller, accuracy tested against real experiments especially for the sensors
  • write documentation in latex (Getting started, overview of the code, working examples)
  • replace fmod, use SDL instead for example. At least provide a flag to decide at compilation time whether we use it or not.
  • ground sensors using ray sensors (find an easy way to specify the ray sensor position)
  • compute if shadow is needed and then compute the results of shadow algorithm (reporter didn't understand that point... sorry if it is mistake)
  • implementation of sound : custom action for emitting (intensity and frequency) and something for receiving
  • wiki page and cvs, rename the project to ? (Vito choose a name !)
  • put ode macros somewhere else so ode is not part anymore of the simulator. Right now we still need a modified version of ODE to have support of cylinders.
  • write configure script (bash script is enough, don't try to deal with the complicated stuff of automake/autoconf)
  • write a set of examples of how to use each sensor in the simulation and each actuators as well
  • assign tasks and schedule to everybody
  • use CVS

Examples to write

We describe here some basic working examples that must be written. This will help to demonstrate the capabilities of the simulator and provide an easy way to understand how to get started with it as well.

  • Braitenberg vehicle : display a very simple obstacle avoidance
  • same as previously but goes toward a light in addition
  • go toward a prey, grasp it.
  • make a swarm of 3x3 robots. All robots move in the same direction except one.