Software Project - Rolling Swarm

 Lecturer:

  • Sanaz Mostaghim
  • Michael Preuss
  • Christoph Steup

Language: 

The course will be held in English or German as requested by the participating students.

Participants:

All students of bachelor curriculums of the faculty are eligable to visit the course. However it is beneficial if students are familiar with at least one of the following topics:

  • Robotics
  • Programming in C++ or Python
  • Technical Computer Science
  • Communications and Network
  • Robot Operating System (ROS)

First Meeting: 

The first meeting will be used to organize the project. We will describe the available tasks, the used system and group the interested students. This Kickoff-Meeting will be held on 12.04.17 at 13:00 in SwarmLab (G29-035). If you want to participate, this meeting is mandatory.

Organization:

The course will be taken in groups of 3-4 Students per topic. The  students and the groups will be chosen by us depending on your background. The individual topics are not fully fixed, extensions and modifications are possible depending on the skills and interest of participating students. This will be discussed in the first meeting. The result of each project is a working demonstration with commented source code and a written documentation indicating the general concept and a How to to start the demo.

Hardware:

The robots are connected to one or more PCs using Bluetooth. The PCs run the Robot Operating System (ROS). This routes the sensory information and actuation commands between your programs and the lab equipment dynamically. The position of the robots is tracked by a camera system and provides as a 2D-position with high accuracy and update rate.

Available Topics:

 

Update of Color-based Tracking of Robots in the Arena (Vision++)

We got a new version of the Spheros called SPRK+, which comes with a clear cover enabling more detailed detection of LEDs with our tracking camera. This new information enables new possibilities like the detection of the orientation of the robots without moving them. Additionally, we want to enhance the tracking to cope with loss of tracking information to enabel a robust behaviour. The task will be to modify and enhance the current camera based robot detection system to make it more reliable and output orientation information even if robots are not moving. Additionally, the new robots need to be evaluated regarding their detectability and regarding the amount of colors,  which  are distinguishable to identify individual robots. These evaluations shall  be done with robots moving and non-moving.

Adoption of the ROS-Navigation stack  for the Robots (Nav-Stack)

ROS already provides an extensive software suite to navigate robots in known environments. This Nav-Stack is well adopted in lost of existing robots. However, the shperos are differen, because they do not contain any obstacle detection sensors on board.  The goal of this task is to use the tracking information of the camera together with a statically created map to enable the usage of the ROS Nav-Stack with the spheros for trajectory planning and execution. This enables the sphero to efficiently move from one point A to a point B in the arena without hitting static obstacles like walls.  Additionally, we want to evaluate how well the Nav-Stack performes to avoid collision with moving obstackles, in our case other spheros.

Interactive Racing  Scenario (Race)

 For the demonstration purposes we want to have an interactive scenario, where users can compete against autonomous robots. For this project we want to have a racing scenario, where spheros autonomously follow a static race track and try to optimize their lap time. User shall be able to compete against the robots by controlling a sphero using a gamepad. The goal  of this task is to implement the necessary functionalities like track following of the spheros, aquisition of lap times for different spheros on the same track and the gamepad control  of a single sphero.

 

Last Modification: 28.03.2017 - Contact Person: Webmaster