Research group: Machine vision and mobile robotics

With the interaction of autonomous or partially autonomous mobile systems, such as robots or automobiles for example, with their environment, a number of problems arise that have to be solved with these systems. How do I recognize the condition of my environment? In what state am I currently? How do I predict my state and that of my environment? How do I navigate through my environment? How do I solve specific problems in this environment? Etc.

To solve these problems, autonomous mobile systems must be equipped with sensors. In this way, they are capable of receiving data from the environment and of obtaining information on the environment from it. The special aspect of these active sensor systems, as for example cameras or microphones that are mounted on a mobile robot, is the fact that they can collect sensor data at different times, in different locations, and under different observations.

Two exciting questions come to mind:

• How do I model the connection between raw sensor data and the condition of the environment?

• How can I effectively use the actuator of the sensors to improve state estimation?

It is clear that the coupling of state estimation and state controlplays an important role in answering both of these questions. We are therefore interested in modeling the interaction between state estimation and state control.

Research area: Dynamic Computer Vision

This area of research includes multiple internal and external projects. The common goal is the robust extraction of visual states from video sequences recorded by static and active camera systems to describe visual scenes with contents that vary with time. The main point can be reduced to two questions:

  • How can one influence the visual data using active cameras or virtual control variables so that visual conditions can be estimated more effectively?
  • How does one connect the individual visual state variables over the time and the image space so that the visual state as a whole can be estimated with as few errors as possible?

For this, different methods from the fields of dynamic state estimation, control unit design, photogrammetry, and machine learning are used and further developed.


  • Virtual control variables in dynamic visual state estimation.
  • Stochastische dynamische Systeme in der Bildverarbeitung
  • Dynamic-adaptive systems for estimating the optical flow.
  • Analysis of movement patterns (in cooperation with the Honda Research Institute Europe GmbH).
  • Dynamic movement segmentation (in cooperation with the Honda Research Institute Europe GmbH).
  • Mobile ToF cameras (supported by PMD Technologies GmbH).
  • Visual indoor positioning with mobile phone cameras (in cooperation with the Geodetic Institute at TUD).
  • IMU supported visual odometry (in cooperation with Thorsten Graber from the Institute of Flight Systems and Automatic Control at TUD as part of the graduate program Mixed Mode Environments)

Research area: Cooperative Mobile Robotics

This research area deals with control processes and state estimators for distributed mobile systems for solving tasks in collaboration. The research area is financed by DFG as part of theMixed Mode Environments graduate program.


• DisCoverage: A new paradigm for multi-robot exploration.

• Asynchronous distributed state estimation.

• Synchronization of dynamic systems with an observer approach.

Research area: Sensor processing in driver assistance systems

This area of research focuses on calibration routines, sensor fusion methods, filter algorithms, and mapping processes to achieve stabile environmental monitoring for driver assistance systems. The research area is financed by Continental AG as part of theProreta 3 project.

back to the listing of the research topics