Benjamin Risse and Radim Tylecek
Thu 09 Mar 2017, 12:45 - 13:45
IF 4.31/4.33

If you have a question about this talk, please contact: Magda Nowak (mnowak)

snacks will be served, this week sponsored by the kind donation of IPAB visiting scholar, Zhengang Nie

To understand the means by which animals pilot their complex environments, an accurate account of their natural foraging behaviour is required, providing both inspiration and validation for hypotheses. Classic studies relied upon manual methods which are imprecise and labour intensive. In a recent review Dell et al (2014) made a call for developement of automated vision-based tracking systems that are easy-to-use and reliable to bridge this gap. The difficulty of tracking in complex habitats is mainly caused by clutter, occlusions, low target contrast / resolution, dynamic illumination and shadows, the unpredictable area of interest and other non-controllable factors. 
In my talk I will present the first unconstrained monocular tracking approach called 'HabiTracks' allowing detection of even tiny animals (~ 5 pixel) plus extraction of motion information over time. Off-the-shelf hand-held cameras, smart phones or drones can be used to freely follow and record the animals in their natural habitats. Subsequently camera motion compensation is applied to extract positional probabilities based on the animal's motion only. The positions are then refined using a global-optimisation scheme, in conjunction with the camera motion information, to determine the trajectory of the animal across the video. Since no additional cues are involved, this approach is robust against occlusions, background clutter and shadows, and is applicable to track a variety of animal species. I will demonstrate the accuracy and flexibility of 'HabiTracks' in a variety of examples including tiny insects in cluttered deserts (using day and night vision), vertebrates recorded using a manually operated drone and artificial objects in urban environments imaged using a mobile phone.

Dell, A. I., Bender, J. A., Branson, K., Couzin, I. D., de Polavieja, G. G., Noldus, L. P. J. J., et al. (2014). Automated image-based tracking and its application in ecology. Trends in Ecology & Evolution, 29(7), 417–428."

Please note: Since I'm currently working on the 'HabiTracks' manuscript any feedback is more than welcome.


Radim Tylecek


Semantic Modelling of Natural Scenes for Outdoor Robot Navigation



Navigation systems for mobile robots usually require a metrical map for trajectory planing and interaction with objects of interest.

Obtaining accurate metrical map can be hard, especially for dynamic natural scenes that undergo seasonal and other changes.

In the course of our garden trimming robot project (TrimBot) I will introduce challenges we are facing and how a metric map can be useful to tackle them. I will outline our approach to obtain a semantic metrical map that is based on non-rigid registration of a sketched map to 3D data.