Benjamin Risse
Thu 26 May 2016, 12:45 - 13:45
4.31/33

If you have a question about this talk, please contact: Steph Smith (ssmith32)

For over 100 years scientists have sought to reveal the underlying mechanisms allowing insects to visually navigate their complex worlds.  Crucial to this endeavour is the validation of hypotheses against real animal data in identical experimental scenarios. To date, this has largely been limited to laboratory or sanitised field settings leading to the risk of non-obvious but fundamental cues being overlooked. In this work we describe our efforts to resolve this issue by creating a holistic test-bed complete with real ant paths and an accurate model of their habitat in which the data was gathered allowing navigational models to be trained, tested and validated in the ant’s world. Firstly, we describe a novel semi-automated visual tracking approach used to record the position and pose of ants both as they forage naturally and then undergo experimental manipulations in their natural environment. We then describe a novel software pipeline able to convert 56 laser scans into a photorealistic mesh of the 800sqm area surrounding the ant nest complete with an undulating ground surface and over 1,700 individual plants. Finally, we show how the resultant habitat model can be used to recreate the animal’s visual perception allowing for model testing and validation in a hitherto unprecedented detail.