John Pisokas, Christian Rauch
Thu 22 Nov 2018, 12:45 - 14:00
Informatics Forum (IF-4.31/4.33)

If you have a question about this talk, please contact: Jodie Cameron (jcamero9)

Title: Spinning and Reeling Flies - The Heading Encoding Circuit in the Insect Brain 

Speaker: John Pisokas

Abstract:

Insects are able to keep track of their heading as they move around their environment. This ability is crucial for many behavioural tasks. A brain circuit providing the necessary computation has been identified in several insect species. Interestingly, this circuit has a very similar neural connectivity pattern across species and yet some important differences. This raises the question of how the details of the connectivity affect the circuit’s function.

We show that the anatomy of the circuit found in Drosophila allows much faster response to heading changes than the one found in other insects. This finding matches the behavioural capabilities of flies that are able to perform fast turns, while other species are not. We suggest that the circuit identified in Drosophila is an evolutionary specialisation, possibly derived from a simpler circuit present in other insects. 

Title:

Learning Driven Coarse-to-Fine Articulated Robot Tracking

 

Abstract:

In this work we present an articulated tracking approach for robotic manipulators, which relies only on visual cues from colour and depth images to estimate the robot's state when interacting with or being occluded by its environment. We hypothesise that articulated model fitting approaches can only achieve accurate tracking if subpixel-level accurate correspondences between observed and estimated state can be established. Previous work in this area has exclusively relied on either discriminative depth information or colour edge correspondences as tracking objective and required initialisation from joint encoders.

In this work we propose a coarse-to-fine articulated state estimator, which relies only on visual cues from colour edges and learned depth keypoints, and which is initialised from a robot state distribution predicted from a depth image. We evaluate our approach on four RGB-D sequences showing a KUKA LWR arm with a Schunk SDH2 hand interacting with its environment and demonstrate that this combined keypoint and edge tracking objective can estimate the palm position with an average error of 2.5cm without using any joint encoder sensing.