Dario Differt
Thu 09 Jul 2015, 13:00 - 14:00
Meeting room 1.15, IF

If you have a question about this talk, please contact: Steph Smith (ssmith32)

Many navigational algorithms are limited to movements in the plane and assume that the agent is not tilted (e.g. indoor cleaning robots).
Assuming that the agent is equipped with a fish-eye lens (providing a cylindrical projection of the visual field), a visual compass can be applied to two recorded images (I1 and I2) in order to detect the agent's change of orientation between different locations.
A visual compass performs an exhaustive search for the minimal image distance between I1 and I2 for all possible orientations of the agent.
This allows us to realign the images, such that navigational algorithms can be applied better.
However, for many applications (e.g. lawn mowing) this assumption does not hold and an unexpected tilting of the agent may lead to unspecified results.
Due to the increased complexity of free 3D-rotations, the computation time of a visual compass is not bearable for this case.
In this talk I will present a visual compass based on spherical harmonics, which is able to deal with arbitrary 3D-rotations in real-time.
Similar to the Fourier transform, we will project the visual field on a sphere and represent it by a coefficient vector (spherical harmonics).
A rotation of the visual field is then obtained by applying a special kind of rotation matrices (Wigner-D matrices).
After a brief introduction to spherical harmonics, I will present sparsity relations for rotations around the X-/Y-/Z-axes.
Furthermore, I will show that symmetries of the spherical harmonics allow us to apply the compass to hemi-spherical images (e.g. fish-eye cameras) instead of full-spherical images.