How to update the activity of pose cells in RatSLAM?

The excerpt note is from Michael et al., 2008, which explains the detail process to update the activity of pose cells in RatSLAM.

Milford, Michael J., and Gordon F. Wyeth. “Mapping a suburb with a single camera using a

Be the First to comment. Read More

How to build robust, visual-inertial state estimation for autonomous navigation?

Robust, Visual-Inertial State Estimation: from Frame-based to Event-based Cameras

This lecture was presented by Professor Davide Scaramuzza at Affiliation University of Zurich, ETH Zurich in September 25, 2017 on Series Microsoft Research Talks.

Professor Davide Scaramuzza presented main algorithms to …

Be the First to comment. Read More

Why drones can learn to navigate autonomously by imitating cars and bicycles?

Drones learn to navigate autonomously by imitating cars and bicycles powered by AI

January 23, 2018, by University of Zurich

Developed by UZH researchers, the algorithm DroNet allows drones to fly completely by themselves through the streets of a city

Be the First to comment. Read More

How to enable robot cognitive mapping inspired by Grid Cells, Head Direction Cells and Speed Cells?

Taiping Zeng, and Bailu Si. “Cognitive Mapping Based on Conjunctive Representations of Space and Movement.” Frontiers in Neurorobotics 11 (2017).

In this work, the researchers developed a cognitive mapping model for mobile robots, taking advantage of the coding …

Be the First to comment. Read More

How to perform robot place recognition with multi-scale, multi-sensor system inspired by place cells?

Adam Jacobson, Zetao Chen, Michael Milford. Leveraging variable sensor spatial acuity with a homogeneous, multi-scale place recognition framework. Biological Cybernetics, Jan 20, 2018, https://doi.org/10.1007/s00422-017-0745-7 .

This paper presented a biologically inspired multi-scale, multi-sensor place recognition system that incorporates the …

Be the First to comment. Read More

How Self-Motion Updates the Head Direction Cell Attractor?

Laurens J., et al., 2018 reviews head direction cells. They propose a quantitative framework whereby this drive represents a multisensory self-motion estimate computed through an internal model that uses sensory prediction errors of vestibular, visual, and somatosensory cues to improve …

Be the First to comment. Read More

How to determine the instantaneous head direction by a population vector scheme?

The excerpt note is from the Song et al., 2005 paper. The example code is implemented by Fangwen. The whole example code is here.

Pengcheng Song, and Xiao-Jing Wang. “Angular path integration by moving “hill of activity”: a

Be the First to comment. Read More