Full text

Turn on search term navigation

© 2017. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

This paper introduces an event-based luminance-free feature from the output of asynchronous event-based neuromorphic retinas. The feature consists in mapping into a matrix, the distribution of the optical flow along the contours of the moving objects in the visual scene. Asynchronous event-based neuromorphic retinas are composed of autonomous pixels, each of them asynchronously generating ''spiking'' events that encode relative changes in pixels' illumination at high temporal resolutions. The optical flow is computed at each event, and is integrated locally or globally in a speed and direction coordinate frame based grid, using speed-tuned temporal kernels. The latter ensures that the resulting feature represents equitably the distribution of the normal motion along the current moving edges, whatever their respective dynamics. The usefulness and the genericness of the proposed feature are demonstrated in pattern recognition applications: local corner detection and global gesture recognition.

Details

Title
A Motion-Based Feature for Event-Based Pattern Recognition
Author
Clady, Xavier; Maro, Jean-Matthieu; Barré, Sébastien; Benosman, Ryad B
Section
Original Research ARTICLE
Publication year
2017
Publication date
Jan 4, 2017
Publisher
Frontiers Research Foundation
ISSN
16624548
e-ISSN
1662453X
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2305550628
Copyright
© 2017. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.