Content area

Abstract

Inspired by bird flight, flapping‐wing robots have gained significant attention due to their high maneuverability and energy efficiency. However, the development of their perception systems faces several challenges, mainly related to payload restrictions and the effects of flapping strokes on sensor data. The limited resources of lightweight onboard processors further constrain the online processing required for autonomous flight. Event cameras exhibit several properties suitable for ornithopter perception, such as low latency, robustness to motion blur, high dynamic range, and low power consumption. This article explores the use of event‐based vision for online processing onboard flapping‐wing robots. First, the suitability of event cameras under flight conditions is assessed through experimental tests. Second, the integration of event‐based vision systems onboard flapping‐wing robots is analyzed. Finally, the performance, accuracy, and computational cost of some widely used event‐based vision algorithms are experimentally evaluated when integrated into flapping‐wing robots flying in indoor and outdoor scenarios under different conditions. The results confirm the benefits and suitability of event‐based vision for online perception onboard ornithopters, paving the way for enhanced autonomy and safety in real‐world flight operations.

Full text

Turn on search term navigation

© 2025. This work is published under https://creativecommons.org/licenses/by/4.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.