Content area
Inspired by bird flight, flapping‐wing robots have gained significant attention due to their high maneuverability and energy efficiency. However, the development of their perception systems faces several challenges, mainly related to payload restrictions and the effects of flapping strokes on sensor data. The limited resources of lightweight onboard processors further constrain the online processing required for autonomous flight. Event cameras exhibit several properties suitable for ornithopter perception, such as low latency, robustness to motion blur, high dynamic range, and low power consumption. This article explores the use of event‐based vision for online processing onboard flapping‐wing robots. First, the suitability of event cameras under flight conditions is assessed through experimental tests. Second, the integration of event‐based vision systems onboard flapping‐wing robots is analyzed. Finally, the performance, accuracy, and computational cost of some widely used event‐based vision algorithms are experimentally evaluated when integrated into flapping‐wing robots flying in indoor and outdoor scenarios under different conditions. The results confirm the benefits and suitability of event‐based vision for online perception onboard ornithopters, paving the way for enhanced autonomy and safety in real‐world flight operations.
Details
; Luna‐Santamaria, Javier 1
; Gutierrez Rodriguez, Ivan 1
; Rodríguez‐Gómez, Juan Pablo 1
; Martínez‐de Dios, José Ramiro 1
; Ollero, Anibal 1
1 GRVC Robotics Lab, University of Seville, Seville, Spain