It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
This article introduces a lightweight and efficient model for measuring applications, aimed at enhancing the current UAV monitoring system. The primary objective of this project is to develop a measuring application capable of determining and displaying the distance between the camera on the UAV and the facial model. The YOLOV8 framework is employed as a detection model to identify and interpret objects within the region of interest. Additionally, the algorithm incorporates the concept of focal length in lenses to calculate the distance between the facial expressions of a human face and the camera. To assess the algorithm’s accuracy, facial models were placed at various distances from the camera during testing. The predicted distance values obtained through the algorithm were then compared to the actual measured distances using a measuring tape. The results demonstrated a maximum tolerance of ±0.9 cm, indicating the algorithm’s reliable performance in predicting distance measurements.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
1 Faculty of Electrical Engineering and Technology, Universiti Malaysia Perlis, 02600 Arau , Perlis , Malaysia; Centre of Excellence for Unmanned Aerial Systems (COE-UAS), Universiti Malaysia Perlis, Block E, Pusat Perniagaan Pengkalan Jaya, Jalan Kangar – Alor Setar, 01000 Kangar , Perlis , Malaysia
2 Centre of Excellence for Unmanned Aerial Systems (COE-UAS), Universiti Malaysia Perlis, Block E, Pusat Perniagaan Pengkalan Jaya, Jalan Kangar – Alor Setar, 01000 Kangar , Perlis , Malaysia
3 Centre of Excellence for Unmanned Aerial Systems (COE-UAS), Universiti Malaysia Perlis, Block E, Pusat Perniagaan Pengkalan Jaya, Jalan Kangar – Alor Setar, 01000 Kangar , Perlis , Malaysia; Department of Medical Equipment Technology Engineering, College of Engineering Technology, Al-Kitab University , Kirkuk , Iraq
4 Centre of Excellence for Unmanned Aerial Systems (COE-UAS), Universiti Malaysia Perlis, Block E, Pusat Perniagaan Pengkalan Jaya, Jalan Kangar – Alor Setar, 01000 Kangar , Perlis , Malaysia; Department of Robotics and Mechatronics Engineering, Kennesaw State University, Marietta, GA , 30067 , USA