1. Introduction
Unmanned aerial vehicles (UAVs), commonly known as drones, are gaining considerable interest in applications such as surveillance, mapping, and remote sensing [1]. The growing interest in the use of UAVs is based on many factors, including the cost of acquiring these systems, the availability of trained operators, low risk to human life, and ease of use. Due to these advantages, as well as offering a good resolution and tracking capabilities, they are starting to be used increasingly in more fields.
After they were first used in geomatics applications, providing alternatives to classical photogrammetry [2] and 3D mapping [3] to present data in a suitable format for architects and engineers [4], UAVs became commonly used tools for data acquisition. Through photogrammetry techniques and remote sensing, structure from motion (SfM) applications allow for the creation of 3D models of different objects, buildings, or areas [5].
In the last few years, UAVs have found their applicability in the field of civil engineering, especially in transportation engineering, in order to supervise and monitor traffic [6]. The main benefit of traffic monitoring with UAVs is that they can be deployed to many different places where, for example, a local council may want to gather information on the use of infrastructure, such as roads, bridges, train tracks, and so on. The same goes for monitoring people and animals for conservation purposes [7,8], or, more recently, they have been used to combat the coronavirus disease (COVID-19) pandemic [9]. Because of their mobility, traffic monitoring UAVs are able to collect high-resolution data, which can then be analyzed in real time. The results can then be displayed or printed, and in some cases sent to a central server or cloud for further analysis.
The growth in traffic volume and the growth of global travel makes traffic monitoring a problem of interest and a major challenge in many countries around the world. In this context, it is expected that UAVs will be an emerging solution to this challenge [10]. The bird’s eye-view of the camera provided by UAVs improves the traditional methodologies used in traffic monitoring [11], but the recognition and tracking of moving vehicles still remains a challenging problem, depending on the accuracy of image registration methods [12].
The use of UAVs for monitoring purpose is a relative new emerging field that requires development and validation of new solutions [13]. UAVs represent a potential solution to support many aspects of the existing traffic monitoring systems such as surveillance and collision avoidance [14]. In a similar way, UAVs have been applied for the monitoring of environmental parameters, e.g., air pollution, land surface temperature, flood risk, forest fire, road surface distress, land terrain monitoring etc. [15,16,17,18,19,20], but also for pedestrian traffic monitoring or disaster evacuation [21,22,23].
Currently, UAVs are very vulnerable to adverse weather conditions, such as wind, fog, and rain [24]. However, there have been significant efforts to improve the robustness of the systems using different types of sensors. For instance, GPS-enabled UAVs have been shown to provide a reasonable degree of robustness and accuracy in challenging environments [25]. Moreover, the use of inertial measurement units (IMUs) for UAV stabilization has received significant attention and has been applied to different types of UAVs including quadcopters [26].
In traffic monitoring, precision is needed to collect and send real-time vehicle data to traffic processing centers for efficient traffic management. This is of special significance to cities where traffic and road conditions are monitored every day. In most cases, wireless sensors are deployed on the road and connected to each other via wireless communication networks to obtain real-time traffic data within the intelligent transportation system (ITS) [27]. In addition, in the case of vehicle monitoring, it is necessary to identify the speed, distance, and current location of the vehicle. In this context, UAVs have significant advantages in traffic information collection because they provide a global perspective of the road and they can obtain traffic parameters that cannot be extracted by conventional monitoring methods [28].
Traffic monitoring represents a challenge not only for police and traffic authority departments, but also for individual drivers. There is great potential in UAVs for assisting drivers in a variety of traffic-related applications, including safety, incident detection, and vehicle tracking [11]. Some problems that can be solved with UAVs in the future are: traffic congestion [29,30], collision avoidance [31], safety analysis [32], and roundabout flow analysis [33]. The driver assistance can be provided via UAV-to-car communication [34].
The aim of this paper is to analyze the main applications regarding the use of UAVs in traffic monitoring. A systematic literature review was conducted for this purpose and the results provide a base for future research and development in this field. The study also highlights the current surveys related to the use of UAVs in the civil engineering field.
2. Related Work
Research on the on the use of UAV in civil engineering related to transportation is relative limited, including several literature reviews that summarize a wide range of applications (Table 1). These studies address the following topics:
In [35], a review optimization approaches for drone operations and drone–truck combined operations in civil applications is provided. Drone operation and applications, some previous works, and issues like mathematical models, solution methods, and synchronization between a drone and a truck are presented in the study, also suggesting some possible research directions.
The recent advances of UAVs and their roles in current and future transportation systems are presented in [10]. The paper summarizes the emerging technologies of UAV in transportation, highlighting performance measures, network and communications, software architecture, privacy, and security concerns. The challenges and opportunities of integrating UAVs in ITS are discussed and some potential research directions are identified in the paper.
In [36], a literature review of 111 publications related to the use of civil drones for transportation is provided. The focus is on passenger transportation drones, but applications from the urban and transportation planning fields are also reviewed. Potential problems are identified, and proposed solutions are given for different areas of application.
Emerging issues in civilian UAV usage and case studies for various fields are presented in [37], a review article that tries to analyze the potential implementations of drones in the economic system and how these implementations can be managed.
The state of the art of UAV for geomatics applications is reported in [3]. The survey gives an overview of different UAV platforms, also presenting various applications, approaches, and perspectives for UAV image processing.
Ref. [38] provide an extensive review of optimization approaches for the civil application of UAVs. The study addresses different aspects related to UAV operation, such as area coverage, search operations, routing, data gathering and recharging, communication links, and computing power.
In [11], the applications of UAVs in three domains of transportation (road safety, traffic monitoring, and highway infrastructure management) are reviewed. The paper discusses topics related to vision algorithms and image processing systems used in accident investigation, traffic flow analysis, and road monitoring.
An overview of advances in the vision-based condition assessment of civil infrastructure, civil infrastructure inspection, and monitoring applications is presented in [39]. The study reviews relevant findings in computer vision, machine learning, and structural engineering, highlighting some key challenges and concluding with ongoing work.
Another study [40] presents the research on using UAVs for vehicle detection by means of deep learning techniques. The work is focused on accuracy improvements and computation overhead reduction, showing similarities and differences of various techniques.
A comprehensive study focused on UAV civil applications and their challenges is presented in [12]. Research trends, key challenges related to charging, collision avoidance, networking and security, and future insights are featured in the paper.
In [41], a critical review of UAVs remote sensing data processing and their application is performed, focusing on land-cover classification and change detection and discussing potential improvements and algorithmic aspects.
Related review papers on UAV application in civil engineering.
No. | Ref. | Title | Year | Journal | Application Domain |
---|---|---|---|---|---|
1 | [35] | Optimization for drone and drone-truck combined operations: A review of the state of the art and future directions | 2020 | Computers and Operations Research | civil applications including construction/infrastructure, agriculture, transportation/logistics, security/disaster management, entertainment/media, etc. |
2 | [10] | Advances of UAVs toward Future Transportation: The State-of-the-Art, Challenges, and Opportunities | 2021 | Future Transportation | transportation sector: surveillance, urban planning, traffic monitoring, emergency response, road maintenance and safety, warehouse inventory management, UAV delivery, disaster management, search and rescue |
3 | [36] | Drones for parcel and passenger transportation: A literature review | 2020 | Transportation Research Interdisciplinary Perspectives | safety and security, environment and sustainability, urban planning and infrastructure |
4 | [37] | Managing the drone revolution: A systematic literature review into the current use of airborne drones and future strategic directions for their effective control | 2020 | Journal of Air Transport Management | monitoring, inspection and data collection, photography/image collection, recreation, logistics |
5 | [3] | UAV for 3D mapping applications: a review | 2014 | Applied Geomatics | archeological site 3D recoding and modeling, geological and mining studies, urban areas, |
6 | [38] | Optimization approaches for civil applications of unmanned aerial vehicles (UAVs) or aerial drones: A survey | 2018 | Networks | agriculture, environmental protection and disaster management, rescue, transport, infrastructure and construction, air traffic management, manufacturing, traffic surveillance, telecommunications, entertainment and media |
7 | [11] | Applications of unmanned aerial vehicle (UAV) in road safety, traffic and highway infrastructure management: Recent advances and challenges | 2020 | Transportation Research Part A | road safety, traffic monitoring and highway infrastructure management |
8 | [39] | Advances in Computer Vision-Based Civil Infrastructure Inspection and Monitoring | 2019 | Engineering | Inspection, monitoring |
9 | [40] | A survey of deep learning techniques for vehicle detection from UAV images | 2021 | Journal of Systems Architecture | traffic management—vehicle detection |
10 | [12] | Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges | 2019 | IEEE Access | search and rescue, remote sensing, construction and infrastructure inspection, precision agriculture, delivery of goods, real-time monitoring of road traffic, surveillance, providing wireless coverage |
11 | [41] | Unmanned Aerial Vehicle for Remote Sensing Applications—A Review | 2019 | Remote Sensing | precision agriculture and vegetation, urban environment and management, disaster, hazards and rescue |
Despite the diversity of UAV analyses in transportation-related areas, less attention has been paid to advances in traffic monitoring techniques using UAV data, which are briefly addressed in the presented studies. Thus, to our knowledge, there is no overview of the acquisition and processing of data received from UAVs in traffic monitoring applications in urban areas. There is only one slightly older conference paper that deals exclusively with this topic, presenting the advantages and disadvantages of various researches in universities and research centers [42]. This paper is therefore a first attempt to review a study that strictly addresses this topic and opens the door to further research in drone monitoring applications that use various detection algorithms. Although there have been many articles investigating the use of UAVs in various fields (mining [43], architecture and urbanism [44], glacial and periglacial geomorphology [45], agriculture [46], geology [5], forest regeneration [47], water monitoring [48] etc.), there is no study yet to exclusively summarize the applications of UAVs in urban traffic monitoring and analysis.
The high percentage of drones in various applications can be seen in the large number of review studies that systematize the work in various fields corresponding to the latest technologies, such as: path planning techniques [49], computer vision algorithms [50], application of blockchain [51], swarm communication and routing protocols [52], configurations, flight mechanisms [53], optical remote sensing applications [54], communication and networking [55], regulation policies and technologies [56], mobile edge-computing for Internet of Things (IoT) applications [57], photogrammetry and remote sensing [58], deep learning approaches for road extraction [59], and advances toward future transportation. As shown in paper [49], it is expected that the percentage of UAVs in the transportation system to be 81% by the year 2022.
3. Materials and Methods
A systematic review of the literature covering relevant research over the last 10 years was performed. The papers were selected according to the recommendations of Systematic Review and Meta-Analysis (PRISMA) (Salameh, 2020).
3.1. Protocol and Registration
The methods and the hypothesis of the review were prepared a priori, but they were not registered on PROSPERO.
3.2. Eligibility Criteria
The papers were selected according to the following inclusion criteria: articles addressing UAV with focus on traffic monitoring or traffic analysis; articles published in English; articles published in peer-review journals; articles published from 2010 onwards; research articles.
The exclusion criteria were the following: duplicate articles; articles addressing the use of UAV in contexts other than car traffic; articles published in languages other than English; articles published before 2010; conference papers, book sections, editorial letters; reviews, conceptual papers. Articles that focused on simulations instead of using real-world data were also excluded.
Although they may provide interesting and valuable works, publications that did not meet these criteria were not included in the study in order to ensure a high-quality standard of investigation.
3.3. Information Sources
The research was carried out on five electronic databases: Scopus, Web of Science, Science Direct, IEEE Xplore, and Springer. The filtering facilities provided by the electronic databases were used to identify the items according to the eligibility criteria presented above. The search has been performed on 25 August 2021.
3.4. Search
Search terms included were: UAV, unmanned aerial vehicle, uncrewed aerial vehicle, drone, unmanned aerial system, traffic, transport, flow, road, analysis, monitoring, surveillance, management, observation, vehicle detection, congestion, urban, city, intersection. The keywords were combined with Boolean operators according to the search possibilities provided by each database.
The results obtained were exported to EndNote (Clarivate™, Philadelphia, PA, USA). Using this software, the duplicates were removed and an initial screening of titles and abstracts followed to extract relevant studies.
Figure 1 shows the author keywords visualization related to UAV use for traffic monitoring. It was obtained using VOS viewer software and it can be observed that several clusters are formed according to the author keywords: aerial vehicle, drone, traffic, image, communication.
3.5. Study Selection
The authors of this article (R.G.B. and E.V.B.) performed the search and selection of papers to be included in the study. When disagreements arose between the two reviewers, they were resolved by consensus. Papers were included in this review if they were relevant to a UAV system used for traffic surveillance purposes.
3.6. Data Extraction
Data extraction was performed independently by the two reviewers (R.G.B. and E.V.B) and disagreements were also resolved by consensus. The following data from each study were extracted: author, publication year, country, paper objective, UAV type, camera resolution, flying height, software technique, urban area, outcomes, vehicle type, main findings, future work. This information was added to Microsoft Excel (Microsoft, Redmond, WA, USA) for further analysis.
4. Results
4.1. Study Selection
A flow-chart diagram showing the selection process according to PRISMA guidelines is presented in Figure 2. A total of 2557 articles were found, but 191 were duplicated papers. After their removal, 2366 studies were screened by title and abstract and 2278 were excluded because they were not relevant to our study. The remaining 88 papers were selected for full-text screening. Of these articles, 54 were excluded (no full-text available: two, magazine article: one, review article: three, no relevant to the study: 48). Finally, 34 papers were considered eligible to be included in the review.
4.2. Study Characteristics
The review process identified a total of 34 studies: Ahmed et al., 2021 [60], Apeltauer et al., 2015 [61], Balamuralidhar et al., 2021 [62], Barmpounakis et al., 2018 [63], Barmpounakis et al., 2019 [64], Barmpounakis and Geroliminis 2020 [65], Brkić et al., [66], Chen et al., 2019 [67], Chen et al., 2021 [68], Guido et al., 2016 [69], Javadi et al., 2021 [70], Kang and Mattyus 2015 [71], Kaufmann et al., 2018 [72], Ke et al., 2017 [73], Khan et al., 2017 [74], Khan et al., 2018 [75], Khan et al., 2020 [76], Kujawski and Dudek 2021 [77], Li et al., 2019 [78], Li et al., 2020 [79], Liu and Zhang 2021 [28], Luo et al., 2020 [80], Moranduzzo and Melgani 2014 [81], Shan et al. [82], Wan et al., 2019 [83], Wang et al., 2016a [84], Wang et al., 2016b [85], Wang et al., 2019 [86], Wang et al., 2019 [87], Xing et al., 2020a [88], Xing et al., 2020b [89], Xu et al., 2016 [90], Zhu et al., 2018a [91], Zhu et al., 2018b [92].
The quantitative analysis of the publications is presented in Figure 3, where the distribution of papers in terms of journal, publication year, and country is presented. This was realized online [93]. As can be seen, the analyzed articles were published in top ranking journals like Automation in Construction (AC), Transportation Research: Part A (TR_A), IEEE Internet of Things (IoT) Journal and so on. Most of the articles identified have been published in Remote Sensing (six studies), then Accident Analysis & Prevention (AAP) (four studies), IEEE Access (two studies), IEEE Transactions on Intelligent Transportation Systems (two studies), Transportation Research: Part C (two studies).
Regarding the year of publication, there is a growing trend from 2014 to 2021. This is to be expected given the continuous growth of the UAVs market [94]. Further, it was considered the country where the experiment was conducted or where the research center is located. As can be seen, the country that dominates overwhelmingly in terms of the number of publications on the proposed topic is China, with 16 studies, followed by Greece with three studies and Belgium, Germany, and Italy with two studies each. For one of the studies, even if the authors belong to an institution in Switzerland, the experiment described was performed in Athens, so Greece was considered as the host country.
4.3. Synthesis of Results
The synthesis of the results is provided in Table 2, where some significant data are extracted, and Table A1, where objective findings and future work for each study are summarized.
4.4. Main Purpose of the Study
The selected works were classified according to their main purpose into two main categories, as can be seen in Table 3: traffic analysis and traffic monitoring. For each of these categories, the basic objective of the study was extracted and several subcategories could be identified. As can be observed, most studies in the first category address issues related to vehicle trajectory extraction, traffic parameter estimation, congestion analysis, or conflict evaluation. In the category of articles referred to traffic monitoring, works that mainly address vehicle detection, vehicle tracking, or vehicle collision detection were included.
The articles were classified in the two categories, taking into account the following criteria: in the traffic monitoring category, the studies focused only on the identification and/or tracking of vehicles in traffic, often in real time were included. In the category of traffic analysis, studies that present a more detailed analysis of certain traffic parameters, such as traffic density estimation, recognizing vehicle behavior, or assessing the risk of collisions were included. In most cases, traffic analysis studies include elements of traffic monitoring: vehicles are first detected, tracked, then further analyses are performed. The analysis, visualization, and interpretation of data obtained from UAV cameras requires intelligent processing systems [77]. Thus, for the trajectory extraction of multiple vehicles, several steps are required: preprocessing, stabilization, georegistration, vehicle detection and tracking, and trajectory management [74]. The first four steps are usually common for both traffic monitoring and traffic analysis applications.
There are a variety of techniques implemented in the analyzed studies that focus on vehicle detection, tracking and/or extraction of traffic parameters. The vast majority of efforts used UAVs to record a certain area and then to extract significant information from the videos. The information can be extracted manually [60], semi-automatic [63] or fully automatic [28]. For vehicle detection, some of the works used conventional computer vision techniques that are focused on feature extraction, such as interest point detection (Shi-Tomasi features) [73], scale invariant feature transform (SIFT) [67,70], histogram of oriented gradients (HOG) features [58,76], local binary patterns (LBP) [49], Viola–Jones object detection scheme [58,76], Haar-like features [56] together with classifiers like support vector machine (SVM) [81], AdaBoost classifier [49,58,76], or k-means clustering [70]. Moreover, for fully automatic techniques of tracking, traditional motion-based methods can be identified, e.g., optical flow (e.g., Kanade–Lucas algorithm) [73,74,75,84,86], background subtraction [61,74,75,77,80], particle filter [28,61,83], correlation filter [68], Kalman filter [65,75,78,82,87,92].
In addition to these classic methods, deep learning-based methods have been developed using two-step detectors like deep neural network (DNN) [70], RetinaNet [91], convolutional neural network (CNN) [28,79], Faster R-CNN [66], fully connected neural network (fcNN) [70], or one-step detectors: You only look once (YOLO) [28,70,78,82,87] and single shot multibox detector (SSD) [92]. These studies showed that deep learning-based methods are more effective than traditional computer vision techniques in traffic video analysis [92]. There are several well-defined steps that researchers follow to detect and track moving vehicles and their trajectories: pre-processing, stabilization, geo-registration, vehicle detection and tracking, and trajectory management [75].
Regarding the variables that were taken into account for the evaluation of the proposed system, some authors used parameters such as speed [60,64,78,85], traffic density [66,73], vehicle counting [77,92], vehicle trajectory [80,91], and parameters related to the performance of developed method: precision [79], accuracy [81], F1 score [87], correctness, completeness, and quality [84,90]. In the vast majority of studies, there is no difference between the types of vehicles identified, but in some of the them, vehicles are classified in various categories, like cars, buses, trucks, motorbikes, and even pedestrians are detected in several studies.
The drones used for data acquisition are of various types, the most used being produced by DJI Technology Co., Ltd., Shenze, China, especially Phantom model (in seven studies, Phantom 2, 3, and 4 were used), Inspire 1 (three studies), Mavic Pro (two studies), Matrice 100 (two studies). Thirteen studies did not mention the UAV model. The flying height varies from 50 to 281 m, but this parameter is also reported in a few studies. The resolution of images varies from 960 × 540, 24 frames per second (fps) to 5184 × 3456.
The objective of the studies, the main findings, and the future work for each selected study are presented in Table A1 from Appendix A.
5. Discussion
The applications related to traffic monitoring and analysis identified in the literature review include different techniques for vehicle detection and tracking, and estimation or extraction of different traffic parameters. The variety of approaches can be divided into two categories: conventional machine vision techniques and deep learning machine vision techniques [95]. The conventional motion-based methods use traditional machine learning and computer vision techniques to detect and track vehicles, e.g., background subtraction, optical flow, blob analysis [74], histogram of oriented gradient (HOG), Haar-like features, speeded-up robust features (SURF), and so on. The most recent techniques are based on deep learning and it has been shown that they outperform the traditional ones, providing better feature representation and processing time [70]. There are many object detection methods based on deep learning, but they are often divided into one-stage or two-stage detectors [87]. While one-stage detectors use region of interest (ROI) directly from the image, two-stage detectors use first some techniques such as region proposal network (RPN) to predict the location of potential objects [66]. YOLO and SSD are CNN one-stage detectors [79], while R-CNN, Fast R-CNN, Faster R-CNN, and Mask R-CNN are two-stage detectors [70]. Even if two-stage detectors are more advanced, they require high hardware performance [62]. Moreover, it was shown that YOLO v3 outperforms R-CNN and runs significantly faster [78].
In the following, some aspects related to the type of UAVs used for traffic monitoring purposes will be discussed. Depending on the construction of the flying mechanism, UAVs can be classified in: fixed-wing, rotary-wing and hybrid UAV [57]. Fixed-wing were prevalent for traffic monitoring applications a few years ago [75], but today small rotary-wing are preferred [11] due to the fact that they are low-cost and require less experience and training [64]. The first type of UAV have some advantages, e.g., increased flight endurance [53], faster travel and ability to carry heavier payloads [51], and ability to fly along linear distance [62], but they are larger in size and depend on airfield for take-off [96]. Rotary-wing UAVs are lighter in weight, capable for vertical take-off and landing (VTOL), provide significant advantages in enclosed or constrained environments [43], and are capable to hover and to get very close to objectives [62], providing very high spatial resolution [64]. On the other hand, they have less mobility and consume more power [51]. A compromise of this types is the hybrid fixed/rotary-wing UAV, that can provide operation modes for both high speed flight and low speed flight, including hovering [80]. However, all types presented are limited to climate factors, the presence of physical obstructions, as well as instrumental or legal factors [97]. More details, classifications and characteristics of UAVs can be seen in [50]. In the analyzed studies, the rotary-wing type predominates, especially quadcopter DJI drones (Phantom 2, Phantom 3, Phantom 4, Mavic Pro, Inspire 1 Pro, Matrice 100), Argus-One quadcopter, and hexacopters (see Table 3). This is consistent with the findings of a study showing that rotary-wing UAVs are more efficient for use in urban environments [98].
Another important issue when it comes to using UAVs for traffic monitoring in urban environments is the safety of UAV against ground vehicles. The certification of UAV operation is regulated by authorities for each state in order to avoid different situations like crashing into pedestrians or buildings, collision with other aircrafts, or disturbance [87]. With the spread of drones and their types, the risk of accidents also increases [35] and that is why strict rules are needed to control UAV operations and avoid their unsafe and unnecessary use. The cooperation between all authorities is of great importance to ensure the uniformity of regulations [3]. In some countries, the operation of UAVs can be performed only if the operator holds a certificate recognized by the Federal Aviation Administration (FAA) [99]. In order to ensure the safety of UAV and to reduce the risk of collisions, in most cases adequate separation of people, buildings, and traffic is sufficient. Thus, some countries have imposed on UAV operators well-defined limits (i.e., 30 or 50 m) for the flight of drones to any person or structure [100]. However, this separation is difficult to achieve in urban environments and different solutions were proposed for this problem: an air tunnel designed as an air tunnel for movement of UAV in areas of transport infrastructure facilities [101], risk maps to define the risk associated to accidents [102], safe landing systems able to identify obstacles [103] or to identify landing zones [104]. The various techniques of safe landing zone detection are reviewed in [96].
In most of the cases, a single UAV was used in the analyzed literature for urban traffic monitoring, capturing portions of roads [60,66,72,85], intersections (one intersection [64,74,75], two [67], five [92], or ten intersections [86]), roundabouts [61,69], a toll plaza area [88,89], and so on. One single paper presented a large-scale field experiment, using observation taken by a swarm of 10 UAVs from a large congested area covering 1.3 km2 with around 100 busy intersections [65]. It is obvious that a collaborative formation of UAVs can provide faster, more effective and more flexible monitoring [55]. A performance comparison of single and multi-UAV systems is provided in [52]. Moreover, different solutions for traffic monitoring and management using multiple cooperative UAVs were proposed [105,106]. However, there are also limitations of multi-cooperative UAV systems like ‘blind’ gaps, as can be seen in [76].
The low-altitude traffic management should also be taken into account when developing systems for monitoring traffic in urban areas since the flight environment in these areas is increasingly complex with the development of UAVs. Progress has been made in this regard as well through the development of public air route networks for UAVs [107] based on aerial corridor systems [108] or airspace geofencing volumization algorithms to support unmanned aircraft management of low-altitude airspace [109]. Another solution is represented by a multilayer network of nodes and airways [110]. Nevertheless, these aspects are not discussed in the papers selected for this analysis because they are strictly focused on describing the algorithms for traffic monitoring. As stated in [82], the requirements for real-time traffic management and control generated broad attention in the field of traffic monitoring and new frameworks for low-altitude UAV systems were developed in many countries.
In this field of urban traffic monitoring, appropriate spatial and temporal resolutions are required to capture details related to three-dimensional traffic. The spatial resolution determines the quality of an image, the smallest pixel area that can be identified [111]. It represents a key aspect for determination of traffic flow parameters [66]. Since the UAVs fly at lower altitudes, they can achieve high spatial resolutions [64]. For instance, in [61] the spatial resolution is mentioned as having the value of 10.5 cm, in [66] it has the value of 13 cm and in [81] it is 2 cm. Compared to satellite remote sensing, that can achieve high-resolution of up to 0.3 m [112], UAVs provide ultra-high spatial resolutions at cm-level. A research on the evaluation of the impact of spatial resolution on the classification of vegetation types in provided in [113]. Another research on post-fire mapping and vegetation recovery highlights the advantages of UAV-based systems compared to satellite-collected imagery in terms of spatial and temporal resolutions [114]. Temporal resolution is also of great importance for remote-sensing applications in urban environments because of its dynamic nature. Traffic monitoring and analysis must be performed promptly and consistently. UAV platforms are efficient in this regard, providing appropriate high temporal resolutions [112,115]. However, in the analyzed studies, these parameters were reported to a very small extent.
Finally, some issues related to different laws and regulations of airspace management will be addressed. In light of the growing number of UAVs, countries around the world are trying to develop policies and means to control the operations of low-altitude aircrafts to ensure their safety and the environment in which they fly. The main regulations are related to the controlled use of airspace, operational limitations, and administrative procedures [10]. There are three categories of the management of low-altitude UAVs: the registration of flight activity, the limitation of maximum flight height for different types of UAVs, setting the area for different flight activities [107]. As an example, the Federal Aviation Administration of USA mention that the maximum allowable altitude is 400 ft (about 122 m) above the ground [116]. The same maximum height is stated by the European Aviation Safety Agency (EASA) [117]. A review of maximum flying height for different countries is provided in [56]. Regarding the different zones where the UAVs are allowed to fly, there are different legal requirements defined by each country. For instance, in Belgium and the Netherlands, the use of UAVs for flying above crowds of people or urban areas, and in Canada, UAVs must not approach more than 120 m to people, animals, buildings or vehicles [118]. In Romania, the Romanian Civil Aviation Authority (AACR) provides that a safety distance of 500m from buildings, people, vehicle and animals is required [119]. Moreover, in order to take pictures or videos in this country, pre-approval is required from the Ministry of Defense. Other EU regulations and requirements regarding policies and authorizations can be found in [120].
Given these regulations, authorities around the world are trying to find solutions to implement a secure UAV operating environment. At the EU level, a report states that there is a need to develop and validate UAV capabilities in certain key areas, such as: urban air mobility, air traffic management, advanced services and technologies [121]. Since in the future the airspace will be very crowded, as multi-purpose UAV applications are developed, it is necessary to implement policies and regulations for a safe, reliable, and efficient use of these flying vehicles, and this can be obtained by digitally sharing flight details in traffic management systems, with minimal human intervention [10]. It has become obvious that there is a need for a common system to control the flight of UAVs and large aircraft, referred to by specialists as ‘low altitude airspace management systems (LAAM)’ [37]. Moreover, future smart cities must provide the necessary infrastructure for UAV to vehicle (UAV-2-V) communication [122], which ultimately has to be adopted in the vehicle-to-everything (V2X) space, involving serious data security issues [85] and issues of processing large volumes of data. A new approach to addressing these issues is proposed in [123], a blockchain-based solution for unmanned traffic management. Beyond the current limitations and barriers of UAVs (such as reduced flight time, legal issues, lack of acceptance, economic barriers, and so on [36]), solutions must be found for optimal planning routes, the development of computer vision systems, infrastructure for processing large data sets [124], and UAV positioning algorithms [125]. Some future research directions are provided in [11,12,35,56,57].
6. Conclusions
In this work, we provided an overview of the UAVs application for traffic monitoring and analysis. The main conclusions that can be drawn are the following:
There is a growing trend in the use of drones to monitor traffic in recent years, with a significant increase in the last three years.
China has supremacy in terms of the number of applications in this field, as well as the source of data acquisition equipment (i.e., UAV models).
In terms of the construction of flying mechanisms, rotary-wing UAVs were preferred for data collection, especially quadcopters.
Various image processing methods were proposed for vehicle detection and tracking, but approaches based on deep learning have been preferred in recent years.
Most of the identified studies are based on vehicle detection and tracking techniques, but also the extraction of the trajectory of the vehicles and the evaluation or prediction of a collision.
There is a vast literature on the use of drones in various fields, but there is still much to add to traffic monitoring. This article is part of a series of those aiming to provide help to researchers and practitioners who contribute to this field.
For future work, we plan to expand the investigation and to include more studies, to add the current ones and to analyze in more detail every aspect related to the use of drones in transportation field. Obviously, this article has limitations that will be covered in a future paper.
Conceptualization, R.G.B. and E.V.B.; methodology, R.G.B.; software, E.V.B.; validation, E.V.B.; formal analysis, R.G.B.; data curation, E.V.B.; writing—original draft preparation, R.G.B.; writing—review and editing, R.G.B.; visualization, E.V.B.; supervision, E.V.B. All authors have read and agreed to the published version of the manuscript.
This research received no external funding.
Not applicable.
The authors declare no conflict of interest.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Summary of information related to data acquisition and analysis.
Author | UAV Type | Camera Resolution, fps | Flying Height | Video Dataset Duration | Software Techniques | Vehicle Type | Urban Area | Measures |
---|---|---|---|---|---|---|---|---|
Ahmed et al., 2021 [ |
DJI Phantom 3 | 4k, NR | NR | 15 min | manual extraction, speed-density model—least-squares method (LSM) | cars, motorbikes, rickshaws, loading pickups, buses, trucks | University Road in Karachi—100-ft long, four marked lanes | traffic flow, traffic density, average speed, longitudinal and lateral gap |
Apeltauer et al., 2015 [ |
NR | 1920 × 980, 29 fps | 100 m | NR | Viola and Jones’s AdaBoost algorithm, sequential particle filter | vehicles | The site of roundabout junction of Hamerska road and Lipenska road near Olomouc, Czech Republic | relative number of missed targets, relative number of false tracks, average number of swaps in tracks, temporal average of measure of completeness, spatial precision |
Balamuralidhar et al., 2021 [ |
DJI Phantom 3 | 3269 × 720, 30 fps | 50 m | NR | CPSDarkNet53 backbone, EnEt- segmentation head, YOLO v4, Minimum Output Sum of Squared Error (MOSSE) algorithm, Ground Sampling Distance (GSD); |
vehicles | NR | performance of vehicle detectionand vehicle tracking algorithms, speed estimation, inference on Jetson Xavier NX |
Barmpounakis et al., 2018 [ |
hexacopter | 4K, 30 fps | NR | NR | manual or semi-automatic extraction, frame-by-frame analysis, machine learning—meta-optimized Decision Trees | motorcycles, scooters, cars and heavy vehicles | National Technical University of Athens campus—arterial with three lanes per direction | the type of each vehicle, the lane each vehicle is moving, speeds of all vehicles present, accelerations of all vehicles present, spatial distances between vehicles, duration between each state and general information for the PTW driver |
Barmpounakis et al., 2019 [ |
hexacopter | 4K, 30 fps | 70 m | 15 min | positive and negative vehicle ‘detectors’ on image content, matching the peaks of probability | 140 vehicles, 23 pedestrians | intersection in the National Technical University of Athens campus—four-legged intersection | trajectory, speed |
Barmpounakis and Geroliminis 2020 [ |
DJI Phantom 4 Advanced | 4K, 25 fps | NR | 59 h | virtual loop detectors (gates) used to calculate several traffic variables and extract valuable information | cars, taxis, motorcycles, buses, heavy vehicles | a congested area of a 1.3 km2 area with more than 100 km-lanes of road network, around 100 busy intersections | arterial travel time, congestion propagation, lane changing |
Brkić et al., 2020 [ |
NR | 4096 × 2160, 24 fps | 50 m | 13:52 min | deep learning object detection—Faster R-CNN with ResNet50 backbone network | vehicles | 500 m long section of Zagreb bypass motorway | traffic flow rate, speed estimation, traffic flow density, distance headways and gaps, time headways and gaps |
Chen et al., 2019 [ |
NR | 1920 × 1080, NR | 100 m | NR | Viola-Jones (V-J) and linear SVM classifier with HOG feature (HOG + SVM), KLT (Kanade-Lucas-Tomasi) feature tracker, image processing system, surrogate safety measures (SSMs) | pedestrians and right-turning vehicles | two urban intersections in Beijing, China | vehicle turning path, turning speed, gap acceptance model and pedestrian behavior model, post encroachment time (PET) |
Chen et al., 2021 [ |
DJI Mavic professional | 3840 × 2160, 25 fps | 223, 281 m | 22, 43 s | Canny-based ensemble detector, kernelized correlation filter (KCF), wavelet transform | vehicles | two urban expressway sections in Nanjing, China | Root-mean-square deviation (RMSE), the Mean Squared Deviation (MSD), and the Pearson product-moment correlation coefficient (Pearson’s r) |
Guido et al., 2016 [ |
UAV drone with eight propellers | 4k, 23 fps | 60 m | 19, 21 min, 11, 5 min | identifying pixels associated with the objects of interest, Haar classifier, video stabilization, the conversion to grayscale space and the Gaussian-blurring filter, extract vehicle trajectories—Haar Classifier, ROI | vehicles | a great urban roundabout at the intersection of the ‘‘Asse Viario” with De Gasperi road | Normalized Root Mean Square Error in positioning, speed profile, Root Mean Square Percentage Error in speed evaluation |
Javadi et al., 2021 [ |
NR | 3840 × 2160, NR | NR | NR | YOLO v3 + DarkNet-53, SqueezeNet, MobileNet-v2 and DenseNet-201 + 3D depth maps, Levenberg-Marquardt algorithm, fcNN | trucks, semi-trailers, and trailers | two industrial harbors | average precision, performance evaluation |
Kang and Mattyus 2015 [ |
NR | 5616 × 3744, NR | NR | NR | Integral Channel Features (ICF), HOG features, AdaBoost classifier in Soft Cascade structure | cars and trucks | area of Munich, Germany | orientation estimation, type classification, baseline comparison, computation time |
Kaufmann et al., 2018 [ |
DJI Inspire 1—a small-scale quadcopter | 4K, 25 fps | 100 m | 14.5 min | Levenberg-Marquardt optimization, moving linear regression (MLR), vehicle trajectories—supervised tracking method | vehicles | the street “Völklinger Straβe” in Düsseldor—600m street— starting with two lanes, broadens to three lanes at location 500m and to four lanes at location 530 m | speed, location, trajectories, lane changes per minute |
Ke et al., 2017 [ |
NR | 960 × 540, 24 fps | NR | 1.17 min | vehicle tracking (Shi-Tomasi features, Kanade-Lucas optical flow algorithm), motion-vector clustering (k-means algorithm), connected-based graph method to detect cluster | vehicles | six lanes of traffic moving in two directions | speed, density, volume |
Khan et al., 2017 [ |
Argus-One (from ArgusVision) | 4K, 25 fps | 80, 60 m | 14 min | optical flow tracking (Lucas-Kanade algorithm, background substraction treshold, blob analysis, vehicle extraction—computer vision | vehicles | an urban intersection near the city of Sint-Truiden in Belgium—four-leg intersection | trajectory, speed profile, space-time trajectories |
Khan et al., 2018 [ |
Argus-One (from Argus-Vision) | 4K, 25 fps | 80, 60 m | 10–12 min | optical flow tracking, blob analysis, Kalman filter | vehicles | a four-legged sub-urban signalized intersection from Sint-Truiden, Belgium | trajectory, speed profile |
Khan et al., 2020 [ |
NR | NR | NR | NR | detect speed or other traffic violation in real time | vehicles | Saudi Arabia | excess speed limit and other traffic safety violations on highways and roads |
Kujawski and Dudek 2021 [ |
NR | 720p, 60 fps | NR | 8 h | image processing—blob detection | vehicles in/out | city of Szczecin in Poland—two lanes of traffic each from and to the city | numbers of cars per hour on holiday and workday |
Li et al., 2019 [ |
DJI-Matrice 100 | 1280 × 960, NR | 80 m | YOLO v3, tracking-by-detection methd, Kalman filter, Hungarian algorithm, motion compensation based on homography, optical flow—RANSAC, adaptive vehicle speed estimation | vehicles | an intersection, country road, parking entrance, highways, and crossroads | vehicle speed estimation, velocity measurement | |
Li et al., 2020 [ |
DJI Matrice 100 | NR | NR | NR | CNN + SSD—scale-specific prediction based single shot detector (SSP-SSD), Resnet 101, remove redundant detection—OA-NMS (Outlier-Aware Non-Maximum Suppression), comparation with SSD, Cascade RCNN, FRCNN, YOLOV3, YOLOV4, YOLOV5(x), FCOS, Retinanet and CenterNet | Vehicles—small, medium, large | dataset containing 312071 vehicles | performance evaluation—precision, recall rate, F1-score, average precision |
Liu and Zhang 2021 [ |
NR | NR | NR | NR | YOLO v4, DeepSORT (KF prediction), trajectory estimation—eight-dimensional space, high-precision positioning—interacting multiple model (IMM)—PF (particle filter) algorithm, IMM-PF, CV-EKF, IMM-EKF comparison | cars, buses, trucks, and vans | dataset containing |
position, Normalized distance, Model probabilities |
Luo et al., 2020 [ |
small UAV similar as SkyProwler | 640 × 360, 570 × 640 | NR | NR | blob detection, classifier, dot-product kernels and radial basis function (RBF) kernels, tracking-by-detection, crash decision | vehicles | environment includes city, suburban and rural areas | vehicle trajectory |
Moranduzzo and Melgani 2014 [ |
hexacopter | 5184 × 3456, NR | NR | NR | feature extraction process based on scalar invariant feature transform (SIFT), classification by means of support vector machine (SVM) classifier, grouping of the key points belonging to the same car | vehicles | NR | accuracy |
Shan et al., 2021 [ |
DJI Phantom 4 Pro | 3840 × 2160, 25 fps | 150–350 m | NR | Pre-processing, YOLO v3, deep SORT algorithm | vehicles | 1 km long of Xi’an Ring Expresswa—upstream of ZHANG-BA interchange exit | precision of vehicle detection, precision of extracted speed |
Wan et al., 2019 [ |
NR | NR | NR | NR | joint dictionary, L2 regularization based on temporal consistency, Markov Random Field (MRF)-based binary support vector, particle filter framework along with a dynamic template update scheme; comparation of 9 state-of-the-art visual tracking algorithms, including IV, L1, PCOM, CT, MTT, WMIL, OFDS, STC and CNT | vehicles, pedestrians | UAV videos | precision and success plots, time complexity, execution time |
Wang et al., 2016a [ |
NR | NR | 80–90 m | 4 h | Shi-Tomasi features, optical flow (Kanade-Lucas algorithm), prediction method—bivariate bivariate extreme value theory (EVT) | vehicles | ten urban signalized intersections in Fengxian District in Shanghai | time-to-accident (TA), post-encroachment Time (PET), minimum time-to-collision (mTTC), and maximum deceleration rate (MaxD) |
Wang et al., 2016b [ |
MD3-1000 by Germany Microdrones Company | NR | 42.6 m | NR | calculating start-wave velocity at signalized intersections | large, medium and small vehicles | straight lanes at the intersection of Cao-an Highway and North Jia-song Road in Shanghai | speed, density of traffic flow |
Wang et al., 2019 [ |
DJI Phantom 4 Pro | 2720 × 1530, 30 fps | 60–150 m | NR | YOLOv3, motion estimation based on Kalman filtering is integrated with deep appearance features | vehicles | N/A | true positive (TP), false positive (FP), true negative (TN), false negative (FN), identification precision (IDP), identification recall (IDR), F1 score, multiple-object tracking accuracy (MOTA), mostly tracked (MT), mostly lost (ML), and identity switching (IDSW) |
Wang et al., 2019 [ |
DJI Phantom 2 | 1920 × 1080, 30 fps | 100–150 m | NR | image registration, image feature extraction—edge (Prewitt edge detection), optical flow (Lucas–Kanade operator), local feature point (SIFT), vehicle detection—shape detection, vehice tracking—optical flow, matched local feature points | vehicles | the north part of the 5th Ring Road in Beijing, China | correctness, completeness and quality—vehicle detection, number of vehicles, error rate |
Xing et al., 2020a [ |
NR | 4K, 30 fps | NR | NR | logistic regression, time-varying random effects logistic regression (T-RELR) model and time-varying random parameters logistic regression (T-RPLR), two time-varying mixed logit models including time-varying random effects logistic regression (T-RELR) model and time-varying random parameters logistic regression (T-RPLR) are developed to examine the time varying effects of influencing factors on vehicle collision risk | cars, buses and trucks | a toll plaza area on G42 freeway in Nanjing, China -12 toll collection lanes on the east-west direction (north side) and 6 toll collection lanes on the west-east direction (south side) | model performance, time-varying logistic regression model, TTC |
Xing et al., 2020b [ |
NR | 4K, 30 fps | NR | 50 min | logistic regression model vs. K-Nearest Neighbor (KNN), Artificial Neural Networks (ANN), Support Vector Machines (SVM), Decision Trees (DT), and Random Forest (RF) | vehicles | a toll plaza area on G42 freeway in Nanjing, China | surrogate safety measure (SSM)—extended TTC, model performance |
Xu et al., 2016 [ |
DJI Phantom 2 | 1920 × 1080, 24 fps | NR | 10 min | Viola-Jones, linear support vector machine (SVM) + histogram of oriented gradients (HOG) features, comparation with 9 other methods | vehicle | NR | detection speed (f/s), correctness, completeness, and quality |
Zhu et al., 2018a [ |
DJI Inspire 1 Pro | 4K (3840 × 2178), 30 fps | NR | 2 min 47 s | Retina object detector (RetinaNET), associated detections, trajectory modeling and extraction, semi-supervised nearest |
cars, buses, and trucks | busy road intersection of a modern megacity | trajectory, tracking speed, vehicle behavior recognition |
Zhu et al., 2018b [ |
DJI Inspire 1 Pro | 4K, 30 fps | NR | 56 min 39 s | deep learning (enhanced single shot multibox detector), support vector machine; comparation with SSD, Faster RCNN (FRC), and YOLO | cars, buses, trucks | five key road intersections in Shenzhen | vehicle counting, counting accuracy |
Note: NR—not reported, fps—frames per second.
Synthesis of the results related to the main purpose of the studies.
Application Field | Main Purpose | Paper |
---|---|---|
Traffic analysis | congestion analysis | [ |
crash prediction | [ |
|
vehicle collision detection | [ |
|
driving behavior modeling | [ |
|
vehicle trajectories extraction | [ |
|
traffic parameters extraction | [ |
|
moving synchronized flow patterns observation | [ |
|
traffic flow parameter estimation | [ |
|
traffic density estimation | [ |
|
traffic information collection | [ |
|
unconventional overtaking decisions identification | [ |
|
vehicle behavior recognition | [ |
|
vehicle collision risk evaluation | [ |
|
vehicle-pedestrian conflicts evaluation | [ |
|
Traffic monitoring | smart monitoring system | [ |
traffic streams recording | [ |
|
vehicle detection | [ |
|
vehicle detection and tracking | [ |
|
vehicle tracking | [ |
Appendix A
Summary of the purpose, results and future work for the analyzed studies.
Author | Paper Aim | Findings | Future Work |
---|---|---|---|
Ahmed et al., 2021 [ |
The utilization of a UAV-based geospatial analysis technique for accurate extraction of longitudinal and lateral distances between vehicles to determine the relationship between macroscopic and microscopic parameters of traffic flow. |
|
|
Apeltauer et al., 2015 [ |
A new approach for simultaneous detection and tracking of vehicles moving through an intersection in aerial images acquired by an unmanned aerial vehicle (UAV). |
|
|
Balamuralidhar et al., 2021 [ |
Presentation of a traffic monitoring system that can detect, track, and estimate the velocity of vehicles in a sequence of aerial images. The solution has been optimized to execute these tasks in real-time on an embedded computer installed on an Unmanned Aerial Vehicle (UAV) |
|
|
Barmpounakis et al., 2018 [ |
Address PTW (Powered Two-Wheeler) overtaking phenomena using a two-step modelling approach based on optimized and meta-optimized decision trees. |
|
|
Barmpounakis et al., 2019 [ |
Examination of the potential of using UAVs as part of the ITS infrastructure as a way of extracting naturalistic trajectory data from aerial video footage from a low volume four-way intersection and a pedestrian passage. |
|
|
Barmpounakis and Geroliminis 2020 [ |
Recording traffic streams in a multi-modal congested environment over an urban setting using UAS that can allow the deep investigation of critical traffic phenomena. |
|
|
Brkić et al., 2020 [ |
Proposing a new, low-cost framework for the determination of highly accurate traffic flow parameters. |
|
|
Chen et al., 2019 [ |
Assessing how simulation can be utilized for vehicle-pedestrian conflict assessment at crosswalks. Empirical models have been established to represent the stochastic behaviour of right-turning vehicles and pedestrians under different geometric layouts and operational conditions at signalized intersections. |
|
|
Chen et al., 2021 [ |
Proposing a novel methodological framework for automatic and accurate vehicle trajectory extraction from aerial videos. |
|
|
Guido et al., 2016 [ |
Presenting a methodology to extract vehicle trajectories and speeds from Unmanned Aerial Vehicles (UAV) video processing. |
|
|
Javadi et al., 2021 [ |
Investigating the ability of three-dimensional (3D) feature maps to improve the performance of deep neural network (DNN) for vehicle detection. First, we propose a DNN based on YOLOv3 with various base networks, including DarkNet-53, SqueezeNet, MobileNet-v2, and DenseNet-201. |
|
|
Kang and Mattyus 2015 [ |
Presenting a method which can detect vehicles with orientation and type information on aerial images in a few seconds on large images. The application of Integral Channel Features in a Soft Cascade structure results in both good detection performance and fast speed |
|
|
Kaufmann et al., 2018 [ |
Showing a suitable way to perform spatiotemporal measurements of city traffic using aerial observations with an unmanned aerial vehicle. |
|
|
Ke et al., 2017 [ |
Proposing a novel framework for real-time traffic flow parameter estimation from aerial videos. The proposed system identifies the directions of traffic streams and extracts traffic flow parameters of each traffic stream separately. |
|
|
Khan et al., 2017 [ |
Processing and analysis of UAV-acquired traffic footage. A detailed methodological framework for automated UAV video processing is proposed to extract the trajectories of multiple vehicles at a particular road segment. |
|
|
Khan et al., 2018 [ |
Analysis of vehicle trajectories acquired via small rotary-wing UAV footage. The experimental data to analyse traffic flow conditions at a signalized intersection was obtained in the city of Sint-Truiden, Belgium. |
|
|
Khan et al., 2020 [ |
Proposing a smart traffic surveillance system based on Unmanned Aerial Vehicle (UAV) using 5G technology. |
|
|
Kujawski and Dudek 2021 [ |
Presenting methods of data acquisition from cameras mounted on unmanned aerial vehicles (UAV) and their further analysis, which may be used to improve urban transportation systems and its sustainability. The analysed data concerning the situation of urban transport in points of intersection of national and local roads. |
|
|
Li et al., 2019 [ |
Proposing a novel adaptive framework for multi-vehicle ground speed estimation in airborne videos. |
|
|
Li et al., 2020 [ |
Proposing a robust vehicle detection model for aerial images. First, image pre-processing to deal with IoU distribution imbalance problem and greatly improve the recall rate were performed. Then, SSP-SSD to enhance feature representation of vehicles with different scales and improve the precision was proposed. |
|
|
Liu and Zhang 2021 [ |
Fusing the target detection network, YOLO v4, with the detection-based multitarget tracking algorithm, DeepSORT, a method based on deep learning for automatic vehicle detection and tracking in urban environments, has been designed. |
|
NR |
Luo et al., 2020 [ |
Proposing a traffic collisions early warning scheme aided by small unmanned aerial vehicle (UAV) companion. Basically, it is a vision-based driver assistance system, and the difference in comparison with the available schemes is that the camera is flying along with the host vehicle. |
|
|
Moranduzzo and Melgani 2014 [ |
Presenting a solution to solve the car detection and counting problem in images acquired by means of unmanned aerial vehicles (UAVs). |
|
|
Shan et al., 2021 [ |
Proposing a systematic approach to detect and track vehicles based on the YOLO v3 model and the deep SORT algorithm for further extracting key traffic parameters. |
|
|
Wan et al., 2019 [ |
Proposing a computer vision-based target tracking algorithm aiming at locating UAV-captured targets, like pedestrian and vehicle, using sparse representation theory. |
|
|
Wang et al., 2016a [ |
Presenting a crash prediction method based on a bivariate extreme value theory (EVT) framework and UAV trajectory data processing. |
|
|
Wang et al., 2016b [ |
Proposing an improved start-wave velocity model, where the speed and density of traffic flow are converted into vehicle space headway, meaning vehicle length and other auxiliary parameters which can be recognized from aerial video or other means |
|
|
Wang et al., 2019 [ |
Proposing a deep-learning framework for vehicle detection and tracking from UAV videos for monitoring traffic flow in complex road structures. The approach is designed to be invariant to significant orientation and scale variations in the videos. The detection procedure is performed by fine-tuning a state-of-the-art object detector, You Only Look Once (YOLOv3), using several custom-labelled traffic datasets. |
|
|
Wang et al., 2019 [ |
Introducing a new vehicle detecting and tracking system based on image data collected by UAV. The system uses consecutive frames to generate vehicle’s dynamic information, such as positions and velocities. |
|
|
Xing et al., 2020a [ |
Investigating the traffic conflict risks at the upstream approach of toll plaza during the vehicles’ diverging period from the time of arrival at the diverging area to that of entering the tollbooths. Based on the vehicle’s trajectory data extracted from unmanned aerial vehicle (UAV) videos using an automated video analysis system, vehicles’ collision risk is computed by extended time to collision (TTC). |
|
|
Xing et al., 2020b [ |
Developing the logistic regression (LR) model and five typical non-parametric models including, K-Nearest Neighbour (KNN), Artificial Neural Networks (ANN), Support Vector Machines (SVM), Decision Trees (DT), and Random Forest (RF) to examine the relationship between influencing factors and vehicle collision risk. |
|
|
Xu et al., 2016 [ |
Proposing a new hybrid vehicle detection scheme which integrates the Viola-Jones (V-J) and linear SVM classifier with HOG feature (HOG + SVM) methods for vehicle detection from low-altitude unmanned aerial vehicle (UAV) images. |
|
|
Zhu et al., 2018a [ |
Presenting an all-in-one behaviour recognition framework for moving vehicles based on the latest deep learning techniques. |
|
|
Zhu et al., 2018b [ |
Presenting an advanced urban traffic density estimation solution using the latest deep learning techniques to intelligently process ultrahigh-resolution traffic videos taken from an unmanned aerial vehicle (UAV). |
|
|
References
1. Azar, A.; Koubaa, A.; Mohamed, N.A.; Ibrahim, H.; Ibrahim, Z.; Kazim, M.; Ammar, A.; Benjdira, B.; Khamis, A.; Hameed, I. et al. Drone Deep Reinforcement Learning: A Review. Electronics; 2021; 10, 999. [DOI: https://dx.doi.org/10.3390/electronics10090999]
2. Elkhrachy, I. Accuracy Assessment of Low-Cost Unmanned Aerial Vehicle (UAV) Photogrammetry. Alex. Eng. J.; 2021; 60, pp. 5579-5590. [DOI: https://dx.doi.org/10.1016/j.aej.2021.04.011]
3. Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat.; 2014; 6, pp. 1-15. [DOI: https://dx.doi.org/10.1007/s12518-013-0120-x]
4. de Silva, I. Geomatics Applied to Civil Engineering State of the Art. Applications of Geomatics in Civil Engineering; Ghosh, J.; de Silva, I. Springer: Singapore, 2020; pp. 31-46.
5. Giordan, D.; Adams, M.S.; Aicardi, I.; Alicandro, M.; Allasia, P.; Baldo, M.; De Berardinis, P.; Dominici, D.; Godone, D.; Hobbs, P. et al. The use of unmanned aerial vehicles (UAVs) for engineering geology applications. Bull. Eng. Geol. Environ.; 2020; 79, pp. 3437-3481. [DOI: https://dx.doi.org/10.1007/s10064-020-01766-2]
6. Alioua, A.; Djeghri, H.-e.; Cherif, M.E.T.; Senouci, S.-M.; Sedjelmaci, H. UAVs for traffic monitoring: A sequential game-based computation offloading/sharing approach. Comput. Netw.; 2020; 177, 107273. [DOI: https://dx.doi.org/10.1016/j.comnet.2020.107273]
7. Cummings, A.R.; Cummings, G.R.; Hamer, E.; Moses, P.; Norman, Z.; Captain, V.; Bento, R.; Butler, K. Developing a UAV-based monitoring program with indigenous peoples. J. Unmanned Veh. Syst.; 2017; 5, pp. 115-125. [DOI: https://dx.doi.org/10.1139/juvs-2016-0022]
8. Chamoso, P.; Raveane, W.; Parra, V.; González, A. UAVs Applied to the Counting and Monitoring of Animals. Ambient Intelligence-Software and Applications; Ramos, C.; Novais, P.; Nihan, C.; Corchado Rodríguez, J. Springer: Cham, Switzerland, 2013; pp. 71-80.
9. Kumar, A.; Sharma, K.; Singh, H.; Naugriya, S.G.; Gill, S.S.; Buyya, R. A drone-based networked system and methods for combating coronavirus disease (COVID-19) pandemic. Futur. Gener. Comput. Syst.; 2021; 115, pp. 1-19. [DOI: https://dx.doi.org/10.1016/j.future.2020.08.046]
10. Gupta, A.; Afrin, T.; Scully, E.; Yodo, N. Advances of UAVs toward Future Transportation: The State-of-the-Art, Challenges, and Opportunities. Futur. Transp.; 2021; 1, 19. [DOI: https://dx.doi.org/10.3390/futuretransp1020019]
11. Outay, F.; Mengash, H.A.; Adnan, M. Applications of unmanned aerial vehicle (UAV) in road safety, traffic and highway infrastructure management: Recent advances and challenges. Transp. Res. Part A Policy Pract.; 2020; 141, pp. 116-129. [DOI: https://dx.doi.org/10.1016/j.tra.2020.09.018]
12. Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access; 2019; 7, pp. 48572-48634. [DOI: https://dx.doi.org/10.1109/ACCESS.2019.2909530]
13. Elloumi, M.; Dhaou, R.; Escrig, B.; Idoudi, H.; Saidane, L.A. Monitoring road traffic with a UAV-based system. Proceedings of the 2018 IEEE Wireless Communications and Networking Conference (WCNC); Barcelona, Spain, 15–18 April 2018; pp. 1-6.
14. Degas, A.; Kaddoum, E.; Gleizes, M.-P.; Adreit, F.; Rantrua, A. Cooperative multi-agent model for collision avoidance applied to air traffic management. Eng. Appl. Artif. Intell.; 2021; 102, 104286. [DOI: https://dx.doi.org/10.1016/j.engappai.2021.104286]
15. Villa, T.F.; Jayaratne, E.R.; Gonzalez, L.F.; Morawska, L. Determination of the vertical profile of particle number concentration adjacent to a motorway using an unmanned aerial vehicle. Environ. Pollut.; 2017; 230, pp. 134-142. [DOI: https://dx.doi.org/10.1016/j.envpol.2017.06.033]
16. Naughton, J.; McDonald, W. Evaluating the Variability of Urban Land Surface Temperatures Using Drone Observations. Remote Sens.; 2019; 11, 1722. [DOI: https://dx.doi.org/10.3390/rs11141722]
17. Yalcin, E. Two-dimensional hydrodynamic modelling for urban flood risk assessment using unmanned aerial vehicle imagery: A case study of Kirsehir, Turkey. J. Flood Risk Manag.; 2018; 12, e12499. [DOI: https://dx.doi.org/10.1111/jfr3.12499]
18. De Vivo, F.; Battipede, M.; Johnson, E. Infra-red line camera data-driven edge detector in UAV forest fire monitoring. Aerosp. Sci. Technol.; 2021; 111, 106574. [DOI: https://dx.doi.org/10.1016/j.ast.2021.106574]
19. Biçici, S.; Zeybek, M. An approach for the automated extraction of road surface distress from a UAV-derived point cloud. Autom. Constr.; 2021; 122, 103475. [DOI: https://dx.doi.org/10.1016/j.autcon.2020.103475]
20. Agarwal, A.; Kumar, S.; Singh, D. Development of Neural Network Based Adaptive Change Detection Technique for Land Terrain Monitoring with Satellite and Drone Images. Def. Sci. J.; 2019; 69, pp. 474-480. [DOI: https://dx.doi.org/10.14429/dsj.69.14954]
21. Sutheerakul, C.; Kronprasert, N.; Kaewmoracharoen, M.; Pichayapan, P. Application of Unmanned Aerial Vehicles to Pedestrian Traffic Monitoring and Management for Shopping Streets. Transp. Res. Procedia; 2017; 25, pp. 1717-1734. [DOI: https://dx.doi.org/10.1016/j.trpro.2017.05.131]
22. Zhu, J.; Chen, S.; Tu, W.; Sun, K. Tracking and Simulating Pedestrian Movements at Intersections Using Unmanned Aerial Vehicles. Remote Sens.; 2019; 11, 925. [DOI: https://dx.doi.org/10.3390/rs11080925]
23. Sahil,; Sood, S.K. Fog-Cloud centric IoT-based cyber physical framework for panic oriented disaster evacuation in smart cities. Earth Sci. Inf.; 2020; pp. 1-22. [DOI: https://dx.doi.org/10.1007/s12145-020-00481-6]
24. Ranquist, E.; Steiner, M.; Argrow, B. Exploring the Range of Weather Impacts on UAS Operations. Proceedings of the 18th Conference on Aviation, Range and Aerospace Meteorology; Seattle, WA, USA, 22–26 January 2017.
25. Vanegas Alvarez, F.; Gonzalez, L. Enabling UAV Navigation with Sensor and Environmental Uncertainty in Cluttered and GPS-Denied Environments. Sensors; 2016; 16, 666. [DOI: https://dx.doi.org/10.3390/s16050666] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/27171096]
26. Pascua, D.A.; Abellanosa, C.; Lugpatan, R. Position Estimation using Inertial Measurement Unit (IMU) on a Quadcopter in an Enclosed Environment. Int. J. Comput. Commun. Instrum. Eng.; 2016; 2, 332. [DOI: https://dx.doi.org/10.15242/IJCCIE.AE0516306]
27. Saboor, A.; Coene, S.; Vinogradov, E.; Tanghe, E.; Joseph, W.; Pollin, S. Elevating the future of mobility: UAV-enabled Intelligent Transportation Systems. arXiv; 2021; [DOI: https://dx.doi.org/10.36227/techrxiv.16826743.v1] arXiv: 2110.09934
28. Liu, X.; Zhang, Z. A Vision-Based Target Detection, Tracking, and Positioning Algorithm for Unmanned Aerial Vehicle. Wirel. Commun. Mob. Comput.; 2021; 2021, 5565589. [DOI: https://dx.doi.org/10.1155/2021/5565589]
29. Utomo, W.; Bhaskara, P.W.; Kurniawan, A.; Juniastuti, S.; Yuniarno, E.M. Traffic Congestion Detection Using Fixed-Wing Unmanned Aerial Vehicle (UAV) Video Streaming Based on Deep Learning. Proceedings of the 2020 International Conference on Computer Engineering, Network and Intelligent Multimedia (CENIM 2020); Surabaya, Indonesia, 13–18 November 2020.
30. Zhang, H.; Liptrott, M.; Bessis, N.; Cheng, J. Real-Time Traffic Analysis using Deep Learning Techniques and UAV based Video. Proceedings of the 2019 16th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS); Taipei, Taiwan, 18–21 September 2019; pp. 1-5.
31. Kumar, A.; Krishnamurthi, R.; Nayyar, A.; Luhach, A.K.; Khan, M.S.; Singh, A. A novel Software-Defined Drone Network (SDDN)-based collision avoidance strategies for on-road traffic monitoring and management. Veh. Commun.; 2021; 28, 100313. [DOI: https://dx.doi.org/10.1016/j.vehcom.2020.100313]
32. Chen, P.; Zeng, W.; Yu, G.; Wang, Y. Surrogate Safety Analysis of Pedestrian-Vehicle Conflict at Intersections Using Unmanned Aerial Vehicle Videos. J. Adv. Transp.; 2017; 2017, pp. 1-12. [DOI: https://dx.doi.org/10.1155/2017/5202150]
33. Khan, M.A.; Ectors, W.; Bellemans, T.; Ruichek, Y.; Yasar, A.-U.-H.; Janssens, D.; Wets, G. Unmanned Aerial Vehicle-based Traffic Analysis: A Case Study to Analyze Traffic Streams at Urban Roundabouts. Procedia Comput. Sci.; 2018; 130, pp. 636-643. [DOI: https://dx.doi.org/10.1016/j.procs.2018.04.114]
34. Hadiwardoyo, S.A.; Hernández-Orallo, E.; Calafate, C.T.; Cano, J.C.; Manzoni, P. Experimental characterization of UAV-to-car communications. Comput. Netw.; 2018; 136, pp. 105-118. [DOI: https://dx.doi.org/10.1016/j.comnet.2018.03.002]
35. Chung, S.H.; Sah, B.; Lee, J. Optimization for drone and drone-truck combined operations: A review of the state of the art and future directions. Comput. Oper. Res.; 2020; 123, 105004. [DOI: https://dx.doi.org/10.1016/j.cor.2020.105004]
36. Kellermann, R.; Biehle, T.; Fischer, L. Drones for parcel and passenger transportation: A literature review. Transp. Res. Interdiscip. Perspect.; 2020; 4, 100088. [DOI: https://dx.doi.org/10.1016/j.trip.2019.100088]
37. Merkert, R.; Bushell, J. Managing the drone revolution: A systematic literature review into the current use of airborne drones and future strategic directions for their effective control. J. Air Transp. Manag.; 2020; 89, 101929. [DOI: https://dx.doi.org/10.1016/j.jairtraman.2020.101929]
38. Otto, A.; Agatz, N.; Campbell, J.; Golden, B.; Pesch, E. Optimization approaches for civil applications of unmanned aerial vehicles (UAVs) or aerial drones: A survey. Networks; 2018; 72, pp. 1-48. [DOI: https://dx.doi.org/10.1002/net.21818]
39. Spencer, B.F.; Hoskere, V.; Narazaki, Y. Advances in Computer Vision-Based Civil Infrastructure Inspection and Monitoring. Engineering; 2019; 5, pp. 199-222. [DOI: https://dx.doi.org/10.1016/j.eng.2018.11.030]
40. Srivastava, S.; Narayan, S.; Mittal, S. A survey of deep learning techniques for vehicle detection from UAV images. J. Syst. Arch.; 2021; 117, 102152. [DOI: https://dx.doi.org/10.1016/j.sysarc.2021.102152]
41. Yao, H.; Qin, R.; Chen, X. Unmanned Aerial Vehicle for Remote Sensing Applications—A Review. Remote Sens.; 2019; 11, 1443. [DOI: https://dx.doi.org/10.3390/rs11121443]
42. Kanistras, K.; Martins, G.; Rutherford, M.J.; Valavanis, K.P. A Survey of Unmanned Aerial Vehicles (UAVs) for Traffic Monitoring. Proceedings of the 2013 International Conference on Unmanned Aircraft Systems (ICUAS); Atlanta, GA, USA, 28–31 May 2013; pp. 221-234.
43. Ren, H.; Zhao, Y.; Xiao, W.; Hu, Z. A review of UAV monitoring in mining areas: Current status and future perspectives. Int. J. Coal Sci. Technol.; 2019; 6, pp. 320-333. [DOI: https://dx.doi.org/10.1007/s40789-019-00264-5]
44. Videras Rodríguez, M.; Melgar, S.G.; Cordero, A.S.; Márquez, J.M.A. A Critical Review of Unmanned Aerial Vehicles (UAVs) Use in Architecture and Urbanism: Scientometric and Bibliometric Analysis. Appl. Sci.; 2021; 11, 9966. [DOI: https://dx.doi.org/10.3390/app11219966]
45. Śledź, S.; Ewertowski, M.W.; Piekarczyk, J. Applications of unmanned aerial vehicle (UAV) surveys and Structure from Motion photogrammetry in glacial and periglacial geomorphology. Geomorphology; 2021; 378, 107620. [DOI: https://dx.doi.org/10.1016/j.geomorph.2021.107620]
46. Eskandari, R.; Mahdianpari, M.; Mohammadimanesh, F.; Salehi, B.; Brisco, B.; Homayouni, S. Meta-analysis of Unmanned Aerial Vehicle (UAV) Imagery for Agro-environmental Monitoring Using Machine Learning and Statistical Models. Remote Sens.; 2020; 12, 3511. [DOI: https://dx.doi.org/10.3390/rs12213511]
47. Mohan, M.; Richardson, G.; Gopan, G.; Aghai, M.M.; Bajaj, S.; Galgamuwa, G.A.P.; Vastaranta, M.; Arachchige, P.S.P.; Amorós, L.; Corte, A.P. et al. UAV-Supported Forest Regeneration: Current Trends, Challenges and Implications. Remote Sens.; 2021; 13, 2596. [DOI: https://dx.doi.org/10.3390/rs13132596]
48. Sibanda, M.; Mutanga, O.; Chimonyo, V.G.P.; Clulow, A.D.; Shoko, C.; Mazvimavi, D.; Dube, T.; Mabhaudhi, T. Application of Drone Technologies in Surface Water Resources Monitoring and Assessment: A Systematic Review of Progress, Challenges, and Opportunities in the Global South. Drone; 2021; 5, 84. [DOI: https://dx.doi.org/10.3390/drones5030084]
49. Aggarwal, S.; Kumar, N. Path planning techniques for unmanned aerial vehicles: A review, solutions, and challenges. Comput. Commun.; 2020; 149, pp. 270-299. [DOI: https://dx.doi.org/10.1016/j.comcom.2019.10.014]
50. Al-Kaff, A.; Martín, D.; García, F.; Escalera, A.D.L.; María Armingol, J. Survey of computer vision algorithms and applications for unmanned aerial vehicles. Expert Syst. Appl.; 2018; 92, pp. 447-463. [DOI: https://dx.doi.org/10.1016/j.eswa.2017.09.033]
51. Alladi, T.; Chamola, V.; Sahu, N.; Guizani, M. Applications of blockchain in unmanned aerial vehicles: A review. Veh. Commun.; 2020; 23, 100249. [DOI: https://dx.doi.org/10.1016/j.vehcom.2020.100249]
52. Chen, X.; Tang, J.; Lao, S. Review of Unmanned Aerial Vehicle Swarm Communication Architectures and Routing Protocols. Appl. Sci.; 2020; 10, 3661. [DOI: https://dx.doi.org/10.3390/app10103661]
53. Darvishpoor, S.; Roshanian, J.; Raissi, A.; Hassanalian, M. Configurations, flight mechanisms, and applications of unmanned aerial systems: A review. Prog. Aerosp. Sci.; 2020; 121, 100694. [DOI: https://dx.doi.org/10.1016/j.paerosci.2020.100694]
54. Emilien, A.-V.; Thomas, C.; Thomas, H. UAV & satellite synergies for optical remote sensing applications: A literature review. Sci. Remote Sens.; 2021; 3, 100019. [DOI: https://dx.doi.org/10.1016/j.srs.2021.100019]
55. Jawhar, I.; Mohamed, N.; Al-Jaroodi, J.; Agrawal, D.P.; Zhang, S. Communication and networking of UAV-based systems: Classification and associated architectures. J. Netw. Comput. Appl.; 2017; 84, pp. 93-108. [DOI: https://dx.doi.org/10.1016/j.jnca.2017.02.008]
56. Xu, C.; Liao, X.; Tan, J.; Ye, H.; Lu, H. Recent Research Progress of Unmanned Aerial Vehicle Regulation Policies and Technologies in Urban Low Altitude. IEEE Access; 2020; 8, pp. 74175-74194. [DOI: https://dx.doi.org/10.1109/ACCESS.2020.2987622]
57. Yazid, Y.; Ez-Zazi, I.; Guerrero-González, A.; El Oualkadi, A.; Arioua, M. UAV-Enabled Mobile Edge-Computing for IoT Based on AI: A Comprehensive Review. Drones; 2021; 5, 148. [DOI: https://dx.doi.org/10.3390/drones5040148]
58. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens.; 2014; 92, pp. 79-97. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2014.02.013]
59. Abdollahi, A.; Pradhan, B.; Shukla, N.; Chakraborty, S.; Alamri, A. Deep Learning Approaches Applied to Remote Sensing Datasets for Road Extraction: A State-Of-The-Art Review. Remote Sens.; 2020; 12, 1444. [DOI: https://dx.doi.org/10.3390/rs12091444]
60. Ahmed, A.; Ngoduy, D.; Adnan, M.; Baig, M.A.U. On the fundamental diagram and driving behavior modeling of heterogeneous traffic flow using UAV-based data. Transp. Res. Part A Policy Pr.; 2021; 148, pp. 100-115. [DOI: https://dx.doi.org/10.1016/j.tra.2021.03.001]
61. Apeltauer, J.; Babinec, A.; Herman, D.; Apeltauer, T. Automatic vehicle trajectory extraction for traffic analysis from aerial video data. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.; 2015; 40, pp. 9-15. [DOI: https://dx.doi.org/10.5194/isprsarchives-XL-3-W2-9-2015]
62. Balamuralidhar, N.; Tilon, S.; Nex, F. MultEYE: Monitoring System for Real-Time Vehicle Detection, Tracking and Speed Estimation from UAV Imagery on Edge-Computing Platforms. Remote Sens.; 2021; 13, 573. [DOI: https://dx.doi.org/10.3390/rs13040573]
63. Barmpounakis, E.N.; Vlahogianni, E.I.; Golias, J.C. Identifying Predictable Patterns in the Unconventional Overtaking Decisions of PTW for Cooperative ITS. IEEE Trans. Intell. Veh.; 2018; 3, pp. 102-111. [DOI: https://dx.doi.org/10.1109/TIV.2017.2788195]
64. Barmpounakis, E.N.; Vlahogianni, E.I.; Golias, J.C.; Babinec, A. How accurate are small drones for measuring microscopic traffic parameters?. Transp. Lett.; 2019; 11, pp. 332-340. [DOI: https://dx.doi.org/10.1080/19427867.2017.1354433]
65. Barmpounakis, E.; Geroliminis, N. On the new era of urban traffic monitoring with massive drone data: The pNEUMA large-scale field experiment. Transp. Res. Part C Emerg. Technol.; 2020; 111, pp. 50-71. [DOI: https://dx.doi.org/10.1016/j.trc.2019.11.023]
66. Brkić, I.; Miler, M.; Ševrović, M.; Medak, D. An Analytical Framework for Accurate Traffic Flow Parameter Calculation from UAV Aerial Videos. Remote Sens.; 2020; 12, 3844. [DOI: https://dx.doi.org/10.3390/rs12223844]
67. Chen, P.; Zeng, W.; Yu, G. Assessing right-turning vehicle-pedestrian conflicts at intersections using an integrated microscopic simulation model. Accid. Anal. Prev.; 2019; 129, pp. 211-224. [DOI: https://dx.doi.org/10.1016/j.aap.2019.05.018] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31170560]
68. Chen, X.Q.; Li, Z.B.; Yang, Y.S.; Qi, L.; Ke, R.M. High-Resolution Vehicle Trajectory Extraction and Denoising from Aerial Videos. IEEE Trans. Intell. Transp. Syst.; 2021; 22, pp. 3190-3202. [DOI: https://dx.doi.org/10.1109/TITS.2020.3003782]
69. Guido, G.; Gallelli, V.; Rogano, D.; Vitale, A. Evaluating the accuracy of vehicle tracking data obtained from Unmanned Aerial Vehicles. Int. J. Transp. Sci. Technol.; 2016; 5, pp. 136-151. [DOI: https://dx.doi.org/10.1016/j.ijtst.2016.12.001]
70. Javadi, S.; Dahl, M.; Pettersson, M.I. Vehicle Detection in Aerial Images Based on 3D Depth Maps and Deep Neural Networks. IEEE Access; 2021; 9, pp. 8381-8391. [DOI: https://dx.doi.org/10.1109/ACCESS.2021.3049741]
71. Kang, L.; Mattyus, G. Fast Multiclass Vehicle Detection on Aerial Images. IEEE Geosci. Remote Sens. Lett.; 2015; 12, pp. 1938-1942. [DOI: https://dx.doi.org/10.1109/LGRS.2015.2439517]
72. Kaufmann, S.; Kerner, B.S.; Rehborn, H.; Koller, M.; Klenov, S.L. Aerial observations of moving synchronized flow patterns in over-saturated city traffic. Transp. Res. Part C Emerg. Technol.; 2018; 86, pp. 393-406. [DOI: https://dx.doi.org/10.1016/j.trc.2017.11.024]
73. Ke, R.; Li, Z.; Kim, S.; Ash, J.; Cui, Z.; Wang, Y. Real-Time Bidirectional Traffic Flow Parameter Estimation from Aerial Videos. IEEE Trans. Intell. Transp. Syst.; 2017; 18, pp. 890-901. [DOI: https://dx.doi.org/10.1109/TITS.2016.2595526]
74. Khan, M.A.; Ectors, W.; Bellemans, T.; Janssens, D.; Wets, G. Unmanned aerial vehicle-based traffic analysis: Methodological framework for automated multivehicle trajectory extraction. Transp. Res. Rec.; 2017; 2626, pp. 25-33. [DOI: https://dx.doi.org/10.3141/2626-04]
75. Khan, M.A.; Ectors, W.; Bellemans, T.; Janssens, D.; Wets, G. Unmanned aerial vehicle-based traffic analysis: A case study for shockwave identification and flow parameters estimation at signalized intersections. Remote Sens.; 2018; 10, 458. [DOI: https://dx.doi.org/10.3390/rs10030458]
76. Khan, N.A.; Jhanjhi, N.Z.; Brohi, S.N.; Usmani, R.S.A.; Nayyar, A. Smart traffic monitoring system using Unmanned Aerial Vehicles (UAVs). Comput. Commun.; 2020; 157, pp. 434-443. [DOI: https://dx.doi.org/10.1016/j.comcom.2020.04.049]
77. Kujawski, A.; Dudek, T. Analysis and visualization of data obtained from camera mounted on unmanned aerial vehicle used in areas of urban transport. Sustain. Cities Soc.; 2021; 72, 103004. [DOI: https://dx.doi.org/10.1016/j.scs.2021.103004]
78. Li, J.; Chen, S.; Zhang, F.; Li, E.; Yang, T.; Lu, Z. An Adaptive Framework for Multi-Vehicle Ground Speed Estimation in Airborne Videos. Remote Sens.; 2019; 11, 1241. [DOI: https://dx.doi.org/10.3390/rs11101241]
79. Li, X.; Li, X.; Pan, H. Multi-Scale Vehicle Detection in High-Resolution Aerial Images with Context Information. IEEE Access; 2020; 8, pp. 208643-208657. [DOI: https://dx.doi.org/10.1109/ACCESS.2020.3036075]
80. Luo, H.; Chu, S.C.; Wu, X.; Wang, Z.; Xu, F. Traffic collisions early warning aided by small unmanned aerial vehicle companion. Telecommun. Syst.; 2020; 75, pp. 169-180. [DOI: https://dx.doi.org/10.1007/s11235-015-0131-5]
81. Moranduzzo, T.; Melgani, F. Automatic Car Counting Method for Unmanned Aerial Vehicle Images. IEEE Trans. Geosci. Remote. Sens.; 2014; 52, pp. 1635-1647. [DOI: https://dx.doi.org/10.1109/TGRS.2013.2253108]
82. Shan, D.; Lei, T.; Yin, X.; Luo, Q.; Gong, L. Extracting Key Traffic Parameters from UAV Video with On-Board Vehicle Data Validation. Sensors; 2021; 21, 5620. [DOI: https://dx.doi.org/10.3390/s21165620] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34451061]
83. Wan, M.; Gu, G.; Qian, W.; Ren, K.; Maldague, X.; Chen, Q. Unmanned Aerial Vehicle Video-Based Target Tracking Algorithm Using Sparse Representation. IEEE Internet Things J.; 2019; 6, pp. 9689-9706. [DOI: https://dx.doi.org/10.1109/JIOT.2019.2930656]
84. Wang, L.; Chen, F.; Yin, H. Detecting and tracking vehicles in traffic by unmanned aerial vehicles. Autom. Constr.; 2016; 72, pp. 294-308. [DOI: https://dx.doi.org/10.1016/j.autcon.2016.05.008]
85. Wang, H.W.; Cheng, K.; Lu, Q.C.; Peng, Z.R. Improved model of start-wave velocity at intersections based on unmanned aerial vehicle data. J. Donghua Univ. Eng. Ed.; 2016; 33, pp. 13-19.
86. Wang, C.; Xu, C.; Dai, Y. A crash prediction method based on bivariate extreme value theory and video-based vehicle trajectory data. Accid. Anal. Prev.; 2019; 123, pp. 365-373. [DOI: https://dx.doi.org/10.1016/j.aap.2018.12.013]
87. Wang, J.; Simeonova, S.; Shahbazi, M. Orientation- and Scale-Invariant Multi-Vehicle Detection and Tracking from Unmanned Aerial Videos. Remote Sens.; 2019; 11, 2155. [DOI: https://dx.doi.org/10.3390/rs11182155]
88. Xing, L.; He, J.; Li, Y.; Wu, Y.; Yuan, J.; Gu, X. Comparison of different models for evaluating vehicle collision risks at upstream diverging area of toll plaza. Accid. Anal. Prev.; 2020; 135, 105343. [DOI: https://dx.doi.org/10.1016/j.aap.2019.105343]
89. Xing, L.; He, J.; Abdel-Aty, M.; Wu, Y.; Yuan, J. Time-varying Analysis of Traffic Conflicts at the Upstream Approach of Toll Plaza. Accid. Anal. Prev.; 2020; 141, 105539. [DOI: https://dx.doi.org/10.1016/j.aap.2020.105539] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/32289572]
90. Xu, Y.; Yu, G.; Wang, Y.; Wu, X.; Ma, Y. A hybrid vehicle detection method based on viola-jones and HOG + SVM from UAV images. Sensors; 2016; 16, 1325. [DOI: https://dx.doi.org/10.3390/s16081325] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/27548179]
91. Zhu, J.; Sun, K.; Jia, S.; Lin, W.; Hou, X.; Liu, B.; Qiu, G. Bidirectional long short-term memory network for vehicle behavior recognition. Remote Sens.; 2018; 10, 887. [DOI: https://dx.doi.org/10.3390/rs10060887]
92. Zhu, J.S.; Sun, K.; Jia, S.; Li, Q.Q.; Hou, X.X.; Lin, W.D.; Liu, B.Z.; Qiu, G.P. Urban Traffic Density Estimation Based on Ultrahigh-Resolution UAV Video and Deep Neural Network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.; 2018; 11, pp. 4968-4981. [DOI: https://dx.doi.org/10.1109/JSTARS.2018.2879368]
93. SankeyMATIC. Available online: https://sankeymatic.com/ (accessed on 22 December 2021).
94. Kovalev, I.; Voroshilova, A.; Karaseva, M. Analysis of the current situation and development trend of the international cargo UAVs market. J. Phys. Conf. Ser.; 2019; 1399, 055095. [DOI: https://dx.doi.org/10.1088/1742-6596/1399/5/055095]
95. Smith, M.L.; Smith, L.N.; Hansen, M.F. The quiet revolution in machine vision-a state-of-the-art survey paper, including historical review, perspectives, and future directions. Comput. Ind.; 2021; 130, 103472. [DOI: https://dx.doi.org/10.1016/j.compind.2021.103472]
96. Alam, S.; Oluoch, J. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs). Expert Syst. Appl.; 2021; 179, 115091. [DOI: https://dx.doi.org/10.1016/j.eswa.2021.115091]
97. Salvo, G.; Caruso, L.; Scordo, A.; Guido, G.; Vitale, A. Traffic data acquirement by unmanned aerial vehicle. Eur. J. Remote Sens.; 2017; 50, pp. 343-351. [DOI: https://dx.doi.org/10.1080/22797254.2017.1328978]
98. Milić, A.; Randjelovic, A.; Radovanović, M. Use of Drones in Operations in the Urban Environment. Proceedings of the 5th International Scientific Conference Safety and Crisis Management-Theory and Practise Safety for the Future–SecMan 2019; Belgrad, Serbia, 3–4 October 2019.
99. Watkins, S.; Burry, J.; Mohamed, A.; Marino, M.; Prudden, S.; Fisher, A.; Kloet, N.; Jakobi, T.; Clothier, R. Ten questions concerning the use of drones in urban environments. Build. Environ.; 2020; 167, 106458. [DOI: https://dx.doi.org/10.1016/j.buildenv.2019.106458]
100. Schmidt, T.; Hauer, F.; Pretschner, A. Understanding Safety for Unmanned Aerial Vehicles in Urban Environments. Proceedings of the 2021 IEEE Intelligent Vehicles Symposium (IV); Nagoya, Japan, 11–17 July 2021; pp. 638-643.
101. Shvetsova, S.; Shvetsov, A. Safety when Flying Unmanned Aerial Vehicles at Transport Infrastructure Facilities. Transp. Res. Procedia; 2021; 54, pp. 397-403. [DOI: https://dx.doi.org/10.1016/j.trpro.2021.02.086]
102. Primatesta, S.; Rizzo, A.; la Cour-Harbo, A. Ground Risk Map for Unmanned Aircraft in Urban Environments. J. Intell. Robot. Syst.; 2020; 97, pp. 489-509. [DOI: https://dx.doi.org/10.1007/s10846-019-01015-z]
103. Lee, J.Y.; Chung, A.Y.; Shim, H.; Joe, C.; Park, S.; Kim, H. UAV Flight and Landing Guidance System for Emergency Situations. Sensors; 2019; 19, 4468. [DOI: https://dx.doi.org/10.3390/s19204468]
104. Guerin, J.; Delmas, K.; Guiochet, J. Certifying Emergency Landing for Safe Urban UAV. Proceedings of the 2021 51st Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W); Taipei, Taiwan, 21–24 June 2021.
105. Elloumi, M.; Dhaou, R.; Escrig, B.; Idoudi, H.; Saidane, L.; Fer, A. Traffic Monitoring on City Roads using UAVs. Proceedings of the International Conference on Ad-Hoc Networks and Wireless; Luxembourg, 1–3 October 2019.
106. Sharma, V.; Chen, H.-C.; Kumar, R. Driver behaviour detection and vehicle rating using multi-UAV coordinated vehicular networks. J. Comput. Syst. Sci.; 2017; 86, pp. 3-32. [DOI: https://dx.doi.org/10.1016/j.jcss.2016.10.003]
107. Zhai, W.; Han, B.; Li, D.; Duan, J.; Cheng, C. A low-altitude public air route network for UAV management constructed by global subdivision grids. PLoS ONE; 2021; 16, e0249680. [DOI: https://dx.doi.org/10.1371/journal.pone.0249680]
108. Feng, D.; Du, P.; Shen, H.; Liu, Z. UAS Traffic Management in Low-Altitude Airspace Based on Three Dimensional Digital Aerial Corridor System. Urban Intelligence and Applications. Studies in Distributed Intelligence; Yuan, X.; Elhoseny, M. Springer: Cham, Switzerland, 2020; pp. 179-188.
109. Kim, J.; Atkins, E. Airspace Geofencing and Flight Planning for Low-Altitude, Urban, Small Unmanned Aircraft Systems. Appl. Sci. Switz.; 2022; 12, 576. [DOI: https://dx.doi.org/10.3390/app12020576]
110. Samir Labib, N.; Danoy, G.; Musial, J.; Brust, M.R.; Bouvry, P. Internet of Unmanned Aerial Vehicles—A Multilayer Low-Altitude Airspace Model for Distributed UAV Traffic Management. Sensors; 2019; 19, 4779. [DOI: https://dx.doi.org/10.3390/s19214779]
111. Messina, G.; Peña, J.M.; Vizzari, M.; Modica, G. A Comparison of UAV and Satellites Multispectral Imagery in Monitoring Onion Crop. An Application in the ‘Cipolla Rossa di Tropea’ (Italy). Remote Sens.; 2020; 12, 3424. [DOI: https://dx.doi.org/10.3390/rs12203424]
112. Xiang, T.; Xia, G.; Zhang, L. Mini-Unmanned Aerial Vehicle-Based Remote Sensing: Techniques, applications, and prospects. IEEE Geosci. Remote Sens. Mag.; 2019; 7, pp. 29-63. [DOI: https://dx.doi.org/10.1109/MGRS.2019.2918840]
113. Liu, M.; Yu, T.; Gu, X.; Sun, Z.; Yang, J.; Zhang, Z.; Mi, X.; Cao, W.; Li, J. The Impact of Spatial Resolution on the Classification of Vegetation Types in Highly Fragmented Planting Areas Based on Unmanned Aerial Vehicle Hyperspectral Images. Remote Sens.; 2020; 12, 146. [DOI: https://dx.doi.org/10.3390/rs12010146]
114. Samiappan, S.; Hathcock, L.; Turnage, G.; McCraine, C.; Pitchford, J.; Moorhead, R. Remote Sensing of Wildfire Using a Small Unmanned Aerial System: Post-Fire Mapping, Vegetation Recovery and Damage Analysis in Grand Bay, Mississippi/Alabama, USA. Drones; 2019; 3, 43. [DOI: https://dx.doi.org/10.3390/drones3020043]
115. Zhang, X.; Jin, J.; Lan, Z.; Li, C.; Fan, M.; Wang, Y.; Yu, X.; Zhang, Y. ICENET: A Semantic Segmentation Deep Network for River Ice by Fusing Positional and Channel-Wise Attentive Features. Remote Sens.; 2020; 12, 221. [DOI: https://dx.doi.org/10.3390/rs12020221]
116. FAA. Small Unmanned Aircraft Systems (UAS) Regulations (Part 107). Available online: https://www.faa.gov/newsroom/small-unmanned-aircraft-systems-uas-regulations-part-107 (accessed on 15 December 2021).
117. EASA. 2021; Available online: https://www.easa.europa.eu/document-library/easy-access-rules/online-publications/easy-access-rules-unmanned-aircraft-systems?page=5 (accessed on 23 December 2021).
118. Lewandowski, K. Sustainable Usage of Freight Drones in City Centers, Proposition of Regulations for Safe Usage of Drones. Sustainability; 2021; 13, 8634. [DOI: https://dx.doi.org/10.3390/su13158634]
119. Drone-Laws. Drone Laws in Romania. Available online: https://drone-laws.com/drone-laws-in-romania/ (accessed on 23 December 2021).
120. Alamouri, A.; Lampert, A.; Gerke, M. An Exploratory Investigation of UAS Regulations in Europe and the Impact on Effective Use and Economic Potential. Drones; 2021; 5, 63. [DOI: https://dx.doi.org/10.3390/drones5030063]
121. U-space. Supporting Safe and Secure Drone Operations in Europe; Publications Office of the European Union European Union: Luxembourg, 2020.
122. Li, T.; Ye, J.; Dai, J.; Lei, H.; Yang, W.; Pan, G.; Chen, Y. Secure UAV-to-Vehicle Communications. IEEE Trans. Commun.; 2021; 69, pp. 5381-5393. [DOI: https://dx.doi.org/10.1109/TCOMM.2021.3074969]
123. Allouch, A.; Cheikhrouhou, O.; Koubâa, A.; Toumi, K.; Khalgui, M.; Nguyen Gia, T. UTM-Chain: Blockchain-Based Secure Unmanned Traffic Management for Internet of Drones. Sensors; 2021; 21, 3049. [DOI: https://dx.doi.org/10.3390/s21093049]
124. Mukhamediev, R.I.; Symagulov, A.; Kuchin, Y.; Zaitseva, E.; Bekbotayeva, A.; Yakunin, K.; Assanov, I.; Levashenko, V.; Popova, Y.; Akzhalova, A. et al. Review of Some Applications of Unmanned Aerial Vehicles Technology in the Resource-Rich Country. Appl. Sci.; 2021; 11, 171. [DOI: https://dx.doi.org/10.3390/app112110171]
125. Oliveira, F.; Luís, M.; Sargento, S. Machine Learning for the Dynamic Positioning of UAVs for Extended Connectivity. Sensors; 2021; 21, 4618. [DOI: https://dx.doi.org/10.3390/s21134618] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34283165]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Unmanned aerial vehicles (UAVs) are gaining considerable interest in transportation engineering in order to monitor and analyze traffic. This systematic review surveys the scientific contributions in the application of UAVs for civil engineering, especially those related to traffic monitoring. Following the PRISMA framework, 34 papers were identified in five scientific databases. First, this paper introduces previous works in this field. In addition, the selected papers were analyzed, and some conclusions were drawn to complement the findings. It can be stated that this is still a field in its infancy and that progress in advanced image processing techniques and technologies used in the construction of UAVs will lead to an explosion in the number of applications, which will result in increased benefits for society, reducing unpleasant situations, such as congestion and collisions in major urban centers of the world.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer