Content area

Abstract

The ubiquitous presence and incredible features of smartphones have led to their great popularity among users. In recent years, this popularity has encouraged developers and researchers to design a variety of navigation applications. Most of these navigational applications use global positioning systems (GPS) to localize the user’s location. However, GPS-enabled smartphones are typically only accurate within 4.9 m even in the best of cases, and they tend to be even less accurate in practice, especially in urban areas. Therefore, this research proposes an algorithm for computing a user’s location, especially at night. It works by recognizing a building that is photographed by the user’s camera, and then an algorithm attempts to match the user’s picture with a corresponding image stored in a database. Next, the user’s location is calculated on the basis of the transformation between the two images.

Due to the lack of images of buildings at nighttime with ground truth measurements of distance and angle from the image to the building, a dataset of nighttime images was collected. The goal of this dataset was to represent the difficulties an image could present if it were taken by users. This includes wide viewpoint changes, repetitive structures and occlusion. Using three datasets, including the Nighttime dataset, an evaluation of state-of-the-art feature point detectors/descriptors was conducted to measure their performance under the deformations in nighttime images. The results of the experiments showed that their performance was significantly affected when matching daytime and nighttime images. Moreover, among the state-of-the-art detectors and descriptors, AGAST-TEBLID and FAST-TEBLID demonstrate the best performance.

In this study, a new algorithm for matching images using line segments was introduced, which is called the graph-based image matching (GBIM) algorithm. In GBIM, an edge map is produced using the edge drawing (ED) algorithm. Subsequently, line segments are extracted using EDLines. A relational graph is constructed for each image based on the relationships between line segments. Then, an association graph is constructed by looking for compatible relationships between the two relational graphs. The depth-first search (DFS) algorithm searches for chains of compatible relationships in the association graph—such chains represent where the two images match. To discard false matches, homography with random sample consensus (RANSAC) was applied. The proposed algorithm succeeded in matching images with moderate viewpoint changes. Furthermore, in some test scenarios, its performance exceeded that of the state-of-the-art line segment matching algorithm, LBD.

For this study, a novel algorithm was developed to determine a user's location. The algorithm estimates the user's location by leveraging knowledge of the physical distance between the database image and the building. The user's orientation with respect to the building is estimated by decomposing a mathematical model representing the transformation between the database image and the user's image. Two mathematical models, namely homography and the essential matrix, were decomposed, and their results were compared. The user's location estimation algorithm exhibited promising and accurate results across the majority of test scenarios when compared to the estimated locations provided by Google Maps.

Details

1010268
Business indexing term
Title
A Lightweight System for Improved Urban Location From Night-Time Images
Number of pages
203
Publication year
2025
Degree date
2025
School code
1543
Source
DAI-B 87/4(E), Dissertation Abstracts International
ISBN
9798297682405
Advisor
University/institution
The University of Manchester (United Kingdom)
University location
England
Degree
Ph.D.
Source type
Dissertation or Thesis
Language
English
Document type
Dissertation/Thesis
Dissertation/thesis number
32293315
ProQuest document ID
3266756271
Document URL
https://www.proquest.com/dissertations-theses/lightweight-system-improved-urban-location-night/docview/3266756271/se-2?accountid=208611
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.
Database
ProQuest One Academic