Content area

Abstract

Geospatial databases generally consist of measurements related to points (or pixels in the case of raster data), lines, and polygons. In recent years, the size and complexity of these databases have increased significantly and they often contain duplicate records, i.e., two or more close records representing the same measurement result. In this thesis, we address the problem of detecting duplicates in a database consisting of point measurements. As a test case, we use a database of measurements of anomalies in the Earth's gravity field that we have compiled. In this thesis, we describe a natural duplicate deletion algorithm and show that it requires (in the worst case) quadratic time; we also propose a new asymptotically optimal O(n · log(n)) algorithm. These two algorithms have been successfully applied to gravity databases. We believe that they will prove to be useful when dealing with many other types of spatial data.*

*This dissertation is a compound document (contains both a paper copy and a CD as part of the dissertation).

Details

Title
Eliminating duplicates under interval and fuzzy uncertainty: An asymptotically optimal algorithm and its geospatial applications
Author
Torres, Roberto
Year
2003
Publisher
ProQuest Dissertations & Theses
ISBN
978-0-496-03223-5
Source type
Dissertation or Thesis
Language of publication
English
ProQuest document ID
305264156
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.