Content area

Abstract

What are the main findings?

A novel video satellite visual tracking method for space targets that accounts for uncertainties in both camera parameters and target position is proposed.

By adaptively estimating uncertain parameters, this method effectively overcomes the problem of traditional tracking accuracy being greatly affected by simultaneous uncertainties.

What are the implications of the main findings?

This work provides a novel framework for video satellite-based visual tracking that is robust to both camera calibration errors and target orbital uncertainties, achieving improvements in both accuracy and observation performance over conventional techniques.

It serves as a complement to ground-based remote sensing technologies, with the goal of improving space situational awareness capabilities for the near-Earth orbital environment.

Video satellites feature agile attitude maneuverability and the capability for continuous target imaging, making them an effective complement to ground-based remote sensing technologies. Existing research on video satellite tracking methods generally assumes either accurately calibrated camera parameters or precisely known target positions. However, deviations in camera parameters and errors in target localization can significantly degrade the performance of current tracking approaches. This paper proposes a novel adaptive visual tracking method for video satellites to track near-circular space targets in the presence of simultaneous uncertainties in both camera parameters and target position. First, the parameters representing these two types of uncertainties are separated through linearization. Then, based on the real-time image tracking error and the current parameter estimates, an update law for the uncertain parameters and a visual tracking law are designed. The stability of the closed-loop system and the convergence of the tracking error are rigorously proven. Finally, quantitative comparisons are conducted using a defined image stability index against two conventional tracking methods. Simulation results demonstrate that under coexisting uncertainties, traditional control methods either fail to track the target or exhibit significant tracking precision degradation. In contrast, the average image error during the steady-state phase exhibits a reduction of approximately one order of magnitude with the proposed method compared to the traditional image-based approach, demonstrating its superior tracking precision under complex uncertainty conditions.

Details

1009240
Title
Video Satellite Visual Tracking of Space Targets with Uncertainties in Camera Parameters and Target Position
Author
Zhong Zikai  VIAFID ORCID Logo  ; Fan Caizhi  VIAFID ORCID Logo  ; Song, Haibo  VIAFID ORCID Logo 
Publication title
Volume
17
Issue
24
First page
3978
Number of pages
29
Publication year
2025
Publication date
2025
Publisher
MDPI AG
Place of publication
Basel
Country of publication
Switzerland
Publication subject
e-ISSN
20724292
Source type
Scholarly Journal
Language of publication
English
Document type
Journal Article
Publication history
 
 
Online publication date
2025-12-09
Milestone dates
2025-10-20 (Received); 2025-12-08 (Accepted)
Publication history
 
 
   First posting date
09 Dec 2025
ProQuest document ID
3286351840
Document URL
https://www.proquest.com/scholarly-journals/video-satellite-visual-tracking-space-targets/docview/3286351840/se-2?accountid=208611
Copyright
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2025-12-24
Database
ProQuest One Academic