Content area
A novel video satellite visual tracking method for space targets that accounts for uncertainties in both camera parameters and target position is proposed. By adaptively estimating uncertain parameters, this method effectively overcomes the problem of traditional tracking accuracy being greatly affected by simultaneous uncertainties.
This work provides a novel framework for video satellite-based visual tracking that is robust to both camera calibration errors and target orbital uncertainties, achieving improvements in both accuracy and observation performance over conventional techniques. It serves as a complement to ground-based remote sensing technologies, with the goal of improving space situational awareness capabilities for the near-Earth orbital environment. Video satellites feature agile attitude maneuverability and the capability for continuous target imaging, making them an effective complement to ground-based remote sensing technologies. Existing research on video satellite tracking methods generally assumes either accurately calibrated camera parameters or precisely known target positions. However, deviations in camera parameters and errors in target localization can significantly degrade the performance of current tracking approaches. This paper proposes a novel adaptive visual tracking method for video satellites to track near-circular space targets in the presence of simultaneous uncertainties in both camera parameters and target position. First, the parameters representing these two types of uncertainties are separated through linearization. Then, based on the real-time image tracking error and the current parameter estimates, an update law for the uncertain parameters and a visual tracking law are designed. The stability of the closed-loop system and the convergence of the tracking error are rigorously proven. Finally, quantitative comparisons are conducted using a defined image stability index against two conventional tracking methods. Simulation results demonstrate that under coexisting uncertainties, traditional control methods either fail to track the target or exhibit significant tracking precision degradation. In contrast, the average image error during the steady-state phase exhibits a reduction of approximately one order of magnitude with the proposed method compared to the traditional image-based approach, demonstrating its superior tracking precision under complex uncertainty conditions.
Full text
1. Introduction
In recent years, advancements in launch vehicle technology have facilitated the deployment of large satellite constellations, such as Starlink, Telesat, and OneWeb. This has led to a rapid increase in the number of on-orbit satellites and a progressively congested orbital environment. Meanwhile, the accumulation of defunct satellites, space debris, and launch vehicle fragments—which persist in medium-low Earth near-circular orbits—poses a growing threat to the safety of operational spacecraft. Against this backdrop, tracking and monitoring near-Earth space objects is crucial for early risk warning and plays a vital role in ensuring spacecraft safety. Video satellites, which offer advantages including real-time monitoring, continuous imaging, agile attitude maneuverability, and cost-effectiveness, have emerged as an effective supplement to ground-based space remote sensing technologies [1,2]. They are widely used in stare-mode observation, space situational awareness, and disaster prevention, thereby significantly improving comprehensive situational awareness [3,4,5]. Consequently, several countries have successfully developed and launched various optical video imaging satellites in recent years [6,7,8,9,10].
Video satellites utilize onboard visible-light cameras to perform visual tracking and continuous imaging of ground or space targets. This requires the satellite’s control system to continuously adjust the boresight direction of the star-borne camera, ensuring it remains aligned with the target throughout the observation process [11]. At this point, the target will appear at the center of the camera’s imaging plane, ensuring optimal observation results. Simultaneously, since the target’s projection is located farther from the image border, the risk of the target moving out of the camera’s field of view due to relative motion is minimized, thereby guaranteeing uninterrupted continuous visual tracking. Early research of video satellite visual tracking are mainly position-based methods. This type of tracking methods calculates real-time desired control commands based on the target’s position and the current camera boresight direction [12,13,14,15,16], which depend on comprehensive prior positional information. Consequently, it requires both the target position and camera parameters to be precisely known. Even slight deviations in the real-time target position or the camera’s optical parameters can degrade the tracking and observation performance.
To overcome the limitations of position-based tracking methods that require complete prior information, scholars have shifted their focus to image-based visual tracking methods for video satellites [5,17,18,19,20,21]. Unlike position-based tracking methods, after the camera completes initial target acquisition and the onboard image processing system extracts the coordinates of the target’s projection point, the generation of the tracking instructions no longer relies on precise target position information. Instead, the visual tracking instructions are calculated in real-time based on the target’s projection point on the image plane. This process gradually guides the adjustment of the camera’s boresight direction, and the target’s projection will move toward the center of the image plane to achieve observation. Consequently, such methods can achieve stable and effective visual tracking of the target without relying on precise target position information, offering enhanced autonomy and robustness. Our team has previously investigated image-based visual tracking of video satellites and designed visual tracking methods [4,22,23,24,25]. In these studies, the error quaternion calculated from image coordinates is defined based on the deviation between the target’s real-time projection point coordinates and the desired projection coordinates on the image plane, and tracking instructions are designed based on the error quaternion. The effectiveness of these approaches relies heavily on precise onboard camera imaging models, which encompass both the internal optical parameters of the camera and its installation parameters relative to the satellite body. However, the in-orbit operational environment of video satellites—subject to thermal cycling, particle radiation, and micro-vibrations—can cause deviations in camera parameters. Since in-orbit calibration of camera parameters is often cumbersome and time-consuming, it significantly compromises the mission flexibility.
To address the challenges of visual tracking of video satellites with uncalibrated camera, our team investigated the target staring observation considering uncertainties in camera parameters [3,26,27]. The designed tracking methods were verified through simulation to effectively overcome the limitations of position-based tracking methods. In these studies, it was assumed that system uncertainties originated solely from the onboard camera. By establishing a linear relationship between the estimated projection error and camera parameters, appropriate adaptive update laws were designed to achieve real-time estimation of camera parameter uncertainties. Consequently, this approach is exclusively applicable to scenarios where the target position is precisely known (e.g., observation of ground targets with known longitude and latitude). However, the above methods are not directly applicable to the visual tracking of spatial targets when camera parameters are uncertain, as they rely on the real-time acquisition of precise target positions.
The commonly used passive orbit determination methods for space targets include optical orbit determination, radar orbit determination and multi-source fusion orbit determination. These orbit determination methods are inevitably subject to certain deviations in orbit determination results due to factors such as weather, sensor noise, and time synchronization [28,29,30,31]. Related studies have shown that the vast majority of spacecraft operate in near circular orbits [32], and the measurement deviations of the semi major axis, eccentricity, and true anomaly of space targets under short-term pure optical measurements are greater than the deviations and uncertainties of the orbital elements that characterize the spatial orientation of the orbital plane (orbital inclination and ascending node right ascension are strongly constrained by normal observations of the orbital plane) [33,34,35,36,37]. If radar ranging information is integrated, the measurement accuracy of the semi major axis can be greatly improved, but the phase angle (including the argument of perigee and true anomaly) measurement deviation of spacecraft in the orbital plane is still relatively significant. The positional uncertainty within the orbital plane may further affect the tracking performance of traditional methods, leading to tracking failures or decreased accuracy. However, despite our efforts, there is currently no research on visual tracking of space targets by video satellites under the conditions of uncertainties in camera parameters and target position.
This paper addresses the performance degradation in video satellites’ visual tracking of near-circular orbit targets under simultaneous uncertainties in camera parameters and target position. Based on our team’s previous research on the visual tracking of ground targets with known positions (Refs. [3,26,27]), which only considered camera parameter uncertainties, we extend the applicability of the method to spatial near circular orbit targets with phase angle measurement deviations. An adaptive visual tracking method for space targets that simultaneously considers the uncertainties in camera parameters and target position is proposed. The main work of this manuscript is as follows. Firstly, the motion equation of the space target operating in a near circular orbit around the Earth is linearized, and the orbit phase parameters containing uncertainties are separated. Then, the parameters representing the uncertainties of camera parameters and target position are extracted from the camera’s observation equation and the visual velocity equation as the estimated variables and linearized, laying the foundation for parameter adaptive estimation. Afterwards, an adaptive visual tracking law and parameter update law based on image feedback information are designed, and the stability of the closed-loop system is rigorously proved using Barbalat’s lemma. Finally, simulation verification is conducted. In the simulation section, the proposed method is compared with the traditional position-based tracking method and image-based method. The performance of each tracking method is quantitatively analyzed by defining the image stability index. It is verified that when uncertainties exist in both camera parameters and target orbit position, traditional methods suffer from shortcomings such as tracking failure or significant degradation in tracking accuracy. However, the method proposed in this paper can overcome the effects of both uncertainties and achieve higher precision tracking of the target.
The main contribution of this article can be summarized as: (1). To address the challenge of video satellite visual tracking under simultaneous uncertainties in camera parameters and target positions, this paper proposes a novel adaptive method. Unlike traditional approaches that suffer from performance degradation under such uncertainties, or existing adaptive methods that require precisely known target positions, the proposed technique estimates both types of uncertainties concurrently. It leverages real-time image feedback to update the unknown parameters and compute the visual tracking instructions, with the closed-loop system stability rigorously guaranteed. (2). An image stability index is defined to quantitatively evaluate the tracking accuracy. Simulations comparing the proposed method with traditional position-based and image-based methods reveal that under concurrent uncertainties, the position-based method fails to track the target, while the proposed method reduces the steady-state image error by approximately an order of magnitude compared to the traditional image-based approach. This improvement enables significantly higher tracking precision despite the presence of dual uncertainties.
The remainder of this paper is structured as follows. Section 2 presents the visual tracking modeling, including the tracking observation model and motion model of the video satellite. Section 3 details the design of the adaptive staring method, which involves formulating appropriate parameters to be estimated for real-time updates of both target position parameters and camera parameters, with strict stability analysis of the closed-loop system provided. Section 4 presents the simulation analysis, and Section 5 concludes the study.
2. Problem Formulation
The configuration of the video satellite studied in this paper is shown in Figure 1. It consists of a rigid satellite platform and a camera installed on the end effector of a 2-degree-of-freedom (DOF) pan-tilt. The pan-tilt can rotate around the yaw axis and the pitch axis to adjust the pointing direction of the camera’s optical axis and realize the staring of targets. The pan-tilt system is simplified to a 2 DOF linkage system, where link 1 rotates around the yaw axis and link 2 rotates around the pitch axis, as the area enclosed by the dashed line in Figure 2.
2.1. Definition of Coordinate Frames
The following five coordinate systems were established in this paper to describe the motion of the video satellite equipped with the pan-tilt camera, as shown in Figure 2:
(1). Earth Centered Inertial Frame . The origin is located at the center of the Earth. aligns with the vernal equinox, points to the pole of the J2000.0 mean equator, and is determined by the right-hand rule [38].
(2). Satellite Body Frame . The origin is located at the center of mass of the satellite. The three unit axes align with the principle axes of the satellite body’s inertia [39].
(3). Pan-tilt Frame . This is fixedly connected to the end of the pan-tilt. When each rotation angle of the pan-tilt is 0, the three-axis directions are parallel to those of the satellite body frame [40].
(4). Camera Frame . is located at the optical center of the camera. points along the direction of the camera’s optical axis and is perpendicular to the camera’s imaging plane. and are parallel to the horizontal and vertical directions of the camera’s field of view [3].
(5). Image Plane Frame . is located at the top-left corner of the imaging plane, with the axis and axis parallel to and respectively. The center point of the image is [3].
2.2. Observation Model
2.2.1. Observation Equation in Earth Inertial Frame
Let denote the position of the target relative to the satellite body expressed in frame. Then, according to the coordinate transformation relationship, we have:
(1)
where denotes the homogeneous transformation matrix from Frame to Frame . , , and represent the rotation matrix, position of the satellite and the target in the inertial frame, respectively.In this paper, it is assumed that the satellite body remains Earth-oriented pointing.
2.2.2. Camera External Observation Model
Let denote the homogeneous transformation matrix from Frame to Frame , which represents the relative motion of the pan-tilt relative to the satellite body and can be calculated according to the robot kinematics [40]. Then, it holds:
(2)
Let denote the homogeneous transformation matrix from Frame to Frame , which describes the installation position of the camera relative to the pan-tilt. Then, the position of the target in the frame , denoted by , can be expressed as:
(3)
2.2.3. Camera Internal Observation Model
The relationship between the position of the target in the camera coordinate system and its pixel coordinates in the image is as follows:
(4)
where is the depth information of the target in Frame , and is the camera intrinsic parameter matrix, which is defined as:(5)
where is the focal length, and denote the sizes of unit pixels in the and directions. represents the angle between the two directions, which is usually 90°.2.2.4. Observation Equation in Camera Frame
By combining Equations (1)–(4), we obtain:
(6)
where denotes the projection coordinates of the target on the image plane. represents the camera projection matrix, which is a comprehensive description of the camera’s internal optical parameters and installation position. When the camera is not calibrated, the values of its constituent elements are not accurately known.The projection matrix can be expressed in a block form as follows:
(7)
Then, Equation (6) can be rewritten as:
(8)
By differentiating Equation (8) and exploiting the property of vector cross product, we obtain:
(9)
Similarly,
(10)
where is the joint angular velocity of the pan-tilt and denotes the skew-symmetric operator, with the expression of(11)
2.3. Dynamic Model of the Pan-Tilt Camera System
By simplifying the structure of the pan-tilt into a two-link system, the dynamic model of the 2 DOF system can be described using the classical Lagrangian dynamic equation as [41]:
(12)
where and represent the control torque and disturbance torque applied to the joints, respectively. , , and are the inertia matrix, Coriolis matrix, and gravity matrix of the pan-tilt system, respectively. is the joint angle vector with respect to the two joints. The dynamic coupling effect between the pan-tilt system and the satellite body is calculated according to the method described in Ref. [42].2.4. Linearization of the Motion of Near-Circular Targets
The space target operates in a near-circular orbit, with its motion governed by the two-body orbital dynamics. The geocentric distance and the spatial orientation of the orbital plane have been accurately determined through preliminary orbit determination, there exist certain measured errors in the initial position and velocity of the target within the orbital plane.
The space target is assumed to be in a near-circular orbit, with its motion described by two-body dynamics. Although the orbit’s geocentric distance and the orientation of its orbital plane (e.g., inclination i, right ascension Ω) are assumed to be accurately measured from preliminary orbit determination, uncertainties exist in the target’s initial in-plane position and velocity.
Based on Assumption 1, the motion of the target in the Earth inertial frame can be simplified as a planar circular motion. The position of the target in the orbital plane at any time can be described by the current phase. Assuming the initial measured phase of the target in the orbital plane is (which is also called the argument of latitude), then the position of the target in the orbital plane at time t according to the measure phase can then be represented as , where is the orbit angular velocity of the target and R is the orbit radius. Then, the position of the target in the Earth Centered Inertial Frame can be expressed as
(13)
Differentiating Equation (13), we obtain the expression of the target’s velocity as
(14)
Through the aforementioned transformation, the variable containing uncertain position parameters has been decoupled from other accurately known orbital elements. The following section will further decouple the parameters representing camera uncertainties and target position uncertainties from the expression of the visual velocity (Equations (9) and (10)) via linearization, so as to facilitate the design of the parameters’ update law.
For , , and , there exist the regression matrices , , , and , respectively, such that the following equation holds:
(15)
(16)
where is the parameter denoting the uncertainties in camera parameters and the target’s position.For and , there exist the regression matrices and respectively, such that the following equation holds:
(17)
For the depth information of the target in the camera frame , there exists a regressive matrix , such that
(18)
where is the parameter denoting the uncertainties in camera parameters and the target’s position.The Proofs of Theorems 1–3 are provided in Appendix A.
By summarizing the above results, the following linearization relationship can be obtained:
(19)
(20)
3. Design of Adaptive Visual Tracking Method
Note: This paper assumes that the onboard image processing platform has extracted features from the video stream captured by the camera and obtained the target’s real-time coordinates in the image plane. These coordinates are transmitted to the onboard control processing unit, which generates visual tracking control commands based on the received coordinate information. The commands are then sent to the actuator of the pan-tilt system. Following these instructions, the pan-tilt actuator executes corresponding attitude maneuvers to adjust the camera’s optical axis orientation, thereby changing the coordinates of the target’s projection point and achieving visual tracking observation of the target, as shown in Figure 3.
3.1. Estimation of Visual Velocity
The rate of change of a target’s pixel coordinates on the image plane is called the visual velocity [26], typically obtained by differentiating the projection coordinates. However, direct differentiation can be inaccurate due to the limited frame rate of video satellite cameras and inherent image sensor noise, which this process tends to amplify. To circumvent this issue, this paper proposes a method to avoid differential calculation by introducing the estimated visual velocity, denoted by . The calculation expression of it is given as
(21)
where is a constant. Let denote the estimation error of the visual velocity at time instant t, then it can be deducted that(22)
By substituting Equation (22) into Equation (21), we obtain
(23)
where is the image tracking error. Multiplying both sides by simultaneously obtains(24)
Adding to both sides of the equation simultaneously yields
(25)
By combining Equation (20), we obtain
(26)
3.2. Reference Trajectory Design
Inspired by Ref. [26], which introduced a reference projection trajectory to dynamically adjust the convergence speed, this paper defines a new reference trajectory based on the estimated projection coordinates as:
(27)
Introducing the matrix with the expression of
(28)
The reference angular velocity of the pan-tilt system is defined as
(29)
The tracking error of the angular velocity is defined as
(30)
Combing Equations (28) and (29), we obtain
(31)
For the first term , combining Equation (10) obtains
(32)
For the second term , combining Equation (9) yields
(33)
Substituting Equations (32) and (33) into Equation (31) yields
(34)
According to Equation (28), we have
(35)
Combining Equations (34) and (35) yields
(36)
3.3. Visual Tracking Method Design
According to Equation (29), in order to ensure the existence of the reference angular velocity, it is necessary to ensure the existence of the inverse matrix of . By constructing a potential function and incorporating the gradient of it into the parameter update law, the parameters are adaptively updated in a direction away from making singular. Define the potential function with the determinant of as
(37)
It can be seen from Equation (37) that when , takes the maximum value. As gradually increases, the potential function gradually decreases. The parameters’ update laws are designed as
(38)
where , , and are all a positive definite diagonal matrix.The visual tracking law is designed as
(39)
where and are positive definite diagonal matrices.Under the maneuver of the control law (39), with the parameter update law (38) adopted, the dynamic system (12) is asymptotically stable, and the target’s projection image tracking error will converge to 0.
The Lyapunov function is designed as follows:
(40)
Taking the derivative of obtains
(41)
Taking the derivative of obtains
(42)
Combining Equation (26) yields
(43)
Taking the derivative of obtains
(44)
Combining Equations (38) and (39) and (41)–(44), we obtain
(45)
Further simplification and transformation of the above equation yields
(46)
When the following condition holds,
(47)
That is,
(48)
We have , where and denote the minimum and maximum eigenvalues of matrix , respectively. Therefore, is upper bounded. According to Equation (40), it can be inferred that the components of the Lyapunov function, including , , , and are all bounded. Due to the boundness of the pan-tilt’s angular velocity and true values of the parameters, , and are also bounded.
It can be inferred that the second-order derivative of the Lyapunov function depends on , , , and . Since , , , , and are all bounded, is also bounded. Therefore, is of uniform continuity [43]. According to Barbalat’s Lemma, we have
(49)
Therefore, we have
(50)
The Proof of Theorem 4 is completed. □
4. Results and Discussion
In the simulation section, the designed adaptive visual tracking method (39) is compared and analyzed with the traditional position-based tracking method and image-based tracking method.
The position-based tracking method was designed according to Ref. [11], with the expression of
(51)
where and are positive definite diagonal matrices. and are the desired angular and angular velocity of the pan-tilt system, respectively. Their values are computed based on the attitude error defined in Ref. [11], the kinematics of the pan-tilt, and the measured target position.The image-based tracking method was designed according to Ref. [44], with the expression of
(52)
Please refer to Appendix B for the specific controller design process and parameter meanings.
4.1. Simulation Parameters Setting
The orbit elements of the video satellite at the initial moment are shown in Table 1, and the orbit elements of the target running on a near-circular orbit and the initial estimation error are shown in Table 2. The theoretical and real parameters of the onboard camera are shown in Table 3. The parameters of the proposed adaptive tracking method are shown in Table 4. The parameters of the position-based tracking method are and , and the parameters of the traditional image-based method (52) are shown in Table 5. The initial attitude quaternion and angular velocity of the video satellite are and . The physical parameters and the initial motion state of the pan-tilt system are shown in Table 6. To better simulate the sensor noise present in practical imaging, a deviation is introduced to the actual projection points of the target on the image plane, with the standard deviation in both horizontal and vertical directions set to . The image sampling time interval of the onboard camera is 0.05 s. The external disturbances of the pan-tilt system are set to and the external environment disturbance torque exerted on the satellite body is set to .
In order to quantitatively analyze the image stability of the target tracking performance of each method, an image stability index composed of pixel deviation between the target projection point and the expected point is defined as follows:
(53)
It can be seen that this index can measure the average image error during the time period from to . In the simulation, = 10 s is set as the end of the simulation, and is set as 6 s.
4.2. Simulation Results and Analysis
4.2.1. Simulation Results of the Two Traditional Methods
The simulation results of the position-based tracking method (51), including the variation curves of the angular velocity and the control torque of the pan-tilt system, as well as the target’s projection coordinate and trajectories on the image plane are shown in Figure 4, Figure 5, Figure 6 and Figure 7, and the results of the traditional image-based tracking method (52) are shown in Figure 8, Figure 9, Figure 10 and Figure 11, respectively.
It can be seen that due to the identical initial conditions, the coordinates of the target’s projection on the image plane are the same at the initial moment (due to image measurement noise, there may be slight differences). Due to the deviation between the target projection point and the image center point, the image-based tracking method (52) can generate real-time instructions by image deviation between the current image coordinate and the center point. By adjusting the pan tilt’s two axis, the target projection gradually approaches the image center point, and ultimately, the target projection stabilizes in the area near the center point (the green center point in Figure 10). For the position-based tracking method (51), from Figure 3, Figure 4 and Figure 5, it can be seen that the onboard camera’s final pointing also reached a stable state, but the target projection gradually moves towards the edge of the image plane and eventually disappears from the camera’s field of view, resulting in target tracking failure.
The reasons for the difference in tracking performance between the two methods are analyzed as follows: for the position-based tracking method (51), the expected angle and angular velocity of the pan-tilt are directly calculated from the target’s position measurement information and camera parameters. Due to the deviation of the two critical factors, the control system generates control instructions with deviations. In this state, there is a significant directional deviation between the camera optical axis and the target direction, resulting in the loss of the target image. But for the image-based tracking method (52), the expected state does not directly depend on the target position information but is calculated in real-time through the deviation between the projection coordinates of the target in the image plane and the expected projection point. The influencing factor of tracking performance comes from the uncertainty of camera parameters, but the degree of influence of this factor is smaller than when both uncertainties exist simultaneously. Therefore, the final target still remains near the center of the camera field of view, achieving more effective visual tracking of the target.
4.2.2. Simulation Results of the Adaptive Method
The results of the adaptive tracking method (39) proposed in this paper are shown in Figure 12, Figure 13, Figure 14 and Figure 15. It can be seen that under the action of the adaptive tracking method, the target eventually converges to the center position of the imaging plane, achieving high precision visual tracking of the target. By comparing Figure 10 and Figure 14, it can be seen that under the adaptive tracking method, the distance between the target’s final projection point and the expected projection point is smaller than that of the traditional image-based tracking method, indicating that the adaptive method has a higher tracking accuracy.
Table 7 lists the calculation results of the image stability index (53) with different tracking methods under various simulation conditions. In the table, Condition 1 involves the presence of uncertainties in both camera parameters and target position; Condition 2 involves only camera parameter uncertainty; Condition 3 involves only target position uncertainty; and Condition 4 is free from any uncertainty (i.e., camera parameters are accurately calibrated, and the target position is precisely measured). It can be seen that when the camera is calibrated and the target’s position is precisely measured, the stability indices of the three methods are relatively small, indicating that in ideal situations without uncertainties, all three methods can achieve effective visual tracking of the target with high accuracy. When there is uncertainty in the target’s position, the stability index of the position-based controller rapidly increases, indicating that the tracking error rapidly increases, while that of the image-based controller remains approximately unchanged (different numerical values caused by the image noise). This is because the input of the image-based controller comes entirely from the projection coordinates of the target, so the measurement deviation of the target’s position within the orbital plane will not affect the tracking accuracy. But when uncertainties exist in camera parameters, both the position-based method (51) and image-based method (52) showed a significant increase in the image stability index, with image-based tracking method having a smaller increase, indicating that the image-based method is more robust to camera parameter uncertainties. However, it can be observed that the method (39) proposed in this paper yielded the smallest image stability index across all four conditions, with minimal variation under the different uncertainty conditions. This indicates that the proposed method possesses strong robustness when confronted with both types of uncertainties.
4.2.3. Robustness Test and Discussion of the Adaptive Method
To further investigate the robustness of the proposed tracking method against the two types of uncertainties, this subsection first examines the impact of varying target initial position errors on tracking accuracy by systematically adjusting the magnitude of these deviations. The final image stability indices are recorded in Table 8. The camera parameter settings remain consistent with the previous section.
A comparison of the data in the table revealed that for the proposed visual tracking method for video satellites, the image stability index increased as the initial target positioning error grew under camera parameter uncertainty conditions. This indicates that the positioning accuracy of the proposed method degrades to some extent when the initial target positioning error becomes large. However, even when the positioning error angle increases tenfold, the image tracking accuracy remains higher than that of traditional image-based methods, demonstrating the robustness of the proposed method against initial target positioning deviations. This can be attributed to the fact that the initial positioning bias affects the initial parameter estimates. As the tracking process proceeds, the proposed method adaptively updates the parameters using the target projection information, gradually compensating for the influence of the initial positioning deviation and ultimately achieving high-precision visual tracking of the target.
Next, an extended study was conducted to investigate the adaptability of the proposed method under uncertainties in the orientation of the orbital plane. Assuming initial positioning errors of 0.2° in both the orbital inclination and the right ascension of the ascending node while keeping the other simulation conditions consistent with those in Section 4.2.2, the obtained image stability index was 8.6235. Compared with the results in Section 4.2.2, the tracking accuracy showed a certain degree of degradation, yet remained higher than that of traditional image-based tracking method, whose value is 35.1385. The primary reason for the decline in accuracy lies in the fact that deviations in the orientation of the orbital plane introduce a form of model error, which affects the direction of parameter updates and the final estimated values. However, since the proposed method is essentially an image-feedback-based control strategy, its control law explicitly incorporates the image tracking error . Consequently, even in the presence of model errors, the image feedback term can drive the camera’s optical axis in real-time toward reducing the tracking error, thereby helping to maintain the system’s tracking performance to some extent.
It should be noted that factors such as actual orbital perturbations and non-circular motion may further affect the tracking accuracy of the proposed method. How to maintain high-precision tracking under more complex dynamic scenarios is a key issue to be addressed in future research.
5. Conclusions
This paper proposes an adaptive visual tracking method to address the visual tracking problem for video satellites subject to uncertainties in both camera parameters and the target’s position. By comparing with the traditional position-based tracking method and image-based tracking method, the results demonstrate that the position-based method completely fails under these dual uncertainties. In contrast, the proposed adaptive tracking method effectively mitigates their negative impact, reducing the steady-state image deviation by approximately an order of magnitude compared to the traditional image-based approach. These results demonstrate the superior robustness and precision of the proposed method in handling simultaneous uncertainties of camera parameters and the target’s position.
For future work, it would be valuable to relax the constraints on the target’s motion characteristics, such as investigating adaptive tracking for space targets in arbitrary orbits to further broaden the method’s applicability. Furthermore, additional considerations can be given to conditions such as image measurement noise suppression [45] and satellite-ground communication constraints [46]. The design of corresponding observers and control methods can be developed to better meet the requirements of different missions.
Conceptualization, Z.Z., C.F. and H.S.; Methodology, Z.Z. and C.F.; Software, Z.Z.; Validation, Z.Z.; Formal analysis, Z.Z.; Investigation, Z.Z.; Resources, C.F. and H.S.; Data curation, Z.Z., C.F. and H.S.; Writing—original draft, Z.Z. and C.F.; Writing—review & editing, Z.Z. and C.F.; Visualization, C.F. and H.S.; Supervision, C.F. and H.S.; Project administration, C.F. and H.S.; Funding acquisition, C.F. and H.S. All authors have read and agreed to the published version of the manuscript.
The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.
The authors declare no conflicts of interest.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 1 Schematic Diagram of the Video Satellite Equipped with a Pan-tilt Camera.
Figure 2 Diagram of the Frames.
Figure 3 Flowchart of the control method.
Figure 4 Angular velocity variation curve of the pan-tilt with position-based tracking method (51).
Figure 5 Control torque variation curve of position-based tracking method (51).
Figure 6 Target projection coordinates on the image plane with the position-based tracking method (51).
Figure 7 Target projection trajectories on the image plane with the position-based tracking method (51).
Figure 8 Angular velocity variation curve of the pan-tilt with image-based tracking method (52).
Figure 9 Control torque variation curve of the image-based tracking method (52).
Figure 10 Target projection coordinates on the image plane with the image-based tracking method (52).
Figure 11 Target projection trajectories on the image plane with the image-based tracking method (52).
Figure 12 Angular velocity variation curve of the pan-tilt with the adaptive tracking method (39).
Figure 13 Control torque variation curve of the adaptive tracking method (39).
Figure 14 Target projection coordinates on the image plane with the adaptive tracking method.
Figure 15 Target projection trajectories on the image plane with the adaptive tracking method (39).
Orbit elements of the video satellite at the initial moment.
| Parameter (Unit) | a (km) | e | i (deg) | Ω (deg) | ω (deg) | f (deg) |
|---|---|---|---|---|---|---|
| Value | 6878.14 | 0 | 45.0487 | 156.6496 | 270.0358 | 213.3153 |
Orbit elements of the near-circular orbit target at the initial moment.
| Parameter (Unit) | a (km) | e | i (deg) | Ω (deg) | ω + f (deg) | Δ(ω + f) (deg) |
|---|---|---|---|---|---|---|
| Value | 6993.14 | 0 | 45.0672 | 146.6371 | 133.2061 | 0.2 |
Theoretical and real values of the onboard camera.
| Camera Parameters | Theoretical Values (Unit) | Real Values (Unit) |
|---|---|---|
| f | 1 (m) | 1.1 (m) |
| | (640, 480) (pixel) | (660, 465) (pixel) |
| | 90 (deg) | 89.8 (deg) |
| | 5 (μm) | 5.1 (μm) |
| | 5 (μm) | 5.05 (μm) |
| | | |
Parameters of the adaptive tracking method.
| Parameters | Value |
|---|---|
| | |
| | |
| | |
| | |
| λ | 1 |
| α | 0.5 |
| | |
| | |
Parameters of the image-based tracking method (52).
| Parameters | Value |
|---|---|
| | |
| | |
| | |
| | |
Physical parameters and initial motion state of the pan-tilt system.
| Parameters | Value (Unit) |
|---|---|
| | 0.08 (kg·m2) |
| | 0.08 (kg·m2) |
| | 0.5 (kg) |
| | |
| | |
Image stability index (53) of different tracking methods under different conditions.
| Tracking Methods | Condition 1 | Condition 2 | Condition 3 | Condition 4 |
|---|---|---|---|---|
| Position-based method (51) | 6.6179 × 103 | 795.4040 | 6.1927 × 103 | 8.5417 |
| Image-based method (52) | 35.1385 | 35.1367 | 9.8137 | 9.8144 |
| Adaptive method (39) | 3.3927 | 3.3437 | 3.3679 | 3.3120 |
Image stability index (53) under different initial position uncertainties.
| Δ(ω + f) (deg) | Image Stability Index (53) |
|---|---|
| 0 | 3.3120 |
| 0.1 | 3.3814 |
| 0.2 | 3.3927 |
| 1 | 3.6696 |
| 2 | 6.9395 |
Appendix A
It can be seen that Equations (15) and (16) share an analogous structure. The proof for Theorem 1 will be detailed step-by-step below using Equation (15) as an example.
According to Equations (9) and (10), by denoting
Let
We obtain
Similarly, for
Let
We have
The Proof of Theorem 1 is completed. □
According to Equations (9) and (10), we have
Combining with Equation (14), we obtain
Let
It can be derived that
The Proof of Theorem 2 is completed. □
Combining Equations (8) and (13), we obtain
Let
The Proof of Theorem 3 is completed. □
Appendix B
Let
By taking the derivative of
By introducing the generalized velocity
Let
The derivative of
An observer to estimate the unknown target’s velocity
In visual tracking of the space target, the desired image feature is set to
Taking the derivative of
The desired angular velocity of the pan-tilt is defined as
1. Yao, J.; Xu, B.; Li, X.; Yang, S. A clustering scheduling strategy for space debris tracking. Aerosp. Sci. Technol.; 2025; 157, 109805. [DOI: https://dx.doi.org/10.1016/j.ast.2024.109805]
2. Li, G.; Liu, J.; Jiang, H.; Liu, C. Research on the Efficient Space Debris Observation Method Based on Optical Satellite Constellations. Appl. Sci.; 2023; 13, 4127. [DOI: https://dx.doi.org/10.3390/app13074127]
3. Song, C.; Fan, C.; Wang, M. Image-Based Adaptive Staring Attitude Control for Multiple Ground Targets Using a Miniaturized Video Satellite. Remote Sens.; 2022; 14, 3974. [DOI: https://dx.doi.org/10.3390/rs14163974]
4. Fan, C.; Wang, M.; Song, C.; Zhong, Z.; Yang, Y. Anti-off-Target Control Method for Video Satellite Based on Potential Function. J. Syst. Eng. Electron.; 2024; 35, pp. 1583-1593. [DOI: https://dx.doi.org/10.23919/JSEE.2024.000098]
5. Zhang, X.; Xiang, J.; Zhang, Y. Space Object Detection in Video Satellite Images Using Motion Information. Int. J. Aerosp. Eng.; 2017; 2017, 1024529. [DOI: https://dx.doi.org/10.1155/2017/1024529]
6. Xiao, A.; Wang, Z.; Wang, L.; Ren, Y. Super-Resolution for “Jilin-1” Satellite Video Imagery via a Convolutional Network. Sensors; 2018; 18, 1194. [DOI: https://dx.doi.org/10.3390/s18041194] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/29652838]
7. Zhang, S.; Yuan, Q.; Li, J. Video Satellite Imagery Super Resolution for ‘Jilin-1’ via a Single-and-Multi Frame Ensembled Framework. Proceedings of the IGARSS 2020—2020 IEEE International Geoscience and Remote Sensing Symposium; Virtual, 26 September–2 October 2020; pp. 2731-2734.
8. Julzarika, A. Utilization of LAPAN Satellite (TUBSAT, A2, and A3) in supporting Indonesia’s potential as maritime center of the world. IOP Conf. Ser. Earth Environ. Sci.; 2017; 54, 012097. [DOI: https://dx.doi.org/10.1088/1755-1315/54/1/012097]
9. Bhushan, S.; Shean, D.; Alexandrov, O.; Henderson, S. Automated digital elevation model (DEM) generation from very-high-resolution Planet SkySat triplet stereo and video imagery. ISPRS J. Photogramm. Remote Sens.; 2021; 173, pp. 151-165. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2020.12.012]
10. Rahayu, D.A.; Nugroho, M.; Ferdiansyah, N.; Amiludin, M.F.; Hakim, P.R.; Harsono, S.D. Development of Ground Station Performance Information System for LAPAN Satellite Operations. Proceedings of the 2021 IEEE International Conference on Aerospace Electronics and Remote Sensing Technology (ICARES); Virtual, 3–4 November 2021; pp. 1-7.
11. Lian, Y.; Gao, Y.; Zeng, G. Staring Imaging Attitude Control of Small Satellites. J. Guid. Control. Dyn.; 2017; 40, pp. 1278-1285. [DOI: https://dx.doi.org/10.2514/1.G002197]
12. Han, S.; Ahn, J.; Tahk, M.-J. Analytical Staring Attitude Control Command Generation Method for Earth Observation Satellites. J. Guid. Control. Dyn.; 2022; 45, pp. 1347-1356. [DOI: https://dx.doi.org/10.2514/1.G006041]
13. Li, H.; Zhao, Y.; Li, B.; Li, G. Attitude Control of Staring-Imaging Satellite Using Permanent Magnet Momentum Exchange Sphere. Proceedings of the 2019 22nd International Conference on Electrical Machines and Systems (ICEMS); Harbin, China, 11–14 August 2019; pp. 1-6.
14. Li, C.; Geng, Y.; Guo, Y.; Han, P. Suboptimal Repointing Maneuver of a staring-mode spacecraft with one DOF for final attitude. Acta Astronaut.; 2020; 175, pp. 349-361. [DOI: https://dx.doi.org/10.1016/j.actaastro.2020.04.040]
15. Geng, Y.; Li, C.; Guo, Y.; Biggs, J.D. Hybrid robust and optimal control for pointing a staring-mode spacecraft. Aerosp. Sci. Technol.; 2020; 105, 105959. [DOI: https://dx.doi.org/10.1016/j.ast.2020.105959]
16. Niu, X.; Lu, B.; Feng, B.; Li, Q. Linear parameter-varying gain-scheduled preview-based robust attitude control design for a staring-mode satellite. Aerosp. Sci. Technol.; 2022; 129, 107816. [DOI: https://dx.doi.org/10.1016/j.ast.2022.107816]
17. Zdešar, A.; Klančar, G.; Mušič, G.; Matko, D.; Škrjanc, I. Design of the image-based satellite attitude control algorithm. Proceedings of the 2013 XXIV International Conference on Information, Communication and Automation Technologies (ICAT); Sarajevo, Bosnia and Herzegovina, 30 October–1 November 2013; pp. 1-8.
18. Zhang, X.; Xiang, J.; Zhang, Y. Tracking imaging attitude control of video satellite for cooperative space object. Proceedings of the 2016 IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC); Xi’an, China, 3–5 October 2016; pp. 429-434.
19. Felicetti, L.; Emami, M.R. Image-based attitude maneuvers for space debris tracking. Aerosp. Sci. Technol.; 2018; 76, pp. 58-71. [DOI: https://dx.doi.org/10.1016/j.ast.2018.02.002]
20. Pei, W. Staring Imaging Attitude Tracking Control Laws for Video Satellites Based on Image Information by Hyperbolic Tangent Fuzzy Sliding Mode Control. Comput. Intell. Neurosci.; 2022; 2022, 8289934. [DOI: https://dx.doi.org/10.1155/2022/8289934] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/36110911]
21. Li, P.; Dong, Y.; Li, H. Staring Imaging Real-Time Optimal Control Based on Neural Network. Int. J. Aerosp. Eng.; 2020; 2020, 8822223. [DOI: https://dx.doi.org/10.1155/2020/8822223]
22. Wang, M.; Fan, C.; Song, C. Image-Based Visual Tracking Attitude Control Research on Small Video Satellites for Space Targets. Proceedings of the 2022 IEEE International Conference on Real-Time Computing and Robotics (RCAR); Guiyang, China, 17–22 July 2022; pp. 174-179.
23. Fan, C.; Zhong, Z.; Wang, M.; Yang, Y. Anti-off-target control of target tracking for small video satellite based on field of view zoning. J. Natl. Univ. Def. Technol.; 2025; 47, pp. 98-108.
24. Wang, M.; Song, C.; Fan, C.; Zhang, Y. Image-Based Sliding Mode Attitude Control Research on Small Video Satellites. Proceedings of the 8th China High Resolution Earth Observation Conference (CHREOC 2022); Beijing, China, 5–8 November 2022; Springer: Singapore, 2023; pp. 135-149.
25. Wang, C.; Fan, C.; Song, H.; Zhong, Z.; Zhang, Y. Non-cooperative Target Tracking Control for Video Satellites Based on Genetic Algorithm. Proceedings of the Advances in Guidance, Navigation and Control; Changsha, China, 9–11 August 2024; Springer: Singapore, 2025; pp. 610-619.
26. Fan, C.; Song, C.; Zhong, Z. Video Satellite Staring Control of Ground Targets Based on Visual Velocity Estimation and Uncalibrated Cameras. Remote Sens.; 2025; 17, 1116. [DOI: https://dx.doi.org/10.3390/rs17071116]
27. Song, C.; Fan, C.; Song, H.; Wang, M. Spacecraft Staring Attitude Control for Ground Targets Using an Uncalibrated Camera. Aerospace; 2022; 9, 283. [DOI: https://dx.doi.org/10.3390/aerospace9060283]
28. Zhang, Z.; Zhang, G.; Cao, J.; Li, C.; Chen, W.; Ning, X.; Wang, Z. Overview on Space-Based Optical Orbit Determination Method Employed for Space Situational Awareness: From Theory to Application. Photonics; 2024; 11, 610. [DOI: https://dx.doi.org/10.3390/photonics11070610]
29. Qu, J.; Fu, T.; Chen, D.; Cao, H.; Zhang, S. An analytical initial orbit determination method using two observations from a bistatic radar. Adv. Space Res.; 2022; 70, pp. 1949-1964. [DOI: https://dx.doi.org/10.1016/j.asr.2022.06.070]
30. Zhang, S.; Fu, T.; Chen, D.; Ding, S.; Gao, M. An Initial Orbit Determination Method Using Single-Site Very Short Arc Radar Observations. IEEE Trans. Aerosp. Electron. Syst.; 2020; 56, pp. 1856-1872. [DOI: https://dx.doi.org/10.1109/TAES.2019.2937661]
31. Armellin, R.; Di Lizia, P. Probabilistic Optical and Radar Initial Orbit Determination. J. Guid. Control. Dyn.; 2017; 41, pp. 101-118. [DOI: https://dx.doi.org/10.2514/1.G002217]
32. ESA Space Debris Office. ESA’S Annualspace Environment Report; The European Space Agency (ESA): Paris, France, 2025.
33. Agostinelli, I.; Goracci, G.; Curti, F. Initial orbit determination via artificial intelligence for too-short arcs. Acta Astronaut.; 2024; 222, pp. 609-624. [DOI: https://dx.doi.org/10.1016/j.actaastro.2024.06.006]
34. Hwang, H.; Park, S.-Y.; Lee, E. Angles-Only Initial Orbit Determination of Low Earth Orbit (LEO) Satellites Using Real Observational Data. J. Astron. Space Sci.; 2019; 36, pp. 187-197. [DOI: https://dx.doi.org/10.5140/JASS.2019.36.3.187]
35. Oliver Montenbruck, E.G. Satellite Orbits-Models, Methods and Applications; Springer: Berlin/Heidelberg, Germany, 2013.
36. Vallado, D.A. Fundamentals of Astrodynamics and Applications; McGraw-Hill: Columbus, OH, USA, 1997.
37. Zhao, K. Problem and Methods for Initial Orbit Determination of Space-Based Optical Space Surveillance. Acta Aeronaut. Astronaut. Sin.; 2023; 44, 326465.
38. Hongbo, Z. Theories and Methods of Spacecraft Orbital Mechanics; National Defense Industry Press: Washington, DC, USA, 2015.
39. Hughes, P.C. Spacecraft Attitude Dynamics; Dover Publications, Inc.: Mineola, NY, USA, 1986.
40. Niku, S.B. Introduction to Robotics: Analysis, Control, Applications; 3rd ed. John Wiley and Sons: Hoboken, NJ, USA, 2020; 507.
41. Liu, Y.H.; Wang, H.; Wang, C.; Lam, K.K. Uncalibrated visual servoing of robots using a depth-independent interaction matrix. IEEE Trans. Robot.; 2006; 22, pp. 804-817. [DOI: https://dx.doi.org/10.1109/tro.2006.878788]
42. Zhang, H.; Wu, Y.; Zhong, S.; Guo, H. Space target compound pointing control method based on backstepping. Syst. Eng. Electron.; 2023; 45, pp. 2884-2893. [DOI: https://dx.doi.org/10.12305/j.issn.1001-506X.2023.09.28]
43. Boyd, S.; Ghaoui, L.E.; Feron, E.; Balakrishnan, V. Linear Matrix Inequality in Systems and Control Theory; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 1991.
44. Corke, P. Robotics, Vision and Control-Fundamental Algorithms. MATLAB® Second, Completely Revised, Extended and Updated Edition; Springer: Cham, Switzerland, 2017.
45. Hao, L.; Zhang, Y.; Li, H. A Dynamic Event-Triggered Saturation Method for Nonlinear Estimation with Application to Drag-Free Control Systems. IEEE Trans. Aerosp. Electron. Syst.; 2025; 61, pp. 915-931. [DOI: https://dx.doi.org/10.1109/TAES.2024.3449245]
46. Zhang, H.; Liu, W.; Zhang, L.; Meng, Y.; Han, W.; Song, T.; Yang, R. An allocation strategy integrated power, bandwidth, and subchannel in a RCC network. Def. Technol. 2025, ahead of print [DOI: https://dx.doi.org/10.1016/j.dt.2025.08.010]
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.