Content area

Abstract

What are the main findings?

A novel video satellite visual tracking method for space targets that accounts for uncertainties in both camera parameters and target position is proposed.

By adaptively estimating uncertain parameters, this method effectively overcomes the problem of traditional tracking accuracy being greatly affected by simultaneous uncertainties.

What are the implications of the main findings?

This work provides a novel framework for video satellite-based visual tracking that is robust to both camera calibration errors and target orbital uncertainties, achieving improvements in both accuracy and observation performance over conventional techniques.

It serves as a complement to ground-based remote sensing technologies, with the goal of improving space situational awareness capabilities for the near-Earth orbital environment.

Video satellites feature agile attitude maneuverability and the capability for continuous target imaging, making them an effective complement to ground-based remote sensing technologies. Existing research on video satellite tracking methods generally assumes either accurately calibrated camera parameters or precisely known target positions. However, deviations in camera parameters and errors in target localization can significantly degrade the performance of current tracking approaches. This paper proposes a novel adaptive visual tracking method for video satellites to track near-circular space targets in the presence of simultaneous uncertainties in both camera parameters and target position. First, the parameters representing these two types of uncertainties are separated through linearization. Then, based on the real-time image tracking error and the current parameter estimates, an update law for the uncertain parameters and a visual tracking law are designed. The stability of the closed-loop system and the convergence of the tracking error are rigorously proven. Finally, quantitative comparisons are conducted using a defined image stability index against two conventional tracking methods. Simulation results demonstrate that under coexisting uncertainties, traditional control methods either fail to track the target or exhibit significant tracking precision degradation. In contrast, the average image error during the steady-state phase exhibits a reduction of approximately one order of magnitude with the proposed method compared to the traditional image-based approach, demonstrating its superior tracking precision under complex uncertainty conditions.

Full text

Turn on search term navigation

1. Introduction

In recent years, advancements in launch vehicle technology have facilitated the deployment of large satellite constellations, such as Starlink, Telesat, and OneWeb. This has led to a rapid increase in the number of on-orbit satellites and a progressively congested orbital environment. Meanwhile, the accumulation of defunct satellites, space debris, and launch vehicle fragments—which persist in medium-low Earth near-circular orbits—poses a growing threat to the safety of operational spacecraft. Against this backdrop, tracking and monitoring near-Earth space objects is crucial for early risk warning and plays a vital role in ensuring spacecraft safety. Video satellites, which offer advantages including real-time monitoring, continuous imaging, agile attitude maneuverability, and cost-effectiveness, have emerged as an effective supplement to ground-based space remote sensing technologies [1,2]. They are widely used in stare-mode observation, space situational awareness, and disaster prevention, thereby significantly improving comprehensive situational awareness [3,4,5]. Consequently, several countries have successfully developed and launched various optical video imaging satellites in recent years [6,7,8,9,10].

Video satellites utilize onboard visible-light cameras to perform visual tracking and continuous imaging of ground or space targets. This requires the satellite’s control system to continuously adjust the boresight direction of the star-borne camera, ensuring it remains aligned with the target throughout the observation process [11]. At this point, the target will appear at the center of the camera’s imaging plane, ensuring optimal observation results. Simultaneously, since the target’s projection is located farther from the image border, the risk of the target moving out of the camera’s field of view due to relative motion is minimized, thereby guaranteeing uninterrupted continuous visual tracking. Early research of video satellite visual tracking are mainly position-based methods. This type of tracking methods calculates real-time desired control commands based on the target’s position and the current camera boresight direction [12,13,14,15,16], which depend on comprehensive prior positional information. Consequently, it requires both the target position and camera parameters to be precisely known. Even slight deviations in the real-time target position or the camera’s optical parameters can degrade the tracking and observation performance.

To overcome the limitations of position-based tracking methods that require complete prior information, scholars have shifted their focus to image-based visual tracking methods for video satellites [5,17,18,19,20,21]. Unlike position-based tracking methods, after the camera completes initial target acquisition and the onboard image processing system extracts the coordinates of the target’s projection point, the generation of the tracking instructions no longer relies on precise target position information. Instead, the visual tracking instructions are calculated in real-time based on the target’s projection point on the image plane. This process gradually guides the adjustment of the camera’s boresight direction, and the target’s projection will move toward the center of the image plane to achieve observation. Consequently, such methods can achieve stable and effective visual tracking of the target without relying on precise target position information, offering enhanced autonomy and robustness. Our team has previously investigated image-based visual tracking of video satellites and designed visual tracking methods [4,22,23,24,25]. In these studies, the error quaternion calculated from image coordinates is defined based on the deviation between the target’s real-time projection point coordinates and the desired projection coordinates on the image plane, and tracking instructions are designed based on the error quaternion. The effectiveness of these approaches relies heavily on precise onboard camera imaging models, which encompass both the internal optical parameters of the camera and its installation parameters relative to the satellite body. However, the in-orbit operational environment of video satellites—subject to thermal cycling, particle radiation, and micro-vibrations—can cause deviations in camera parameters. Since in-orbit calibration of camera parameters is often cumbersome and time-consuming, it significantly compromises the mission flexibility.

To address the challenges of visual tracking of video satellites with uncalibrated camera, our team investigated the target staring observation considering uncertainties in camera parameters [3,26,27]. The designed tracking methods were verified through simulation to effectively overcome the limitations of position-based tracking methods. In these studies, it was assumed that system uncertainties originated solely from the onboard camera. By establishing a linear relationship between the estimated projection error and camera parameters, appropriate adaptive update laws were designed to achieve real-time estimation of camera parameter uncertainties. Consequently, this approach is exclusively applicable to scenarios where the target position is precisely known (e.g., observation of ground targets with known longitude and latitude). However, the above methods are not directly applicable to the visual tracking of spatial targets when camera parameters are uncertain, as they rely on the real-time acquisition of precise target positions.

The commonly used passive orbit determination methods for space targets include optical orbit determination, radar orbit determination and multi-source fusion orbit determination. These orbit determination methods are inevitably subject to certain deviations in orbit determination results due to factors such as weather, sensor noise, and time synchronization [28,29,30,31]. Related studies have shown that the vast majority of spacecraft operate in near circular orbits [32], and the measurement deviations of the semi major axis, eccentricity, and true anomaly of space targets under short-term pure optical measurements are greater than the deviations and uncertainties of the orbital elements that characterize the spatial orientation of the orbital plane (orbital inclination and ascending node right ascension are strongly constrained by normal observations of the orbital plane) [33,34,35,36,37]. If radar ranging information is integrated, the measurement accuracy of the semi major axis can be greatly improved, but the phase angle (including the argument of perigee and true anomaly) measurement deviation of spacecraft in the orbital plane is still relatively significant. The positional uncertainty within the orbital plane may further affect the tracking performance of traditional methods, leading to tracking failures or decreased accuracy. However, despite our efforts, there is currently no research on visual tracking of space targets by video satellites under the conditions of uncertainties in camera parameters and target position.

This paper addresses the performance degradation in video satellites’ visual tracking of near-circular orbit targets under simultaneous uncertainties in camera parameters and target position. Based on our team’s previous research on the visual tracking of ground targets with known positions (Refs. [3,26,27]), which only considered camera parameter uncertainties, we extend the applicability of the method to spatial near circular orbit targets with phase angle measurement deviations. An adaptive visual tracking method for space targets that simultaneously considers the uncertainties in camera parameters and target position is proposed. The main work of this manuscript is as follows. Firstly, the motion equation of the space target operating in a near circular orbit around the Earth is linearized, and the orbit phase parameters containing uncertainties are separated. Then, the parameters representing the uncertainties of camera parameters and target position are extracted from the camera’s observation equation and the visual velocity equation as the estimated variables and linearized, laying the foundation for parameter adaptive estimation. Afterwards, an adaptive visual tracking law and parameter update law based on image feedback information are designed, and the stability of the closed-loop system is rigorously proved using Barbalat’s lemma. Finally, simulation verification is conducted. In the simulation section, the proposed method is compared with the traditional position-based tracking method and image-based method. The performance of each tracking method is quantitatively analyzed by defining the image stability index. It is verified that when uncertainties exist in both camera parameters and target orbit position, traditional methods suffer from shortcomings such as tracking failure or significant degradation in tracking accuracy. However, the method proposed in this paper can overcome the effects of both uncertainties and achieve higher precision tracking of the target.

The main contribution of this article can be summarized as:

(1). To address the challenge of video satellite visual tracking under simultaneous uncertainties in camera parameters and target positions, this paper proposes a novel adaptive method. Unlike traditional approaches that suffer from performance degradation under such uncertainties, or existing adaptive methods that require precisely known target positions, the proposed technique estimates both types of uncertainties concurrently. It leverages real-time image feedback to update the unknown parameters and compute the visual tracking instructions, with the closed-loop system stability rigorously guaranteed.

(2). An image stability index is defined to quantitatively evaluate the tracking accuracy. Simulations comparing the proposed method with traditional position-based and image-based methods reveal that under concurrent uncertainties, the position-based method fails to track the target, while the proposed method reduces the steady-state image error by approximately an order of magnitude compared to the traditional image-based approach. This improvement enables significantly higher tracking precision despite the presence of dual uncertainties.

The remainder of this paper is structured as follows. Section 2 presents the visual tracking modeling, including the tracking observation model and motion model of the video satellite. Section 3 details the design of the adaptive staring method, which involves formulating appropriate parameters to be estimated for real-time updates of both target position parameters and camera parameters, with strict stability analysis of the closed-loop system provided. Section 4 presents the simulation analysis, and Section 5 concludes the study.

2. Problem Formulation

The configuration of the video satellite studied in this paper is shown in Figure 1. It consists of a rigid satellite platform and a camera installed on the end effector of a 2-degree-of-freedom (DOF) pan-tilt. The pan-tilt can rotate around the yaw axis and the pitch axis to adjust the pointing direction of the camera’s optical axis and realize the staring of targets. The pan-tilt system is simplified to a 2 DOF linkage system, where link 1 rotates around the yaw axis and link 2 rotates around the pitch axis, as the area enclosed by the dashed line in Figure 2.

2.1. Definition of Coordinate Frames

The following five coordinate systems were established in this paper to describe the motion of the video satellite equipped with the pan-tilt camera, as shown in Figure 2:

(1). Earth Centered Inertial Frame Fi:OiXiYiZi. The origin Oi is located at the center of the Earth. OiXi aligns with the vernal equinox, OiYi points to the pole of the J2000.0 mean equator, and OiZi is determined by the right-hand rule [38].

(2). Satellite Body Frame Fb:ObXbYbZb. The origin Ob is located at the center of mass of the satellite. The three unit axes align with the principle axes of the satellite body’s inertia [39].

(3). Pan-tilt Frame Fe:OeXeYeZe. This is fixedly connected to the end of the pan-tilt. When each rotation angle of the pan-tilt is 0, the three-axis directions are parallel to those of the satellite body frame [40].

(4). Camera Frame Fc:OcXcYcZc. Oc is located at the optical center of the camera. OcZc points along the direction of the camera’s optical axis and is perpendicular to the camera’s imaging plane. OcXc and OcYc are parallel to the horizontal and vertical directions of the camera’s field of view [3].

(5). Image Plane Frame Fp:Opuv. Op is located at the top-left corner of the imaging plane, with the Opu axis and Opv axis parallel to OcXc and OcYc respectively. The center point of the image is u0, v0 [3].

2.2. Observation Model

2.2.1. Observation Equation in Earth Inertial Frame

Let rTb denote the position of the target relative to the satellite body expressed in Fb frame. Then, according to the coordinate transformation relationship, we have:

(1)rTb1=MibMibrbi01×31rTi1TibrTi1

where Tib denotes the homogeneous transformation matrix from Frame Fi to Frame Fb. Mib, rbi, and rTi represent the rotation matrix, position of the satellite and the target in the inertial frame, respectively.

In this paper, it is assumed that the satellite body remains Earth-oriented pointing.

2.2.2. Camera External Observation Model

Let Tbe=MbeMberbee01×31 denote the homogeneous transformation matrix from Frame Fb to Frame Fe, which represents the relative motion of the pan-tilt relative to the satellite body and can be calculated according to the robot kinematics [40]. Then, it holds:

(2)reTe1=TberbTb1

Let Tec=MecMecrece01×31 denote the homogeneous transformation matrix from Frame Fe to Frame Fc, which describes the installation position of the camera relative to the pan-tilt. Then, the position of the target in the frame Fc, denoted by rcTc, can be expressed as:

(3)rcTc1=TecTberbTb1

2.2.3. Camera Internal Observation Model

The relationship between the position of the target in the camera coordinate system and its pixel coordinates in the image is as follows:

(4)zcy1=ΠrcTc1

where zc is the depth information of the target in Frame Fc, and Π is the camera intrinsic parameter matrix, which is defined as:

(5)Π=fdxfdxcotαu000fdysinαv000010

where f is the focal length, dx and dy denote the sizes of unit pixels in the Opu and Opv directions. α represents the angle between the two directions, which is usually 90°.

2.2.4. Observation Equation in Camera Frame

By combining Equations (1)–(4), we obtain:

(6)y1=1zcΠTecNTbeTibrTi1 =1zcNTbeTibrTi1

where y=u, vT denotes the projection coordinates of the target on the image plane. N represents the camera projection matrix, which is a comprehensive description of the camera’s internal optical parameters and installation position. When the camera is not calibrated, the values of its constituent elements are not accurately known.

The projection matrix M¯ can be expressed in a block form as follows:

(7)N=Pn3TP=n11n12n13n14n21n22n23n24, P(3)=n11n12n13n21n22n23n3T=n31n32n33n34, n3(3)T=n31n32n33

Then, Equation (6) can be rewritten as:

(8)zct=n3TTbetTibtrTit1yt=1zctPTbetTibtrTit1

By differentiating Equation (8) and exploiting the property of vector cross product, we obtain:

(9)z˙ct=n3(3)TMbeMibrTirbi+rebe=n3(3)TMbeMibvTivbin3(3)TMbeskωbMibrTirbin3(3)TskωeeMbeMibrTirbi+n3(3)Tr˙ebe=n3(3)TMbeMibvTivbiavt+n3(3)TMbeskMibrTirbiaωtωb+n3(3)TskMbeMibrTirbiωeeθ˙t+rebeθtaθ˙tθ˙t=avt+aωtωbt+aθ˙tθ˙t

Similarly,

(10)y˙t=z˙ctytzct+1zctP3(3)MbeMibrTirbi+rebe=1zctP3(3)yn3(3)TMbeMibrTirbi+rebe=1zctP3(3)yn3(3)TMbeMibvTivbi+P3(3)yn3(3)TMbeskMibrTirbiωb+P3(3)yn3(3)TskMbeMibrTirbiωee+M(3)m3(3)Tr˙ebe=1zctP3(3)yn3(3)TMbeMibvTivbiAvt+P3(3)yn3(3)TMbeskMibrTirbiAωtωb+P3(3)yn3(3)TskMbeMibrTirbiωeeθ˙t+rebeθtθ˙tAθ˙t1zctAvt+Aωtωt+Aθ˙tθ˙t

where θ˙t is the joint angular velocity of the pan-tilt and sk denotes the skew-symmetric operator, with the expression of

(11)skx=0x3x2x30x1x2x10

2.3. Dynamic Model of the Pan-Tilt Camera System

By simplifying the structure of the pan-tilt into a two-link system, the dynamic model of the 2 DOF system can be described using the classical Lagrangian dynamic equation as [41]:

(12)Hθθ¨+12H˙θ+Cθ,θ˙θ˙+Gθ=τc+τd

where τc and τd represent the control torque and disturbance torque applied to the joints, respectively. H, C, and G are the inertia matrix, Coriolis matrix, and gravity matrix of the pan-tilt system, respectively. θ=θ1, θ2T is the joint angle vector with respect to the two joints. The dynamic coupling effect between the pan-tilt system and the satellite body is calculated according to the method described in Ref. [42].

2.4. Linearization of the Motion of Near-Circular Targets

Assumption 1.

The space target operates in a near-circular orbit, with its motion governed by the two-body orbital dynamics. The geocentric distance and the spatial orientation of the orbital plane have been accurately determined through preliminary orbit determination, there exist certain measured errors in the initial position and velocity of the target within the orbital plane.

The space target is assumed to be in a near-circular orbit, with its motion described by two-body dynamics. Although the orbit’s geocentric distance and the orientation of its orbital plane (e.g., inclination i, right ascension Ω) are assumed to be accurately measured from preliminary orbit determination, uncertainties exist in the target’s initial in-plane position and velocity.

Based on Assumption 1, the motion of the target in the Earth inertial frame can be simplified as a planar circular motion. The position of the target in the orbital plane at any time can be described by the current phase. Assuming the initial measured phase of the target in the orbital plane is f0 (which is also called the argument of latitude), then the position of the target in the orbital plane at time t according to the measure phase can then be represented as rtot=Rcosωtt+f0, sinωtt+f0, 0T, where ωt=μR3 is the orbit angular velocity of the target and R is the orbit radius. Then, the position of the target in the Earth Centered Inertial Frame can be expressed as

(13)rTit=RMzΩMxicosωtt+f0sinωtt+f00=RcosΩcosωttsinΩcosisinωttcosΩsinωtsinΩcosicosωttsinΩcosωtt+cosΩcosisinωttsinΩsinωt+cosΩcosicosωttsinisinωttsinicosωttcosf0sinf0RTrcosf0sinf0

Differentiating Equation (13), we obtain the expression of the target’s velocity as

(14)vTit=drTitdt=RωcosΩsinωttsinΩcosicosωttcosΩcosωtt+sinΩcosisinωttsinΩsinωtt+cosΩcosicosωttsinΩcosωttcosΩcosisinωttsinicosωttsinisinωttcosf0sinf0=RωTvcosf0sinf0

Through the aforementioned transformation, the variable f0 containing uncertain position parameters has been decoupled from other accurately known orbital elements. The following section will further decouple the parameters representing camera uncertainties and target position uncertainties from the expression of the visual velocity (Equations (9) and (10)) via linearization, so as to facilitate the design of the parameters’ update law.

Theorem 1.

For aθ˙t, Aθ˙t, aωt and Aωt, there exist the regression matrices Yz,θ1×27, Yy,θ2×27, Yz,ω1×27, and Yy,ω2×27, respectively, such that the following equation holds:

(15)aθ˙tθ˙t=Yz,θt,Ω,i,RσkAθ˙tθ˙t=Yy,θt,Ω,i,R,yσk

(16)aωtωt=Yz,ωt,Ω,i,RσkAωtωt=Yy,ωt,Ω,i,R,yσk

where σk27×1 is the parameter denoting the uncertainties in camera parameters and the target’s position.

Theorem 2.

Foravt and Avt , there exist the regression matrices Yz,θ1×27 and Yy,θ2×27 respectively, such that the following equation holds:

(17)avt=Yz,θt,Ω,i,RσkAvt=Yy,θt,Ω,i,R,yσk

Theorem 3.

For the depth information of the target in the camera frame zct, there exists a regressive matrix Yzt,Ω,i,R1×10, such that

(18)zct=Yzt,Ω,i,Rσz

where σz10×1 is the parameter denoting the uncertainties in camera parameters and the target’s position.

The Proofs of Theorems 1–3 are provided in Appendix A.

By summarizing the above results, the following linearization relationship can be obtained:

(19)zct=Yzt,Ω,i,Rσz

(20)z˙ct=Yz,ωt,Ω,i,R+Yz,θt,Ω,i,R+Yz,vt,Ω,i,RσkYz,kσky˙t=Yy,ωt,Ω,i,R,y+Yy,θt,Ω,i,R,y+Yy,vt,Ω,i,R,yσkYy,kσk

3. Design of Adaptive Visual Tracking Method

Note: This paper assumes that the onboard image processing platform has extracted features from the video stream captured by the camera and obtained the target’s real-time coordinates in the image plane. These coordinates are transmitted to the onboard control processing unit, which generates visual tracking control commands based on the received coordinate information. The commands are then sent to the actuator of the pan-tilt system. Following these instructions, the pan-tilt actuator executes corresponding attitude maneuvers to adjust the camera’s optical axis orientation, thereby changing the coordinates of the target’s projection point and achieving visual tracking observation of the target, as shown in Figure 3.

3.1. Estimation of Visual Velocity

The rate of change of a target’s pixel coordinates on the image plane is called the visual velocity [26], typically obtained by differentiating the projection coordinates. However, direct differentiation can be inaccurate due to the limited frame rate of video satellite cameras and inherent image sensor noise, which this process tends to amplify. To circumvent this issue, this paper proposes a method to avoid differential calculation by introducing the estimated visual velocity, denoted by y˙obt. The calculation expression of it is given as

(21)y˙obt=1z^ctA^vt+A^ωtωt+A^θ˙tθ˙t12z^ctz^˙ct2yt+yobt+ydαyobtyt

where α>0 is a constant. Let Δyobt denote the estimation error of the visual velocity at time instant t, then it can be deducted that

(22)Δy˙obt=y˙obty˙t

By substituting Equation (22) into Equation (21), we obtain

(23)Δy˙obt=1z^ctA^vt+A^ωtωt+A^θ˙tθ˙t1zctAvt+Aωtωt+Aθ˙tθ˙t12z^ctz^˙ctΔyobtΔytαyobtyt

where Δyt=ytyd is the image tracking error. Multiplying both sides by zct simultaneously obtains

(24)zctΔy˙obt=1z^ctz^ctzctA^vt+A^ωtωt+A^θ˙tθ˙t+A^vtAvt+A^ωtAωtωt+A^θ˙tAθ˙tθ˙tzct2z^ctz^˙ctΔyobtΔytαzctΔyobt

Adding 12z˙ctΔyobtΔyt to both sides of the equation simultaneously yields

(25)zctΔy˙obt+12z˙ctΔyobtΔyt=1z^ctz^ctzctA^vt+A^ωtωt+A^θ˙tθ˙t+A^vtAvt+A^ωtAωtωt+A^θ˙tAθ˙tθ˙t+z˙^ct2z^ctz^ctzctΔyobtΔyt12z˙^ctz˙ctΔyobtΔytαzctΔyobt

By combining Equation (20), we obtain

(26)zctΔy˙obt+12z˙ctΔyobtΔyt=1z^ctA^vt+A^ωtωt+A^θ˙tθ˙tYzt,Ω,i,RΔσz+Yy,kΔσk+z˙^ct2z^ctΔyobtΔytYzt,Ω,i,RΔσz12ΔyobtΔytYz,kΔσkαzctΔyobt=1z^ctA^vt+A^ωtωt+A^θ˙tθ˙t+z˙^ct2z^ctΔyobtΔytYzt,Ω,i,RWz1Δσz+Yy,k12ΔyobtΔytYz,kWk1ΔσkαzctΔyobtWz1Δσz+Wk1ΔσkαzctΔyobt

3.2. Reference Trajectory Design

Inspired by Ref. [26], which introduced a reference projection trajectory to dynamically adjust the convergence speed, this paper defines a new reference trajectory based on the estimated projection coordinates as:

(27)y˙rt=y˙dλyobtyd­=λyobtyd

Introducing the matrix Ht with the expression of

(28)Ht=Aθ˙t+12yobtydaθ˙t2×2

The reference angular velocity of the pan-tilt system is defined as

(29)θ˙rt=H^1tA^ωtωtH^1tA^vt+H^1tz^y˙r12H^1tyobyda^vt12H^1tyobyda^ωtωt

The tracking error of the angular velocity is defined as

(30)δθ˙t=θ˙tθ˙rt

Combing Equations (28) and (29), we obtain

(31)H^tδθ˙t=H^tθ˙tθ˙rt=A^θ˙tθ˙t+12yobtyda^θ˙tθ˙tH^tθ˙rt

For the first term A^θ˙tθ˙t, combining Equation (10) obtains

(32)A^θ˙tθ˙t=A^θ˙tθ˙t+zcty˙tzcty˙t=A^θ˙tθ˙t+zcty˙tAθ˙tθ˙tAωtωtAvt=A^θ˙tAθ˙tθ˙t+zcty˙tAωtωtAvt

For the second term 12yobtyda^θ˙tθ˙t, combining Equation (9) yields

(33)12yobtyda^θ˙tθ˙t=12yobtyda^θ˙tθ˙t+z˙z˙=12yobtyda^θ˙taθ˙tθ˙t+z˙aωtωtavt

Substituting Equations (32) and (33) into Equation (31) yields

(34)H^tδθ˙t=A^θ˙tAθ˙tθ˙t+zcty˙tAωtωtAvtH^tθ˙rt+12yobtyda^θ˙taθ˙tθ˙t+z˙aωtωtavt=H^tHtθ˙t+zcty˙tAωtωtAvt+12yobtydz˙12yobtydaωtωt12yobtydavt+A^ωtωt+A^vtz^y˙rt+12yobyda^vt+12yobyda^ωtωt=H^tHtθ˙t+zctΔy˙t+λzctyobtydz^zy˙rt+A^ωtAωtωt+A^vtAvt+12yobtydz˙+12yobyda^vtavt+12yobyda^ωtaωtωt=zctΔy˙t+H^tHtθ˙t+λzctyobtydYzΔσzy˙rt+Yy,ωΔσk+Yy,vt,Ω,i,R,yΔσk+12yobtydz˙+12yobydYz,vΔσk+12yobydYz,ωt,Ω,i,RΔσk

According to Equation (28), we have

(35)H^tHtθ˙t=A^θ˙tAθ˙tθ˙t+12yobtyda^θ˙taθ˙tθ˙t=Yy,θt,Ω,i,R,yΔσk+12yobtydYz,vt,Ω,i,RΔσk

Combining Equations (34) and (35) yields

(36)H^tδθ˙t=zctΔy˙t+Yy,θΔσk+12yobtydYz,vΔσk+λzctyobtydYzΔσzy˙rt+Yy,ωΔσk+Yy,vΔσk+12yobtydz˙t+12yobydYz,vΔσk+12yobydYz,ωΔσk=zctΔy˙t+λzctyobtyd+12yobtydz˙t+Yy,θ+12yobtydYz,v+Yy,ω+Yy,v+12yobydYz,v+12yobydYz,ωWk2Δσky˙rtYzWz2ΔσzzctΔy˙t+λzctyobtyd+12yobtydz˙t+Wk2Δσk+Wz2Δσz

3.3. Visual Tracking Method Design

According to Equation (29), in order to ensure the existence of the reference angular velocity, it is necessary to ensure the existence of the inverse matrix of H^t. By constructing a potential function and incorporating the gradient of it into the parameter update law, the parameters are adaptively updated in a direction away from making H^t singular. Define the potential function with the determinant of H^t as

(37)UH^t=1eaH^t21+b

It can be seen from Equation (37) that when H^t=0, UH^t=1b takes the maximum value. As H^t gradually increases, the potential function gradually decreases. The parameters’ update laws are designed as

(38)Δσ˙z=Γz1Wz2TK1ΔyWz1TK1ΔyobΔσ˙k=Γk1Wk2TK1ΔyWk1TK1ΔyobQUH^tσ^ktδθ˙2

where Γz, Γk, K1 and Q are all a positive definite diagonal matrix.

The visual tracking law is designed as

(39)τ=12M˙+Cθ˙rt+G+Mθ¨rH^tTK1ΔyK2δθ˙K3UH^tσ^ktδθ˙t

where K2 and K3 are positive definite diagonal matrices.

Theorem 4.

Under the maneuver of the control law (39), with the parameter update law (38) adopted, the dynamic system (12) is asymptotically stable, and the target’s projection image tracking error Δ y t will converge to 0.

Proof. 

The Lyapunov function is designed as follows:

(40)V=12δθ˙TtMδθ˙tV1+12zctΔyTtK1Δy+12zctΔyobTtK1ΔyobV2+12ΔσkTΓkΔσk+12ΔσzTΓzΔσzV3

Taking the derivative of V1 obtains

(41)V˙1=δθ˙TtMδ˙θ˙t+12δθ˙TtM˙δθ˙t=δθ˙TtH^tTK1ΔyK2δθ˙tK3UH^tσ^ktδθ˙t=δθ˙TtH^tTK1Δyδθ˙TtK2δθ˙tδθ˙TtK3UH^tσ^ktδθ˙t

Taking the derivative of V2 obtains

(42)V˙2=12zctΔyTtK1Δyt+12zctΔyobTtK1Δyobt=12z˙ctΔyTtK1Δyt+zctΔyTtK1Δy˙t+12z˙ctΔyobTtK1Δyobt+zctΔyobTtK1Δy˙obt=ΔyTtK112z˙ctΔyt+zctΔy˙t+yobTtK112z˙ctΔyobt+zctΔy˙obt=ΔyTtK112z˙ctΔyt+Δyobt+zctΔy˙t+yobTtK112z˙ctΔyobtΔyt+zctΔy˙obt

Combining Equation (26) yields

(43)V˙2=ΔyTtK1H^tδθ˙tλzctyobtydWk2ΔσkWz2Δσz+ΔyobTtK1Wz1Δσz+Wk1ΔσkαzctΔyobt=ΔyTtK1H^tδθ˙tλzctΔyTtK1Δyobt+ΔytαzctyobTtK1Δyobt+ΔyobTtK1Wk1ΔyTtK1Wk2Δσk+ΔyobTtK1Wz1ΔyTtK1Wz2Δσz

Taking the derivative of V3 obtains

(44)V˙3=ΔσkTΓkΔσ˙k+ΔσzTΓzΔσ˙z

Combining Equations (38) and (39) and (41)–(44), we obtain

(45)V˙=V˙1+V˙2+V˙3=δθ˙TtH^tTK1Δyδθ˙TtK2δθ˙tδθ˙TtK3UH^tσ^ktδθ˙t+ΔyTtK1H^tδθ˙tλzctΔyTtK1ΔyobtλzctΔyTtK1ΔytαzctyobTtK1Δyobt+ΔyobTtK1Wk1ΔyTtK1Wk2Δσk+ΔyobTtK1Wz1ΔyTtK1Wz2Δσz=λzctΔyTtK1ΔyobtλzctΔyTtK1ΔytαzctyobTtK1Δyobtδθ˙TtK2δθ˙tδθ˙TtK3UH^tσ^ktδθ˙t

Further simplification and transformation of the above equation yields

(46)V˙=ΔyΔyobTλzctK1λzctK12λzctK12αzcK1ΔyΔyobδθ˙TtK2δθ˙tδθ˙TtK3UH^tσ^ktδθ˙tΔσkTQUH^tσ^ktδθ˙2ΔyΔyobTλzctK1λzctK12λzctK12αzcK1ΔyΔyobδθ˙TtK2δθ˙tK3ΔσkTQUH^tσ^ktδθ˙2

When the following condition holds,

(47)λzctλzct2λzct2αzc0K3ΔσkTQ0

That is,

(48)αλ4λminK3λmaxQ2V0λminΓk

We have V˙0, where λminA and λmaxA denote the minimum and maximum eigenvalues of matrix A, respectively. Therefore, Vt is upper bounded. According to Equation (40), it can be inferred that the components of the Lyapunov function, including Δσkt, Δσzt, Δy, Δyob and δθ˙t are all bounded. Due to the boundness of the pan-tilt’s angular velocity and true values of the parameters, σ^kt, σ^zt and θ˙rt are also bounded.

It can be inferred that the second-order derivative of the Lyapunov function depends on Δy˙t, Δy˙obt, δ˙θ˙t, and σ^˙kt. Since yt, yobt, θ˙t, θ˙rt, and σ^kt are all bounded, V¨t is also bounded. Therefore, V˙t is of uniform continuity [43]. According to Barbalat’s Lemma, we have

(49)limtV˙t=0

Therefore, we have

(50)limtδθ˙t=0limtΔyt=0limtΔyobt=0

The Proof of Theorem 4 is completed. □

4. Results and Discussion

In the simulation section, the designed adaptive visual tracking method (39) is compared and analyzed with the traditional position-based tracking method and image-based tracking method.

The position-based tracking method was designed according to Ref. [11], with the expression of

(51)τc=Gkpθθdkdθ˙θ˙d

where kp and kd are positive definite diagonal matrices. θd and θ˙d are the desired angular and angular velocity of the pan-tilt system, respectively. Their values are computed based on the attitude error defined in Ref. [11], the kinematics of the pan-tilt, and the measured target position.

The image-based tracking method was designed according to Ref. [44], with the expression of

(52)τ=Mθ¨dt+Cθ˙dt+Gkpθtθdtkdθ˙tθ˙dt

Please refer to Appendix B for the specific controller design process and parameter meanings.

4.1. Simulation Parameters Setting

The orbit elements of the video satellite at the initial moment are shown in Table 1, and the orbit elements of the target running on a near-circular orbit and the initial estimation error are shown in Table 2. The theoretical and real parameters of the onboard camera are shown in Table 3. The parameters of the proposed adaptive tracking method are shown in Table 4. The parameters of the position-based tracking method are kp=1 and kd=4, and the parameters of the traditional image-based method (52) are shown in Table 5. The initial attitude quaternion and angular velocity of the video satellite are 0.9975, 0.0197, 0.0515, 0.0445T and 0.2, 0.3, 0.8Tdeg/s. The physical parameters and the initial motion state of the pan-tilt system are shown in Table 6. To better simulate the sensor noise present in practical imaging, a deviation is introduced to the actual projection points of the target on the image plane, with the standard deviation in both horizontal and vertical directions set to σu=σv=7pixels. The image sampling time interval of the onboard camera is 0.05 s. The external disturbances of the pan-tilt system are set to τd=0.30.6θ˙1+0.08sin5θ˙1, 0.3θ˙2+0.06sin2θ˙2T Nm and the external environment disturbance torque exerted on the satellite body is set to Td=0.003sin(10tπ), sin(10tπ), sin(10tπ)TNm.

In order to quantitatively analyze the image stability of the target tracking performance of each method, an image stability index composed of pixel deviation between the target projection point and the expected point is defined as follows:

(53)Ipixel=t0t1ytyddtt1t0

It can be seen that this index can measure the average image error during the time period from t0 to t1. In the simulation, t1 = 10 s is set as the end of the simulation, and t0 is set as 6 s.

4.2. Simulation Results and Analysis

4.2.1. Simulation Results of the Two Traditional Methods

The simulation results of the position-based tracking method (51), including the variation curves of the angular velocity and the control torque of the pan-tilt system, as well as the target’s projection coordinate and trajectories on the image plane are shown in Figure 4, Figure 5, Figure 6 and Figure 7, and the results of the traditional image-based tracking method (52) are shown in Figure 8, Figure 9, Figure 10 and Figure 11, respectively.

It can be seen that due to the identical initial conditions, the coordinates of the target’s projection on the image plane are the same at the initial moment (due to image measurement noise, there may be slight differences). Due to the deviation between the target projection point and the image center point, the image-based tracking method (52) can generate real-time instructions by image deviation between the current image coordinate and the center point. By adjusting the pan tilt’s two axis, the target projection gradually approaches the image center point, and ultimately, the target projection stabilizes in the area near the center point (the green center point in Figure 10). For the position-based tracking method (51), from Figure 3, Figure 4 and Figure 5, it can be seen that the onboard camera’s final pointing also reached a stable state, but the target projection gradually moves towards the edge of the image plane and eventually disappears from the camera’s field of view, resulting in target tracking failure.

The reasons for the difference in tracking performance between the two methods are analyzed as follows: for the position-based tracking method (51), the expected angle and angular velocity of the pan-tilt are directly calculated from the target’s position measurement information and camera parameters. Due to the deviation of the two critical factors, the control system generates control instructions with deviations. In this state, there is a significant directional deviation between the camera optical axis and the target direction, resulting in the loss of the target image. But for the image-based tracking method (52), the expected state does not directly depend on the target position information but is calculated in real-time through the deviation between the projection coordinates of the target in the image plane and the expected projection point. The influencing factor of tracking performance comes from the uncertainty of camera parameters, but the degree of influence of this factor is smaller than when both uncertainties exist simultaneously. Therefore, the final target still remains near the center of the camera field of view, achieving more effective visual tracking of the target.

4.2.2. Simulation Results of the Adaptive Method

The results of the adaptive tracking method (39) proposed in this paper are shown in Figure 12, Figure 13, Figure 14 and Figure 15. It can be seen that under the action of the adaptive tracking method, the target eventually converges to the center position of the imaging plane, achieving high precision visual tracking of the target. By comparing Figure 10 and Figure 14, it can be seen that under the adaptive tracking method, the distance between the target’s final projection point and the expected projection point is smaller than that of the traditional image-based tracking method, indicating that the adaptive method has a higher tracking accuracy.

Table 7 lists the calculation results of the image stability index (53) with different tracking methods under various simulation conditions. In the table, Condition 1 involves the presence of uncertainties in both camera parameters and target position; Condition 2 involves only camera parameter uncertainty; Condition 3 involves only target position uncertainty; and Condition 4 is free from any uncertainty (i.e., camera parameters are accurately calibrated, and the target position is precisely measured). It can be seen that when the camera is calibrated and the target’s position is precisely measured, the stability indices of the three methods are relatively small, indicating that in ideal situations without uncertainties, all three methods can achieve effective visual tracking of the target with high accuracy. When there is uncertainty in the target’s position, the stability index of the position-based controller rapidly increases, indicating that the tracking error rapidly increases, while that of the image-based controller remains approximately unchanged (different numerical values caused by the image noise). This is because the input of the image-based controller comes entirely from the projection coordinates of the target, so the measurement deviation of the target’s position within the orbital plane will not affect the tracking accuracy. But when uncertainties exist in camera parameters, both the position-based method (51) and image-based method (52) showed a significant increase in the image stability index, with image-based tracking method having a smaller increase, indicating that the image-based method is more robust to camera parameter uncertainties. However, it can be observed that the method (39) proposed in this paper yielded the smallest image stability index across all four conditions, with minimal variation under the different uncertainty conditions. This indicates that the proposed method possesses strong robustness when confronted with both types of uncertainties.

4.2.3. Robustness Test and Discussion of the Adaptive Method

To further investigate the robustness of the proposed tracking method against the two types of uncertainties, this subsection first examines the impact of varying target initial position errors on tracking accuracy by systematically adjusting the magnitude of these deviations. The final image stability indices are recorded in Table 8. The camera parameter settings remain consistent with the previous section.

A comparison of the data in the table revealed that for the proposed visual tracking method for video satellites, the image stability index increased as the initial target positioning error grew under camera parameter uncertainty conditions. This indicates that the positioning accuracy of the proposed method degrades to some extent when the initial target positioning error becomes large. However, even when the positioning error angle increases tenfold, the image tracking accuracy remains higher than that of traditional image-based methods, demonstrating the robustness of the proposed method against initial target positioning deviations. This can be attributed to the fact that the initial positioning bias affects the initial parameter estimates. As the tracking process proceeds, the proposed method adaptively updates the parameters using the target projection information, gradually compensating for the influence of the initial positioning deviation and ultimately achieving high-precision visual tracking of the target.

Next, an extended study was conducted to investigate the adaptability of the proposed method under uncertainties in the orientation of the orbital plane. Assuming initial positioning errors of 0.2° in both the orbital inclination and the right ascension of the ascending node while keeping the other simulation conditions consistent with those in Section 4.2.2, the obtained image stability index was 8.6235. Compared with the results in Section 4.2.2, the tracking accuracy showed a certain degree of degradation, yet remained higher than that of traditional image-based tracking method, whose value is 35.1385. The primary reason for the decline in accuracy lies in the fact that deviations in the orientation of the orbital plane introduce a form of model error, which affects the direction of parameter updates and the final estimated values. However, since the proposed method is essentially an image-feedback-based control strategy, its control law explicitly incorporates the image tracking error Δy. Consequently, even in the presence of model errors, the image feedback term Δy can drive the camera’s optical axis in real-time toward reducing the tracking error, thereby helping to maintain the system’s tracking performance to some extent.

It should be noted that factors such as actual orbital perturbations and non-circular motion may further affect the tracking accuracy of the proposed method. How to maintain high-precision tracking under more complex dynamic scenarios is a key issue to be addressed in future research.

5. Conclusions

This paper proposes an adaptive visual tracking method to address the visual tracking problem for video satellites subject to uncertainties in both camera parameters and the target’s position. By comparing with the traditional position-based tracking method and image-based tracking method, the results demonstrate that the position-based method completely fails under these dual uncertainties. In contrast, the proposed adaptive tracking method effectively mitigates their negative impact, reducing the steady-state image deviation by approximately an order of magnitude compared to the traditional image-based approach. These results demonstrate the superior robustness and precision of the proposed method in handling simultaneous uncertainties of camera parameters and the target’s position.

For future work, it would be valuable to relax the constraints on the target’s motion characteristics, such as investigating adaptive tracking for space targets in arbitrary orbits to further broaden the method’s applicability. Furthermore, additional considerations can be given to conditions such as image measurement noise suppression [45] and satellite-ground communication constraints [46]. The design of corresponding observers and control methods can be developed to better meet the requirements of different missions.

Author Contributions

Conceptualization, Z.Z., C.F. and H.S.; Methodology, Z.Z. and C.F.; Software, Z.Z.; Validation, Z.Z.; Formal analysis, Z.Z.; Investigation, Z.Z.; Resources, C.F. and H.S.; Data curation, Z.Z., C.F. and H.S.; Writing—original draft, Z.Z. and C.F.; Writing—review & editing, Z.Z. and C.F.; Visualization, C.F. and H.S.; Supervision, C.F. and H.S.; Project administration, C.F. and H.S.; Funding acquisition, C.F. and H.S. All authors have read and agreed to the published version of the manuscript.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

Footnotes

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Figures and Tables

Figure 1 Schematic Diagram of the Video Satellite Equipped with a Pan-tilt Camera.

View Image -

Figure 2 Diagram of the Frames.

View Image -

Figure 3 Flowchart of the control method.

View Image -

Figure 4 Angular velocity variation curve of the pan-tilt with position-based tracking method (51).

View Image -

Figure 5 Control torque variation curve of position-based tracking method (51).

View Image -

Figure 6 Target projection coordinates on the image plane with the position-based tracking method (51).

View Image -

Figure 7 Target projection trajectories on the image plane with the position-based tracking method (51).

View Image -

Figure 8 Angular velocity variation curve of the pan-tilt with image-based tracking method (52).

View Image -

Figure 9 Control torque variation curve of the image-based tracking method (52).

View Image -

Figure 10 Target projection coordinates on the image plane with the image-based tracking method (52).

View Image -

Figure 11 Target projection trajectories on the image plane with the image-based tracking method (52).

View Image -

Figure 12 Angular velocity variation curve of the pan-tilt with the adaptive tracking method (39).

View Image -

Figure 13 Control torque variation curve of the adaptive tracking method (39).

View Image -

Figure 14 Target projection coordinates on the image plane with the adaptive tracking method.

View Image -

Figure 15 Target projection trajectories on the image plane with the adaptive tracking method (39).

View Image -

Orbit elements of the video satellite at the initial moment.

Parameter (Unit) a (km) e i (deg) Ω (deg) ω (deg) f (deg)
Value 6878.14 0 45.0487 156.6496 270.0358 213.3153

Orbit elements of the near-circular orbit target at the initial moment.

Parameter (Unit) a (km) e i (deg) Ω (deg) ω + f (deg) Δ(ω + f) (deg)
Value 6993.14 0 45.0672 146.6371 133.2061 0.2

Theoretical and real values of the onboard camera.

Camera Parameters Theoretical Values (Unit) Real Values (Unit)
f 1 (m) 1.1 (m)
u 0 ,   v 0 (640, 480) (pixel) (660, 465) (pixel)
α 90 (deg) 89.8 (deg)
d x 5 (μm) 5.1 (μm)
d y 5 (μm) 5.05 (μm)
T e c 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 1 0.0035 0 0.9999 0.1 0 1 0 0.05 0.9999 0 0.0035 0.04 0 0 0 1

Parameters of the adaptive tracking method.

Parameters Value
K 1 12 × 10 15   I 2
Κ 2 4   I 2
Κ 3 50   I 2
Q 1 × 10 5   I 2
λ 1
α 0.5
Γ k 1 × 10 3   I 27
Γ z 1 × 10 3   I 10

Parameters of the image-based tracking method (52).

Parameters Value
k o 15   I 2 × 2
k s 30   I 2 × 2
k p 15   I 2 × 2
k d 12   I 2 × 2

Physical parameters and initial motion state of the pan-tilt system.

Parameters Value (Unit)
I 1 0.08 (kg·m2)
I 2 0.08 (kg·m2)
m 2 0.5 (kg)
θ 0 67.9387 ,   46.0683 T   ( deg )
θ ˙ 0 0.2 ,   0.1 T   ( deg / s )

Image stability index (53) of different tracking methods under different conditions.

Tracking Methods Condition 1 Condition 2 Condition 3 Condition 4
Position-based method (51) 6.6179 × 103 795.4040 6.1927 × 103 8.5417
Image-based method (52) 35.1385 35.1367 9.8137 9.8144
Adaptive method (39) 3.3927 3.3437 3.3679 3.3120

Image stability index (53) under different initial position uncertainties.

Δ(ω + f) (deg) Image Stability Index (53)
0 3.3120
0.1 3.3814
0.2 3.3927
1 3.6696
2 6.9395

Appendix A

It can be seen that Equations (15) and (16) share an analogous structure. The proof for Theorem 1 will be detailed step-by-step below using Equation (15) as an example.

According to Equations (9) and (10), by denoting MbeMibmij3×3 and combining with Equation (13), we haveaθ˙tθ˙t=n3(3)TM˙beMibrTirbi+r˙ebe=n3(3)TθMbeMibrTirbi+rebeθ˙=n3(3)TθMbeMibRTrcosf0sinf0MbeMibrbi+rebeθ˙=n3(3)TθMbeMibRTrcosf0sinf0MbeMibrbi+rebeθ˙=θj=13n3jRk=13mjktr,k1cosf0+mjktr,k2sinf0+reb,jej=13n3jk=13mjkrb,kiθ˙ andAθ˙tθ˙t=θj=13n1jun3jRk=13mjktr,k1cosf0+mjktr,k2sinf0+reb,jej=13n1jun3jk=13mjkrb,kiθj=13n2jun3jRk=13mjktr,k1cosf0+mjktr,k2sinf0+reb,jej=13n2jun3jk=13mjkrb,ki

Let σh=nh1cosf0,nh1sinf0,nh1,nh2cosf0,nh2sinf0,nh2,nh3cosf0,nh3sinf0,nh3T9×1, ηhl=Rθk=13mlktr,k1θ˙,Rθk=13mlktr,k2θ˙,θreb,lek=13mlkrb,kiθ˙1×3, then if the regressive matrices Yz,θ and Yy,θ have the following expressionsYz,θ=01×9,01×9,η31,η32,η33Yy,θ=η11η12η1301×9uη31uη32uη3301×9η21η22η23vη31vη32vη33

We obtainaθ˙tθ˙t=Yz,θt,Ω,i,RσkAθ˙tθ˙t=Yy,θt,Ω,i,R,yσk

Similarly, for aωt and Aωt, we haveaωtωt=n3(3)TψMbeMibRT1cosf0sinf0MbeMibrbiψ˙Aωtωt=P3(3)yn3(3)TψMbeMibRT1cosf0sinf0MbeMibrbiψ˙ where ψ is the Euler attitude angle of the video satellite. Then, Equation (A5) can be rewritten asaωtωt=n3(3)TψMbeMibRTrcosf0sinf0MbeMibrbiψ˙=ψRj=13n3jk=13mjktr,k1cosf0+mjktr,k2sinf0j=13n3jk=13mjkrb,kiψ˙ andAωtωt=ψRj=13n1jun3jk=13mjktr,k1cosf0+mjktr,k2sinf0j=13n1jun3jk=13mjkrb,kiψ˙ψRj=13n2jvn3jk=13mjktr,k1cosf0+mjktr,k2sinf0j=13n2jvn3jk=13mjkrb,kiψ˙

Let ηhl=Rψk=13mlktr,k1ψ˙,Rψk=13mlktr,k2ψ˙,ψk=13mlkrb,kiψ˙1×3, then if the following equation holds:Yz,ωt,Ω,i,R=01×9,01×9,η31,η32,η33Yy,ωt,Ω,i,R,y=η11η12η1301×9uη31uη32uη3301×9η21η22η23vη31vη32vη33

We haveaωtωt=Yz,ωt,Ω,i,RσkAωtωt=Yy,ωt,Ω,i,R,yσk where σk=σ1T,σ2T,σ3TT.

The Proof of Theorem 1 is completed. □

According to Equations (9) and (10), we haveavt=n3(3)TMbeMibvTivbiAvt=P3(3)yn3(3)TMbeMibvTivbi

Combining with Equation (14), we obtainavt=n3(3)TMbeMibvTivbi=Rωj=13n3jk=13mjktv,k1cosf0+mjktv,k2sinf0j=13n3jk=13mjkvb,ki andAvt=P3(3)yn3(3)TMbeMibvTivbi=Rωj=13n1jun3jk=13mjktv,k1cosf0+mjktv,k2sinf0j=13n1jun3jk=13mjkvb,kiRωj=13n2jvn3jk=13mjktv,k1cosf0+mjktv,k2sinf0j=13n2jvn3jk=13mjkvb,ki

Let ηhl=Rωk=13mlktv,k1,Rωk=13mlktv,k2,k=13mlkvb,ki1×3, then when the regressive matrices Yz,v and Yy,v have the following form:Yz,θ=01×9,01×9,η31,η32,η33Yy,θ=η11η12η1301×9uη31uη32uη3301×9η21η22η23vη31vη32vη33

It can be derived thatavt=Yz,vt,Ω,i,RσkAvt=Yy,vt,Ω,i,R,yσk where σk=σ1T,σ2T,σ3TT.

The Proof of Theorem 2 is completed. □

Combining Equations (8) and (13), we obtainzct=n3(3)TMbeMibrTirbi+n34=n3(3)TMbeMibRTrcosf0sinf0n3(3)TMbeMibrbi+n34=Rj=13n3jk=13mjktr,k1cosf0+mjktr,k2sinf0j=13n3jk=13mjkrb,ki+n34

Let σz=n31cosf0n31sinf0n31n32sinf0n32sinf0n32n33cosf0n33sinf0n33n34T, λj=Rk=13mjktr,k1Rk=13mjktr,k2k=13mjkrb,ki, then if the regressive matrix Yzt,Ω,i,R=λ1λ2λ31, it holds thatzct=Yzt,Ω,i,Rσz

The Proof of Theorem 3 is completed. □

Appendix B

Let rcit denote the position of origin of the camera frame in the Earth Centered inertial frame, then according to the geometric relationship, we havercit=rbit+rbeit+recit=rbit+MibTrbebt+MibTMbeTMecTrecct

By taking the derivative of rcit, we can obtain the velocity of the camera asr˙cit=r˙bit+M˙ibTrbebt+MibTr˙bebt+M˙ibTMbeTMecTrecct+MibTM˙beTMecTrecct=vbit+MibTskωbrbet+MibTr˙bebt+MibTskωbMbeTMecTrecct+MibTMbeTskωeeMecTrecct=vbitMibTskrbetωb+MibTA3θ˙tMibTskMbeTMecTrecctωbMibTMbeTskMecTrecctA1θ˙t=vbit+MibTskrbetMibTskMbeTMecTrecctωb+MibTA3MibTMbeTskMecTrecctA1θ˙t

By introducing the generalized velocity Vct (including the camera’s linear velocity and angular velocity), it can be obtained thatVcit=vcitωcit=I3×3MibTskrbetMibTskMbeTMecTrecctMibTA3MibTMbeTskMecTrecctA103×3I3×3MibTA4vbitωbtθ˙t

Let rcTc=xcT,ycT,zcTT represent the target’s position in camera frame and s=x0,y0T be its projected coordinate on the normalized image plane, with the expression of its component elements asx0=xcTzcTy0=ycTzcT

The derivative of s can be represented ass˙=JcVccVTc=JcMecMbeMib00MecMbeMibVcitJcVTc=JcMecMbeMib00MecMbeMib×I3×3MibTskrbetMibTskMbeTMecTrecctMibTA3MibTMbeTskMecTrecctA103×3I3×3MibTA4vbitωbtθ˙tJcVTc=JcMecMbeMibJ1J1aVbit+JcMecMbeMibJ2J2aθ˙tJcVTcJ1aVbit+J2aθ˙tJcVTc where Jc is called the image Jacobian matrix with the expression ofJc=1zcT0x0zcTx0y01x02y001zcTy0zcT1+y02x0y0x0

An observer to estimate the unknown target’s velocity VTc is designed asd^t=koss^ where s^˙=J1aVbit+J2aθ˙t+koss^.

In visual tracking of the space target, the desired image feature is set to sd=0,0T. Then, the image tracking error in the normalized plane isest=stsd=st

Taking the derivative of est yieldse˙st=J1aVbit+J2aθ˙t+dt

The desired angular velocity of the pan-tilt is defined asθ˙dt=J2a+kse˙stJ1aVbitd^t where J2a+ denotes the pseudo-inverse matrix of J2a and ks is the positive definite diagonal matrix. The expected angle of the pan-tilt joint θdt can be obtained by integrating the desired angular velocity. The image-based tracking method is designed asτ=Mθ¨dt+Cθ˙dt+Gkpθtθdtkdθ˙tθ˙dt where kp and kd are positive definite diagonal matrices.

References

1. Yao, J.; Xu, B.; Li, X.; Yang, S. A clustering scheduling strategy for space debris tracking. Aerosp. Sci. Technol.; 2025; 157, 109805. [DOI: https://dx.doi.org/10.1016/j.ast.2024.109805]

2. Li, G.; Liu, J.; Jiang, H.; Liu, C. Research on the Efficient Space Debris Observation Method Based on Optical Satellite Constellations. Appl. Sci.; 2023; 13, 4127. [DOI: https://dx.doi.org/10.3390/app13074127]

3. Song, C.; Fan, C.; Wang, M. Image-Based Adaptive Staring Attitude Control for Multiple Ground Targets Using a Miniaturized Video Satellite. Remote Sens.; 2022; 14, 3974. [DOI: https://dx.doi.org/10.3390/rs14163974]

4. Fan, C.; Wang, M.; Song, C.; Zhong, Z.; Yang, Y. Anti-off-Target Control Method for Video Satellite Based on Potential Function. J. Syst. Eng. Electron.; 2024; 35, pp. 1583-1593. [DOI: https://dx.doi.org/10.23919/JSEE.2024.000098]

5. Zhang, X.; Xiang, J.; Zhang, Y. Space Object Detection in Video Satellite Images Using Motion Information. Int. J. Aerosp. Eng.; 2017; 2017, 1024529. [DOI: https://dx.doi.org/10.1155/2017/1024529]

6. Xiao, A.; Wang, Z.; Wang, L.; Ren, Y. Super-Resolution for “Jilin-1” Satellite Video Imagery via a Convolutional Network. Sensors; 2018; 18, 1194. [DOI: https://dx.doi.org/10.3390/s18041194] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/29652838]

7. Zhang, S.; Yuan, Q.; Li, J. Video Satellite Imagery Super Resolution for ‘Jilin-1’ via a Single-and-Multi Frame Ensembled Framework. Proceedings of the IGARSS 2020—2020 IEEE International Geoscience and Remote Sensing Symposium; Virtual, 26 September–2 October 2020; pp. 2731-2734.

8. Julzarika, A. Utilization of LAPAN Satellite (TUBSAT, A2, and A3) in supporting Indonesia’s potential as maritime center of the world. IOP Conf. Ser. Earth Environ. Sci.; 2017; 54, 012097. [DOI: https://dx.doi.org/10.1088/1755-1315/54/1/012097]

9. Bhushan, S.; Shean, D.; Alexandrov, O.; Henderson, S. Automated digital elevation model (DEM) generation from very-high-resolution Planet SkySat triplet stereo and video imagery. ISPRS J. Photogramm. Remote Sens.; 2021; 173, pp. 151-165. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2020.12.012]

10. Rahayu, D.A.; Nugroho, M.; Ferdiansyah, N.; Amiludin, M.F.; Hakim, P.R.; Harsono, S.D. Development of Ground Station Performance Information System for LAPAN Satellite Operations. Proceedings of the 2021 IEEE International Conference on Aerospace Electronics and Remote Sensing Technology (ICARES); Virtual, 3–4 November 2021; pp. 1-7.

11. Lian, Y.; Gao, Y.; Zeng, G. Staring Imaging Attitude Control of Small Satellites. J. Guid. Control. Dyn.; 2017; 40, pp. 1278-1285. [DOI: https://dx.doi.org/10.2514/1.G002197]

12. Han, S.; Ahn, J.; Tahk, M.-J. Analytical Staring Attitude Control Command Generation Method for Earth Observation Satellites. J. Guid. Control. Dyn.; 2022; 45, pp. 1347-1356. [DOI: https://dx.doi.org/10.2514/1.G006041]

13. Li, H.; Zhao, Y.; Li, B.; Li, G. Attitude Control of Staring-Imaging Satellite Using Permanent Magnet Momentum Exchange Sphere. Proceedings of the 2019 22nd International Conference on Electrical Machines and Systems (ICEMS); Harbin, China, 11–14 August 2019; pp. 1-6.

14. Li, C.; Geng, Y.; Guo, Y.; Han, P. Suboptimal Repointing Maneuver of a staring-mode spacecraft with one DOF for final attitude. Acta Astronaut.; 2020; 175, pp. 349-361. [DOI: https://dx.doi.org/10.1016/j.actaastro.2020.04.040]

15. Geng, Y.; Li, C.; Guo, Y.; Biggs, J.D. Hybrid robust and optimal control for pointing a staring-mode spacecraft. Aerosp. Sci. Technol.; 2020; 105, 105959. [DOI: https://dx.doi.org/10.1016/j.ast.2020.105959]

16. Niu, X.; Lu, B.; Feng, B.; Li, Q. Linear parameter-varying gain-scheduled preview-based robust attitude control design for a staring-mode satellite. Aerosp. Sci. Technol.; 2022; 129, 107816. [DOI: https://dx.doi.org/10.1016/j.ast.2022.107816]

17. Zdešar, A.; Klančar, G.; Mušič, G.; Matko, D.; Škrjanc, I. Design of the image-based satellite attitude control algorithm. Proceedings of the 2013 XXIV International Conference on Information, Communication and Automation Technologies (ICAT); Sarajevo, Bosnia and Herzegovina, 30 October–1 November 2013; pp. 1-8.

18. Zhang, X.; Xiang, J.; Zhang, Y. Tracking imaging attitude control of video satellite for cooperative space object. Proceedings of the 2016 IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC); Xi’an, China, 3–5 October 2016; pp. 429-434.

19. Felicetti, L.; Emami, M.R. Image-based attitude maneuvers for space debris tracking. Aerosp. Sci. Technol.; 2018; 76, pp. 58-71. [DOI: https://dx.doi.org/10.1016/j.ast.2018.02.002]

20. Pei, W. Staring Imaging Attitude Tracking Control Laws for Video Satellites Based on Image Information by Hyperbolic Tangent Fuzzy Sliding Mode Control. Comput. Intell. Neurosci.; 2022; 2022, 8289934. [DOI: https://dx.doi.org/10.1155/2022/8289934] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/36110911]

21. Li, P.; Dong, Y.; Li, H. Staring Imaging Real-Time Optimal Control Based on Neural Network. Int. J. Aerosp. Eng.; 2020; 2020, 8822223. [DOI: https://dx.doi.org/10.1155/2020/8822223]

22. Wang, M.; Fan, C.; Song, C. Image-Based Visual Tracking Attitude Control Research on Small Video Satellites for Space Targets. Proceedings of the 2022 IEEE International Conference on Real-Time Computing and Robotics (RCAR); Guiyang, China, 17–22 July 2022; pp. 174-179.

23. Fan, C.; Zhong, Z.; Wang, M.; Yang, Y. Anti-off-target control of target tracking for small video satellite based on field of view zoning. J. Natl. Univ. Def. Technol.; 2025; 47, pp. 98-108.

24. Wang, M.; Song, C.; Fan, C.; Zhang, Y. Image-Based Sliding Mode Attitude Control Research on Small Video Satellites. Proceedings of the 8th China High Resolution Earth Observation Conference (CHREOC 2022); Beijing, China, 5–8 November 2022; Springer: Singapore, 2023; pp. 135-149.

25. Wang, C.; Fan, C.; Song, H.; Zhong, Z.; Zhang, Y. Non-cooperative Target Tracking Control for Video Satellites Based on Genetic Algorithm. Proceedings of the Advances in Guidance, Navigation and Control; Changsha, China, 9–11 August 2024; Springer: Singapore, 2025; pp. 610-619.

26. Fan, C.; Song, C.; Zhong, Z. Video Satellite Staring Control of Ground Targets Based on Visual Velocity Estimation and Uncalibrated Cameras. Remote Sens.; 2025; 17, 1116. [DOI: https://dx.doi.org/10.3390/rs17071116]

27. Song, C.; Fan, C.; Song, H.; Wang, M. Spacecraft Staring Attitude Control for Ground Targets Using an Uncalibrated Camera. Aerospace; 2022; 9, 283. [DOI: https://dx.doi.org/10.3390/aerospace9060283]

28. Zhang, Z.; Zhang, G.; Cao, J.; Li, C.; Chen, W.; Ning, X.; Wang, Z. Overview on Space-Based Optical Orbit Determination Method Employed for Space Situational Awareness: From Theory to Application. Photonics; 2024; 11, 610. [DOI: https://dx.doi.org/10.3390/photonics11070610]

29. Qu, J.; Fu, T.; Chen, D.; Cao, H.; Zhang, S. An analytical initial orbit determination method using two observations from a bistatic radar. Adv. Space Res.; 2022; 70, pp. 1949-1964. [DOI: https://dx.doi.org/10.1016/j.asr.2022.06.070]

30. Zhang, S.; Fu, T.; Chen, D.; Ding, S.; Gao, M. An Initial Orbit Determination Method Using Single-Site Very Short Arc Radar Observations. IEEE Trans. Aerosp. Electron. Syst.; 2020; 56, pp. 1856-1872. [DOI: https://dx.doi.org/10.1109/TAES.2019.2937661]

31. Armellin, R.; Di Lizia, P. Probabilistic Optical and Radar Initial Orbit Determination. J. Guid. Control. Dyn.; 2017; 41, pp. 101-118. [DOI: https://dx.doi.org/10.2514/1.G002217]

32. ESA Space Debris Office. ESA’S Annualspace Environment Report; The European Space Agency (ESA): Paris, France, 2025.

33. Agostinelli, I.; Goracci, G.; Curti, F. Initial orbit determination via artificial intelligence for too-short arcs. Acta Astronaut.; 2024; 222, pp. 609-624. [DOI: https://dx.doi.org/10.1016/j.actaastro.2024.06.006]

34. Hwang, H.; Park, S.-Y.; Lee, E. Angles-Only Initial Orbit Determination of Low Earth Orbit (LEO) Satellites Using Real Observational Data. J. Astron. Space Sci.; 2019; 36, pp. 187-197. [DOI: https://dx.doi.org/10.5140/JASS.2019.36.3.187]

35. Oliver Montenbruck, E.G. Satellite Orbits-Models, Methods and Applications; Springer: Berlin/Heidelberg, Germany, 2013.

36. Vallado, D.A. Fundamentals of Astrodynamics and Applications; McGraw-Hill: Columbus, OH, USA, 1997.

37. Zhao, K. Problem and Methods for Initial Orbit Determination of Space-Based Optical Space Surveillance. Acta Aeronaut. Astronaut. Sin.; 2023; 44, 326465.

38. Hongbo, Z. Theories and Methods of Spacecraft Orbital Mechanics; National Defense Industry Press: Washington, DC, USA, 2015.

39. Hughes, P.C. Spacecraft Attitude Dynamics; Dover Publications, Inc.: Mineola, NY, USA, 1986.

40. Niku, S.B. Introduction to Robotics: Analysis, Control, Applications; 3rd ed. John Wiley and Sons: Hoboken, NJ, USA, 2020; 507.

41. Liu, Y.H.; Wang, H.; Wang, C.; Lam, K.K. Uncalibrated visual servoing of robots using a depth-independent interaction matrix. IEEE Trans. Robot.; 2006; 22, pp. 804-817. [DOI: https://dx.doi.org/10.1109/tro.2006.878788]

42. Zhang, H.; Wu, Y.; Zhong, S.; Guo, H. Space target compound pointing control method based on backstepping. Syst. Eng. Electron.; 2023; 45, pp. 2884-2893. [DOI: https://dx.doi.org/10.12305/j.issn.1001-506X.2023.09.28]

43. Boyd, S.; Ghaoui, L.E.; Feron, E.; Balakrishnan, V. Linear Matrix Inequality in Systems and Control Theory; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 1991.

44. Corke, P. Robotics, Vision and Control-Fundamental Algorithms. MATLAB® Second, Completely Revised, Extended and Updated Edition; Springer: Cham, Switzerland, 2017.

45. Hao, L.; Zhang, Y.; Li, H. A Dynamic Event-Triggered Saturation Method for Nonlinear Estimation with Application to Drag-Free Control Systems. IEEE Trans. Aerosp. Electron. Syst.; 2025; 61, pp. 915-931. [DOI: https://dx.doi.org/10.1109/TAES.2024.3449245]

46. Zhang, H.; Liu, W.; Zhang, L.; Meng, Y.; Han, W.; Song, T.; Yang, R. An allocation strategy integrated power, bandwidth, and subchannel in a RCC network. Def. Technol. 2025, ahead of print [DOI: https://dx.doi.org/10.1016/j.dt.2025.08.010]

© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.