Content area

Abstract

Trajectory representation learning transforms raw trajectory data (sequences of spatiotemporal points) into low-dimensional representation vectors to improve downstream tasks such as trajectory similarity computation, prediction, and classification. Existing models primarily adopt self-supervised learning frameworks, often employing models like Recurrent Neural Networks (RNNs) as encoders to capture local dependency in trajectory sequences. However, individual mobility within urban areas exhibits regular and periodic patterns, suggesting the need for a more comprehensive representation from both local and global perspectives. To address this, we propose TrajRL-TFF, a trajectory representation learning method based on time-domain and frequency-domain feature fusion. First, considering the heterogeneous distribution of trajectory data in space, a quadtree is employed for spatial partitioning and coding. Then, each trajectory is converted into a quadtree-code based time series (i.e., time-domain signal), with its corresponding frequency-domain signal derived via Discrete Fourier Transform (DFT). Finally, a trajectory encoder, combining an RNN-based time-domain encoder and a Transformer-based frequency domain encoder, is constructed to capture the trajectory’s local and global features, respectively, and trained by a self-supervised sequence encoding-decoding framework with trajectory perturbation-reconstruction task. Experiments demonstrate that TrajRL-TFF outperforms baselines in downstream tasks including trajectory querying and prediction, confirming that integrating time- and frequency-domain signals enables a more comprehensive representation of human mobility regularities and patterns, which provides valuable guidance for trajectory representation learning and trajectory modeling in future studies.

Full text

Turn on search term navigation

© The Author(s) 2025. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.