1. Introduction
A chaotic system refers to a deterministic system where there are irregular movements that appear to be random, and its behavior is uncertain, unrepeatable, and unpredictable. The high sensitivity to the initial conditions and the fact that they are inherently unpredictable are the main characteristics of the chaotic systems. Chaotic phenomena are ubiquitous in several scientific fields, such as atmosphere motions [1], population dynamics [2,3,4], epidemiology [5], and economics. It has been a hot topic in such fields and attracted the attention of many people. It is worth noting that the chaotic system is not completely random as its name suggests but has certain structure and patterns. However, due to the lack of understanding of the dynamic mechanism of chaotic systems, the prediction of chaotic time series is still a very important but challenging problem.
With the development of big data and advanced algorithms in machine learning, it has become a new research direction to solve prediction problems of chaotic systems using a data-driven way. Several empirical models to predict chaotic time series based on machine learning are proposed. Many famous artificial neural networks (ANN) models such as Radial Basis Function (RBF) neural network [6], neuro-fuzzy model with Locally Linear Model Tree (LoLiMoT) [7], feedforward neural network [8], multi-layer perceptron (MLP) [9], recurrent neural networks (RNN) [10], finite impulse response (FIR) neural network [11], deep belief nets (DBN) [12], Elman neural network [11,13,14], and wavelet neural network (WNN) [15,16] have been introduced in the literature.
However, the setting of neural network model parameters will greatly affect the performance of these models. Consequently, a substantial amount of work has also been put into the optimization algorithm and parameter settings. Min Gan et al. present a state-dependent autoregressive (SD-AR) model, which uses a set of locally linear radial basis function networks (LIRBFNs) to approximate its functional coefficients [17]. In addition, Pauline Ong and Zarita Zainuddin presented a modified cuckoo search algorithm (MCSA) to initialize WNN models [16]. A hybrid learning algorithm, called HGAGD, which combines genetic algorithm (GA) with gradient descent (GD) is proposed to optimize the parameters of a quantum-inspired neural network (QNN) [18]. In the proposed methodology, the embedding method is used along with ENN to predict the residual time series [13]. A single hidden Markov model (HMM) combined with fuzzy inference systems is introduced for time series predicting [19]. What is more, many hybrid methods are also developed for improving the performance of these prediction models [20,21].
As mentioned above, model structure and parameter tuning are important factors for chaotic time series prediction with machine learning, and a lot of research has focused on it. To simplify the learning model, a hybrid method using Hankel Alternative View Of Koopman (HAVOK) analysis and machine learning (HAVOK-ML) is developed to predict chaotic time series in this research. Hankel Alternative View Of Koopman (HAVOK) analysis was proposed by Brunton [22]. It combines the delay embedding method [23] and the Koopman theory [24] to decompose chaotic dynamics into a linear model with intermittent forcing. HAVOK-ML decomposes chaotic dynamics into intermittently forced linear systems with HAVOK; then, it estimates the forcing term using machine learning. Essentially, the prediction of the chaotic time series using the HAVOK-ML method is conducted as solving linear ordinary differential equations, which can be calculated efficiently. It can take different types of regression methods such as Linear Regression or Random Forest Regression (RFR) [25] into the prediction framework and combines the advantages of HAVOK theory and machine learning. Therefore, it can obtain better prediction results than directly using those machine learning models.
This paper is organized as follows. Section 2 briefly describes the theory of the HAVOK analysis combined with the machine learning method for time series prediction. Section 3 applies the proposed combined method to perform multi-step ahead prediction for some well-known chaotic time series and also compares the obtained prediction performance with that of existing prediction models. Finally, conclusions are given in Section 4.
2. HAVOK-ML Method
Consider a nonlinear system of the form:
(1)
where is the state of the system at time t and f denotes the dynamics of the system. For a given state at time , can be given discretely by:(2)
Generally speaking, for an observed chaotic time series x(t), the governing equation f is highly nonlinear and unknown. HAVOK analysis [22] provides linear representations for those unknown nonlinear systems. A Hankel matrix , for a single measurement , by taking singular value decomposition (SVD), is given by:
(3)
where , p and q are two parameters that determine the dimension of . The columns of are defined by:(4)
then(5)
Usually, can be well approximated by the first r columns of , . According to the HAVOK analysis [22], the first variables in V can be built as a linear model with the last variable as a forcing term:
(6)
where is the vector of the first eigen-time-delay coordinates. Note that Equation (6) is not a closed model because is an external input forcing. In the linear HAVOK model, matrix and vector may be obtained by the Sparse Identification of Nonlinear Dynamics (SINDy) algorithm [26] or by a straightforward linear regression procedure. is given by the rth column of .A machine learning method is used to predict by using previous observed values , as shown in Figure 1. Suppose evenly varies within interval . The evolution of can be approximated by:
(7)
Then, the first variables are obtained by solving the linear model Equation (6):(8)
Assume that the integration starts at time for an input . Then, the next step of can be written as:(9)
where is a vector containing the first r variables. In order to evaluate the efficiency of HAVOK-ML, RMSE, NMSE, and score defined below are used as a performance index.(10)
(11)
(12)
where , and represent the observed data, the predicted data, and the mean of the observed data, respectively. The Root Mean Squared Error (RMSE) and the Normalized Mean Squared Error (NMSE) are used to assess the accuracy of the prediction and to compare the results with those of the literature. The score is used to evaluate the score of the machine learning based prediction for .3. Numerical Experiments
In this section, three different type of time series—Lorenz [1], Mackey–Glass [26], and Sunspot—are applied to verify our proposed HAVOK_ML method. The parameters adopted in the HAVOK analysis of these series are listed in Table 1.
3.1. Lorenz Time Series
The Lorenz system [1] is among the most famous chaotic systems, which is described by:
(13)
The chaotic time series is obtained with parameters , , , and in the second sampling. In this study, only the time series of variable , shown in Figure 2, is considered.In this research, HAVOK-ML decomposes chaotic dynamics into intermittently forced linear systems by HAVOK analysis; the settings of HAVOK analysis are given in Table 1, and the sampling time step for each system is consistent with other references listed in Table 2. However, according to the advice in paper [14], the samples of the Lorenz system are interpolated at 0.001 s resolution in the HAVOK analysis.
By using HAVOK analysis for the training data, a linear HAVOK model (Equation (6)) is developed. As shown in Figure 3, matrix and vector are sparse, and the reconstruction of and is coherent with the actual values for the full range of time. Since the (Figure 4) is not smooth enough, many experiment results demonstrate that the RFR method [25] can predict the best. Hence, an RFR method is adopted to train and estimate the next step based on previously observed values . The samples from the 3rd to the 100th seconds are spilled into training set (first 80%) and test set (20%). The score for the RFR method on the test set is 0.87. We can observe that the estimated results are mainly consistent with the actual values, as shown in Figure 4.
In the next experiment, the HAVOK-ML method is used in the N-step recursive prediction of Lorenz time series. In the recursive prediction, the current predicted values are used for next predictions without any correction to the actual values. A comparison between the multi-step predicted values and the original time series, with 1000 testing samples, is shown in Figure 5. It can be seen that in the initial steps of predict (less than 10), the prediction results are coherent with the actual values. The error increases with predict steps, especially at the region near the extreme point of the curve. The RMSE of the prediction function of time is presented in Figure 6. It can be observed that the error quickly increases with the increase of the predicted time, which means that a long-term prediction is basically impossible. At step 10, the obtained RMSE is 9.003× 10 (Figure 6), which is significantly better than the result of the literature (0.014) [27].
Table 2 presents the one-step ahead prediction errors (RMSE and NMSE) for the proposed method as well as some results obtained by existing methods, which were extracted from the literature. It can be shown that the RMSE index of the proposed method is optimal, while the NMSE index shows that the proposed method is second only to the functional weight WNN state-dependent AR (FWWNN-AR) model [15].
3.2. Mackey–Glass Time Series
The Mackey–Glass chaotic time series has been introduced as a white blood cell production [26]. It is described by:
(14)
where , , and , similar to other published papers presented in Table 3. The Mackey–Glass equation is solved using the delay differential equation method dde23 of MATLAB. A chaotic time series samples set of 25,000 lengths, with time step , is generated. The samples from the 300th to 2000th seconds, shown in Figure 7, are chosen as the training set, while the rest is used as the test set.The HAVOK analysis settings for Mackey–Glass time series are given in Table 1. The rows in the H matrix are for , and the rank of the SVD decomposition is . More details on the HAVOK analysis and the multi-step ahead prediction for the Mackey–Glass time series are presented in Figure A1, Figure A2, Figure A3 and Figure A4 in Appendix A. By considering the properties of the curve, an LLN model with the LoliMoT optimization method [7] is determined through experiments as a regressor to predict . A comparison between the prediction accuracies of the proposed method and other models of the literature are summarized in Table 3. As shown in Table 3, whether RMSE or NMSE, the effect of the proposed method outperforms the existing models.
3.3. Sunspot Time Series
The number of sunspots observed in the solar surface varies within a period of approximately 11 years. The variation of the number of sunspots has a large impact on Earth and on the climate. The monthly smoothed sunspot number time series, observed by the Solar Influences Data Analysis Center (
4. Discussion and Conclusions
In this paper, a HAVOK-ML method combining the HAVOK analysis with machine learning to predict chaotic time series is proposed. Based on the HAVOK analysis, the observed chaotic dynamic system could be reconstructed as a linear model with an external intermittent forcing. A machine learning method was applied to predict the external forcing term by using previously observed values. Finally, the combination of the HAVOK analysis with machine learning produces a closed model for prediction. It is worth noting that the machine learning method used in HAVOK-ML will vary depending on the property of the external forcing term. The developed method has been validated for multi-step ahead prediction of several classic chaotic time series (the Lorenz time series, the Mackey–Glass time series, and the Sunspot time series). The experimental results show that our method can produce accurate forecasts even with simple machine learning algorithms. The prediction performance of the proposed method has been compared with other forecasting models of the literature. The comparison shows that the proposed method outperforms the existing ones in terms of superior forecasting ability. Although HAVOK-ML can be combined with different machine learning methods, it does not give suggestions on how to choose machine learning methods for different time series forecasting problems. This is worth studying in the future.
Conceptualization, J.Y.; Data curation, J.Z. and C.Z.; Formal analysis, J.Z., J.S. and H.L.; Funding acquisition, H.L.; Investigation, J.Y., J.Z. and C.Z.; Methodology, J.Y. and J.Z.; Project administration, J.S. and J.W.; Supervision, J.S. and J.W.; Validation, J.Y. and H.L.; Writing—original draft, J.Y.; Writing—review and editing, J.Y. and J.Z. All authors have read and agreed to the published version of the manuscript.
This research is partially supported by the National Natural Science Foundation of China (Grant Nos. 41605070, 61802424).
The code and the dataset supporting the results of this article are available in the
The authors declare no conflict of interest.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Figure 1. The architecture of the HAVOK-ML method to perform one-step prediction. The SVD of Hankel matrix [Forumla omitted. See PDF.] yields eigen time series [Forumla omitted. See PDF.]. On the one hand, the HAVOK analysis gives a linear system for the first [Forumla omitted. See PDF.] variables with [Forumla omitted. See PDF.] as an external input. On the other hand, by using the machine learning method, the evolution of [Forumla omitted. See PDF.] can be established. Hence, a closed linear model for the first r variables is available. The symbols with superscript + stand for values at the next step [Forumla omitted. See PDF.].
Figure 2. The time series [Forumla omitted. See PDF.] in the Lorenz system. The initial condition is (−8, 8, 27). The training data are chosen from the 3rd to the 100th seconds.
Figure 3. HAVOK analysis for Lorenz chaotic series [Forumla omitted. See PDF.]. From upper-left to bottom-right: matrix [Forumla omitted. See PDF.], vector [Forumla omitted. See PDF.], reconstruction of [Forumla omitted. See PDF.] using the linear HAVOK model with forcing [Forumla omitted. See PDF.], reconstruction of [Forumla omitted. See PDF.] and the input of external forcing [Forumla omitted. See PDF.].
Figure 4. The random forest regressor for [Forumla omitted. See PDF.], using previously observed values at [Forumla omitted. See PDF.] to predict the next time value at [Forumla omitted. See PDF.], with [Forumla omitted. See PDF.].
Figure 5. Comparison of the original time series samples and the multi-step predicted values with one-step length of 0.01 s on Lorenz time series.
Figure 6. Lorenz time-series RMSE of multi-step ahead prediction, function of the number of steps (N), with one-step length of 0.01 s.
Figure 7. Time series of Mackey–Glass system. The initial condition is 0.8, and the training data are chosen from the 300th to 2000th seconds.
Figure 8. Time series of sunspot normalized to [[Forumla omitted. See PDF.], 1]. The training period ranges between November 1834 and March 1918.
HAVOK analysis parameters for each system.
System | Samples | dt |
|
q | Rank (r) | Regressor for |
---|---|---|---|---|---|---|
Lorenz | 20,000 | 0.01 s | 0.001 s | 40 | 11 | RandomForest |
Mackey-Glass | 50,000 | 0.1 s | / | 5 | 5 | LoLiMoT |
Sunspot | 2000 | 1 month | 0.02 month | 140 | 7 | LoLiMoT |
Comparison of the models in one-step predicting for Lorenz chaotic series
Model | RMSE | NMSE | Reference |
---|---|---|---|
Deep Belief Network | 1.02 × 10 |
/ | [ |
Elman–NARX neural networks | 1.08 × 10 |
1.98 × 10 |
[ |
WNN | / |
9.84 × 10 |
[ |
Fuzzy Inference System | 3.1 × 10 |
/ | [ |
Local Linear Neural Fuzzy | / | 9.80 × 10 |
[ |
Local Linear Radial Basis Function Networks | / | 4.53 × 10 |
[ |
WNNs with MCSA | 8.20 × 10 |
1.22 × 10 |
[ |
HAVOK_ML(RFR) |
1.43 × 10 |
3.23 × 10 |
Comparison of the models in six-time step ahead predicting Mackey–Glass time series, with 4000 testing samples. The last row shows the proposed HAVOK-ML method with the LLN model as the regressor. The values in bold are the highest prediction accuracies achieved by the models.
Model | RMSE | NMSE | Reference |
---|---|---|---|
ARMA with Maximal Overlap Discrete Wavelet Transform | / | 5.3373 × 10 |
[ |
Ensembles of Recurrent Neural Network | 7.533 × 10 |
8.29 × 10 |
[ |
Quantum-Inspired Neural Network | 9.70 × 10 |
/ | [ |
Recurrent Neural Network | 6.25 × 10 |
/ | [ |
Type-1 Fuzzy System | 4.8 × 10 |
/ | [ |
Fuzzy Inference System | 7.1 × 10 |
/ | [ |
WNNs with MCSA | 5.60 × 10 |
6.25 × 10 |
[ |
HAVOK_ML(RFR) |
9.92 × 10 |
1.86 × 10 |
Comparison of the models in one-time step ahead predicting sunspot time series, with 1000 testing samples. The last row shows the proposed HAVOK analysis with the LLN model as the regressor. The values in bold are the highest prediction accuracies achieved by the models.
Model | RMSE | NMSE | Reference |
---|---|---|---|
Elman-NARX Neural Networks | 1.19 × 10 |
5.90 × 10 |
[ |
Elman Recurrent Neural Networks | 5.58 × 10 |
1.92 × 10 |
[ |
Ensembles of Recurrent Neural Network | 1.52 × 10 |
9.64 × 10 |
[ |
Fuzzy Inference System | 1.18 × 10 |
5.32 × 10 |
[ |
Functional Weights WNNs State Dependent Autoregressive Model | 1.12 × 10 |
5.24 × 10 |
[ |
WNNs with MCSA | 1.13 × 10 |
5.30 × 10 |
[ |
HAVOK_ML(RFR) |
4.25 × 10 |
7.40 × 10 |
Appendix A. Figures of Mackey–Glass Time Series and Sunspot Time Series
Figure A1. Decomposition of the Mackey–Glass chaocit series with HAVOK analysis (similar to Figure 3).
Figure A2. The LLN model with the LoliMoT optimization method, which is used to predict [Forumla omitted. See PDF.] of the Mackey–Glass chaocit series. The previously observed values at [Forumla omitted. See PDF.] are used to predict the next time value at [Forumla omitted. See PDF.], with [Forumla omitted. See PDF.] s.
Figure A3. Comparison of the original time series samples of Mackey–Glass and the multi-step predicted values, with one-step length of 0.1 s.
Figure A4. Error growth of the multi-step prediction of Mackey–Glass chaocit series for 4000 samples, with one-step length of 0.1 s.
Figure A5. Decomposition of the sunspot series with HAVOK analysis (similar to Figure 3).
Figure A6. The LLN model with the LoliMoT optimization method, which is used to predict [Forumla omitted. See PDF.] of sunspot series. The previously observed values at [Forumla omitted. See PDF.] are used to predict.
Figure A7. Multi-step ahead prediction of sunspot series with one-step length of 1 month.
Figure A8. Error growth of multi-step prediction of sunspot series with one-step length of 1 (month).
References
1. Lorenz, E.N. Deterministic nonperiodic flow. J. Atoms.; 1963; 20, pp. 130-141. [DOI: https://dx.doi.org/10.1175/1520-0469(1963)020<0130:DNF>2.0.CO;2]
2. Bjørnstad, O.N.; Grenfell, B.T. Noisy Clockwork: Time Series Analysis of Population Fluctuations in Animals. Science; 2001; 293, pp. 638-643. [DOI: https://dx.doi.org/10.1126/science.1062226]
3. Sugihara, G.; May, R.; Ye, H.; Hsieh, C.H.; Deyle, E.; Fogarty, M.; Munch, S. Detecting Causality in Complex Ecosystems. Science; 2012; 338, pp. 496-500. [DOI: https://dx.doi.org/10.1126/science.1227079]
4. Ye, H.; Beamish, R.J.; Glaser, S.M.; Grant, S.; Hsieh, C.H.; Richards, L.J.; Schnute, J.T.; Sugihara, G. Equation-free mechanistic ecosystem forecasting using empirical dynamic modeling. Proc. Natl. Acad. Sci. USA; 2015; 112, E1569. [DOI: https://dx.doi.org/10.1073/pnas.1417063112]
5. Sugihara, G.; May, R.M. Nonlinear forecasting as a way of distinguishing chaos from measurement error in time series. Nature; 1990; 344, pp. 734-741. [DOI: https://dx.doi.org/10.1038/344734a0]
6. Chen, S.; Cowan, C.; Grant, P. Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans. Neural Netw.; 1991; 2, pp. 302-309. [DOI: https://dx.doi.org/10.1109/72.80341]
7. Predicting Chaotic time series using neural and neurofuzzy models: A comparative study. Neural Process. Lett.; 2006; 24, pp. 217-239. [DOI: https://dx.doi.org/10.1007/s11063-006-9021-x]
8. Chen, Y.; Yang, B.; Dong, J.; Abraham, A. Time-series forecasting using flexible neural tree model. Inf. Sci.; 2005; 174, pp. 219-235. [DOI: https://dx.doi.org/10.1016/j.ins.2004.10.005]
9. Chandra, R.; Zhang, M. Cooperative coevolution of Elman recurrent neural networks for chaotic time series prediction. Neurocomputing; 2012; 86, pp. 116-123. [DOI: https://dx.doi.org/10.1016/j.neucom.2012.01.014]
10. Ma, Q.L.; Zheng, Q.L.; Peng, H.; Zhong, T.W.; Xu, L.Q. Chaotic Time Series Prediction Based on Evolving Recurrent Neural Networks. Proceedings of the 2007 International Conference on Machine Learning and Cybernetics; Hong Kong, China, 19–22 August 2007; Volume 6, pp. 3496-3500. [DOI: https://dx.doi.org/10.1109/ICMLC.2007.4370752]
11. Koskela, T.; Lehtokangas, M.; Saarinen, J.; Kaski, K. Time Series Prediction with Multilayer Perceptron, FIR and Elman Neural Networks. Proceedings of the World Congress on Neural Networks; INNS Press: San Diego, CA, USA, 1996; pp. 491-496.
12. Kuremoto, T.; Kimura, S.; Kobayashi, K.; Obayashi, M. Time series forecasting using a deep belief network with restricted Boltzmann machines. Neurocomputing; 2014; 137, pp. 47-56. [DOI: https://dx.doi.org/10.1016/j.neucom.2013.03.047]
13. Ardalani-Farsa, M.; Zolfaghari, S. Chaotic time series prediction with residual analysis method using hybrid Elman–NARX neural networks. Neurocomputing; 2010; 73, pp. 2540-2553. [DOI: https://dx.doi.org/10.1016/j.neucom.2010.06.004]
14. Brunton, S.L.; Brunton, B.W.; Proctor, J.L.; Kaiser, E.; Kutz, J.N. Chaos as an Intermittently Forced Linear System. Nat. Commun.; 2016; 8, 19. [DOI: https://dx.doi.org/10.1038/s41467-017-00030-8]
15. Inoussa, G.; Peng, H.; Wu, J. Nonlinear time series modeling and prediction using functional weights wavelet neural network-based state-dependent AR model. Neurocomputing; 2012; 86, pp. 59-74. [DOI: https://dx.doi.org/10.1016/j.neucom.2012.01.010]
16. Zhu, L.; Wang, Y.; Fan, Q. MODWT-ARMA model for time series prediction. Appl. Math. Model.; 2014; 38, pp. 1859-1865. [DOI: https://dx.doi.org/10.1016/j.apm.2013.10.002]
17. Ong, P.; Zainuddin, Z. Optimizing wavelet neural networks using modified cuckoo search for multi-step ahead chaotic time series prediction. Appl. Soft Comput. J.; 2019; 80, pp. 374-386. [DOI: https://dx.doi.org/10.1016/j.asoc.2019.04.016]
18. Wang, X.; Ma, L.; Wang, B.; Wang, T. A hybrid optimization-based recurrent neural network for real-time data prediction. Neurocomputing; 2013; 120, pp. 547-559. [DOI: https://dx.doi.org/10.1016/j.neucom.2013.04.016]
19. Bhardwaj, S.; Srivastava, S.; Gupta, J.R.P. Pattern-Similarity-Based Model for Time Series Prediction. Comput. Intell.; 2015; 31, pp. 106-131. [DOI: https://dx.doi.org/10.1111/coin.12015]
20. Smith, C.; Jin, Y. Evolutionary multi-objective generation of recurrent neural network ensembles for time series prediction. Neurocomputing; 2014; 143, pp. 302-311. [DOI: https://dx.doi.org/10.1016/j.neucom.2014.05.062]
21. Ho, D.T.; Garibaldi, J.M. Context-Dependent Fuzzy Systems With Application to Time-Series Prediction. IEEE Trans. Fuzzy Syst. Publ. IEEE Neural Netw. Counc.; 2014; 22, pp. 778-790. [DOI: https://dx.doi.org/10.1109/TFUZZ.2013.2272645]
22. Takens, F. Detecting strange attractors in turbulence. Dynamical Systems and Turbulence, Warwick 1980; Rand, D.; Young, L.S. Springer: Berlin/Heidelberg, Germany, 1981; pp. 366-381.
23. Tu, J.H.; Rowley, C.W.; Luchtenburg, D.M.; Brunton, S.L.; Kutz, J.N. On dynamic mode decomposition: Theory and applications. J. Comput. Dyn.; 2014; 1, pp. 391-421. [DOI: https://dx.doi.org/10.3934/jcd.2014.1.391]
24. Brunton, S.L.; Proctor, J.L.; Kutz, J.N. Discovering governing equations from data: Sparse identification of nonlinear dynamical systems. Proc. Natl. Acad. Sci. USA; 2015; 113, 3932. [DOI: https://dx.doi.org/10.1073/pnas.1517384113]
25. Ao, Y.; Li, H.; Zhu, L.; Ali, S.; Yang, Z. The linear random forest algorithm and its advantages in machine learning assisted logging regression modeling. J. Pet. Sci. Eng.; 2019; 174, pp. 776-789. [DOI: https://dx.doi.org/10.1016/j.petrol.2018.11.067]
26. Mackey, M.C.; Glass, L. Oscillation and Chaos in Physiological Control Systems. Science; 1977; 197, pp. 287-289. [DOI: https://dx.doi.org/10.1126/science.267326]
27. Gan, M.; Peng, H.; Peng, X.; Chen, X.; Inoussa, G. A locally linear RBF network-based state-dependent AR model for nonlinear time series modeling. Inf. Sci.; 2010; 180, pp. 4370-4383. [DOI: https://dx.doi.org/10.1016/j.ins.2010.07.012]
28. Ganjefar, S.; Tofighi, M. Optimization of quantum-inspired neural network using memetic algorithm for function approximation and chaotic time series prediction. Neurocomputing; 2018; 291, pp. 175-186. [DOI: https://dx.doi.org/10.1016/j.neucom.2018.02.074]
29. Woolley, J.W.; Agarwal, P.K.; Baker, J. Modeling and prediction of chaotic systems with artificial neural networks. Int. J. Numer. Methods Fluids; 2010; 63, pp. 989-1004. [DOI: https://dx.doi.org/10.1002/fld.2117]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
The prediction of chaotic time series systems has remained a challenging problem in recent decades. A hybrid method using Hankel Alternative View Of Koopman (HAVOK) analysis and machine learning (HAVOK-ML) is developed to predict chaotic time series. HAVOK-ML simulates the time series by reconstructing a closed linear model so as to achieve the purpose of prediction. It decomposes chaotic dynamics into intermittently forced linear systems by HAVOK analysis and estimates the external intermittently forcing term using machine learning. The prediction performance evaluations confirm that the proposed method has superior forecasting skills compared with existing prediction methods.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer