[ProQuest: [...] denotes non US-ASCII text; see PDF]
Mingzhu Song 1 and Quanxin Zhu 2,3 and Hongwei Zhou 4
Academic Editor:Xiaohua Ding
1, Department of Mathematics and Computer Science, Tongling University, Tongling 244000, China
2, School of Mathematical Sciences and Institute of Finance and Statistics, Nanjing Normal University, Nanjing 210023, China
3, Department of Mathematics, University of Bielefeld, 33615 Bielefeld, Germany
4, School of Mathematics and Information Technology, Nanjing Xiaozhuang University, Nanjing, Jiangsu 211171, China
Received 24 March 2016; Accepted 26 April 2016
This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1. Introduction
During the past decades, a great deal of attention has been paid to investigate the dynamics behaviors such as stability, periodic oscillatory behavior, almost periodic oscillatory behavior, and chaos and bifurcation of neural networks. Particularly, the stability of neural networks is one of the best topics since many important applications depend heavily on the stability of the equilibrium point. Therefore, there have appeared a large number of works on the stability of the equilibrium point of various neural networks such as Hopfield neural networks, cellular neural networks, recurrent neural networks, Cohen-Grossberg neural networks, and bidirectional associative memory (BAM) neural networks [1-9].
As is well known, time delay is one of the most significant phenomena that occur in many different fields such as biology, chemistry, economy, and communication networks. Moreover, it is inevitably encountered in both neural processing and signal transmission due to the limited bandwidth of neurons and amplifiers. However, the existence of time delays may cause oscillation, divergence, chaos, instability, or other poor performance in neural networks, which are usually harmful to the applications of neural networks. Therefore, the stability analysis for neural networks with time delays has attracted many researchers' much attention in the literature. The existing works on the stability of neural networks with time delays can be simply classified into four categories: constant delays, time-varying delays, distributed delays, and mixed time delays.
It should be mentioned that a new class of delays, called leakage delays (also named time delays in the "forgetting" or leakage terms), was initially introduced by Gopalsamy [10] in the study of neural networks. In [10], Gopalsamy pointed out that the leakage delays often have a tendency to destabilize the neural networks and they were very difficult to handle. Hence, to investigate the stability of neural networks with leakage delays has been an interesting and challenging topic. It is inspiring that there have been many interesting results on the stability of neural networks with leakage delays reported in the literature [11-17]. For example, Liu in [11] investigated the existence of a unique equilibrium and globally exponential stability for a class of BAM neural networks with time-varying delays in the leakage terms by using the fixed point theorem and Lyapunov functional theory. By using the Lyapunov-Krasovskii functional having triple integral terms and model transformation technique, Zhu et al. in [12] obtained some novel sufficient delay-dependent conditions to ensure the globally exponential stability in the mean square of impulsive bidirectional associative memory (BAM) neural networks with both Markovian jump parameters and leakage delays. In [13], Wang et al. discussed the stability of recurrent neural networks with time delays in the leakage terms under impulsive perturbations. By applying a new stability lemma, Itô's formula, Lyapunov-Krasovskii functional, stochastic analysis theory, and matrix inequalities technique, Xie et al. in [15] studied the exponential stability in the mean square for a class of stochastic neural networks with leakage delays and expectations in the coefficients.
On the other hand, noise disturbance is a major source of instability and poor performances in neural networks. Usually, many real nervous systems are affected by external perturbations which in many cases are of great uncertainty and hence may be treated as random. Just as Haykin pointed out, the synaptic transmission can be regarded as a noisy process introduced by random fluctuations from the release of neurotransmitters and other probabilistic causes. Therefore, we should consider the effect of noise disturbances when studying the stability of neural networks. Generally speaking, neural networks with noise disturbances are called stochastic neural networks. Recently, there have appeared a large number of results on the stability of stochastic neural networks (see, e.g., [3-5, 7, 9, 13, 14]). Unluckily, those criteria presented in [3-5, 7, 9, 13, 14] require a strict condition that the derivative of the considered Lyapunov-Krasovskii functional is negative; that is, L V ( t , x ( t ) ) < 0 for any x ( t ) ≠ 0 , where L V ( t , x ( t ) ) is a weak infinitesimal operator and V ( t , x ( t ) ) is a positive Lyapunov-Krasovskii functional. However, L V ( t , x ( t ) ) may not be negative in many real cases, which leads to the fact that the criteria obtained in [3-5, 7, 9, 13, 14] fail in this case.
Motivated by the above discussion, in this paper we study the stability problem for a class of stochastic neural networks with time delays in the leakage terms. Different from the previous literature, we aim to remove the restriction of L V ( t , x ( t ) ) < 0 for any x ( t ) ≠ 0 . By using the LaSalle invariant principle of stochastic delay differential equations, Itô's formula, and stochastic analysis theory, some novel sufficient conditions are derived to guarantee the almost sure stability of the equilibrium point. Moreover, two numerical examples and their simulations are provided to show the effectiveness of the theoretical results and demonstrate that time delays in the leakage terms do contribute to the stability of stochastic neural networks.
The remainder of this paper is organized as follows. In Section 2, we introduce the model of a class of stochastic neural networks with time delays in the leakage terms and present the definition of almost sure stability as well as some necessary assumptions. By means of the LaSalle invariant principle of stochastic delay differential equations, Itô's formula, and stochastic analysis theory, our main results are established in Section 3. In Section 4, two numerical examples are given to show the effectiveness of the obtained results. Finally, in Section 5, the paper is concluded with some general remarks.
Notation 1.
The notations used in this paper are quite standard. R n and R n × n denote the n -dimensional Euclidean space and the set of all n × n real matrices, respectively. The superscript " T " denotes the transpose of a matrix or vector, and the symbol " * " denotes the symmetric term of the matrix. Trace ( · ) denotes the trace of the corresponding matrix and I denotes the identity matrix with compatible dimensions. For any matrix A , λ m a x ( A ) (resp., λ m i n ( A ) ) denotes the largest (resp., smallest) eigenvalue of A . For square matrices M 1 and M 2 , the notation M 1 > ( ≥ , < , <= ) M 2 denotes that M 1 - M 2 is positive definite (positive semidefinite, negative, and negative semidefinite) matrix. Let w ( t ) = ( w 1 , ... , w n ) T be n -dimensional Brownian motion defined on a complete probability space ( Ω , F , P ) with a natural filtration ( F t ) t ≥ 0 . Also, let τ > 0 and C ( [ - τ , 0 ] ; R n ) denote the family of continuous function [varphi] from [ - τ , 0 ] to R n with the uniform norm ( [varphi] ) = s u p - τ <= θ <= 0 ( [varphi] ( θ ) ) . Denote by L F t 2 ( ( - τ , 0 ) ; R n ) the family of all F t measurable, C ( [ - τ , 0 ] ; R n ) -valued stochastic variables ξ = { ξ ( θ ) : - τ <= θ <= 0 } such that ∫ - τ 0 E ( ξ ( s ) ) 2 d s < ∞ , where E ( · ) stands for the correspondent expectation operator with respect to the given probability measure P .
2. Model Description and Problem Formulation
In this paper, we consider a class of neural networks with mixed time delays, which is described by the following integrodifferential equations: [figure omitted; refer to PDF] where x ( t ) = [ x 1 ( t ) , x 2 ( t ) , ... , x n ( t ) ] T is the state vector associated with the n neurons and the diagonal matrix D = diag [...] ( d 1 , d 2 , ... , d n ) has positive entries d i > 0 ( i = 1,2 , ... , n ) . The matrices A = ( a i j ) n × n , B = ( b i j ) n × n , C = ( c i j ) n × n , and D = ( d i j ) n × n are the connection weight matrix, the constant delay connection weight matrix, the time-varying delay connection weight matrix, and the distributed delay connection weight matrix, respectively. f ( x ( t ) ) = ( f 1 ( x 1 ( t ) ) , f 2 ( x 2 ( t ) ) , ... , f n ( x n ( t ) ) ) T , g ( x ( t ) ) = ( g 1 ( x 1 ( t ) ) , g 2 ( x 2 ( t ) ) , ... , g n ( x n ( t ) ) ) T and h ( x ( t ) ) = ( h 1 ( x 1 ( t ) ) , h 2 ( x 2 ( t ) ) , ... , h n ( x n ( t ) ) ) T are the neuron activation functions. The noise perturbation σ : R n × R n × R n × R n [arrow right] R n × n is a Borel measurable function, and β > 0 denotes the leakage delay. τ 1 and τ 2 are constant delays.
Throughout this paper, the following assumptions are assumed to hold.
Assumption H1. There exist diagonal matrices U i - = diag [...] ( u i 1 - , u i 2 - , ... , u i n - ) and U i + = diag [...] ( u i 1 + , u i 2 + , ... , u i n + ) , i = 1,2 , 3 , satisfying [figure omitted; refer to PDF] for all α , β ∈ R , α ≠ β , and j = 1,2 , ... , n .
Assumption H2. There exist positive definite matrices R 1 , R 2 , R 3 , and R 4 such that [figure omitted; refer to PDF] for all x 1 , x 2 , x 3 , x 4 ∈ R n .
Assumption H3 . Consider [figure omitted; refer to PDF]
Let x ( t ; ξ ) denote the state trajectory from the initial data x ( θ ) = ξ ( θ ) on - τ <= θ <= 0 in L F 0 2 ( [ - τ , 0 ] ; R n ) . Clearly, under Assumptions H1-H3, system (1) admits a trivial solution x ( t ; 0 ) ≡ 0 corresponding to the initial data ξ = 0 . For simplicity, we write x ( t ; ξ ) = x ( t ) .
Now we give the concept of almost sure stability for system (1).
Definition 1.
The equilibrium point of (1) is said to be almost surely stable if for every ξ ∈ L F 0 2 ( [ - τ , 0 ] ; R n ) [figure omitted; refer to PDF] where "a.s." denotes "almost surely."
The following lemma is needed to prove our main results.
Lemma 2 (see [18]).
For any positive definite matrix G > 0 , a scalar τ > 0 , and a function Φ : [ 0 , τ ] [arrow right] R n such that the integrations concerned are well defined, the following inequality holds: [figure omitted; refer to PDF]
3. Main Results and Proofs
In this section, the almost sure stability of the equilibrium point for system (1) is investigated under Assumptions H1-H3.
Theorem 3.
Under Assumptions H1-H3, the equilibrium point of (1) is almost surely stable, if there exist a positive scalar λ , positive diagonal matrices Q 1 , Q 2 , and Q 3 , and positive definite matrices P , E , F , G , and H such that the following linear matrix inequalities (LMIs) hold: [figure omitted; refer to PDF] where [figure omitted; refer to PDF]
Proof.
Fixing ξ ∈ L F 0 2 ( [ - τ , 0 ] ; R n ) arbitrarily and writing x ( t ; ξ ) = x ( t ) , we first define an infinitesimal generator L of the Markov process acting on V ( t , x ( t ) ) as follows: [figure omitted; refer to PDF] Let C 1 2 ( R + × R n ; R + ) denote the family of all nonnegative functions V ( t , x ) on R + × R n which are continuously twice differentiable in x and differentiable in t . If V ∈ C 1 2 ( R + × R n ; R n ) , then along the trajectory of system (1) we define an operator L V from R + × R n to R by [figure omitted; refer to PDF] where [figure omitted; refer to PDF] Now, let us consider the following Lyapunov-Krasovskii functional: [figure omitted; refer to PDF] Then, it follows from (1) and (11) that [figure omitted; refer to PDF] where σ ( t ) [: =] σ ( x ( t ) , x ( t - β ) , x ( t - τ 1 ) , x ( t - τ 2 ) ) . On the other hand, by Assumption H2 and condition (7), we obtain [figure omitted; refer to PDF] which together with (14) gives [figure omitted; refer to PDF] By employing Lemma 2, we have [figure omitted; refer to PDF] On the other hand, it follows from Assumptions H1 and H3 that [figure omitted; refer to PDF] Hence, by (14), (16), (17), (18), (19), and (20), we get [figure omitted; refer to PDF] where [figure omitted; refer to PDF] By conditions (7) and (8), we see that Π < 0 . Let γ = λ m i n ( - Π ) . Then we claim that γ > 0 . This fact together with (20) yields [figure omitted; refer to PDF] Let w 1 ( x ( t ) ) = x T ( t ) ( γ I + R 4 ) x ( t ) and w 2 ( x ( t ) ) = x T ( t ) ( λ R 4 - G ) x ( t ) . It is obvious that w 1 ( x ( t ) ) > w 2 ( x ( t ) ) for any x ( t ) ≠ 0 . Therefore, by Definition 1 and the LaSalle invariant principle of stochastic differential delay equations (e.g., see Corollary 1 in [19]) we see that the zero solution for system (1) is almost surely stable. This completes the proof of Theorem 3.
Remark 4.
If we ignore the effect of time delays in the leakage terms, then system (1) is reduced to the following: [figure omitted; refer to PDF]
Correspondingly, we revise Assumptions H2 and H3 as follows.
Assumption H [variant prime]2 . There exist positive definite matrices R 1 , R 2 , and R 3 such that [figure omitted; refer to PDF] for all x 1 , x 2 , x 3 ∈ R n .
Assumption H [variant prime]3. Consider f ( 0 ) = g ( 0 ) = h ( 0 ) = 0 and σ ( 0,0 , 0 ) ≡ 0 .
Under Assumptions H1, H[variant prime]2, and H[variant prime]3, we have the following result.
Theorem 5.
Under Assumptions H1, H[variant prime]2, and H[variant prime]3, the equilibrium point of (24) is almost surely stable, if there exist a positive scalar λ , positive diagonal matrices Q 1 , Q 2 , and Q 3 , and positive definite matrices P , E , F , G , and H such that the following linear matrix inequalities (LMIs) hold: [figure omitted; refer to PDF] where [figure omitted; refer to PDF]
Proof.
Consider the following Lyapunov-Krasovskii functional: [figure omitted; refer to PDF] Similar to the proof of Theorem 3, we can obtain the desired result by a direct computation. The proof of Theorem 5 is completed.
Remark 6.
Theorems 3 and 5 present some novel sufficient conditions for a class of stochastic neural networks with or without time delays in the leakage terms to ascertain the almost sure stability of the equilibrium point by constructing a different Lyapunov-Krasovskii functional. These conditions are easy to be verified and can be applied in practice as they can be checked by using recently developed algorithms in solving LMIs. It is worth pointing out that Theorem 3 depends on all the delay constants β , τ 1 , and τ 2 , whereas Theorem 5 only depends on the delay constants τ 1 , τ 2 . Therefore, Theorem 3 is less conservative than Theorem 5.
Remark 7.
It is worth pointing out that L V ( t , x ( t ) ) in Theorems 3 and 5 may not be negative. However, the stability criteria obtained in the earlier literature [3-5, 7, 9, 13, 14] require L V ( t , x ( t ) ) < 0 for any x ( t ) ≠ 0 . Hence, the LMI criteria existing in all the previous literature (e.g., see [3-5, 7, 9, 13, 14]) fail in our results.
4. Illustrative Examples
In this section, two numerical examples are given to illustrate the effectiveness of the obtained results.
Example 1.
Consider a two-dimensional stochastic neural network with time delays in the leakage terms: [figure omitted; refer to PDF] where x ( t ) = ( x 1 ( t ) , x 2 ( t ) ) T and w ( t ) is a two-dimensional Brownian motion. Let [figure omitted; refer to PDF] Consider τ 1 = 0.8 and τ 2 = 0.9 . Then system (31) satisfies Assumption H1 with U 1 - = U 2 - = U 3 - = U 4 - = - 0.3 I and U 1 + = U 2 + = U 3 + = U 4 + = 0.4 I . Take [figure omitted; refer to PDF] and then system (31) satisfies Assumptions H2 and H3 with R 1 = 0.27 I , R 2 = 0.19 I , R 3 = 0.18 I , and R 4 = 0.09 I .
Other parameters of network (31) are given as follows: [figure omitted; refer to PDF] By using the Matlab LMI toolbox, we can obtain the following feasible solution for LMIs (7) and (8): [figure omitted; refer to PDF] Therefore, it follows from Theorem 3 that network (31) is almost surely stable.
By using the Euler-Maruyama numerical scheme, simulation results are as follows: T = 160 and step size δ t = 0.02 . Figure 1 is the state response of network (31) with the initial condition [ - 0.6,0.8 ] T , for - 1.2 <= t <= 0 .
Figure 1: The state response of network (31) with β = 0.6 .
[figure omitted; refer to PDF]
Example 2.
Consider a two-dimensional stochastic neural network without time delays in the leakage terms: [figure omitted; refer to PDF] where [figure omitted; refer to PDF]
All other parameters of network (36) are the same as in Example 1. It is easy to check that system (36) satisfies Assumptions H1, H[variant prime]2, and H[variant prime]3.
By using the Matlab LMI toolbox, we can obtain the following feasible solution for LMIs (26): [figure omitted; refer to PDF] Therefore, it follows from Theorem 3 that network (36) is almost surely stable.
By using the Euler-Maruyama numerical scheme, simulation results are as follows: T = 160 and step size δ t = 0.02 . Figure 2 is the state response of network (36) with the initial condition [ - 0.6,0.8 ] T , for - 1.2 <= t <= 0 .
Figure 2: The state response of network (36) with β = 0 .
[figure omitted; refer to PDF]
Remark 8.
Examples 1 and 2 show that two-dimensional stochastic neural networks with and without time delays in the leakage terms are both almost surely stable. However, we know from Figures 1 and 2 that the stability speed of stochastic neural network with time delays in the leakage terms is clearly faster than that of stochastic neural network without time delays in the leakage terms. This fact reveals that time delays in the leakage terms do contribute to the stability of stochastic neural networks.
5. Concluding Remarks
In this paper, we have investigated the almost sure stability analysis problem for a class of stochastic neural networks with time delays in the leakage terms. Some novel delay-dependent conditions are obtained to ensure that the suggested system is almost surely stable, which is quite different from the moment stability. Our method is mainly based on the LaSalle invariant principle of stochastic delay differential equations, Itô's formula, and stochastic analysis theory. Moreover, the stability criteria given in this paper are expressed in terms of LMIs, which can be solved easily by recently developed algorithms. In addition, we use two examples to show that time delays in the leakage terms do contribute to the stability of stochastic neural networks. Finally, we point out that it is possible to generalize our results to some more complex stochastic neural networks with time delays in the leakage terms (e.g., consider the effect of fractional-order factor [20]). Research on this topic is in progress.
Acknowledgments
This work was jointly supported by the Alexander von Humboldt Foundation of Germany (Fellowship CHN/1163390), the National Natural Science Foundation of China (61374080), Qing Lan Project of Jiangsu, the Priority Academic Program Development of Jiangsu Higher Education Institutions, the Key University Science Research Project of Anhui Province (KJ2016A705), the Key Projects of Anhui Province University Outstanding Youth Talent Support Program (gxyqZD2016317), the Natural Science Foundation of Jiangsu Province (BK20140089), and the Play of Nature Science Fundamental Research in Nanjing Xiao Zhuang University (2012NXY12).
[1] A. Arbi, C. Aouiti, F. Cherif, A. Touati, A. M. Alimi, "Stability analysis for delayed high-order type of Hopfield neural networks with impulses," Neurocomputing , vol. 165, pp. 312-329, 2015.
[2] L. N. Liu, Q. X. Zhu, "Almost sure exponential stability of numerical solutions to stochastic delay Hopfield neural networks," Applied Mathematics and Computation , vol. 266, pp. 698-712, 2015.
[3] W. X. Xie, Q. X. Zhu, "Mean square exponential stability of stochastic fuzzy delayed Cohen-Grossberg neural networks with expectations in the coefficients," Neurocomputing , vol. 166, pp. 133-139, 2015.
[4] Q. X. Zhu, J. D. Cao, "Stability analysis of markovian jump stochastic BAM neural networks with impulse control and mixed time delays," IEEE Transactions on Neural Networks and Learning Systems , vol. 23, no. 3, pp. 467-479, 2012.
[5] Y. G. Kao, L. Shi, J. Xie, H. R. Karimi, "Global exponential stability of delayed Markovian jump fuzzy cellular neural networks with generally incomplete transition probability," Neural Networks , vol. 63, pp. 18-30, 2015.
[6] T. Lv, Q. Gan, Q. Zhu, "Stability and bifurcation analysis for a class of generalized reaction-diffusion neural networks with time delay," Discrete Dynamics in Nature and Society , vol. 2016, 2016.
[7] Q. X. Zhu, J. D. Cao, "Stochastic stability of neural networks with both Markovian jump parameters and continuously distributed delays," Discrete Dynamics in Nature and Society , vol. 2009, 2009.
[8] X. H. Mei, L. W. Zhang, H. J. Jiang, Z. Y. Yu, "Dynamics of uncertain discrete-time neural network with delay and impulses," Discrete Dynamics in Nature and Society , vol. 2015, 2015.
[9] C. Wu, J. Hu, Y. Li, "Robustness analysis of hybrid stochastic neural networks with neutral terms and time-varying delays," Discrete Dynamics in Nature and Society , vol. 2015, 2015.
[10] K. Gopalsamy, "Leakage delays in BAM," Journal of Mathematical Analysis and Applications , vol. 325, no. 2, pp. 1117-1132, 2007.
[11] B. W. Liu, "Global exponential stability for BAM neural networks with time-varying delays in the leakage terms," Nonlinear Analysis: Real World Applications , vol. 14, no. 1, pp. 559-566, 2013.
[12] Q. X. Zhu, R. Rakkiyappan, A. Chandrasekar, "Stochastic stability of Markovian jump BAM neural networks with leakage delays and impulse control," Neurocomputing , vol. 136, pp. 136-151, 2014.
[13] Y. Wang, C.-D. Zheng, E. M. Feng, "Stability analysis of mixed recurrent neural networks with time delay in the leakage term under impulsive perturbations," Neurocomputing , vol. 119, pp. 454-461, 2013.
[14] M. J. Park, O. M. Kwon, J. H. Park, S. M. Lee, E. J. Cha, "Synchronization criteria for coupled stochastic neural networks with time-varying delays and leakage delay," Journal of the Franklin Institute , vol. 349, no. 5, pp. 1699-1720, 2012.
[15] W. X. Xie, Q. X. Zhu, F. Jiang, "Exponential stability of stochastic neural networks with leakage delays and expectations in the coefficients," Neurocomputing , vol. 173, pp. 1268-1275, 2016.
[16] P. Balasubramaniam, M. Kalpana, R. Rakkiyappan, "Global asymptotic stability of BAM fuzzy cellular neural networks with time delay in the leakage term, discrete and unbounded distributed delays," Mathematical and Computer Modelling , vol. 53, no. 5-6, pp. 839-853, 2011.
[17] P. Balasubramaniam, M. Kalpana, R. Rakkiyappan, "Existence and global asymptotic stability of fuzzy cellular neural networks with time delay in the leakage term and unbounded distributed delays," Circuits, Systems, and Signal Processing , vol. 30, no. 6, pp. 1595-1616, 2011.
[18] K. Gu, "An integral inequality in the stability problem of time-delay systems," in Proceedings of 39th IEEE Conference on Decision and Control, pp. 2805-2810, Sydney, Australia, December 2000.
[19] X. R. Mao, "A note on the LaSalle-type theorems for stochastic differential delay equations," Journal of Mathematical Analysis and Applications , vol. 268, no. 1, pp. 125-142, 2002.
[20] C. Yin, Y. H. Cheng, Y. Q. Chen, B. Stark, S. M. Zhong, "Adaptive fractional-order switching-type control method design for 3D fractional-order nonlinear systems," Nonlinear Dynamics , vol. 82, no. 1-2, pp. 39-52, 2015.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright © 2016 Mingzhu Song et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract
The stability issue is investigated for a class of stochastic neural networks with time delays in the leakage terms. Different from the previous literature, we are concerned with the almost sure stability. By using the LaSalle invariant principle of stochastic delay differential equations, Itô's formula, and stochastic analysis theory, some novel sufficient conditions are derived to guarantee the almost sure stability of the equilibrium point. In particular, the weak infinitesimal operator of Lyapunov functions in this paper is not required to be negative, which is necessary in the study of the traditional moment stability. Finally, two numerical examples and their simulations are provided to show the effectiveness of the theoretical results and demonstrate that time delays in the leakage terms do contribute to the stability of stochastic neural networks.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer