Global Exponential Stability and Global Convergence in Finite Time of Neural Networks with Discontinuous Activations

    loading  Checking for direct PDF access through Ovid

Abstract

In this paper, we consider a general class of neural networks, which have arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. Based on the topological degree theory and Lyapunov functional method, we provide some new sufficient conditions for the global exponential stability and global convergence in finite time of these delayed neural networks. Under these conditions the uniqueness of initial value problem (IVP) is proved. The exponential convergence rate can be quantitatively estimated on the basis of the parameters defining the neural network. These conditions are easily testable and independent of the delay. In the end some remarks and examples are discussed to compare the present results with the existing ones.

Related Topics

    loading  Loading Related Articles