Discrete-Time Zhang Neural Networks for Time-Varying Nonlinear Optimization
Autor: | Min Sun, Maoying Tian, Yiju Wang |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2019 |
Předmět: | |
Zdroj: | Discrete Dynamics in Nature and Society, Vol 2019 (2019) |
Druh dokumentu: | article |
ISSN: | 1026-0226 1607-887X |
DOI: | 10.1155/2019/4745759 |
Popis: | As a special kind of recurrent neural networks, Zhang neural network (ZNN) has been successfully applied to various time-variant problems solving. In this paper, we present three Zhang et al. discretization (ZeaD) formulas, including a special two-step ZeaD formula, a general two-step ZeaD formula, and a general five-step ZeaD formula, and prove that the special and general two-step ZeaD formulas are convergent while the general five-step ZeaD formula is not zero-stable and thus is divergent. Then, to solve the time-varying nonlinear optimization (TVNO) in real time, based on the Taylor series expansion and the above two convergent two-step ZeaD formulas, we discrete the continuous-time ZNN (CTZNN) model of TVNO and thus get a special two-step discrete-time ZNN (DTZNN) model and a general two-step DTZNN model. Theoretical analyses indicate that the sequence generated by the first DTZNN model is divergent, while the sequence generated by the second DTZNN model is convergent. Furthermore, for the step-size of the second DTZNN model, its tight upper bound and the optimal step-size are also discussed. Finally, some numerical results and comparisons are provided and analyzed to substantiate the efficacy of the proposed DTZNN models. |
Databáze: | Directory of Open Access Journals |
Externí odkaz: | |
Nepřihlášeným uživatelům se plný text nezobrazuje | K zobrazení výsledku je třeba se přihlásit. |