Normalized stochastic gradient descent learning of general complex‐valued models

Autor: T. Paireder, C. Motz, M. Huemer
Jazyk: angličtina
Rok vydání: 2021
Předmět:
Zdroj: Electronics Letters, Vol 57, Iss 12, Pp 493-495 (2021)
Druh dokumentu: article
ISSN: 1350-911X
0013-5194
DOI: 10.1049/ell2.12170
Popis: Abstract The stochastic gradient descent (SGD) method is one of the most prominent first‐order iterative optimisation algorithms, enabling linear adaptive filters as well as general nonlinear learning schemes. It is applicable to a wide range of objective functions, while featuring low computational costs for online operation. However, without a suitable step‐size normalisation, the convergence and tracking behaviour of the stochastic gradient descent method might be degraded in practical applications. In this letter, a novel general normalisation approach is provided for the learning of (non‐)holomorphic models with multiple independent parameter sets. The advantages of the proposed method are demonstrated by means of a specific widely‐linear estimation example.
Databáze: Directory of Open Access Journals