Inverse Result of Approximation for the Max-Product Neural Network Operators of the Kantorovich Type and Their Saturation Order

Autor: Marco Cantarini, Lucian Coroianu, Danilo Costarelli, Sorin G. Gal, Gianluca Vinti
Jazyk: angličtina
Rok vydání: 2021
Předmět:
Zdroj: Mathematics, Vol 10, Iss 1, p 63 (2021)
Druh dokumentu: article
ISSN: 2227-7390
DOI: 10.3390/math10010063
Popis: In this paper, we consider the max-product neural network operators of the Kantorovich type based on certain linear combinations of sigmoidal and ReLU activation functions. In general, it is well-known that max-product type operators have applications in problems related to probability and fuzzy theory, involving both real and interval/set valued functions. In particular, here we face inverse approximation problems for the above family of sub-linear operators. We first establish their saturation order for a certain class of functions; i.e., we show that if a continuous and non-decreasing function f can be approximated by a rate of convergence higher than 1/n, as n goes to +∞, then f must be a constant. Furthermore, we prove a local inverse theorem of approximation; i.e., assuming that f can be approximated with a rate of convergence of 1/n, then f turns out to be a Lipschitz continuous function.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje