In this article, we study a nonlinear version of the sampling Kantorovich type operators in a multivariate setting and we show applications to image processing. By means of the above operators, we are able to reconstruct continuous and uniformly continuous signals/images (functions). Moreover, we study the modular convergence of these operators in the setting of Orlicz spaces L ( n ) that allows us to deal the case of not necessarily continuous signals/images. The convergence theorems in L p ( n )-spaces, L log L( n )-spaces and exponential spaces follow as particular cases. Several graphical representations, for the various examples and image processing applications are included.
A family of neural network operators of the Kantorovich type is introduced and their convergence studied. Such operators are multivariate, and based on certain special density functions, constructed through sigmoidal functions. Pointwise as well as uniform approximation theorems are established when such operators are applied to continuous functions. Moreover, also $L^p$ approximations are considered, with $1 \miu p < +\infty$, since the $L^p$ setting is the most natural for the neural network operators of the Kantorovich type. Constructive multivariate approximation algorithms, based on neural networks, are important since typical applications to neurocomputing processes do exist for high-dimensional data, then the relation with usual neural networks \ud
approximations is discussed. Several examples of sigmoidal functions, for which the present theory can be applied are presented
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.