We construct generalized translation networks to approximate uniformly a class of nonlinear, continuous functionals defined on Lp ([-1, 1]s) for integer s > or = 1, 1 < or = p < infinity, or C ([-1, 1]s). We obtain lower bounds on the possible order of approximation for such functionals in terms of any approximation process depending continuously on a given number of parameters. Our networks almost achieve this order of approximation in terms of the number of parameters (neurons) involved in the network. The training is simple and noniterative; in particular, we avoid any optimization such as that involved in the usual backpropagation.
In this paper, we study a rumor spreading model in which several types of ignorants exist with trust rate distributions λ i , 1 ≤ i ≤ N. We rigorously show the existence of a threshold on a momentum type initial quantity related to rumor outbreak occurrence regardless of the total initial population. We employ a steady state analysis to obtain the final size of the rumor. Using numerical simulations, we demonstrate the analytical result in which the threshold phenomenon exists for rumor size and discuss interaction between the ignorants of several types of trust rates.
Abstract. In this paper, we investigate a localized approximation of a continuously differentiable function by neural networks. To do this, we first approximate a continuously differentiable function by B-spline functions and then approximate B-spline functions by neural networks. Our proofs are constructive and we give numerical results to support our theory.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.