Abstract:A sufficiently smooth function of d variables that decays fast enough at infinity can be represented pointwise by an integral combination of Heaviside plane waves (i.e., characteristic functions of closed half-spaces). The weight function in such a representation depends on the derivatives of the represented function. The representation is proved here by elementary techniques with separate arguments for even and odd d, and unifies and extends various results in the literature. An outline of the paper follows. Section 2 reviews neural networks and establishes notation, Section 3 discusses Green's functions and Green's second identity, and Section 4 describes functions of controlled decay and states our main theorem. We consider the 1-dimensional case in Section 5, provide necessary lemmas in Section 6, and prove the main theorem in Section 7. Section 8 has extensions and refinements of our representation and its relation to known results. The paper ends with a brief discussion.
Keywords
Feedforward neural networksFeedforward neural networks compute functions determined by the type of units and their interconnections. Each computational unit depends on two vector variables (an input and a parameter ), and is given by a function φ : R p × R d → R, where p and d are the dimensions of the parameter and input space respectively and R denotes the set of real numbers.One-hidden-layer networks, with hidden units based on a fixed function φ and a single linear output unit, yield functions f :where n is the number of hidden units, and w i ∈ R are the output weights and a i ∈ R p the input parameters of the i-th unit for i = 1, ..., n.A perceptron is a computational unit based on a function of the form φ ((v, b)