Direct electrical stimulation of the auditory nerve can be used to restore some degree of hearing to the profoundly deaf. Percepts due to electrical stimulation have characteristics corresponding approximately to the acoustic percepts of loudness, pitch, and timbre. To encode speech as a pattern of electrical stimulation, it is necessary to determine the effects of the stimulus parameters on these percepts. The effects of the three basic stimulus parameters of level, repetition rate, and stimulation location on subjects' percepts were examined. Pitch difference limens arising from changes in rate of stimulation increase as the stimulating rate increases, up to a saturation point of between 200 and 1000 pulses per second. Changes in pitch due to electrode selection depend upon the subject, but generally agree with a tonotopic organization of the human cochlea. Further, the discriminability of such place-pitch percepts seems to be dependent on the degree of current spread in the cochlea. The effect of stimulus level on perceived pitch is significant but is highly dependent on the individual tested. The results of these experiments are discussed in terms of their impact on speech-processing strategies and their relevance to acoustic pitch perception.
The Stone-Weierstrass theorem and its terminology are reviewed, and neural network architectures based on this theorem are presented. Specifically, exponential functions, polynomials, partial fractions, and Boolean functions are used to create networks capable of approximating arbitrary bounded measurable functions. A modified logistic network satisfying the theorem is proposed as an alternative to commonly used networks based on logistic squashing functions.
Conventional artificial neural networks perform functional mappings from their input space to their output space. The synaptic weights encode information about the mapping in a manner analogous to long-term memory in biological systems. This paper presents a method of designing neural networks where recurrent signal loops store this knowledge in a manner analogous to short-term memory. The synaptic weights of these networks encode a learning algorithm. This gives these networks the ability to dynamically learn any functional mapping from a (possibly very large) set, without changing any synaptic weights. These networks are adaptive dynamic systems. Learning is online continually taking place as part of the network's overall behavior instead of a separate, externally driven process. We present four higher order fixed-weight learning networks. Two of these networks have standard backpropagation embedded in their synaptic weights. The other two utilize a more efficient gradient-descent-based learning rule. This new learning scheme was discovered by examining variations in fixed-weight topology. We present empirical tests showing that all these networks were able to successfully learn functions from both discrete (Boolean) and continuous function sets. Largely, the networks were robust with respect to perturbations in the synaptic weights. The exception was the recurrent connections used to store information. These required a tight tolerance of 0.5%. We found that the cost of these networks scaled approximately in proportion to the total number of synapses. We consider evolving fixed weight networks tailored to a specific problem class by analyzing the meta-learning cost surface of the networks presented.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.