How systems can represent and how systems can learn are two central problems in the study of cognition. Conventional contemporary approaches to these problems are vitiated by a shared error in their presuppositions about representation. Consequently, such approaches share further errors about the sorts of architectures that are required to support either representation or learning. We argue that the architectural requirements for genuine representing systems lead to architectural characteristics that are necessary (though not sufficient) for heuristic learning and development. These architectural constraints, in turn, explain properties of the functioning of the central nervous system that remain inexplicable for standard approaches.
Topologies of Learning and DevelopmentMark H. Bickhard Robert L. CampbellHow systems can represent and how systems can learn are two central problems in the study of cognition. Conventional contemporary approaches to these problems are vitiated by a shared error in their presuppositions about representation. Consequently, such approaches share further errors about the sorts of architectures that are required to support either representation or learning. We argue that the architectural requirements for genuine representing systems lead to architectural characteristics that are necessary (though not sufficient) for heuristic learning and development. These architectural constraints, in turn, explain properties of the functioning of the central nervous system that remain inexplicable for standard approaches.We will first outline some architectural requirements for models of learning and development, then present a model of representation that poses its own architectural requirements. The requirements for representation, it turns out, will also satisfy the requirements for learning and development.A major architectural requirement imposes itself on any system that can profit from past successes and failures -in other words, any system that can learn and get better at learning. That requirement is a functional topology in the space of possible constructive products of learning and development. The basic intuition is that, if previous problem-solving successes are to be functionally useful in attempts at solving new problems, there must be some way to "locate" and make use of the old problem representations and solutions that are "closest" to the new problem representations, or at least "nearby." Such closeness information constitutes a topology. Exploration of these topological requirements yields strong constraints on the nature and locus of learning and of development.Such topological considerations are not well addressed within standard architectural and mathematical approaches to computation. In finite state machines, Turing machines, or programming languages, topologies do not naturally arise. When Artificial Intelligence researchers encounter the constraints that yield the necessity for such topologies, the standard fix, as we shall discuss below, is to introduce feature-ba...