Evolution-in-Materio is a computational paradigm in which an algorithm reconfigures a material's properties to achieve a specific computational function. This paper addresses the question of how successful and well performing Evolution-in-Materio processors can be designed through the selection of nanomaterials and an evolutionary algorithm for a target application. A physical model of a nanomaterial network is developed which allows for both randomness, and the possibility of Ohmic and non-Ohmic conduction, that are characteristic of such materials. These differing networks are then exploited by differential evolution, which optimises several configuration parameters (e.g., configuration voltages, weights, etc.), to solve different classification problems. We show that ideal nanomaterial choice depends upon problem complexity, with more complex problems being favoured by complex voltage dependence of conductivity and vice versa. Furthermore, we highlight how intrinsic nanomaterial electrical properties can be exploited by differing configuration parameters, clarifying the role and limitations of these techniques. These findings provide guidance for the rational design of nanomaterials and algorithms for future Evolution-in-Materio processors.
Nanomaterial networks have been presented as a building block for unconventional in-Materio processors. Evolution in-Materio (EiM) has previously presented a way to configure and exploit physical materials for computation, but their ability to scale as datasets get larger and more complex remains unclear. Extreme Learning Machines (ELMs) seek to exploit a randomly initialised single layer feed forward neural network by training the output layer only. An analogy for a physical ELM is produced by exploiting nanomaterial networks as material neurons within the hidden layer. Circuit simulations are used to efficiently investigate diode-resistor networks which act as our material neurons. These in-Materio ELMs (iM-ELMs) outperform common classification methods and traditional artificial ELMs of a similar hidden layer size. For iM-ELMs using the same number of hidden layer neurons, leveraging larger more complex material neuron topologies (with more nodes/electrodes) leads to better performance, showing that these larger materials have a better capability to process data. Finally, iM-ELMs using virtual material neurons, where a single material is re-used as several virtual neurons, were found to achieve comparable results to iM-ELMs which exploited several different materials. However, while these Virtual iM-ELMs provide significant flexibility, they sacrifice the highly parallelised nature of physically implemented iM-ELMs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.