Problem statement: ART1 artificial neural networks offer good tools for test clustering, where no expert is needed if the system is well trained. However, having no output reference for the input patterns makes it hard to judge the quality of the training. Moreover, the performance depends to a great extent on a set of training parameters. Designers follow some recommendations or depend on their expertise in finding good sets with no performance guarantees. Many methods were proposed; from greedy methods offering quick and acceptable solutions to evolutionary algorithms offering suboptimal sets of parameters. While the evolutionary algorithms are a good choice for quality, the computational cost is large even for an offline process; after all, computing resources are not for free. Approach: We introduced a method for selecting a set of parameters that yields a comparable performance and robust operation, with relatively low cost compared to the evolutionary methods. This method located a suitable set through repetitive portioning of the range, by considering the best subset for the next iteration. Results: Tests have shown that performance comparable with the computationally intensive evolutionary methods could be achieved in much less time. Conclusion: The repetitive portioning method for finding a good set of training parameters is very cost effective and yields good performance.
Problem statement: ART1 artificial neural networks offer good tools for test clustering, where no expert is needed if the system is well trained. However, having no output reference for the input patterns makes it hard to judge the quality of the training. Moreover, the performance depends to a great extent on a set of training parameters. Designers follow some recommendations or depend on their expertise in finding good sets with no performance guarantees. Many methods were proposed; from greedy methods offering quick and acceptable solutions to evolutionary algorithms offering suboptimal sets of parameters. While the evolutionary algorithms are a good choice for quality, the computational cost is large even for an offline process; after all, computing resources are not for free. Approach: We introduced a method for selecting a set of parameters that yields a comparable performance and robust operation, with relatively low cost compared to the evolutionary methods. This method located a suitable set through repetitive portioning of the range, by considering the best subset for the next iteration. Results: Tests have shown that performance comparable with the computationally intensive evolutionary methods could be achieved in much less time. Conclusion: The repetitive portioning method for finding a good set of training parameters is very cost effective and yields good performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.