Automated benchmarking environments aim to support researchers in understanding how different algorithms perform on different types of optimization problems. Such comparisons carry the potential to provide insights into the strengths and weaknesses of different approaches, which can be leveraged into designing new algorithms. Carefully selected benchmark problems are also needed as training sets in the context of algorithm selection and configuration. With the ultimate goal to create a meaningful benchmark set for iterative optimization heuristics, we compile and assess in this work a selection of discrete optimization problems that subscribe to different types of fitness landscapes. All problems have been implemented and tested within IOHprofiler, our recently released software built to assess iterative heuristics solving combinatorial optimization problems. For each selected problem we compare performances of eleven different heuristics. Apart from fixed-target and fixed-budget results for the individual problems, we also derive ECDF results for groups of problems. To obtain these, we have implemented an add-on for IOHprofiler which allows aggregation of performance data across different benchmark functions. CCS CONCEPTS • Human-centered computing → Scientific visualization; • Theory of computation → Theory of randomized search heuristics; • Software and its engineering → Software libraries and repositories;
Automated benchmarking environments aim to support researchers in understanding how different algorithms perform on different types of optimization problems. Such comparisons provide insights into the strengths and weaknesses of different approaches, which can be leveraged into designing new algorithms and into the automation of algorithm selection and configuration. With the ultimate goal to create a meaningful benchmark set for iterative optimization heuristics, we have recently released IOHprofiler, a software built to create detailed performance comparisons between iterative optimization heuristics. With this present work we demonstrate that IOHprofiler provides a suitable environment for automated benchmarking. We compile and assess a selection of 23 discrete optimization problems that subscribe to different types of fitness landscapes. For each selected problem we compare performances of twelve different heuristics, which are as of now available as baseline algorithms in IOHprofiler. We also provide a new module for IOHprofiler which extents the fixed-target and fixed-budget results for the individual problems by ECDF results, which allows one to derive aggregated performance statistics for groups of problems.
Motor acuity is considered to be the outcome of prolonged practice and to involve morphological changes in the motor cortex. We have previously designed a curved pointing task, the arc pointing task (APT), to study motor acuity acquisition, defined as a change in the speed-accuracy tradeoff function (SAF) of the task. Here, we studied the generalization of motor acuity between hands and between tasks (drawing the arc in the opposite direction and with the untrained hand) and the effect of training duration on motor acuity. We report that training-induced motor acuity improvement did not generalize across hands and across tasks performed with the same hand, suggesting a task-specific representation of motor acuity. To our surprise, the largest gains in motor acuity, measured both by changes in SAF and by improvement in multiple kinematic variables, were seen following a short exposure to the task. Our results suggest that motor acuity training-induced improvement is task specific and that motor acuity starts to improve following a very short practice. NEW & NOTEWORTHY We report that training induced motor acuity improvement does not generalize from one hand to another or between movements that are performed with the same effector. Furthermore, significant improvements in acuity were found following a very short exposure to the task (∼20 trials). Therefore, our results suggest that the nervous system has the capacity to rapidly improve motor acuity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.