Agent-based routing in wireless ad hoc networks defines a set of rules that all the participating nodes follow. Routing becomes a collaboration between nodes, reducing computational and resource costs. Swarm Intelligence uses agent-like entities from insect societies as a metaphor to solve the routing problem. Certain insects exchange information about their activities and the environment in which they operate in order to complete their tasks in an adaptive, efficient and scalable manner. This paper examines Swarm Intelligence based routing protocols, along with a newly proposed bee-inspired routing protocol for providing multi-path routing in wireless ad hoc networks of mobile nodes. Simulation results indicate that applying Swarm Intelligence offers a significant level of adaptability and efficiency that, under several network conditions, allow the protocol to outperform traditional approaches.
It is known that during early infancy, humans experience many physical and cognitive changes that shape their learning and refine their understanding of objects in the world. With the extended arm being one of the very first objects they familiarise, infants undergo a series of developmental stages that progressively facilitate physical interactions, enrich sensory information and develop the skills to learn and recognise. Drawing inspiration from infancy, this study deals with the modelling of an open-ended learning mechanism for embodied agents that considers the cumulative and increasing complexity of physical interactions with the world. The proposed system achieves object perception, and recognition as the agent (i.e., a humanoid robot) matures, experiences changes to its visual capabilities, develops sensorimotor control, and interacts with objects within its reach. The reported findings demonstrate the critical role of developing vision on the effectiveness of object learning and recognition and the importance of reaching and grasping in solving visually elicited ambiguities. Impediments caused by the interdependency of parallel components responsible for the agent's physical and cognitive functionalities are exposed, demonstrating an interesting phase transition in utilising object perceptions for recognition.
The understanding of concepts related to objects are developed over a long period of time in infancy. This paper investigates how physical constraints and changes in visual perception impact on both sensorimotor development for gaze control, as well as the perception of features of interesting regions in the scene. Through a progressive series of developmental stages, simulating ten months of infant development, this paper examines feature perception toward recognition of localized regions in the environment. Results of two experiments, conducted using the iCub humanoid robot, indicate that by following the proposed approach a cognitive agent is capable of scaffolding sensorimotor experiences to allow gradual exploration of the surroundings and local region recognition, in terms of low-level feature similarities. In addition, this paper reports the emergence of vision-related phenomena that match human behaviors found in the developmental psychology literaturepublishersversionpublishersversionPeer reviewe
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.