Soft continuum robots are highly flexible and adaptable, making them ideal for unstructured environments such as the human body and agriculture. However, their high compliance and manoeuvrability make them difficult to model, sense, and control. Current control strategies focus on Cartesian space control of the end-effector, but few works have explored full-body control. This study presents a novel image-based deep learning approach for closed-loop kinematic shape control of soft continuum robots. The method combines a local inverse kinematics formulation in the image-space with deep convolutional neural networks for accurate shape control that is robust to feedback noise and mechanical changes in the continuum arm. The shape controller is fast and straightforward to implement; it takes only a few hours to generate training data, train the network, and deploy, requiring only a web camera for feedback. This method offers an intuitive and user-friendly way to control the robot's 3D shape and configuration through teleoperation using only 2D hand-drawn images of the desired target state without the need for further user instruction or consideration of the robot's kinematics.
Soft robotic grippers are becoming increasingly popular for agricultural and logistics automation. Their passive conformability enables them to adapt to varying product shapes and sizes, providing stable large-area grasps. This work presents a novel methodology for combining soft robotic grippers with electrical impedance tomography-based sensors to infer intrinsic properties of grasped fruits. We use a Fin Ray soft robotic finger with embedded microspines to grab and obtain rich multi-direction electrical properties of the object. Learning-based techniques are then used to infer the desired fruit properties. The framework is extensively tested and validated on multiple fruit groups. Our results show that ripeness parameters and even weight of the grasped fruit can be estimated with reasonable accuracy autonomously using the proposed system.
Road infrastructure is one of the most vital assets of any country. Keeping the road infrastructure clean and unpolluted is important for ensuring road safety and reducing environmental risk. However, roadside litter picking is an extremely laborious, expensive, monotonous and hazardous task. Automating the process would save taxpayers money and reduce the risk for road users and the maintenance crew. This work presents LitterBot, an autonomous robotic system capable of detecting, localizing and classifying common roadside litter. We use a learning-based object detection and segmentation algorithm trained on the TACO dataset for identifying and classifying garbage. We develop a robust modular manipulation framework by using soft robotic grippers and a real-time visual-servoing strategy. This enables the manipulator to pick up objects of variable sizes and shapes even in dynamic environments. The robot achieves greater than 80% classified picking and binning success rates for all experiments; which was validated on a wide variety of test litter objects in static single and cluttered configurations and with dynamically moving test objects. Our results showcase how a deep model trained on an online dataset can be deployed in real-world applications with high accuracy by the appropriate design of a control framework around it.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.