2023
DOI: 10.1016/j.eswa.2022.118441
|View full text |Cite
|
Sign up to set email alerts
|

Vision-based interaction force estimation for robot grip motion without tactile/force sensor

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 15 publications
(4 citation statements)
references
References 27 publications
0
4
0
Order By: Relevance
“…The first step of encoder preprocessing is to calculate the body speed observation value to determine the encoder error term in the joint initialization and optimization of vision Inertial measurement unit encoder, and determine the degree of wheel slip through the calculation of robot slip factor. The linear and angular velocities of the robot body in the encoder coordinates are calculated as shown in equation ( 15) [18]. ω , respectively; The distance between two wheels is represented by D .…”
Section: B Improved Algorithm In View Of Multi-sensor Fusionmentioning
confidence: 99%
“…The first step of encoder preprocessing is to calculate the body speed observation value to determine the encoder error term in the joint initialization and optimization of vision Inertial measurement unit encoder, and determine the degree of wheel slip through the calculation of robot slip factor. The linear and angular velocities of the robot body in the encoder coordinates are calculated as shown in equation ( 15) [18]. ω , respectively; The distance between two wheels is represented by D .…”
Section: B Improved Algorithm In View Of Multi-sensor Fusionmentioning
confidence: 99%
“…Ko et al developed a vision-based system to estimate the interaction forces between the robot grip and objects by combining RGB-D images, robot positions, and motor currents. By incorporating proprioceptive feedback with visual data, the proposed model achieves high accuracy in force estimation [17]. In the context of smart manufacturing, Chen et al develop a real-time milling force monitoring system, using sensory data to accurately estimate the forces involved in the process, thus enabling real-time adjustment to optimize the cutting operation [18].…”
Section: Introductionmentioning
confidence: 99%
“…Lee et al [ 39 ] conducted a study in which they used data collected from an accelerometer and gyroscope sensors built into a smartphone as input values for a neural network to recognize the touched area on a smartphone, instead of using the tactile signal from the fingertip. Lee et al [ 40 ] and Ko et al [ 41 ] proposed methods that estimate the interaction force between a robot and an object while the robot grasps the object using visual images, robot position, and electrical current, without relying on tactile signals. Proprioception is necessary for tactile perception in humans.…”
Section: Introductionmentioning
confidence: 99%