SG-Grasp_Semantic_Segmentation_Guided_Robotic_Grasp_Oriented_to_Weakly_Textured_Objects_Based_on_Visual_Perception_Sensors.pdf (3.41 MB)
SG-Grasp: semantic segmentation guided robotic grasp oriented to weakly textured objects based on visual perception sensors
journal contribution
posted on 2024-02-07, 10:30 authored by Ling Tong, Kechen Song, Hongkun Tian, Yi Man, Yunhui Yan, Qinggang MengQinggang MengWeakly textured objects are frequently manipulated by industrial and domestic robots, and the most common two types are transparent and reflective objects; however, their unique visual properties present challenges even for advanced grasp detection algorithms. Many existing algorithms heavily rely on depth information, which is not accurately provided by ordinary red-green-blue and depth (RGB-D) sensors for transparent and reflective objects. To overcome this limitation, we propose an innovative solution that uses semantic segmentation to effectively segment weakly textured objects and guide grasp detection. By using only red-green-blue (RGB) images from RGB-D sensors, our segmentation algorithm (RTSegNet) achieves state-of-The-Art performance on the newly proposed TROSD dataset. Importantly, our method enables robots to grasp transparent and reflective objects without requiring retraining of the grasp detection network (which is trained solely on the Cornell dataset). Real-world robot experiments demonstrate the robustness of our approach in grasping commonly encountered weakly textured objects; furthermore, results obtained from various datasets validate the effectiveness and robustness of our segmentation algorithm. Code and video are available at: https://github.com/meiguiz/SG-Grasp.
Funding
Research on 3D Dynamic Detection Theory and Identification Method for Surface Defects of Large High-temperature Structural Parts
National Natural Science Foundation of China
Find out more...Chunhui Plan Cooperative Project of Ministry of Education under Grant HZKY20220433
111 Project under Grant B16009
History
School
- Science
Department
- Computer Science
Published in
IEEE Sensors JournalVolume
23Issue
22Pages
28430 - 28441Publisher
IEEEVersion
- AM (Accepted Manuscript)
Rights holder
© IEEEPublisher statement
© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Acceptance date
2023-09-30Publication date
2023-10-09Copyright date
2023ISSN
1530-437XeISSN
1558-1748Publisher version
Language
- en