版权说明 操作指南
首页 > 成果 > 详情

Visual-Tactile Robot Grasping Based on Human Skill Learning From Demonstrations Using a Wearable Parallel Hand Exoskeleton

认领
导出
Link by DOI
反馈
分享
QQ微信 微博
成果类型:
期刊论文
作者:
Lu, Zhenyu;Chen, Lu;Dai, Hengtai;Li, Haoran;Zhao, Zhou;...
通讯作者:
Yang, CG
作者机构:
[Zheng, Bofang; Lu, Zhenyu; Yang, Chenguang; Dai, Hengtai] Univ West England, Bristol Robot Lab, Bristol BS161QY, England.
[Chen, Lu] Shanxi Univ, Inst Big Data Sci & Ind, Taiyuan 030006, Peoples R China.
[Lepora, Nathan F. F.; Li, Haoran] Univ West England, Bristol Robot Lab, Bristol BS8 1TW, England.
[Zhao, Zhou] Cent China Normal Univ, Sch Comp Sci, Wuhan 430079, Peoples R China.
[Yang, Chenguang] Univ West England, Bristol Robot Lab, Bristol BS161QY, England.
通讯机构:
[Yang, CG ]
Univ West England, Bristol Robot Lab, Bristol BS161QY, England.
语种:
英文
关键词:
Index Terms-Force and tactile sensing;learning from demonstration;exoskeleton;data-driven human modeling;robot grasping
期刊:
IEEE ROBOTICS AND AUTOMATION LETTERS
ISSN:
2377-3766
年:
2023
卷:
8
期:
9
页码:
5384-5391
基金类别:
Thisworkwas supported in part by the H2020 Marie Sklodowska-Curie Actions Individual Fellowship under Grant 101030691 and in part by the National Natural Science Foundation of China under Grant 62003200.
机构署名:
本校为其他机构
院系归属:
计算机学院
摘要:
The soft fingers and strategic grasping skills enable the human hands to grasp objects in a stable manner. This letter is to model human grasping skills and transfer the learned skills to robots to improve grasping quality and success rate. First, we designed a wearable tool-like parallel hand exoskeleton equipped with optical tactile sensors to acquire multimodal information, including hand positions and postures, the relative distance of the exoskeleton claws, and tactile images. Using the demonstration data, we summarized three characteristi...

反馈

验证码:
看不清楚,换一个
确定
取消

成果认领

标题:
用户 作者 通讯作者
请选择
请选择
确定
取消

提示

该栏目需要登录且有访问权限才可以访问

如果您有访问权限,请直接 登录访问

如果您没有访问权限,请联系管理员申请开通

管理员联系邮箱:yun@hnwdkj.com