A Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps

Hangxin Liu , Zeyu Zhang , Ziyuan Jiao , Zhenliang Zhang , Minchen Li , Chenfanfu Jiang , Yixin Zhu , Song-Chun Zhu

Engineering ›› 2024, Vol. 32 ›› Issue (1) : 217 -232.

PDF (5552KB)
Engineering ›› 2024, Vol. 32 ›› Issue (1) : 217 -232. DOI: 10.1016/j.eng.2023.01.009
Research
Article

A Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps

Author information +
History +
PDF (5552KB)

Abstract

In this work, we present a reconfigurable data glove design to capture different modes of human hand-object interactions, which are critical in training embodied artificial intelligence (AI) agents for fine manipulation tasks. To achieve various downstream tasks with distinct features, our reconfigurable data glove operates in three modes sharing a unified backbone design that reconstructs hand gestures in real time. In the tactile-sensing mode, the glove system aggregates manipulation force via customized force sensors made from a soft and thin piezoresistive material; this design minimizes interference during complex hand movements. The virtual reality (VR) mode enables real-time interaction in a physically plausible fashion: A caging-based approach is devised to determine stable grasps by detecting collision events. Leveraging a state-of-the-art finite element method, the simulation mode collects data on fine-grained four-dimensional manipulation events comprising hand and object motions in three-dimensional space and how the object’s physical properties (e.g., stress and energy) change in accordance with manipulation over time. Notably, the glove system presented here is the first to use high-fidelity simulation to investigate the unobservable physical and causal factors behind manipulation actions. In a series of experiments, we characterize our data glove in terms of individual sensors and the overall system. More specifically, we evaluate the system’s three modes by ① recording hand gestures and associated forces, ② improving manipulation fluency in VR, and ③ producing realistic simulation effects of various tool uses, respectively. Based on these three modes, our reconfigurable data glove collects and reconstructs fine-grained human grasp data in both physical and virtual environments, thereby opening up new avenues for the learning of manipulation skills for embodied AI agents.

Graphical abstract

Keywords

Data glove / Tactile sensing / Virtual reality / Physics-based simulation

Cite this article

Download citation ▾
Hangxin Liu, Zeyu Zhang, Ziyuan Jiao, Zhenliang Zhang, Minchen Li, Chenfanfu Jiang, Yixin Zhu, Song-Chun Zhu, , , , , , , , . A Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps. Engineering, 2024, 32(1): 217-232 DOI:10.1016/j.eng.2023.01.009

登录浏览全文

4963

注册一个新账户 忘记密码

References

RIGHTS & PERMISSIONS

THE AUTHOR

AI Summary AI Mindmap
PDF (5552KB)

Supplementary files

Supplementary Material

2996

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/