Abstract:In recent years, multi-modal haptic rendering featured with multi-contact interaction has become a research focus in the human-computer interaction due to its ability to further enhancing the sense of reality. A multi-modal haptic rendering system for multi-contact interaction was presented, which is composed of Leap Motion sensor based multi-finger position detection module, vibration-based tactile rendering module, NRF51822 blue-tooth communication module, visual rendering module, and CHAI3D based virtual environment. To verify the efficacy and effectiveness of the proposed system, the virtual object contour recognition experiments and object pick-and-place experiments were carried out. Experiment results show that the average success rate for distinguishing geometric virtual objects is 87.9%, and the pick-and-place experiment with combined haptic rendering and visual rendering can lead to an average time saving of 47.7%, compared with that of single visual rendering condition. The proposed multi-contact haptic system features the virtues of low cost, small size, and simple control strategy, which lays out a profound foundation for the further development and application of multi-modal human-computer interaction.