人机共融的远程态势智能感知系统

2021,43(6):85-94
牛文龙
中国科学院国家空间科学中心 复杂航天系统电子信息技术重点实验室, 北京 100190;
中国科学院大学, 北京 100049,niuwenlong@nssc.ac.cn
樊铭瑞
中国科学院国家空间科学中心 复杂航天系统电子信息技术重点实验室, 北京 100190;
中国科学院大学, 北京 100049
李运
中国科学院国家空间科学中心 复杂航天系统电子信息技术重点实验室, 北京 100190
彭晓东
中国科学院国家空间科学中心 复杂航天系统电子信息技术重点实验室, 北京 100190,pxd@nssc.ac.cn
谢文明
中国科学院国家空间科学中心 复杂航天系统电子信息技术重点实验室, 北京 100190
任敬义
中国科学院国家空间科学中心 复杂航天系统电子信息技术重点实验室, 北京 100190
杨震
中国科学院国家空间科学中心 复杂航天系统电子信息技术重点实验室, 北京 100190
摘要:
针对人与远程无人设备协同精准配合的迫切需求,以机器人操作系统为基础构建了一种人机共融的远程态势感知系统,并开展了实验与分析。该系统以视觉定位技术为基础,以人机感知共融为切入点,通过实时三维场景重建技术与场景一致性融合方法,将无人设备探测到的环境及目标信息进行三维重构,并通过增强现实设备进行显示,与人的视觉信息进行一致性融合,实现无GPS条件下远程无人设备与人所佩戴的增强现实设备之间的协同定位。实验结果表明,系统在近距离时具有较好的人机协同定位准确度,定位精度随着距离的增加而逐渐降低。所构建的系统使无人设备成为人眼的延伸,在不干扰人员正常行动的情况下实现了穿障碍、跨视距的感知能力,在未来信息化作战中可发挥重要作用。
基金项目:
中科院基础前沿科学研究计划资助项目(22E0223301)

Remote situational intelligent sensing system for human-machine integration

NIU Wenlong
Key Laboratory of Electronics and Information Technology for Space Systems, National Space Science Center, Chinese Academy of Sciences, Beijing 100190, China;
University of Chinese Academy of Sciences, Beijing 100049, China,niuwenlong@nssc.ac.cn
FAN Mingrui
Key Laboratory of Electronics and Information Technology for Space Systems, National Space Science Center, Chinese Academy of Sciences, Beijing 100190, China;
University of Chinese Academy of Sciences, Beijing 100049, China
LI Yun
Key Laboratory of Electronics and Information Technology for Space Systems, National Space Science Center, Chinese Academy of Sciences, Beijing 100190, China
PENG Xiaodong
Key Laboratory of Electronics and Information Technology for Space Systems, National Space Science Center, Chinese Academy of Sciences, Beijing 100190, China,pxd@nssc.ac.cn
XIE Wenming
Key Laboratory of Electronics and Information Technology for Space Systems, National Space Science Center, Chinese Academy of Sciences, Beijing 100190, China
REN Jingyi
Key Laboratory of Electronics and Information Technology for Space Systems, National Space Science Center, Chinese Academy of Sciences, Beijing 100190, China
YANG Zhen
Key Laboratory of Electronics and Information Technology for Space Systems, National Space Science Center, Chinese Academy of Sciences, Beijing 100190, China
Abstract:
Aiming at the urgent need for human and remote unmanned devices to cooperate precisely and accurately, a remote situational awareness system of human-machine integration was proposed based on the robot operating system, and experiments and analysis were carried out. Based on visual positioning technology, with the integration of human-machine perception as the breakthrough point, through real-time 3D scene reconstruction technology and scene consistency fusion method, the environment and target information detected by unmanned equipment were 3D reconstructed. The result was consistent and fused with the human visual information, and displayed by the augmented reality device, realizing the coordinated positioning between the remote unmanned device and the augmented reality device worn by the person without GPS. The experimental results show that the system has better performance at close range. The accuracy of human-machine coordinated positioning is gradually reduced as the distance increases. The proposed system makes the unmanned device an extension of the human eye, realizing the ability to penetrate obstacles, and crosses the sight distance without interfering with the normal movement of personnel. It can play an important role in future information operations.
收稿日期:
2020-05-08
     下载PDF全文