VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization

Peng-Yuan Kao1, Hsui-Jui Chang1, Kuan-Wei Tseng1,2, Timothy Chen1, He-Lin Luo3, Yi-Ping Hung1
1National Taiwan University   2Tokyo Insitute of Technology   3Tainan National University of the Arts
IEEE ACCESS 2023
MY ALT TEXT

We propose a learning-based localization method using the fusion of Vision, Inertial Measurement Unit (IMU), and Ultra-Wideband(UWB) sensors. It consists of a VI branch that takes images and IMU measurements as input and predicts the relative pose between each image frame, and a UWB branch (blue background) which regresses the absolute position according to the UWB measurements. The relative pose and absolute position are combined for UAV indoor localization.

MY ALT TEXT

We have collected data that provides synchronized RGB images (RealSense D435i), 6DoF IMU measurements (RealSense D435i), UWB measurements (Nooploop LinkTrack S) with ground-truth poses from motion capture system (Vicon).

Citation

If you find this research helpful, please cite (volume, number, pages coming soon)

@ARTICLE{VIUNet,
      author={Kao, Peng-Yuan and Chang, Hsui-Jui and Tseng, Kuan-Wei and Chen, Timothy and Luo, He-Lin and Hung, Yi-Ping},
      journal={IEEE Access}, 
      title={VIUNet: Deep Visual–Inertial–UWB Fusion for Indoor UAV Localization}, 
      year={2023},
      volume={},
      number={},
      pages={},
      doi={10.1109/ACCESS.2023.3279292}}