top of page

PUBLICATIONS

Sep 9, 2021

Accepted by RA-L(Robotics & Automation Letters, a robotics journal with impact factor 3.74) and ICRA(a top conference in the field of robotics), IEEE. The submission ID is 21-2224.

​

Link to the paper: https://ieeexplore.ieee.org/document/9676458

​

Arxiv: https://arxiv.org/abs/2201.01760

​

Multi-robot systems such as swarms of aerialrobots are naturally suited to offer additional flexibility, re-silience, and robustness in several tasks compared to a singlerobot by enabling cooperation among the agents. To enhancethe autonomous robot decision-making process and situationalawareness, multi-robot systems have to coordinate their per-ception capabilities to collect, share, and fuse environmentinformation among the agents in an efficient and meaningfulway such to accurately obtain context-appropriate informationor gain resilience to sensor noise or failures. In this paper,we propose a general-purpose Graph Neural Network (GNN)with the main goal to increase, in multi-robot perception tasks,single robots’ inference perception accuracy as well as resilienceto sensor failures and disturbances. We show two instancesof the proposed formulation to address two multi-view visualperception problems. The first one deals with depth estimationwhereas the second one focuses on semantic classification.Several experiments both using photo-realistic and real datagathered from multiple aerial robots’ viewpoints show theeffectiveness of the proposed approach in challenging inferenceconditions including images corrupted by heavy noise andcamera occlusions or failures

​

​

 

©2021 by Yue Zhou.

bottom of page