Autonomous micro helicopters are starting to play a major role in tasks like search and rescue, environment monitoring, security surveillance, transportation and inspection. However, for such operations, two main challenges arise. The use GPS based navigation is not sufficient. Fully autonomous operation in cities or other dense indoor and outdoor environments requires micro helicopters to fly at low altitudes, where GPS signals are often shadowed or absent. In addition, during the previous mentioned tasks, agile motions are still not possible, compromising the execution of critical missions. These should be typically accomplished in a fast and agile manner and within a limited amount of time. Thus, several perception and control challenges have still to be addressed and solved. Unmanned Aerial Vehicles (UAVs) should be able to fly autonomously with agility in extreme navigation conditions guaranteeing robust high rate state estimation for closed loop control. On the other hand, multiple MAVs have been endowed with manipulation and transportation capabilities. Although the complexity of such systems increases with the number of agents, MAVs can perform tasks in a collaborative manner and exchange information between each other to make better decisions and optimize tasks.


This workshop will focus on the next research challenges in the area of vision-based navigation for single and multiple collaborative vision-based drones. In these areas, there are still several open research and scientific challenges related to the best and efficient environment representations for navigation and toward unified solutions for manipulation, transportation, locomotion, human-robot interaction, and heterogeneity in unstructured environments. How can drones autonomy change the human mobility? How can these machines interact with humans during a task predicting his future behavior and provide situational awareness relaxing communication constraints? How do we co-design perception and action loops for fast navigation of small-scale aerial platforms to obtain racing and super vehicles machines? What role should machine learning play for autonomy? What are and how do we solve the perception challenges in aerial swarms? 

Topics of interest to this workshop include, but are not necessarily limited to:

  • Visionary ideas for autonomy of vision-based UAVs
  • Agile autonomous navigation, transportation and manipulation with UAVs
  • High-speed visual control and state estimation of aerial vehicles
  • Long term and range perception for UAVs without GPS
  • Sensor fusion for autonomous navigation in unstructured environment
  • System software and hardware architectures
  • Mapping and Obstacle avoidance
  • Perception in challenging and dynamic environments
  • Modeling and benchmarking of performances for three-dimensional navigation
  • Dynamic visual servo control of aerial vehicles
  • Cooperative estimation and control with multiple aerial vehicles
  • Resource constrained navigation
  • Field robotics
  • Search and rescue robotics


Giuseppe Loianno, New York University Davide Scaramuzza, University of Zurich



Program Committee

  • Dr. Shaojie Shen, HKUST
  • Dr. Gary McGrath, Qualcomm Research
  • Dr. Nikolai Smolyanskiy, NVIDIA
  • Dr. Tarek Taha, Algorythma/Krypto Labs
  • Dr. Debadeepta Dey, Microsoft
  • Dr. Juan Nieto, ETH Zurich
  • Dr. Nathan Michael, CMU
  • Dr. Martin Saska, CTU

This workshop is endorsed by the IEEE RAS TC on Aerial Robotics and Unmanned Aerial Vehicles and supported by the DCIST Distributed and Collaborative Intelligent Systems and Technology Collaborative Research Alliance (CRA).