Námið
Rannsóknir
HR

Dept. of Computer Science PhD thesis defence - Joshua Springer

Computational methods for autonomous multirotor drone landing
12. júní, 13:00 - 14:30
Háskólinn í Reykjavík - stofa M209
Skrá í dagatal

Join us for a PhD thesis defence of Joshua Springer on his thesis: Computational methods for autonomous multirotor drone landing.

Defence committee:

  • Main Supervisor: Gylfi Þór Guðmundsson, Assistant Professor, RU
  • Co-Supervisor: Marcel Kyas, Assistant Professor, RU
  • Examiner: Stefanos Vrochidis, Researcher B, Information Technologies Institute - Greece

Committee members: 

  • Joe Foley, Assistant Professor, RU
  • Sebastian Scherer, Associate Research Professor, Carnegie Mellon University (CMU) - USA
  • Ingibjörg Jónsdóttir, Associate Professor, HÍ

Master of ceremony: Björn Þór Jónsson

Abstract:

This dissertation presents findings on the topic of autonomous multirotor drone landing – one basic area of multirotor drone flight that is not yet fully automated – with emphasis on real world proofs of concept. We conduct two phases of research, focusing first on landing sites that are structured with fiducial markers, and next on unstructured landing sites, where the drone cannot expect to detect existing infrastructure.

The first phase is a continuation of the author’s master thesis which proposes an autonomous landing method based on fiducial markers and a gimbal mounted cam- era that is tested in simulation. We migrate the method from simulation to the real world and then expand on it. The initial migration revealed problems in recognizing the orientation of the fiducial markers in the real world, which was obscured by idealized graphics in simulation. We quantify this orientation ambiguity in several fiducial systems and carry out a real world landing experiment without mitigating the orientation ambiguity to gauge its effects. Finally, we develop a method for avoiding this issue entirely by directing the drone based solely on the angle from the drone to the landing pad. We demonstrate this method in the real world with both visual and infrared fiducial markers. The infrared markers can serve as unpowered landing site identification infrastructure both at daytime and nighttime.

The second phase involves analyzing the terrain beneath the drone to determine if it is safe for landing. We focus on suburban environments as an easier test case, and on lava fields as a more challenging test case that is plentiful and relevant in Iceland. We develop a geometric method that uses a stereo depth camera to evaluate terrain topographically. We also develop an appearance-based image segmentation method for performing an initial search for viable landing sites with a visual camera, which is trained on synthetic data extracted from LiDAR scans. The resulting classifiers are lightweight and quick to train. We conduct a validation step to show that the synthetic data can be used to train such classifiers for successful use in the real world. The classifiers can correctly generalize to previously unseen analog environ- ments, but may not generalize to completely different environment types.

Real world testing requires a lot of engineering overhead, which itself is useful for other researchers. Therefore, we describe our systems and payloads in detail so they can be reproduced by others. We carry out autonomous landing experiments using a DJI Spark and the DJI Mobile SDK for autonomous control. Finally, we develop 3 payloads for the DJI Matrice 350 that add onboard computers with varying degrees of integration and autonomous control, and we provide guidelines for reproducing them.

Vinsamlegast athugið að á viðburðum Háskólans í Reykjavík (HR) eru teknar ljósmyndir og myndbönd sem notuð eru í markaðsstarfi HR. Hægt er að nálgast frekari upplýsingar á ru.is eða með því að senda tölvupóst á netfangið personuvernd@ru.is.

Please note that at events hosted at Reykjavík University (RU), photographs and videos are taken which might be used for RU marketing purposes. Read more about this on out ru.is or send an e-mail: personuverd@ru.is.

Fara efst