Toward Minimalistic and Learning-Enabled Autonomous Navigation
Thursday, January 25, 2018
About the Seminar
This presentation described efforts to produce an autonomous navigation framework in which the various components of the architecture work together to minimize the size, computation, and power required to achieve a robust sensing and navigation package. This includes a lightweight lidar odometry framework that extracts and relies on a small number of descriptive features to localize a ground vehicle, a real-time scan-by-scan terrain traversability mapping algorithm, and a hierarchical multiobjective motion-planning framework that is capable of safe and efficient decision making over these maps. These concepts were demonstrated using the presenter's Clearpath Jackal electric unmanned ground vehicle (UGV).
The presentation also discussed the potential for machine learning techniques to outperform the costly optimization-based approaches that are repeatedly invoked in many UGV perception and decision-making processes. Although expensive to train, approaches such as deep learning with convolutional neural nets offer the promise of fast and scalable query time, and preliminary results will be presented in pursuit of this goal.
Applications to problems such as autonomously clearing snow on a campus driveway (in a GPS-unfriendly environment) and the detection and avoidance of pedestrians were also discussed.
Webcast
About the Speaker

Brendan Englot is an assistant professor of mechanical engineering at Stevens Institute of Technology, where he directs the Robust Field Autonomy Lab. The lab focuses on robust autonomous navigation solutions for robots operating in harsh and unstructured environments. He previously worked at United Technologies Research Center, where he was a research scientist and principal investigator in the Autonomous and Intelligent Robotics Laboratory and a technical contributor to the Sikorsky Autonomous Research Aircraft.