LCSR Seminar: Christoffer Heckman “Robust, real-time perception strategies for robotic platforms in challenging environments”
Perception precedes action, in both the biological world as well as the technologies maturing today that will bring us autonomous cars, aerial vehicles, robotic arms and mobile platforms. The problem of probabilistic state estimation via sensor measurements takes on a variety of forms, resulting in information about our own motion as well as the structure of the world around us. In this talk, I will discuss some approaches that my research group has been developing that focus on estimating these quantities online and in real-time in extreme environments where dust, fog and other visually obscuring phenomena are widely present and when sensor calibration is altered or degraded over time. These approaches include new techniques in computer vision, visual-inertial SLAM, geometric reconstruction, nonlinear optimization, and even some sensor development. The methods I discuss have an application-specific focus to ground vehicles in the subterranean environment, but are also currently deployed in the agriculture, search and rescue, and industrial human-robot collaboration contexts.
Chris Heckman is an Assistant Professor and the Jacques Pankove Faculty Fellow in the Department of Computer Science at the University of Colorado at Boulder, where he also holds appointments in the Aerospace Engineering Sciences and Electrical and Computer Engineering departments. Professor Heckman earned his B.S. in Mechanical Engineering from UC Berkeley in 2008 and his Ph.D. in Theoretical and Applied Mechanics from Cornell University in 2012, where he was an NSF Graduate Research Fellow. He had postdoctoral appointments at the Naval Research Laboratory in Washington, D.C. as an NRC Research Associate, and in the Autonomous Robotics and Perception Group at CU Boulder as a Research Scientist, before joining the faculty there in 2016. He currently is leading one of the funded competition teams in the DARPA Subterranean Challenge; his past work has been funded by NSF, DARPA and multiple industry partners. His research focuses on developing mathematical and systems-level frameworks for autonomous control and perception, particularly vision and sensor fusion. His work applies concepts of nonlinear dynamical systems to the design of control systems for autonomous agents, in particular ground and aquatic vehicles, enabling them to navigate uncertain and rapidly-changing environments. A hallmark of his research is the implementation of these systems on experimental platforms.