LCSR Seminar: Brian Bittner “Data-driven geometric mechanics: top-down tools for in situ robotic modeling and adaptation to injury”

When:
April 13, 2022 @ 12:00 pm – 1:00 pm
2022-04-13T12:00:00-04:00
2022-04-13T13:00:00-04:00
Where:
https://wse.zoom.us/s/94623801186
Contact:
Ashley Moriarty

Link for Live Seminar

Link for Recorded seminars – 2021/2022 school year

 

Abstract:

Many successful approaches to robotic locomotion and manipulation operate with high quality simulation tools. Many such approaches are “bottom-up” in a modeling sense, accounting for all internal forces and environmental interactions explicitly. These “bottom-up” models are used either beforehand (such as in reinforcement learning) and/or in real time.  However, various types of robots are getting smaller, softer, and more complex (e.g. bio-hybrid actuators). Some robots lean on low-precision manufacturing and fabrication techniques, and many robots are now being asked to operate in hard-to-characterize, natural interfaces like the human body. Such attributes can render “bottom-up” simulators impractical for expected use cases on various research frontiers, such as micro-biomedical robots and soft robots deployed in uncharacterized environments. In this talk I will revisit the reconstruction equation, a result from the geometric mechanics literature that offers a “top-down” view of Lagrangian systems, permitting insights into generalizable system behaviors along a spectrum of friction-momentum dominance. I will show how these tools can permit rapid modeling of high complexity robots in their operating environment without the requirement to specify CAD models or any explicit forces. I will also discuss a related strength and weakness of the approach resulting from the use of symmetries. Surprisingly, results in simulation and hardware indicate that even with eight-jointed systems, useful behavioral models can be computed from tens of cycles of data. This suggests that high degree of freedom robots can adjust and excel in situations where explicit force models are poorly understood. I will also briefly discuss a framework for robot recovery that leans on these tools as well as a metric for a robot’s ability to cover the local space of motions, computed on the Lie algebra of the position space. The metric allows primitives to be valued for their contribution to the space of composed motions rather than just their individual qualities. Results here include a Dubins car that can learn how to turn left (with its steering wheel restricted to only turn right) in less than a second as well as a robot made of tree branches that can learn to walk around the laboratory with less than twelve minutes of experimental data. I hope to motivate the general use of structural reductions as we pursue modeling and control of the next generation of high complexity robots.

 

Biography:

Dr. Brian Bittner received a B.S. from Carnegie Mellon and a PhD from Michigan where he researched the theory, simulation, and application of physics informed machine learning for in situ behavior modeling and optimization. He has sought out cross-disciplinary environments for research, collaborating with physicists, biologists, and mathematicians, working to facilitate insights from these fields into robotic systems. Bittner is currently a research scientist at the Applied Physics Lab. He is currently working on approaches to modeling and control for soft robots and underwater manipulation.

Johns Hopkins University

Johns Hopkins University, Whiting School of Engineering

3400 North Charles Street, Baltimore, MD 21218-2608

Laboratory for Computational Sensing + Robotics