Bats have a complex skeletal morphology, with both ball-and-socket and revolute joints that interconnect the bones and muscles to create a musculoskeletal system with over 40 degrees of freedom, some of which are passive. Replicating this biological system in a small, lightweight, low-power air vehicle is not only infeasible, but also undesirable; trajectory planning and control for such a system would be intractable, precluding any possibility for synthesizing complex agile maneuvers, or for real-time control. Thus, our goal is to design a robot whose kinematic structure is topologically much simpler than a bat’s, while still providing the ability to mimic the bat-wing morphology during flapping flight, and to find optimal trajectories that exploit the natural system dynamics, enabling effective controller design.
The kinematic design of our robot is driven by motion capture experiments using live bats. In particular, we use principal component analysis to capture the essential bat-wing shape information, and solve a nonlinear optimization problem to determine the optimal kinematic parameters for a simplified parallel kinematic wing structure. We then derive the Lagrangian dynamic equations for this system, along with a model for the aerodynamic forces. We use a shooting-based optimizer to locate physically feasible, periodic solutions to this system, and an event-based control scheme is then derived in order to track the desired trajectory. We demonstrate our results with flight experiments on our robotic bat.
Seth Hutchinson is Professor and KUKA Chair for Robotics in the School of Interactive Computing at the Georgia Institute of Technology, where he also serves as Associate Director of the Institute for Robotics and Intelligent Machines. His research in robotics spans the areas of planning, sensing, and control. He has published more than 200 papers on these topics, and is coauthor of the books “Principles of Robot Motion: Theory, Algorithms, and Implementations,” published by MIT Press, and “Robot Modeling and Control,” published by Wiley.
Hutchinson currently serves on the editorial board of the International Journal of Robotics Research and chairs the steering committee of the IEEE Robotics and Automation Letters. He was Founding Editor-in-Chief of the IEEE Robotics and Automation Society’s Conference Editorial Board (2006-2008) and Editor-in-Chief of the IEEE Transaction on Robotics (2008-2013).
Hutchinson is an Emeritus Professor of Electrical and Computer Engineering at the University of Illinois at Urbana-Champaign, where he was Professor of ECE until 2018, serving as Associate Head for Undergraduate Affairs from 2001 to 2007. He received his Ph.D. from Purdue University in 1988. Hutchinson is a Fellow of the IEEE.
The functionality of artificial manipulators could be enhanced by artificial “haptic intelligence” that enables the identification of object features via touch for semi-autonomous decision-making and/or display to a human operator. This could be especially useful when complementary sensory modalities, such as vision, are unavailable. I will highlight past and present work to enhance the functionality of artificial hands in human-machine systems. I will describe efforts to develop multimodal tactile sensor skins, and to teach robots how to haptically perceive salient geometric features such as edges and fingertip-sized bumps and pits using machine learning techniques. I will describe the use of reinforcement learning to teach robots goal-based policies for a functional contour-following task: the closure of a ziplock bag. Our Contextual Multi-Armed Bandits approach tightly couples robot actions to the tactile and proprioceptive consequences of the actions, and selects future actions based on prior experiences, the current context, and a functional task goal. Finally, I will describe current efforts to develop real-time capabilities for the perception of tactile directionality, and to develop models for haptically locating objects buried in granular media. Real-time haptic perception and decision-making capabilities could be used to advance semi-autonomous robot systems and reduce the cognitive burden on human teleoperators of devices ranging from wheelchair-mounted robots to explosive ordnance disposal robots.
Veronica J. Santos is an Associate Professor in the Mechanical and Aerospace Engineering Department at the University of California, Los Angeles, and Director of the UCLA Biomechatronics Lab (http://BiomechatronicsLab.ucla.edu). Dr. Santos received her B.S. in mechanical engineering with a music minor from the University of California at Berkeley (1999), was a Quality and R&D Engineer at Guidant Corporation, and received her M.S. and Ph.D. in mechanical engineering with a biometry minor from Cornell University (2007). As a postdoc at the University of Southern California, she contributed to the development of a biomimetic tactile sensor for prosthetic hands. From 2008 to 2014, Dr. Santos was an Assistant Professor of Mechanical and Aerospace Engineering at Arizona State University. Her research interests include human hand biomechanics, human-machine systems, haptics, tactile sensors, machine perception, prosthetics, and robotics for grasp and manipulation. Dr. Santos was selected for an NSF CAREER Award (2010), three engineering teaching awards (2012, 2013, 2017), an ASU Young Investigator Award (2014), and as a National Academy of Engineering Frontiers of Engineering Education Symposium participant (2010). She currently serves as an Editor for the IEEE International Conference on Robotics and Automation (2017-2019), an Associate Editor for the ASME Journal of Mechanisms and Robotics (2016-2019), and an Associate Editor for the ACM Trans on Human-Robot Interaction (2018- 2021).
Perception precedes action, in both the biological world as well as the technologies maturing today that will bring us autonomous cars, aerial vehicles, robotic arms and mobile platforms. The problem of probabilistic state estimation via sensor measurements takes on a variety of forms, resulting in information about our own motion as well as the structure of the world around us. In this talk, I will discuss some approaches that my research group has been developing that focus on estimating these quantities online and in real-time in extreme environments where dust, fog and other visually obscuring phenomena are widely present and when sensor calibration is altered or degraded over time. These approaches include new techniques in computer vision, visual-inertial SLAM, geometric reconstruction, nonlinear optimization, and even some sensor development. The methods I discuss have an application-specific focus to ground vehicles in the subterranean environment, but are also currently deployed in the agriculture, search and rescue, and industrial human-robot collaboration contexts.
Chris Heckman is an Assistant Professor and the Jacques Pankove Faculty Fellow in the Department of Computer Science at the University of Colorado at Boulder, where he also holds appointments in the Aerospace Engineering Sciences and Electrical and Computer Engineering departments. Professor Heckman earned his B.S. in Mechanical Engineering from UC Berkeley in 2008 and his Ph.D. in Theoretical and Applied Mechanics from Cornell University in 2012, where he was an NSF Graduate Research Fellow. He had postdoctoral appointments at the Naval Research Laboratory in Washington, D.C. as an NRC Research Associate, and in the Autonomous Robotics and Perception Group at CU Boulder as a Research Scientist, before joining the faculty there in 2016. He currently is leading one of the funded competition teams in the DARPA Subterranean Challenge; his past work has been funded by NSF, DARPA and multiple industry partners. His research focuses on developing mathematical and systems-level frameworks for autonomous control and perception, particularly vision and sensor fusion. His work applies concepts of nonlinear dynamical systems to the design of control systems for autonomous agents, in particular ground and aquatic vehicles, enabling them to navigate uncertain and rapidly-changing environments. A hallmark of his research is the implementation of these systems on experimental platforms.