Calendar

Apr
3
Wed
LCSR Seminar: Seth Hutchinson, “Model-Based Methods in Today’s Data-Driven Robotics Landscape @ Hackerman B17
Apr 3 @ 12:00 pm – 1:00 pm

Model-Based Methods in Today’s Data-Driven Robotics Landscape
Seth Hutchinson, Georgia Tech

Abstract:
Data-driven machine learning methods are making advances in many long-standing problems in robotics, including grasping, legged locomotion, perception, and more. There are, however, robotics applications for which data-driven methods are less effective. Data acquisition can be expensive, time consuming, or dangerous — to the surrounding workspace, humans in the workspace, or the robot itself. In such cases, generating data via simulation might seem a natural recourse, but simulation methods come with their own limitations, particularly when nondeterministic effects are significant, or when complex dynamics are at play, requiring heavy computation and exposing the so-called sim2real gap. Another alternative is to rely on a set of demonstrations, limiting the amount of required data by careful curation of the training examples; however, these methods fail when confronted with problems that were not represented in the training examples (so-called out-of-distribution problems), and this precludes the possibility of providing provable performance guarantees.

In this talk, I will describe recent work on robotics problems that do not readily admit data-driven solutions, including flapping flight by a bat-like robot, vision-based control of soft continuum robots, a cable-driven graffiti-painting robot, and ensuring safe operation of mobile manipulators in HRI scenarios. I will describe some specific difficulties that confront data-driven methods for these problems, and describe how model-based approaches can provide workable solutions. Along the way, I will also discuss how judicious incorporation of data-driven machine learning tools can enhance performance of these methods.

BIO:

Seth Hutchinson is the Executive Director of the Institute for Robotics and Intelligent Machines at the Georgia Institute of Technology, where he is also Professor and KUKA Chair for Robotics in the School of Interactive Computing. Hutchinson received his Ph.D. from Purdue University in 1988, and in 1990 joined the University of Illinois in Urbana-Champaign (UIUC), where he was a Professor of Electrical and Computer Engineering (ECE) until 2017, serving as Associate Department Head of ECE from 2001 to 2007.

Hutchinson served as president of the IEEE Robotics and Automation Society (RAS) 2020-21. He has previously served as a member of the RAS Administrative Committee, as the Editor-in-Chief for the “IEEE Transactions on Robotics” and as the founding Editor-in-Chief of the RAS Conference Editorial Board. He has served on the organizing committees for more than 100 conferences, has more than 300 publications on the topics of robotics and computer vision, and is coauthor of the books “Robot Modeling and Control,” published by Wiley, “Principles of Robot Motion: Theory, Algorithms, and Implementations,” published by MIT Press, and the forthcoming “Introduction to Robotics and Perception,” to be published by Cambridge University Press. He is a Fellow of the IEEE.

 

Apr
10
Wed
LCSR Seminar: Glen Chou, “Toward End-to-end Reliable Robot Learning for Autonomy and Interaction” @ Hackerman B17
Apr 10 @ 12:00 pm – 1:00 pm

Abstract:

Robots must behave safely and reliably if we are to confidently deploy them in the real world around humans. To complete tasks, robots must manage a complex, interconnected autonomy stack of perception, planning, and control software. While machine learning has unlocked the potential for full-stack end-to-end control in the real world, these methods can be catastrophically unreliable. In contrast, model-based safety-critical control provides rigorous guarantees, but struggles to scale to real systems, where common assumptions, e.g., perfect task specification and perception, break down.

However, we need not choose between real-world utility and safety. By taking an end-to-end approach to safety-critical control that builds and leverages knowledge of where learned components can be trusted, we can build practical yet rigorous algorithms that can make real robots more reliable. I will first discuss how to make task specification easier and safer by learning hard constraints from human task demonstrations, and how we can plan safely with these learned specifications despite uncertainty. Then, given a task specification, I will discuss how we can reliably leverage learned dynamics and perception for planning and control by estimating where these learned models are accurate, enabling probabilistic guarantees for end-to-end vision-based control. Finally, I will provide perspectives on open challenges and future opportunities in assuring algorithms for space autonomy, including robust perception-based hybrid control algorithms for reliable data-driven robotic manipulation and human-robot collaboration.

Bio:

Glen Chou is a postdoctoral associate at MIT CSAIL, advised by Prof. Russ Tedrake. His research focuses on end-to-end safety and reliability guarantees for learning-enabled robots that operate around humans. Previously, Glen received his PhD in Electrical and Computer Engineering from the University of Michigan in 2022, where he was advised by Profs. Dmitry Berenson and Necmiye Ozay. Prior to that, he received dual B.S. degrees in Electrical Engineering and Computer Science and Mechanical Engineering from UC Berkeley in 2017. He is a recipient of the National Defense Science and Engineering Graduate (NDSEG) fellowship, the NSF Graduate Research fellowship, and is a Robotics: Science and Systems Pioneer.

Website: https://glenchou.github.io/

Zoom: Meeting ID 955 8366 7779; Passcode 530803
https://wse.zoom.us/j/95583667779

 

Apr
17
Wed
LCSR Seminar: David Porfirio, “Robot Application Development: A Shifting Paradigm” @ Hackerman B17
Apr 17 @ 12:00 pm – 1:00 pm

Title: Robot Application Development: A Shifting Paradigm

Abstract:
Interfaces for Robot Application Development (RAD) have proven effective at empowering non-roboticist developers (i.e., robot end users and non-robotics domain experts) to specify tasks for robots to perform. Historically, RAD has adopted development paradigms that have strong ties to traditional computer programming. With recent advancements in robot artificial intelligence, however, there is a pressing need for RAD interfaces to serve instead as communication intermediaries between the developer and the robot. As communication intermediaries, these interfaces should be designed to harness any relevant developer knowledge that is unknown to the robot, while at the same time appropriately leveraging the robot’s intelligent capabilities and communicating this information back to the developer. This talk describes two separate research threads to facilitate developer-robot communication through RAD interfaces. The first thread investigates how interfaces should be designed to appropriately leverage the robot’s knowledge and capabilities. The second thread investigates how interfaces should be designed to elicit relevant tacit knowledge from developers.

Bio:
David Porfirio is an NRC Postdoctoral Research Associate at the U.S. Naval Research Laboratory. His interests lie in designing and evaluating user interfaces that facilitate robot end-user development with the goal of making robot programming more accessible and reliable for non-roboticists. His work has been published in top-tier conferences in both human-robot interaction and human-computer interaction. Prior to his postdoctoral appointment, David received his Ph.D. in 2022 from the University of Wisconsin–Madison (UW–Madison), where he was advised by Drs. Bilge Mutlu and Aws Albarghouthi. During his Ph.D., he was supported by the NSF GRFP, Microsoft Dissertation Grant, and Cisco Wisconsin Distinguished Graduate Fellowship.

 

Apr
24
Wed
LCSR Seminar: Marin Kobilarov and Louis Whitcomb, “Interviewing for Jobs in Academia and Industry” @ Hackerman B17
Apr 24 @ 12:00 pm – 1:00 pm

Abstract:
This LCSR Professional Development Seminar is focused on essential skills for interviewing for technical jobs in industry and in academia

BIO:

Marin Kobilarov is an Associate Professor at the Johns Hopkins University and a Principal Engineer at Zoox/Amazon. At JHU he leads the Autonomous Systems, Control and Optimization (ASCO) lab which develops algorithms and software for planning, learning, and control of autonomous robotic systems. Their focus is on computational theory at the intersection of planning and learning, and on the system integration and deployment of robots that can operate safely and efficiently in challenging environments.

 


Whitcomb is Professor of Mechanical Engineering at the Johns Hopkins University School of Engineering, and Adjunct Scientist at the Woods Hole Oceanographic Institution. His research focuses on the navigation, dynamics, and control of robot systems in extreme environments. He is the founding Director (2007-2013) of the JHU Laboratory for Computational Sensing and Robotics and former Chair (2013-2017) of the JHU Department of Mechanical Engineering. He has received numerous best paper awards and teaching awards. He is a Fellow of the IEEE.

 

 

May
1
Wed
Bonus LCSR Seminar, Kaushik Jayaram, “Towards robust and autonomous locomotion in cluttered terrain using insect-scale robots” @ Latrobe 107
May 1 @ 12:00 pm – 1:00 pm

Talk Title: Towards robust and autonomous locomotion in cluttered terrain using insect-scale robots

 

Talk Abstract: Animals such as mice, cockroaches and spiders have the remarkable ability to maneuver through challenging cluttered natural terrain and have been inspiration for adaptable legged robotic systems. Recent biological research further indicates that body reorientation along pathways of minimal energy is a key factor influencing such locomotion. We propose to extend this idea by hypothesizing that body compliance of soft bodied animals and robots might be an alternate yet effective locomotion strategy to squeeze through cluttered obstacles. We present some early results related to the above using Compliant Legged Autonomous Robotic Insect (CLARI), our novel, insect-scale, origami-based quadrupedal robot. While the distributed compliance of such soft-legged robots enables them to explore complex environments, their gait design, control, and motion planning is often challenging due to a large number of unactuated/underactuated degrees of freedom. Towards addressing this issue, we present a geometric motion planning framework for autonomous, closed kinematic chain articulated systems that is computationally effective and has a promising potential for onboard and real-time gait generation.

Biography: Dr. Kaushik Jayaram is presently an Assistant Professor in Robotics at the Paul M Rady Department of Mechanical Engineering at the University of Colorado Boulder. Previously, he was a post-doctoral scholar in Prof. Rob Wood’s Microrobotics lab at Harvard University. He obtained his doctoral degree in Integrative Biology in 2015 from the University of California Berkeley mentored by Prof. Bob Full and undergraduate degree in Mechanical Engineering from the Indian Institute of Technology Bombay in 2009, with interdisciplinary research experiences at the University of Bielefeld, Germany, and Ecole Polytechnique Federale de Lausanne, Switzerland. Dr. Jayaram’s research combines biology and robotics to, uncover the principles of robustness that make animals successful at locomotion in natural environments, and, in turn, inspire the design of the next generation of novel robots for effective real-world operation. His work has been published in a number of prestigious journals and gained significant popular media attention. Besides academic research, Dr. Jayaram’s group is actively involved in several outreach activities that strive toward achieving diversity, equity and inclusivity in STEM.

Computer-Integrated Surgery II Poster & Demo Session @ Hackerman Hall
May 1 @ 1:00 pm – 5:00 pm
May
4
Sat
May the 4th Be With You @ Chinese Pavilion
May 4 @ 12:00 pm – 2:00 pm

Laboratory for Computational Sensing + Robotics