Calendar

Nov
6
Wed
LCSR Seminar: Xinyan Deng “Learn to Fly Like a Hummingbird by an At-scale Bio-inspired Robot: The Highly Robust, Resilient, and Maneuverable Flapping Flight” @ Hackerman B-17
Nov 6 @ 12:00 pm – 1:00 pm

Abstract

Flying insects and hummingbirds demonstrate remarkable aerial maneuverability, robustness, and resilience to their environment and their morphological changes.  Upon a looming threat, hummingbird can perform a rapid 180-degree escape turn in just six wingbeats; Hawk moth can adjust to real-time wing area loss during hovering; Migrating butterflies can tolerate wind gusts disturbances while flying thousands of miles.  The interest in micro air vehicles capable of hovering and fast maneuvers has led to several efforts to develop bio-inspired insect and hummingbird robots.  However, it is only through the understanding of their underlying flight mechanisms can we create novel robots with key bio-inspired principles that allow them to approach the performance of their natural counterparts.  To this end, we use a combined approach of dynamics and control theories, fluid experiments, and robotic platforms. In this talk I will highlight our recent findings including: 1) Learning extreme maneuvers such as rapid escape and tight body flips on an at-scale hummingbird robot equipped with just two actuators; 2) Sensing through flapping wings and their resilience to cluttered environment and dynamic morphological damage; 3) Flapping wing in turbulence and its gust mitigating potentials.

 

Biography

Xinyan Deng is an Associate Professor at the School of Mechanical Engineering at Purdue University.  She received her B.S. degree from the School of Electrical Engineering and Automation at Tianjin University, and her Ph.D. degree from the Department of Mechanical Engineering at the University of California at Berkeley.  Her background is in controls and robotics, and her research interest include the principles of aerial and aquatic locomotion in animals, bio-inspired robots, and cyber physical security of autonomous systems. She received the NSF CAREER Award in 2006 on flying insect and robot research. She received the B.F.S. Schaefer Outstanding Faculty Scholar Award from Purdue University in 2015.  Her work is highly interdisciplinary and has appeared in top robotics, biology, fluids, and computer science journals and conferences.  She served as the Co-Chair for the Technical Committee on Bio-robotics of the IEEE Robotics and Automation Society from 2009-2013.  She has chaired and co-chaired varies IEEE and ASME conference workshops, NSF workshops, conference symposiums and sessions on bio-inspired robotics. Her research has been funded by federal agencies including NSF, AFOSR, AFRL, and ONR.

 

 LCSR Seminar Video Link

Nov
13
Wed
LCSR Seminar: Juan Wachs “The Cyber Touch – Empowering Medical Robots Through Gestures” @ Hackerman B-17
Nov 13 @ 12:00 pm – 1:00 pm

Abstract

At the time, the only robots present in the operating room are those that extend surgeons capabilities through tele-operation, such as the da-Vinci robot. Alternatively, a new type of robots is emerging that would understand natural language, and mainly non-verbal language, such as gestures, which is the main form of interaction in the operating room. But it also turns out that medics and first responders also use a combination of communication modalities to collaborate with robots outside the OR, in austere settings such as the battlefield or in rural settings. Endowing robots with the capability of recognizing intention through body language, predicting and informing the surgical team about future surgical tasks is a key challenge in trauma care. In this talk, I will highlight three applications that showcase robots working with doctors in a semi-autonomous manner. Such work has applications to the DoD, and was made possible through collaborations with hospitals: Indiana University of Medicine, the Naval Medical Center Portsmouth at Norfolk, VA and Womack Army Medical Center at Fort Bragg, North Carolina. Significant breakthroughs in this research led to major publications, such as Annals of Surgery and Surgery. News releases covering this work appear at NPR “Surgical Technology Aims to Mimic ‘Teleporting’”. NPR Inside Indiana Business. Sept. 28, 2015 and more recently featured in the WIRED magazine “How Technology is Helping Surgeons Collaborate from Across the World” (07/2018) and Inside Indiana Business with Gerry Dick. TV Show. October 4, 2018. The research showcased is supported through the kind generosity of DoD and NSF

 

Biosketch

Dr. Juan Wachs is the James A. and Sharon M. Tompkins Rising Star Associate Professor in the Industrial Engineering School at Purdue University, Professor of Biomedical Engineering (by courtesy) and an Adjunct Associate Professor of Surgery at IU School of Medicine. He is the director of the Intelligent Systems and Assistive Technologies (ISAT) Lab at Purdue, and he is affiliated with the Regenstrief Center for Healthcare Engineering. He completed postdoctoral training at the Naval Postgraduate School’s MOVES Institute under a National Research Council Fellowship from the National Academies of Sciences. Dr. Wachs received his B.Ed.Tech in Electrical Education in ORT Academic College, at the Hebrew University of Jerusalem campus. His M.Sc and Ph.D in Industrial Engineering and Management from the Ben-Gurion University of the Negev, Israel. He is the recipient of the 2013 Air Force Young Investigator Award, and the 2015 Helmsley Senior Scientist Fellow, and 2016 Fulbright U.S. Scholar, the James A. and Sharon M. Tompkins Rising Star Associate Professor, 2017, and an ACM Distinguished Speaker 2018. He is also the Associate Editor of IEEE Transactions in Human-Machine Systems, Frontiers in Robotics and AI.

 

 

 LCSR Seminar Video Link

Nov
20
Wed
LCSR Seminar: Guoquan Huang “Visual-Inertial State Estimation” @ Hackerman B-17
Nov 20 @ 12:00 pm – 1:00 pm

Abstract:

As autonomous vehicles are emerging in many different application domains from self-driving cars and drone delivery to underwater survey, state estimation, as one of the most important enabling technologies for autonomous systems, becomes more important than ever before. While tremendous progress in autonomous navigation has been made in the past decades, many challenges still remain. For example, many current sate estimation algorithms of robot localization tend to become inconsistent (i.e., the state estimates are biased and the error covariance estimates are different from the true ones), causing mission failure in a short period of time. If resources available to vehicles are limited, designing consistent efficient estimators becomes even more challenging. In this talk, I will present some of our recent work on taking up these challenges. I will discuss our observability-based methodology for improving estimation consistency, and deep learning for loop closure, in the context of simultaneous localization and mapping (SLAM) and visual-inertial navigation system (VINS). In particular, I will highlight our recent results on visual-inertial state estimation and its extensions.

 

Bio:

Guoquan (Paul) Huang is currently an Assistant Professor of Mechanical Engineering (ME), Electrical and Computer Engineering (ECE), and Computer and Information Sciences (CIS), at the University of Delaware (UD), where he is leading the Robot Perception and Navigation Group (RPNG). He also holds an Adjunct Professor position at the Zhejiang University, China. He was a Senior Consultant (2016-2018) at the Huawei 2012 Laboratories and a Postdoctoral Associate (2012-2014) at MIT CSAIL (Marine Robotics). He received the B.Eng. (2002) in Automation (Electrical Engineering) from the University of Science and Technology Beijing, China, and the M.Sc. (2009) and Ph.D. (2013) in Computer Science from the University of Minnesota. From 2003 to 2005, he was a Research Assistant with the Department of Electrical Engineering, Hong Kong Polytechnic University. His research interests include sensing, localization, mapping, perception and navigation of autonomous ground, aerial, and underwater vehicles. Dr. Huang received the 2006 Academic Excellence Fellowship from the University of Minnesota, 2011 Chinese Government Award for Outstanding Self-Financed Students Abroad, 2015 UD Research Award (UDRF), 2016 NSF CRII Award, 2017 UD Makerspace Faculty Fellow, 2018 SATEC Robotics Delegation (one of ten US experts invited by ASME),  2018 Google Daydream Faculty Research Award, 2019 Google AR/VR Faculty Research Award, and was the Finalist for the 2009 Best Paper Award from the Robotics: Science and Systems Conference (RSS).

 

 LCSR Seminar Video Link

Dec
4
Wed
LCSR Seminar: Career Services @ Hackerman B-17
Dec 4 @ 12:00 pm – 1:00 pm

Abstract:

TBA

 

Bio:

TBA

 

 LCSR Seminar Video Link

Jan
29
Wed
LCSR Seminar: Robert Pless “Supporting Sex Trafficking Investigations with Deep Metric Learning” @ Hackerman B-17
Jan 29 @ 12:00 pm – 1:00 pm

Abstract:

This talk shares work to develop traffickCam, a system to support sex trafficking investigations by recognizing the hotel rooms in pictures of trafficking victims.  I’ll share context for this project and ways that this system is currently being used at the National Center for Missing and Exploited Children, as well as special challenges that come from this problem domain such as dramatic differences in rooms within a hotel and the similarity of rooms across chains.  Attacking these problems led us to specific improvements in large scale classification with Deep Metric Learning, including novel training algorithms, visual explainability and new visualization approaches to compare and understand the representations they learn.

Bio:

Robert Pless is the Patrick and Donna Martin Professor and Chair of Computer Science at George Washington University.  Dr. Pless was born at Johns Hopkins hospital in 1972, has a Bachelors Degree in Computer Science from Cornell University in 1994 and a PhD from the University of Maryland, College Park in 2000.  He was on the faculty of Computer Science at Washington University in St. Louis from 2000-2017. His research focuses on geometrical and statistical Computer Vision.

 

 This talk will be recorded. Click Here for all of the recorded seminars for the 2019-2020 academic year.

Feb
5
Wed
LCSR Seminar: Debra Mathews “Implementing ethics in autonomous systems” @ Hackerman B-17
Feb 5 @ 12:00 pm – 1:00 pm

Abstract:
AI is moving both faster and slower than the general public realizes — it is being deployed extensively across many domains, but we are also still very, very far away from the sorts of autonomous robots imagined by Asimov or anyone who watched the Jetsons as a child or watches Westworld today. Overlapping sets of ethical issues are raised by both current stages and domains of AI development and use and by the autonomy and uses we imagine and hope AI will have in the future. What is crucial is that we appreciate and attend to these issues today, so that we can maximize the benefits of this technology for humanity while minimizing the harms.

Bio:
Debra JH Mathews, PhD, MA, is the Assistant Director for Science Programs for the Johns Hopkins Berman Institute of Bioethics, an Associate Professor in the Department of Pediatrics, Johns Hopkins School of Medicine. Dr. Mathews earned her PhD in genetics from Case Western Reserve University. Concurrent with her PhD, she earned a Master’s degree in bioethics, also from Case. She completed a Post-Doctoral Fellowship in genetics at Johns Hopkins, and the Greenwall Fellowship in Bioethics and Health Policy at Johns Hopkins and Georgetown Universities. Dr. Mathews has also spent time at the Genetics and Public Policy Center, the US Department of Health and Human Services, and the Presidential Commission for the Study of Bioethical Issues, working in various capacities on science policy. Dr. Mathews’s academic work focuses on ethics and policy issues raised by emerging biotechnologies, with particular focus on genetics, stem cell science, neuroscience, and synthetic biology.

 

 This talk will be recorded. Click Here for all of the recorded seminars for the 2019-2020 academic year.

Feb
12
Wed
LCSR Seminar: Rebecca Kramer-Bottiglio “From Particles to Parts–Building Multifunctional Robots with Programmable Robotic Skins” @ Hackerman B-17
Feb 12 @ 12:00 pm – 1:00 pm

Abstract:

Robots generally excel at specific tasks in structured environments, but lack the versatility and adaptability required to interact-with and locomote-within the natural world. To increase versatility in robot design, my research group is developing soft robotic skins that can wrap around arbitrary deformable objects to induce the desired motions and deformations. The robotic skins integrate programmable composites to embed actuation and sensing into planar substrates that may be applied-to, removed-from, and transferred-between different objects to create a multitude of controllable robots with different functions. During this talk, I will demonstrate the versatility of this soft robot design approach by showing robotic skins in a wide range of applications – including manipulation tasks, locomotion, and wearables – using the same 2D robotic skins reconfigured on the surface of various 3D soft, inanimate objects. Further, I will present recent work towards programmable composites derived from novel functional particulates that address the emerging need for variable stiffness properties, variable trajectory motions, and embedded computation within the soft robotic skins.

 

Bio:

Rebecca Kramer-Bottiglio is the John J. Lee Assistant Professor of Mechanical Engineering and Materials Science at Yale University. She completed her B.S. at the Johns Hopkins University, M.S. at U.C. Berkeley, and Ph.D. at Harvard University. Prior to joining the faculty at Yale, she was an Assistant Professor of Mechanical Engineering at Purdue University. She currently serves as an Associate Editor of Soft Robotics, Frontiers in Robotics and AI, IEEE Robotics and Automation Letters, and Multifunctional Materials, and is an IEEE Distinguished Lecturer. She is the recipient of the NSF CAREER Award, the NASA Early Career Faculty Award, the AFOSR Young Investigator Award, the ONR Young Investigator Award, and the Presidential Early Career Award for Scientists and Engineers (PECASE), and was named to Forbes’ 30 under 30 list in 2015.

 

 This talk will be recorded. Click Here for all of the recorded seminars for the 2019-2020 academic year.

Feb
19
Wed
LCSR Seminar: Cornelia Fermüller “Action perception at multiple time scales” @ Hackerman B-17
Feb 19 @ 12:00 pm – 1:00 pm

Abstract:

Understanding human activity is a very challenging task, but a prerequisite for the autonomy of robots interacting with humans. Solutions that generalize must involve not only perception but also cognition and a grounding in the motor system. Our approach is to describe complex actions as events at multiple time scales. At the lowest level, signals are chunked into primitive symbolic events, and these are then combined into increasingly more complex events of longer and longer time spans. The approach will be demonstrated on our work of creating visually learning robots, and the talk will describe some of its novel components: an architecture that has cognitive and linguistic processes communicate with the vision and motor systems in a dialog fashion; vision processes that parse the objects and movements based on their attributes, spatial relations, and 3D geometry; the combination of tactile sensing with vision for better recognition; and approaches to cover long-term relations in observed activities.

 

Bio:

Cornelia Fermüller is a research scientist at the Institute for Advanced Computer Studies (UMIACS) at the University of Maryland at College Park.  She holds a Ph.D. from the Technical University of Vienna, Austria and an M.S. from the University of Technology, Graz, Austria, both in Applied Mathematics.  She co-founded the Autonomy Cognition and Robotics (ARC) Lab and co-leads the Perception and Robotics Group at UMD. Her research is in the areas of Computer Vision, Human Vision, and Robotics. She studies and develops biologically inspired Computer Vision solutions for systems that interact with their environment. In recent years, her work has focused on the interpretation of human activities, and on motion processing for fast active robots (such as drones) using as input bio-inspired event-based sensors.

http://users.umiacs.umd.edu/users/fer

 

 This talk will be recorded. Click Here for all of the recorded seminars for the 2019-2020 academic year.

Feb
26
Wed
LCSR Seminar: Henry Astley “Using Robotic Models To Explore The Evolution Of Functional Morphology” @ Hackerman B-17
Feb 26 @ 12:00 pm – 1:00 pm

Abstract:

Living organisms face a wide range of physical challenges in their environments, yet frequently display exceptional performance.  The performance is often correlated with morphological features, physiological differences, or particular behaviors, leading to the hypothesis that these traits are adaptations to improve performance.  However, rigorously testing adaptations is extremely difficult, as a particular trait may be suboptimal due to lack of selective pressure, subject to tradeoffs and evolutionary constraints, or even be entirely non-adaptive.  Furthermore, it can be difficult to even truly determine the function of some traits, as they may not be amenable to experimental manipulation or comparative analysis.  However, techniques and tools from engineering are allowing biologists to test the functional consequences of previously untestable physical and behavioral traits and even explore the performance consequences of alternative versions of traits.  This can led to a broader understanding of the trait itself and the evolutionary pressures acting upon it, past and present.  This talk will use several examples of how 3D printing and robotics have been used to establish the functional consequences of enigmatic morphologies and behaviors in snakes, early tetrapods, and fish, and demonstrate the power of these techniques for providing biological insights.

 

Bio:

Henry Astley is currently an Assistant Professor at University of Akron’s Biomimicry Research & Innovation Center (BRIC), working on animal locomotion and biomimetic robotics.  Dr. Astley initially completed a B.S in Aerospace Engineering at Florida Institute of Technology before switching fields and completing a second B.S. and an M.S. in biology at the University of Cincinnati, focusing on arboreal snake locomotion.  Dr. Astley did his Ph.D. on frog jumping at Brown University, followed by a postdoc at Georgia Institute of Technology focusing on locomotion in granular media.

 

 This talk will be recorded. Click Here for all of the recorded seminars for the 2019-2020 academic year.

Johns Hopkins University

Johns Hopkins University, Whiting School of Engineering

3400 North Charles Street, Baltimore, MD 21218-2608

Laboratory for Computational Sensing + Robotics