The open ocean is a massive 3D ecosystem responsible for absorbing much of Earth’s excess heat and CO2 emissions produced by humans. A portion of the ocean’s carbon pump sequesters atmospheric carbon into the sediments of the deep sea. Quantifying the amount of this carbon exported to the deep and identifying the variables driving that export is vital to understanding how we might better mitigate the deleterious effects of climate change. The Monterey Bay Aquarium Institutes MBARI has developed high endurance mobile robots to investigate ocean carbon transport. One of its vehicles, the Benthic Rover has been working continuously on the seafloor at 4000m for 6 years– measuring the spatial and temporal variability of carbon export from the surface. This long-term dataset has revealed that carbon enters the deep sea in large pulses of sinking detritus. MBARI is now focused on connecting these carbon pulses to processes in the upper layers of the ocean. Exploring, mapping and sampling the upper water column to uncover ocean productivity hotspots (HS) is a central/key initiative/goal requiring the collaboration of MBARI’s Long Range Autonomous Underwater Vehicles (LRAUVs) as well as other complementary vehicles that are able to measure the full ecology of the hotspots from the microbes to the whales.
Brett W. Hobson received a BS in Mechanical Engineering from San Francisco State University in 1989. He began his ocean engineering career at Deep Ocean Engineering in San Leandro California, developing remotely operated vehicles (ROVs) and manned submarines. In 1992, he helped start and run Deep Sea Discoveries where he helped develop and operate deep towed sonar and camera systems offshore the US, Venezuela, Spain and the Philippians. In 1998, he joined Nekton Research in North Carolina to develop bio-inspired underwater vehicles for Navy applications. After the sale of Nekton Research to iRobot in 2005, Hobson joined the Monterey Bay Aquarium Research Institute (MBARI) where he leads the Long Range Autonomous Underwater Vehicle (AUV) program overseeing the development and science operations of a fleet of AUVs. He also helped develop MBARI’s long-endurance seafloor crawling Benthic Rover. Hobson holds a patent on the design of a biomimetic underwater vehicle and has been the Co-PI on large projects funded by NSF, NASA, and DHS projects aimed at developing novel underwater vehicles for ocean science.
Benjamin D. Killeen “An Autonomous X-ray Image Acquisition and Interpretation System for Assisting Percutaneous Pelvic Fracture Fixation”
Abstract: Percutaneous fracture fixation involves multiple X-ray acquisitions to determine adequate tool trajectories in bony anatomy. In order to reduce time spent adjusting the X-ray imager’s gantry, avoid excess acquisitions, and anticipate inadequate trajectories before penetrating bone, we propose an autonomous system for intra-operative feedback that combines robotic X-ray imaging and machine learning for automated image acquisition and interpretation, respectively. Our approach reconstructs an appropriate trajectory in a two-image sequence, where the optimal second viewpoint is determined based on analysis of the first image. The reconstructed corridor and K-wire pose are compared to determine likelihood of cortical breach, and both are visualized for the clinician in a mixed reality environment that is spatially registered to the patient and delivered by an optical see-through head-mounted display. We assess the upper bounds on system performance through in silico evaluation across 11 CTs with fractures present, in which the corridor and K-wire are adequately reconstructed. In post-hoc analysis of radiographs across 3 cadaveric specimens, our system determines the appropriate trajectory to within 2.8 ± 1.3 mm and 2.7 ± 1.8°. An expert user study with an anthropomorphic phantom demonstrates how our autonomous, integrated system requires fewer images and lower movement to guide and confirm adequate placement compared to current clinical practice.
Bio: A 4th year Ph.D. candidate at Johns Hopkins University, Benjamin D. Killeen is interested in intelligent surgical systems that improve patient outcomes. His recent work involves realistic simulation of interventional X-ray imaging for the purpose of developing AI-integrated surgical systems. Benjamin is a member of the Advanced Robotics and Computationally Augmented Environments (ARCADE) research group, led by Mathias Unberath, as well as the President of the LCSR Graduate Student Association (GSA) and Sports Officer for the MICCAI Student Board. In 2019, he earned a B.A. in Computer Science with Honors from the University of Chicago, with a minor in Physics, and he has completed internships at IBM Research – Almaden, Epic Systems, and Intuitive Surgical. In his spare time, he enjoys bouldering and creative writing.
Divya Ramesh “Studying terrestrial fish locomotion on wet deformable substrates”
Abstract: Many amphibious fishes can make forays onto land. The water-land interface often has wet deformable substrates like mud and sand, whose strength changes as they get dryer or wetter, challenging locomotion. Most previous terrestrial locomotion studies of fishes focused on quantifying kinematics, muscle control, and functional morphology. Yet, without quantifying how the complex mechanics of wet deformable substrates affect ground reaction forces during locomotion, we cannot fully understand how these locomotor features interact with the environment to permit performance. Here, we used controlled mud as a model wet deformable substrate and developed methods to prepare mud into spatially uniform and temporally stable states and tools to characterize its strength. As a first step to understand how mud strength impact locomotion, we studied the Atlantic mudskipper (Periophthalmus barbarus) moving on a thicker and a thinner mud, which differs in strength by a factor of two. The animal performed similar “crutching” walks on mud of both strengths, with only a slight reduction in speed on the thinner mud (from 0.39 ± 0.12 to 0.32 ± 0.14 body length/s, P < 0.05, ANOVA). However, it jumped more frequently on the thinner mud (from 1.2 ± 0.7 to 3.2 ± 1.6 times per minute, P < 0.05, ANOVA), likely due to it sticking more to the belly and fins and hindering walking.
Bio: Divya Ramesh is a fourth year PhD student in Dr. Chen Li’s lab (Terradynamics lab). Her current work focuses in studying and understanding amphibious fish locomotion on wet deformable substrates. Her previous work focused in using contact sensing to study and understand limbless locomotion of snakes and snake-robot on 3-D terrains. She received a BTech in Electronics and Communication Engineering from VIT University (Vellore, India) and MSE in Electrical Engineering from University of Pennsylvania. She has published in IEEE RA-L (presented at ICRA 2020) and presented at ICRA 2022. This work was presented in SICB 2023 where she was a finalist for Best Student Presentation in the Division of Comparative Biomechanics.
Gargi Sadalgekar “Template-level robophysical models for studying sustained terrestrial locomotion of amphibious fish”
Abstract: Studying terrestrial locomotion of amphibious fishes informs how early tetrapods may have invaded land. The water-land interface often has wet, deformable substrates like mud and sand that challenge locomotion. Recent progress has been made on understanding limbed and limbless tetrapod locomotion by studying robots as active physical models of model organisms. Robophysical models complement animals with their high controllability and repeatability for systematic experiments. They also complement theoretical and computational models because they enact physical laws in the real world, which is especially useful for studying locomotion in complex terrain. Here, we created the first robophysical models for studying sustained terrestrial locomotion of amphibious fishes on controlled mud as a model web deformable substrate. Our three robots are on the template level (lowest degree-of-freedom to generate a target locomotor behavior) and represent mudskippers, ropefish, and bichirs that use appendicular, axial, and axial-appendicular strategies, respectively. The mudskipper robot rotated two fins in phase to raise the body and “crutch” forward on mud. The ropefish robot used body lateral undulation to “surface-swim” on mud. The bichir robot combined body undulation and out-of-phase fin rotations to “army-crawl” forward on mud. Each robot generated qualitatively similar locomotion on mud as its model organism. We are currently refining the robots and performing systematic experiments on mud of a wide range of strengths.
Bio: Gargi Sadalgekar is a 2nd year master’s student in the Terradynamics Lab at Johns Hopkins University and is interested in developing bio-inspired robots to investigate locomotion in extreme environments. Her current work focuses on developing low-order robophysical models of amphibious fish to uncover general principles of locomotion over wet deformable substrates, and this work was presented in SICB 2023 where she was a finalist for Best Student Presentation in the Division of Comparative Biomechanics. Gargi received a BSE in Mechanical and Aerospace Engineering from Princeton University with a minor in Robotics and Information Systems.
Yaqing Wang “Force sensing can help robots reconstruct potential energy landscape and guide locomotor transitions to traverse large obstacles”
Abstract: Legged robots already excel at maintaining stability during upright walking and running to step over small obstacles. However, they must further traverse large obstacles comparable to body size to enable a broader range of applications like search and rescue in rubble and sample collection in rocky Martian hills. Our lab’s recent research demonstrated that legged robots can traverse large obstacles if they can be destabilized to transition across various locomotor modes. When viewed on a potential energy landscape of the system, which results from physical interaction with obstacles, these locomotor transitions are strenuous barrier-crossing transitions between landscape basins. Because potential energy landscape gradients are closely related to terrain reaction forces and torques, we hypothesize that sensing obstacle interaction forces allows landscape reconstruction, which can guide robots to cross barriers at the saddle to make transitions more easily (analogous to crossing a mountain ridge at its saddle). Here, we created a robophysical model with custom 3-axis force sensors and surface contact sensors to measure forces and contacts during interaction with large obstacles. We found that the measured forces indeed well captured potential energy landscape gradients and we could use the locally measured gradients to roughly reconstruct the potential energy landscape. Our future work should understand how to enable robots to make locomotor transitions at the landscape saddle based on local landscape reconstruction.
Bio: Yaqing Wang is a fourth-year PhD student in Dr. Chen Li’s lab (Terradynamics lab). His work focuses on understanding locomotor transitions in bio and bio-inspired terrestrial locomotion. He received a B.S. in Mechanical Engineering at Tsinghua University in China. He recently presented this work at the annual APS March meeting.
Emotional intelligence for artificial systems is not a luxury but a necessity. It is paramount for many applications that require both short and long–term engaging human–technology interactions, including entertainment, hospitality, education, and healthcare. However, creating artificially intelligent systems and interfaces with social and emotional skills is a challenging task. Progress in industry and developments in academia provide us a positive outlook, however, the artificial emotional intelligence of the current technology is still quite limited. Creating technology with artificial emotional intelligence requires the development of perception, learning, action and adaptation capabilities, and the ability to execute these pipelines in real-time in human-AI interactions. Truly addressing these challenges relies on cross-fertilization of multiple research fields, including psychology, nonverbal behaviour understanding, psychiatry, vision, social signal processing, affective computing, and human-computer and human-robot interaction. My lab’s research has been pushing the state of the art in a wide spectrum of research topics in this area, including the design and creation of new datasets; novel feature representations and learning algorithms for sensing and understanding human nonverbal behaviours in solo, dyadic and group settings; designing short/long-term human-robot adaptive interactions for wellbeing; and creating algorithmic solutions to mitigate the bias that creeps into these systems.
In this talk, I will present the recent explorations of the Cambridge Affective Intelligence and Robotics Lab in these areas with insights for human embodied-AI interaction research.
Hatice Gunes is a Professor of Affective Intelligence and Robotics (AFAR) and leads the AFAR Lab at the University of Cambridge’s Department of Computer Science and Technology. Her expertise is in the areas of affective computing and social signal processing cross-fertilising research in multimodal interaction, computer vision, signal processing, machine learning and social robotics. She has published over 155 papers in these areas (H-index=36, citations > 7,300), with most recent works on lifelong learning for facial expression recognition, fairness, and affective robotics; and longitudinal HRI for wellbeing. She has served as an Associate Editor for IEEE Transactions on Affective Computing, IEEE Transactions on Multimedia, and Image and Vision Computing Journal, and has guest edited many Special Issues, the latest ones being 2022 Int’l Journal of Social Robotics Special Issue on Embodied Agents for Wellbeing, 2022 Frontiers in Robotics and AI Special Issue on Lifelong Learning and Long-Term Human-Robot Interaction, and 2021 IEEE Transactions on Affective Computing Special Issue on Automated Perception of Human Affect from Longitudinal Behavioural Data. Other research highlights include Outstanding PC Award at ACM/IEEE HRI’23, RSJ/KROS Distinguished Interdisciplinary Research Award Finalist at IEEE RO-MAN’21, Distinguished PC Award at IJCAI’21, Best Paper Award Finalist at IEEE RO-MAN’20, Finalist for the 2018 Frontiers Spotlight Award, Outstanding Paper Award at IEEE FG’11, and Best Demo Award at IEEE ACII’09. Prof Gunes is a former President of the Association for the Advancement of Affective Computing (2017-2019), is/was the General Co-Chair of ACM ICMI’24 and ACII’19, and the Program Co-Chair of ACM/IEEE HRI’20 and IEEE FG’17. She was the Chair of the Steering Board of IEEE Transactions on Affective Computing (2017-2019) and was a member of the Human-Robot Interaction Steering Committee (2018-2021. Her research has been supported by various competitive grants, with funding from Google, the Engineering and Physical Sciences Research Council UK (EPSRC), Innovate UK, British Council, Alan Turing Institute and EU Horizon 2020. In 2019 she was awarded a prestigious EPSRC Fellowship to investigate adaptive robotic emotional intelligence for wellbeing (2019-2024) and has been named a Faculty Fellow of the Alan Turing Institute – UK’s national centre for data science and artificial intelligence (2019-2021). Prof Gunes is currently a Staff Fellow of Trinity Hall, a Senior Member of the IEEE, and a member of the AAAC.
Old monkeys may have stories. Some could be lessons learned to help overcome obstacles. The first part of this seminar discusses classical robotic applications in industry and critical factors in the development and applications. The second part discusses intelligent manufacturing with the use of data and easy-to-use analytics, necessary in modern-day manufacturing. Moving forward, some opportunities in robotics in intelligent manufacturing are discussed.
Dr. Day was previously a Senior VP of Foxconn Automation Technology. Dr. Day began his career in 1970 as a coop equipment development engineer at IBM Burlington Vt. and later continued to plow in the manufacturing automation field with General Motors, Fanuc, Rockwell Automation, Stoneridge and Foxconn. Dr. was the founder of Foxbot, with 80,000 units deployed in various applications. In June 2016, Dr. Day received the Joseph F. Engelberger award from the Robot Industries Association for a lifetime career contribution in the automotive and electronic industries.
“Games Without Frontiers: Beating Super Mario Bros. 1-1 with a 3D-Printed Soft Robotic Hand”
Ryan D. Sochol, Ph.D.
Associate Professor, Department of Mechanical Engineering
Affiliate Faculty, Fischell Department of Bioengineering
Executive Committee Member, Maryland Robotics Center
Fischell Institute Fellow, Robert E. Fischell Institute for Biomedical Devices
Affiliate Faculty, Institute for Systems Research
James Clark School of Engineering
University of Maryland, College Park
Over the past decade, the field of “soft robotics” has established itself as uniquely suited for applications that would be difficult or impossible to realize using traditional, rigid-bodied robots. The reliance on compliant materials that are often actuated by fluidic (e.g., hydraulic or pneumatic) means presents a number of inherent benefits for soft robots, particularly in terms of safety for human-robot interactions and adaptability for manipulating complex and/or delicate objects. Unfortunately, progress has been impeded by broad challenges associated with controlling the underlying fluidics of such systems. In this seminar, Prof. Ryan D. Sochol will discuss how his Bioinspired Advanced Manufacturing (BAM) Laboratory is leveraging the capabilities of two alternative types of additive manufacturing (or “three-dimensional (3D) printing”) technologies to address these critical barriers. Specifically, Prof. Sochol will describe his lab’s recent strategies for using the 3D nanoprinting approach, “Two-Photon Direct Laser Writing”, and the inkjet 3D printing technique, “PolyJet 3D Printing”, to engineer soft robotic systems that comprise integrated fluidic circuitry… including a soft robotic “hand” that plays Nintendo.
Prof. Ryan D. Sochol is an Associate Professor of Mechanical Engineering within the A. James Clark School of Engineering at the University of Maryland, College Park. Prof. Sochol received his B.S. in Mechanical Engineering from Northwestern University in 2006, and both his M.S. and Ph.D. in Mechanical Engineering from the University of California, Berkeley, in 2009 and 2011, respectively, with Doctoral Minors in Bioengineering and Public Health. Prior to joining the faculty at UMD, Prof. Sochol served two primary academic roles: (i) as an NIH Postdoctoral Trainee within the Harvard-MIT Division of Health Sciences & Technology, Harvard Medical School, and Brigham & Women’s Hospital, and (ii) as the Director of the Micro Mechanical Methods for Biology (M3B) Laboratory Program within the Berkeley Sensor & Actuator Center at UC Berkeley. Prof. Sochol also served as a Visiting Postdoctoral Fellow at the University of Tokyo. In 2019, Prof. Sochol was elected Co-President of the Mid-Atlantic Micro/Nano Alliance. His group received IEEE MEMS Outstanding Student Paper Awards in both 2019 and 2021 and the Springer Nature Best Paper Award (Runner-Up) in 2022. Prof. Sochol received the NSF CAREER Award in 2020 and the Early Career Award from the IOP Journal of Micromechanics and Microengineering in 2021, and was recently honored as an inaugural Rising Star by the journal, Advanced Materials Technologies, in 2023.
Job title and affiliation: Senior Scientist, Philips
JHU degrees and year(s) of degree(s): Ph.D. Computer Science 2018, MSE Computer Science 2014
Short bio: Ayushi Sinha is a Senior Scientist at Philips working on image guided therapy systems including C-arm X-ray imaging systems. Ayushi received a BS in Computer Science and a BA in Mathematics from Providence College, RI, and a MSE and Ph.D. in Computer Science from Johns Hopkins University, MD. She remained at Hopkins as a Provost’s Postdoctoral Fellow followed by a Research Scientist before joining Philips in late 2019. Her primary research interest is in image analysis to enable automation of medical imaging systems and integration of multiple systems.
Job title and affiliation: Computer Vision Research Scientist at PediaMetrix
JHU degrees and year(s) of degree(s): BS Mechanical Engineering 2019, MSE Robotics 2020
Short bio: Originally from Istanbul, Turkey, Can came to JHU for his undergraduate degree and early on explored an interest in robotics through coursework and the robotics minor. He completed his master’s research and thesis under Prof. Taylor in 2020 on an autonomous endoscope safety system. Since graduation, Can has been working as a Computer Vision Research Scientist at PediaMetrix, a medical imaging startup focused on infant healthcare. There he has worked on developing, deploying, and validating image processing and vision algorithms, machine and deep learning models, as well as acquiring 510(k) clearance. Since September 2022, he has taken a leadership role in their R&D department. Can is interested in making healthcare more robust and accessible through innovation and technology and is the co-inventor of 2 US patents.
Job title and affiliation: Associate Professor, United States Naval Academy Department of Weapons, Robotics, and Control Engineering/Instructor, JHU-EP Mechanical Engineering Program
JHU degrees and year(s) of degree(s): M.S.E. Mechanical Engineering 2007, Ph.D. Mechanical Engineering 2012
Short bio: Mike Kutzer received his Ph.D. in mechanical engineering from the Johns Hopkins University, Baltimore, MD, USA in 2012. He is currently an Associate Professor in the Weapons, Robotics, and Control Engineering Department (WRCE) at the United States Naval Academy (USNA). Prior to joining USNA, he worked as a senior researcher in the Research and Exploratory Development Department of the Johns Hopkins University Applied Physics Laboratory (JHU/APL). His research interests include robotic manipulation, computer vision and motion capture, applications of and extensions to additive manufacturing, mechanism design and characterization, continuum manipulators, redundant mechanisms, and modular systems.
Dr. Juan Wachs is a Professor and Faculty Scholar in the Industrial Engineering School at Purdue University, Professor of Biomedical Engineering (by courtesy) and an Adjunct Associate Professor of Surgery at IU School of Medicine. He is currently serving at NSF as a Program Director for robotics and AI programs at CISE. He is also the director of the Intelligent Systems and Assistive Technologies (ISAT) Lab at Purdue, and he is affiliated with the Regenstrief Center for Healthcare Engineering. He completed postdoctoral training at the Naval Postgraduate School’s MOVES Institute under a National Research Council Fellowship from the National Academies of Sciences. Dr. Wachs received his B.Ed.Tech in Electrical Education in ORT Academic College, at the Hebrew University of Jerusalem campus. His M.Sc and Ph.D in Industrial Engineering and Management from the Ben-Gurion University of the Negev, Israel. He is the recipient of the 2013 Air Force Young Investigator Award, and the 2015 Helmsley Senior Scientist Fellow, and 2016 Fulbright U.S. Scholar, the James A. and Sharon M. Tompkins Rising Star Associate Professor, 2017, and an ACM Distinguished Speaker 2018. He is also the Associate Editor of IEEE Transactions in Human-Machine Systems, Frontiers in Robotics and AI.
WE ARE BACK WITH OUR MONDAY BAGELS TRADITION!!
Please join us this coming Monday 09/11 at 10.30 am at the students’ office space in Hackerman 136/137 for some fresh morning bagels!! We will provide various cream cheese spreads, and there will be a coffee machine, water boiler and K-cups for you to enjoy as well (bring your own mugs though).
Looking forward to seeing you all there!
Lydia & Benjamin
The LCSR Graduate Student Association (LCSR-GSA)
Laboratory for Computational Sensing and Robotics
Johns Hopkins University