Calendar

Feb
8
Wed
LCSR Seminar: Mark Savage “Resumes” @ Hackerman B17
Feb 8 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Mark Savage is the Johns Hopkins Life Design Educator for Engineering Masters Students, advising on all aspects of career development and the internship / job search, with the Handshake Career Management System as a necessary tool.  Look for weekly newsletters to soon be emailed to Homewood WSE Masters Students on Sunday Nights.

 

 

 

Feb
15
Wed
LCSR Seminar: Brent Gillespie “Predicting Human Behavior in Predictable Environments Using the Internal Model Principle” @ Hackerman B17
Feb 15 @ 12:00 pm – 1:00 pm

 

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract:

All models are wrong, and too many are directed inward. The Internal Model Principle of control engineering directs our attention (and modeling proficiency) to what makes the world around us patterned and predictable.  It says that driving a model of that patterned or predictable behavior in a feedback loop is the only way to achieve perfect tracking or disturbance rejection. In the spirit of “some models are useful”, I will present a control system model of humans tracking moving targets on a screen using a mouse and cursor. Simple analyses reveal this controller’s robustness to visual blanking and experiments (even experiments conducted remotely during the pandemic) provide ample support. Extensions that combine feedforward and feedback control complete the picture and complement existing literature in human motor behavior, most of which is focused on modeling the system under control rather than the environment.

Bio:

Brent Gillespie is a Professor of Mechanical Engineering and Robotics at the University of Michigan. He received a Bachelor of Science in Mechanical Engineering from the University of California Davis in 1986, a Master of Music from the San Francisco Conservatory of Music in 1989, and a Ph.D. in Mechanical Engineering from Stanford University in 1996. His research interests include haptic interface, human motor behavior, haptic shared control, and robot-assisted rehabilitation after neurological injury. Prof. Gillespie’s awards include the Popular Science Invention Award (2016), the University of Michigan Provost’s Teaching Innovation Prize (2012), and the Presidential Early Career Award for Scientists and Engineers (2001).

 

Feb
22
Wed
LCSR Seminar: Joshua Mangelson “Steps Towards Intelligent Autonomous Underwater Inspection and Data Collection” @ Hackerman B17
Feb 22 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

 

Abstract:

Over 70% of our world is underwater, but less than 1% of the world’s oceans have been mapped at resolutions greater than 100m per pixel. Regular inspection, mapping, and data collection in marine environments is essential for a whole host of reasons including gaining a scientific understanding of our planet, civil infrastructure maintenance, and safe navigation. However, manual inspection/data collection using divers is expensive, dangerous, time-consuming, and tedious work.

 

In this talk, I will discuss the use of autonomous underwater vehicles (AUVs) and autonomous surface vessels (ASVs) to automatically and intelligently map, inspect, and collect information in unstructured marine environments. In particular, we will discuss the problems present in this space as well as the contributions my lab is making towards addressing these problems, including i) the development of a general-purpose marine robotics testbed at BYU, ii) the development of a marine robotics simulator called HoloOcean (https://holoocean.readthedocs.io/en/stable/), iii) advancements in marine robotic localization using Lie groups, and iv) preliminary results towards expert-guided topic modeling and intelligent data collection.

 

Bio:

Dr. Joshua Mangelson holds PhD and Masters degrees in Robotics from the University of Michigan. After completing his degre, he served as a post-doctoral fellow at Carnegie Mellon University before joining the Electrical and Computer Engineering faculty at Brigham Young University in 2020. His qualifications include demonstrated expertise in robotic perception, mapping, and localization with a particular focus on marine robotics. He has extensive experience leading marine robotic field trials in various locations around the world including San Diego, Hawaii, Boston, northern Michigan, and Utah. In 2018, his work on multi-robot mapping received the Best Multi-Robot Paper Award at the IEEE ICRA conference and 1st-Place in the IEEE OCEANS Student Poster Competition. He is currently serving as an associate editor for The International Journal of Robotics Research (IJRR) and the IEEE/RSJ IROS Conference.

 

Mar
1
Wed
LCSR Seminar: Student Seminars @ Hackerman B17
Mar 1 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Ulas Berk Karli and Shiye (Sally) Cao “What if it is wrong: effects of power dynamics and trust repair strategy on trust and compliance in HRI.”

Abstract: Robotic systems designed to work alongside people are susceptible to technical and unexpected errors. Prior work has investigated a variety of strategies aimed at repairing people’s trust in the robot after its erroneous operations. In this work, we explore the effect of post-error trust repair strategies (promise and explanation) on people’s trust in the robot under varying power dynamics (supervisor and subordinate robot). Our results show that, regardless of the power dynamics, promise is more effective at repairing user trust than explanation. Moreover, people found a supervisor robot with verbal trust repair to be more trustworthy than a subordinate robot with verbal trust repair. Our results further reveal that people are prone to complying with the supervisor robot even if it is wrong. We discuss the ethical concerns in the use of supervisor robot and potential interventions to prevent improper compliance in users for more productive human-robot collaboration.

 

Bio: Ulas Berk Karli is a MSE student in Robotics LCSR, Johns Hopkins University. He received the Bachelor of Science degree in Mechanical Engineering and Double Majored in Computer Engineering from Koc University, Istanbul in 2021. His research interests are Human-Robot Collaboration and Robot Learning for HRI.

Shiye Cao is a first-year Ph.D. student in the Department of Computer Science, co-advised by Dr. Chien-Ming Huang and Dr. Anqi Liu. She received the Bachelor of Science degree in Computer Science with  a second major in Applied Mathematics and Statistics from Johns Hopkins University in 2021, and the Masters of Science in Engineering in Computer Science from Johns Hopkins University in 2022. Her work focuses on user trust and reliance in human-machine collaborative tasks.

 

 

Eugene Lin “Robophysical modeling of spider vibration sensing of prey on orb webs”

Abstract: Orb-weaving spiders are functionally blind and detect prey-generated web vibrations through vibration sensors at their leg joints to locate and identify prey caught in their (near) planar webs. Previous studies focused on how spiders use web geometry, silk properties, and web pre-tension to modulate vibration sensing. Spiders can also dynamically adjust their posture while sensing prey, which may be a form of active sensing (Hung, Corver, Gordus, 2022, APS March Meeting). However, whether this is true and how it works is poorly understood, due to difficulty of measuring the dynamics of the entire prey-web-spider interaction system all at once. Here, we developed a robophysical model of the system to test this hypothesis of active sensing and discover its principles. Our model consists of a vibrating prey robot and a spider robot that can adjust its posture, with torsional springs at leg joints and accelerometers to measure joint vibration. Both robots are attached to a physical web made of cords with qualitatively similar properties to real spider web threads. Load cells measure web pre-tension and a high-speed camera system measure web vibrations and robot movement. Preliminary results showed vibration attenuation through the web from the prey robot. We are currently studying the complex effects of spider robot’s dynamic posture change on vibration propagation across the web and leg joints, by systematically varying the parameters of prey robot vibration, spider robot leg posture, and web pre-tension.

 

Bio: Eugene Lin is a third year PhD student in Dr. Chen Li’s lab (Terradynamics lab). His work focuses on understanding environmental sensing on suspended, sparse terrain. He received a B.S. in Mechanical Engineering at the University of California, San Diego. He recently presented this work at the annual SICB conference and will present it again at the annual March APS conference.

 

 

Aishwarya Pantula “Pick a Side: Untethered Gel Crawlers That Can Break Symmetry”

Abstract: The development of untethered soft crawling robots programmed to respond to environmental stimuli and precisely maneuverable across size scales has been paramount to the fields of soft robotics, drug delivery, and autonomous smart devices. Of particular relevance are reversible thermoresponsive hydrogels, which swell and shrink in the temperature range of (30- 60 °C) for operating such untethered soft robots in human physiological and ambient conditions. While crawling has been demonstrated by thermoresponsive hydrogels, they need surface modifications in the form of rachets, asymmetric patterning, or constraints to achieve unidirectional motion.

Here we demonstrate and validate a new mechanism for untethered, unidirectional crawling for multisegmented gel crawlers built from an active thermoresponsive poly (N-isopropyl acrylamide) (pNIPAM) and passive polyacrylamide (pAAM) on flat unpatterned surfaces. By connecting bilayers of different geometries and thicknesses using a centrally suspended gel linker, we create a morphological gradient along the fore-aft axis, which leads to an asymmetry in the contact forces during the swelling and deswelling of our crawler. We thoroughly explain our mechanism using experiments and finite element simulations and, using experiments, demonstrate that we can tune the generated asymmetry and, in turn, increase the displacement of the crawler by varying linker stiffness, morphology, and the number of bilayer segments. We believe this mechanism can be widely applied across fields of study to create the next generation of autonomous shape-changing and smart locomotors.

Bio: Aishwarya is a 4th year Ph.D. candidate in the lab of Dr. David Gracias at Johns Hopkins University, USA. Her research focuses on exploring smart materials like stimuli-responsive hydrogels, combining them with novel patterning methods like 3D/4D printing, imprint molding, lithography, etc., and using different mechanical design strategies to create untethered biomimetic actuators and locomotors across size scales for soft robotics and biomedical devices.

 

 

Maia Stiber “On using social signals to enable flexible error-aware HRI.”

Abstract: Prior error management techniques often do not possess the versatility to appropriately address robot errors across tasks and scenarios. Their fundamental framework involves explicit, manual error management and implicit domain-specific information driven error management, tailoring their response for specific interaction contexts. We present a framework for approaching error-aware systems by adding implicit social signals as another information channel to create more flexibility in application. To support this notion, we introduce a novel dataset (composed of three data collections) with a focus on understanding natural facial action unit (AU) responses to robot errors during physical-based human-robot interactions—varying across task, error, people, and scenario. Analysis of the dataset reveals that, through the lens of error detection, using AUs as input into error management affords flexibility to the system and has the potential to improve error detection response rate. In addition, we provide an example real-time interactive robot error management system using the error-aware framework.

 

Bio: Maia Stiber is a 4th year Ph.D. candidate in the Department of Computer Science, co-advised by Dr. Chien-Ming Huang and Dr. Russell Taylor. She received a B.S. in Computer Science from Caltech in 2019 and a M.S.E. in Computer Science from Johns Hopkins University in 2021. Her work focuses on leveraging natural human responses to robot errors in an effort to develop flexible error management techniques in support of effective human-robot interaction.

 

Victor Antony “Co-designing with older adults, for older adults: robots to promote physical activity.”

Abstract: Lack of physical activity has severe negative health consequences for older adults and limits their ability to live independently. Robots have been proposed to help engage older adults in physical activity (PA), albeit with limited success. There is a lack of robust understanding of older adults’ needs and wants from robots designed to engage them in PA. In this paper, we report on the findings of a co-design process where older adults, physical therapy experts, and engineers designed robots to promote PA in older adults. We found a variety of motivators for and barriers against PA in older adults; we, then, conceptualized a broad spectrum of possible robotic support and found that robots can play various roles to help older adults engage in PA. This exploratory study elucidated several overarching themes and emphasized the need for personalization and adaptability. This work highlights key design features that researchers and engineers should consider when developing robots to engage older adults in PA, and underscores the importance of involving various stakeholders in the design and development of assistive robots.

 

Bio: Victor Antony is a second-year Ph.D. student in the Department of Computer Science, advised by Dr. Chien-Ming Huang. He received the Bachelor of Science degree in Computer Science from the University of Rochester in 2021. His work focuses on Social Robots for well-being.

 

Mar
8
Wed
LCSR Seminar: Allison Okamura “Wearable Haptic Devices for Ubiquitous Communication” @ Hackerman B17
Mar 8 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

Allison Okamura: “Wearable Haptic Devices for Ubiquitous Communication”

Abstract:

Haptic devices allow touch-based information transfer between humans and intelligent systems, enabling communication in a salient but private manner that frees other sensory channels. For such devices to become ubiquitous, their physical and computational aspects must be intuitive and unobtrusive. The amount of information that can be transmitted through touch is limited in large part by the location, distribution, and sensitivity of human mechanoreceptors. Not surprisingly, many haptic devices are designed to be held or worn at the highly sensitive fingertips, yet stimulation using a device attached to the fingertips precludes natural use of the hands. Thus, we explore the design of a wide array of haptic feedback mechanisms, ranging from devices that can be actively touched by the fingertips to multi-modal haptic actuation mounted on the arm. We demonstrate how these devices are effective in virtual reality, human-machine communication, and human-human communication.

 

Bio:

Allison Okamura received the BS degree from the University of California at Berkeley, and the MS and PhD degrees from Stanford University. She is the Richard W. Weiland Professor of Engineering at Stanford University in the mechanical engineering department, with a courtesy appointment in computer science. She is an IEEE Fellow and is the co-general chair of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems and a deputy director of the Wu Tsai Stanford Neurosciences Institute. Her awards include the IEEE Engineering in Medicine and Biology Society Technical Achievement Award, IEEE Robotics and Automation Society Distinguished Service Award, and Duca Family University Fellow in Undergraduate Education. Her academic interests include haptics, teleoperation, virtual reality, medical robotics, soft robotics, rehabilitation, and education. For more information, please see the CHARM Lab website.

Mar
15
Wed
LCSR Seminar: Debra Mathews “Ethics and Governance of Emerging Technologies” @ Hackerman B17
Mar 15 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract: From genetic engineering to direct to consumer neurotechnology to ChatGPT, it is a standard refrain that science outpaces the development of ethical norms and governance. Further, technologies increasingly cross boundaries from medicine to the consumer market to law enforcement and beyond, in ways that our existing governance structures are not equipped to address. Finally, our standard governance approaches to addressing ethical issues related to new technologies fail to address population and societal-level impacts. This talk will demonstrate the above through a series of examples and describe ongoing work by the US National Academies and others to address these challenges.

 

Bio: Debra JH Mathews, PhD, MA, is the Associate Director for Research and Programs for the Johns Hopkins Berman Institute of Bioethics, and an Associate Professor in the Department of Genetic Medicine, Johns Hopkins University School of Medicine. Within the JHU Institute for Assured Autonomy, Dr. Mathews serves as the Ethics & Governance Lead. Her academic work focuses on ethics and policy issues raised by emerging technologies, with particular focus on genetics, stem cell science, neuroscience, synthetic biology, and artificial intelligence. Dr. Mathews helped found and lead The Hinxton Group, an international collective of scientists, ethicists, policymakers and others, interested in ethical and well-regulated science, and whose work focuses primarily on stem cell research. She has been a member of the Board of Directors of the International Neuroethics Society since 2015, and is currently President-Elect. In addition to her academic work, Dr. Mathews has spent time at the Genetics and Public Policy Center, the US Department of Health and Human Services, the Presidential Commission for the Study of Bioethical Issues, and the National Academy of Medicine working in various capacities on science policy.

Dr. Mathews earned her PhD in genetics from Case Western Reserve University, as well as a concurrent Master’s in bioethics. She completed a Post-Doctoral Fellowship in genetics at Johns Hopkins, and the Greenwall Fellowship in Bioethics and Health Policy at Johns Hopkins and Georgetown Universities.

 

 

Mar
24
Fri
2023 JHU Robotics Industry Day @ Levering Hall - Glass Pavilion
Mar 24 @ 9:00 am – 4:00 pm

2023 Industry Day Agenda/Program

Zoom Link for Morning Session

Friday 3/24 Location: Glass Pavilion – Levering Hall
8:30 AM Registration Open and Breakfast
9:00 AM Welcome
9:05 AM Introduction to LCSR – Russell H. Taylor, Director
9:20 AM LCSR Education – Louis Whitcomb, Deputy Director
9:25 AM IAA – James Bellingham and Anton Dahbura
9:30 AM Student Research Talk – Max Li
9:42 AM Student Research Talk – Divya Ramesh
9:55 AM Student Research Talk – Michael Kam
10:07 AM Student Research Talk – Di Cao
10:20 AM Coffee Break
10:40 AM Johns Hopkins Tech Ventures – Seth Zonies
10:55 AM Industry Talk – Ankur Kapoor, Siemens
11:15 AM Industry Talk – William Tan, GE
11:35 AM New LCSR Faculty – Alejandro Martin-Gomez,
11:55 AM Closing – Russell H. Taylor, Director
12:00 PM Lunch – Resume Roundtables
1:30-4:00 PM Poster and Demo Session (Hackerman Hall)
     1:45-3:45 PM         Guided Krieger Hall Tours (meet outside Hackerman 134)
4:00-5:00 PM Alumni Reception (Shriver Hall – Clipper Room)

 

The Laboratory for Computational Sensing and Robotics will highlight its elite robotics students and showcase cutting-edge research projects in areas that include Medical Robotics, Extreme Environments Robotics, Human-Machine Systems, BioRobotics and more.

Robotics Industry Day will provide top companies and organizations in the private and public sectors with access to the LCSR’s forward-thinking, solution-driven students. The event will also serve as an informal opportunity to explore university-industry partnerships.

You will experience dynamic presentations and discussions, observe live demonstrations, and participate in speed networking sessions that afford you the opportunity to meet Johns Hopkins most talented robotics students before they graduate.

Please contact Ashley Moriarty if you have any questions.



Please contact Ashley Moriarty if you have any questions.

 

Mar
29
Wed
LCSR Seminar: Brett Hobson “The development of robots for open ocean ecology” @ Hackerman B17
Mar 29 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract:

The open ocean is a massive 3D ecosystem responsible for absorbing much of Earth’s excess heat and CO2 emissions produced by humans.  A portion of the ocean’s carbon pump sequesters atmospheric carbon into the sediments of the deep sea. Quantifying the amount of this carbon exported to the deep and identifying the variables driving that export is vital to understanding how we might better mitigate the deleterious effects of climate change. The Monterey Bay Aquarium Institutes MBARI has developed high endurance mobile robots to investigate ocean carbon transport. One of its vehicles, the Benthic Rover has been working continuously on the seafloor at 4000m for 6 years– measuring the spatial and temporal variability of carbon export from the surface. This long-term dataset has revealed that carbon enters the deep sea in large pulses of sinking detritus. MBARI is now focused on connecting these carbon pulses to processes in the upper layers of the ocean. Exploring, mapping and sampling the upper water column to uncover ocean productivity hotspots (HS) is a central/key initiative/goal requiring the collaboration of MBARI’s Long Range Autonomous Underwater Vehicles (LRAUVs) as well as other complementary vehicles that are able to measure the full ecology of the hotspots from the microbes to the whales.

Bio:

Brett W. Hobson received a BS in Mechanical Engineering from San Francisco State University in 1989.  He began his ocean engineering career at Deep Ocean Engineering in San Leandro California, developing remotely operated vehicles (ROVs) and manned submarines. In 1992, he helped start and run Deep Sea Discoveries where he helped develop and operate deep towed sonar and camera systems offshore the US, Venezuela, Spain and the Philippians.  In 1998, he joined Nekton Research in North Carolina to develop bio-inspired underwater vehicles for Navy applications. After the sale of Nekton Research to iRobot in 2005, Hobson joined the Monterey Bay Aquarium Research Institute (MBARI) where he leads the Long Range Autonomous Underwater Vehicle (AUV) program overseeing the development and science operations of a fleet of AUVs.  He also helped develop MBARI’s long-endurance seafloor crawling Benthic Rover. Hobson holds a patent on the design of a biomimetic underwater vehicle and has been the Co-PI on large projects funded by NSF, NASA, and DHS projects aimed at developing novel underwater vehicles for ocean science.

Apr
5
Wed
LCSR Seminar: Student Seminar @ Hackerman B17
Apr 5 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Benjamin D. Killeen “An Autonomous X-ray Image Acquisition and Interpretation System for Assisting Percutaneous Pelvic Fracture Fixation”

Abstract: Percutaneous fracture fixation involves multiple X-ray acquisitions to determine adequate tool trajectories in bony anatomy. In order to reduce time spent adjusting the X-ray imager’s gantry, avoid excess acquisitions, and anticipate inadequate trajectories before penetrating bone, we propose an autonomous system for intra-operative feedback that combines robotic X-ray imaging and machine learning for automated image acquisition and interpretation, respectively. Our approach reconstructs an appropriate trajectory in a two-image sequence, where the optimal second viewpoint is determined based on analysis of the first image. The reconstructed corridor and K-wire pose are compared to determine likelihood of cortical breach, and both are visualized for the clinician in a mixed reality environment that is spatially registered to the patient and delivered by an optical see-through head-mounted display. We assess the upper bounds on system performance through in silico evaluation across 11 CTs with fractures present, in which the corridor and K-wire are adequately reconstructed. In post-hoc analysis of radiographs across 3 cadaveric specimens, our system determines the appropriate trajectory to within 2.8 ± 1.3 mm and 2.7 ± 1.8°. An expert user study with an anthropomorphic phantom demonstrates how our autonomous, integrated system requires fewer images and lower movement to guide and confirm adequate placement compared to current clinical practice.

Bio: A 4th year Ph.D. candidate at Johns Hopkins University, Benjamin D. Killeen is interested in intelligent surgical systems that improve patient outcomes. His recent work involves realistic simulation of interventional X-ray imaging for the purpose of developing AI-integrated surgical systems. Benjamin is a member of the Advanced Robotics and Computationally Augmented Environments (ARCADE) research group, led by Mathias Unberath, as well as the President of the LCSR Graduate Student Association (GSA) and Sports Officer for the MICCAI Student Board. In 2019, he earned a B.A. in Computer Science with Honors from the University of Chicago, with a minor in Physics, and he has completed internships at IBM Research – Almaden, Epic Systems, and Intuitive Surgical. In his spare time, he enjoys bouldering and creative writing.

 

Divya Ramesh “Studying terrestrial fish locomotion on wet deformable substrates”

Abstract: Many amphibious fishes can make forays onto land. The water-land interface often has wet deformable substrates like mud and sand, whose strength changes as they get dryer or wetter, challenging locomotion. Most previous terrestrial locomotion studies of fishes focused on quantifying kinematics, muscle control, and functional morphology. Yet, without quantifying how the complex mechanics of wet deformable substrates affect ground reaction forces during locomotion, we cannot fully understand how these locomotor features interact with the environment to permit performance. Here, we used controlled mud as a model wet deformable substrate and developed methods to prepare mud into spatially uniform and temporally stable states and tools to characterize its strength. As a first step to understand how mud strength impact locomotion, we studied the Atlantic mudskipper (Periophthalmus barbarus) moving on a thicker and a thinner mud, which differs in strength by a factor of two. The animal performed similar “crutching” walks on mud of both strengths, with only a slight reduction in speed on the thinner mud (from 0.39 ± 0.12 to 0.32 ± 0.14 body length/s, P < 0.05, ANOVA). However, it jumped more frequently on the thinner mud (from 1.2 ± 0.7 to 3.2 ± 1.6 times per minute, P < 0.05, ANOVA), likely due to it sticking more to the belly and fins and hindering walking.

Bio: Divya Ramesh is a fourth year PhD student in Dr. Chen Li’s lab (Terradynamics lab). Her current work focuses in studying and understanding amphibious fish locomotion on wet deformable substrates. Her previous work focused in using contact sensing to study and understand limbless locomotion of snakes and snake-robot on 3-D terrains. She received a BTech in Electronics and Communication Engineering from VIT University (Vellore, India) and MSE in Electrical Engineering from University of Pennsylvania. She has published in IEEE RA-L (presented at ICRA 2020) and presented at ICRA 2022. This work was presented in SICB 2023 where she was a finalist for Best Student Presentation in the Division of Comparative Biomechanics.

 

Gargi Sadalgekar “Template-level robophysical models for studying sustained terrestrial locomotion of amphibious fish”

Abstract: Studying terrestrial locomotion of amphibious fishes informs how early tetrapods may have invaded land. The water-land interface often has wet, deformable substrates like mud and sand that challenge locomotion. Recent progress has been made on understanding limbed and limbless tetrapod locomotion by studying robots as active physical models of model organisms. Robophysical models complement animals with their high controllability and repeatability for systematic experiments. They also complement theoretical and computational models because they enact physical laws in the real world, which is especially useful for studying locomotion in complex terrain. Here, we created the first robophysical models for studying sustained terrestrial locomotion of amphibious fishes on controlled mud as a model web deformable substrate. Our three robots are on the template level (lowest degree-of-freedom to generate a target locomotor behavior) and represent mudskippers, ropefish, and bichirs that use appendicular, axial, and axial-appendicular strategies, respectively. The mudskipper robot rotated two fins in phase to raise the body and “crutch” forward on mud. The ropefish robot used body lateral undulation to “surface-swim” on mud. The bichir robot combined body undulation and out-of-phase fin rotations to “army-crawl” forward on mud. Each robot generated qualitatively similar locomotion on mud as its model organism. We are currently refining the robots and performing systematic experiments on mud of a wide range of strengths.

Bio: Gargi Sadalgekar is a 2nd year master’s student in the Terradynamics Lab at Johns Hopkins University and is interested in developing bio-inspired robots to investigate locomotion in extreme environments. Her current work focuses on developing low-order robophysical models of amphibious fish to uncover general principles of locomotion over wet deformable substrates, and this work was presented in SICB 2023 where she was a finalist for Best Student Presentation in the Division of Comparative Biomechanics. Gargi received a BSE in Mechanical and Aerospace Engineering from Princeton University with a minor in Robotics and Information Systems.

 

Yaqing Wang “Force sensing can help robots reconstruct potential energy landscape and guide locomotor transitions to traverse large obstacles”

Abstract: Legged robots already excel at maintaining stability during upright walking and running to step over small obstacles. However, they must further traverse large obstacles comparable to body size to enable a broader range of applications like search and rescue in rubble and sample collection in rocky Martian hills. Our lab’s recent research demonstrated that legged robots can traverse large obstacles if they can be destabilized to transition across various locomotor modes. When viewed on a potential energy landscape of the system, which results from physical interaction with obstacles, these locomotor transitions are strenuous barrier-crossing transitions between landscape basins. Because potential energy landscape gradients are closely related to terrain reaction forces and torques, we hypothesize that sensing obstacle interaction forces allows landscape reconstruction, which can guide robots to cross barriers at the saddle to make transitions more easily (analogous to crossing a mountain ridge at its saddle). Here, we created a robophysical model with custom 3-axis force sensors and surface contact sensors to measure forces and contacts during interaction with large obstacles. We found that the measured forces indeed well captured potential energy landscape gradients and we could use the locally measured gradients to roughly reconstruct the potential energy landscape. Our future work should understand how to enable robots to make locomotor transitions at the landscape saddle based on local landscape reconstruction.

Bio: Yaqing Wang is a fourth-year PhD student in Dr. Chen Li’s lab (Terradynamics lab). His work focuses on understanding locomotor transitions in bio and bio-inspired terrestrial locomotion. He received a B.S. in Mechanical Engineering at Tsinghua University in China. He recently presented this work at the annual APS March meeting.

Apr
12
Wed
LCSR Seminar: Hatice Gunes “Emotional Intelligence for Human-Embodied AI Interaction” @ Hackerman B17
Apr 12 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract:

Emotional intelligence for artificial systems is not a luxury but a necessity. It is paramount for many applications that require both short and long–term engaging human–technology interactions, including entertainment, hospitality, education, and healthcare. However, creating artificially intelligent systems and interfaces with social and emotional skills is a challenging task. Progress in industry and developments in academia provide us a positive outlook, however, the artificial emotional intelligence of the current technology is still quite limited. Creating technology with artificial emotional intelligence requires the development of perception, learning, action and adaptation capabilities, and the ability to execute these pipelines in real-time in human-AI interactions. Truly addressing these challenges relies on cross-fertilization of multiple research fields, including psychology, nonverbal behaviour understanding, psychiatry, vision, social signal processing, affective computing, and human-computer and human-robot interaction. My lab’s research has been pushing the state of the art in a wide spectrum of research topics in this area, including the  design and creation of new datasets; novel feature representations and learning algorithms for sensing and understanding human nonverbal behaviours in solo, dyadic and group settings; designing short/long-term human-robot adaptive interactions for wellbeing; and creating algorithmic solutions to mitigate the bias that creeps into these systems.

In this talk, I will present the recent explorations of the Cambridge Affective Intelligence and Robotics Lab in these areas with insights for human embodied-AI interaction research.

Bio:

Hatice Gunes is a Professor of Affective Intelligence and Robotics (AFAR) and leads the AFAR Lab at the University of Cambridge’s Department of Computer Science and Technology. Her expertise is in the areas of affective computing and social signal processing cross-fertilising research in multimodal interaction, computer vision, signal processing, machine learning and social robotics. She has published over 155 papers in these areas (H-index=36, citations > 7,300),  with  most  recent  works  on lifelong learning for facial expression recognition, fairness, and affective  robotics;  and  longitudinal  HRI  for  wellbeing. She has served as an Associate Editor for IEEE Transactions on Affective Computing, IEEE Transactions on Multimedia, and Image and Vision Computing Journal, and has guest edited many Special Issues, the latest ones being 2022 Int’l Journal of Social Robotics Special Issue on Embodied Agents for Wellbeing, 2022 Frontiers in Robotics and AI Special Issue on Lifelong Learning and Long-Term Human-Robot Interaction, and 2021 IEEE Transactions on Affective Computing  Special  Issue  on  Automated Perception of Human Affect from Longitudinal Behavioural Data. Other research highlights  include Outstanding PC Award at  ACM/IEEE HRI’23, RSJ/KROS Distinguished Interdisciplinary Research Award Finalist at IEEE RO-MAN’21, Distinguished PC  Award  at  IJCAI’21, Best Paper Award Finalist at IEEE RO-MAN’20, Finalist for the 2018 Frontiers Spotlight Award, Outstanding Paper Award at IEEE FG’11, and Best Demo Award at IEEE ACII’09. Prof Gunes is a former President of the Association for the Advancement of Affective Computing (2017-2019), is/was the General Co-Chair of ACM ICMI’24 and ACII’19, and the Program Co-Chair of ACM/IEEE HRI’20 and IEEE FG’17. She was the Chair of the Steering Board of IEEE Transactions on Affective Computing (2017-2019) and was a member of the Human-Robot Interaction Steering Committee (2018-2021. Her research has been supported by various competitive grants, with funding from Google, the Engineering and Physical Sciences Research Council UK (EPSRC), Innovate UK, British Council, Alan Turing Institute and EU Horizon 2020. In 2019 she was awarded a prestigious EPSRC Fellowship to investigate adaptive robotic emotional intelligence for wellbeing (2019-2024) and has been named a Faculty Fellow of the Alan Turing Institute – UK’s national centre for data science and artificial intelligence (2019-2021). Prof Gunes is currently a Staff Fellow of Trinity Hall, a Senior Member of the IEEE, and a member of the AAAC.