Calendar

Nov
30
Wed
LCSR Seminar: Careers in Robotics: A Panel Discussion With Experts From Industry and Academia @ Hackerman B17
Nov 30 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Panel Speaker 1: Erin Sutton, PhD

Guidance and Control Engineer at the JHU Applied Physics Laboratory

Ph.D. Mechanical Engineering 2017, M.S. Mechanical Engineering 2016

Erin Sutton is a mechanical engineer at Johns Hopkins Applied Physics Laboratory. She received a BS in mechanical engineering from the University of Dayton and an MS and a PhD in mechanical engineering from Johns Hopkins University. She spent a year at the Naval Air Systems Command designing flight simulators before joining APL in 2019. Her primary research interest is in enhancing existing guidance and control systems with autonomy, and her recent projects range from hypersonic missile defense to civil space exploration.

 

Panel Speaker 2: Star Kim, PhD

Job title and affiliation: Management Consultant at McKinsey & Company

Ph.D. Mechanical Engineering 2021

Star is an Associate at a global business management consulting firm, McKinsey & Company. At JHU, she worked on personalizing cardiac surgery by creating patient specific vascular conduits at Dr. Axel Krieger’s IMERSE lab. She made a virtual reality software for doctors to design and evaluate conduits for each patient. Her team filed a patent and founded a startup together, which received funding from the State of Maryland. Before joining JHU, she was at the University of Maryland, College Park and the U.S. Food and Drug Administration. There, she developed and tested patient specific medical devices and systems such as virtual reality mental therapy and orthopedic surgical cutting guides.

 

Panel Speaker 3: Nicole Ortega, MSE

Senior Robotics and Controls Engineer at Johnson and Johnson, Robotics and Digital Solutions

JHU MSE Robotics 2018, JHU BS in Biomedical Engineering 2016

At Johnson and Johnson Nicole works on the Robotis and Controls team to improve the accuracy of their laparoscopic surgery platform.  Before joining J&J, Nicole worked as a contractor for NASA supporting Gateway and at Think Surgical supporting their next generation knee arthroplasty robot.

 

Panel Speaker 4: Ryan Keating, MSE

Software Engineer at Nuro

BS Mechanical Engineering 2013, MSE Robotics 2014

Bio: After finishing my degrees at JHU, I spent two and a half years working at Carnegie Robotics, where I was primarily involved in the development of a land-mine sweeping robot and an inertial navigation system. Following a brief stint working at SRI International to prototype a sandwich-making robot system (yes, really), I have been working on the perception team at Nuro for the past four and a half years. I’ve had the opportunity to work on various parts of the perception stack over that time period, but my largest contributions have been to our backup autonomy system, our object tracking system, and the evaluation framework we use to validate changes to the perception system.

Dec
9
Fri
LCSR Winter Potluck/ Ugly Sweater Bash @ Levering Hall - Glass Pavilion
Dec 9 @ 5:00 pm – 7:00 pm

 

 

All LCSR members, their families, and significant others are invited to the:

 

Ugly (or normal) Sweater Bash
Friday, December 9th
5:00PM-7:00PM
Glass Pavilion

 

You can help by contributing your favorite holiday dish (regional specialties strongly encouraged!) to this pot-luck get together (you don’t have to bring anything to participate). Main dishes will be provided, as will plates, napkins, utensils, etc. Click here to sign up

 

There will a gingerbread decorating contest and prizes for best/ugliest sweater!

 

 

 

 

 

 

 

 

 

Jan
25
Wed
LCSR Seminar: Ugur Tumerdem “Recovering the Sense of Touch for Robotic Surgery and Surgical Training”
Jan 25 @ 12:00 pm – 1:00 pm

Recovering the Sense of Touch for Robotic Surgery and Surgical Training

 

By Ugur Tumerdem

Assistant. Professor of Mechanical Engineering at Marmara University

Visiting Assistant Professor of Mechanical Engineering at Johns Hopkins University

Fulbright Visiting Research Scholar 2022/23

 

Abstract

 

While robotic surgery systems have revolutionized the field of minimally invasive surgery in the past 25 years, their biggest disadvantage since their inception is the lack of haptic feedback to the surgeon. While controlling robotic instrument with teleoperation surgeons operate without their sense of touch and rely on only visual feedback which can result in unwanted complications.

 

In this seminar, I am going to talk about our recent and ongoing work to recover the lost sense of touch in robotic surgery, through new motion control laws, haptic teleoperation and machine learning algorithms as well as novel mechanism design. Major hurdles to providing reliable haptic feedback in robotic surgery systems are the difficulty in obtaining reliable force measurements/estimations from robotic laparoscopic instruments and the lack of transparent teleoperation architectures which can guarantee stability under environment uncertainty or  communication delays. I will be talking about our approaches to solving these issues and on our ongoing project to achieve haptic feedback on the da Vinci Research Kit. As an extension of the technology we are developing, I will also be discussing how the proposed haptic control approaches can be used to connect multiple surgeons through haptic interfaces to enable a new haptic training approach in surgical robotics.

 

Bio

Ugur Tumerdem is an Assistant Professor of Mechanical Engineering at Marmara University,Istanbul, Turkey and a Visiting Professor of Mechanical Engineering at Johns Hopkins University as the recipient of a Fulbright Visiting Research Fellowship in the academic year 2022/2023. Prof. Tumerdem received his B.Sc. in Mechatronics Engineering from Sabanci University, Istanbul, Turkey in 2005, his M.Sc. and Ph.D. degrees in Integrated Design Engineering from Keio University, Tokyo, Japan in 2007 and 2010 respectively. He worked as a postdoctoral researcher at IBM Research – Tokyo in 2011 before joining Marmara University.

 

 

 

Jan
26
Thu
GSA Ice Skating Social
Jan 26 @ 6:00 pm – 7:30 pm

We are super excited for you to join us on our first LCSR social event of this spring term! We are planning to get together for an ice-skating session on Thursday January 26th @6:00 pm at the JHU ice rink, followed by an informal happy hour at the Charles Village Pub (we are not providing food or drinks this time). The ice rink on the night of the 26th is dedicated to JHU grad students, so it’s a good opportunity to mingle with peeps from other departments as well! If you are interested in joining us, please sign up on this google form – we will be taking in people on a first come first serve basis.

 

We currently have 27 available tickets open only to LCSR students. However, you are free to bring in extra guests by signing them up yourselves at this link (please read through JHU’s policy on bringing in non JHU affiliated guests on their website). We will be meeting up at Hackerman breezeway at 5:40pm to head together as a group because all tickets are registered under 3 of our committee members’ names. The skating session will last for 1.5 hours.

 

FAQs:

  1. Do I need to know how to skate? Nope. You are all welcome to join, no matter much or little you know about ice-skating!
  2. Do I need to bring in anything? No. Come as you are! JHU will provide skates, that’s all you’re going to need. Just wear thicker socks for added comfort/padding. But you’re welcome to bring in your own skates or protective equipment (knee/butt pads) if you wish.

 

Lastly, we wanted to emphasize that the aforementioned date is TENTATIVE and weather dependent. Should the clouds bless us with rain on that Thursday, we will need to postpone the event. We will send an email on Monday January 23rd to confirm the final date, but it will most likely be a Thursday or Friday either the week of January 23 or 30.

 

Looking forward to cruising with you soon ⛸️⛸️!

Feb
1
Wed
LCSR Seminar: Dana Yoerger “Recent Results and Future Challenges for Autonomous Underwater Vehicles in Ocean Exploration” @ Hackerman B17
Feb 1 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract:

Recent Results and Future Challenges for Autonomous Underwater Vehicles in Ocean Exploration

Dr. Dana R. Yoerger

Senior Scientist

Dept of Applied Ocean Physics and Engineering

Woods Hole Oceanographic Institution

 

In the past two decades, engineers and scientists have used robots to study basic processes in the deep ocean including the Mid Ocean Ridge, coral habitats, volcanoes, and the deepest trenches. We have also used such vehicles to investigate the environmental impact of the Deepwater Horizon oil spill and to investigate ancient and modern shipwrecks. More recently, we are expanding our efforts to include the mesopelagic or “twilight zone” which extends vertically in the ocean from about 200 to 1000m where sunlight ceases to penetrate.  This regime is particularly under-explored and poorly understood due in large part to the logistical and technological challenges in accessing it.  However, knowledge of this vast region is critical for many reasons, including understanding the global carbon cycle – and Earth’s climate – and for managing biological resources. This talk will show results from our past expeditions and look to future challenges.

 

Bio:

Dr. Dana Yoerger is a Senior Scientist at the Woods Hole Oceanographic Institution and a researcher in robotics and autonomous vehicles.  He supervises the research and academic program of graduate students studying oceanographic engineering through the MIT/WHOI Joint Program in the areas of control, robotics, and design.  Dr. Yoerger has been a key contributor to the remotely-operated vehicle Jason; to the Autonomous Benthic Explorer known as ABE; most recently, to the autonomous underwater vehicle, Sentry; the hybrid remotely operated vehicle, Nereus which reached the bottom of the Mariana Trench in 2009, and most recently Mesobot, a hybrid robot for midwater exploration.  Dr. Yoerger has gone to sea on over 90 oceanographic expeditions exploring the Mid-Ocean Ridge, mapping underwater seamounts and volcanoes, surveying ancient and modern shipwrecks, studying the environmental effects of the Deepwater Horizon oil spill, and the recent effort that located the Voyage Data Recorder from the merchant vessel El Faro. His current research focuses on robots for exploring the midwater regions of the world’s ocean. Dr. Yoerger has served on several National Academies committees and is a member of the Research Board of the Gulf of Mexico Research Initiative.  He has a PhD in mechanical engineering from the Massachusetts Institute of Technology and is a Fellow of the IEEE.

 

 

 

Feb
6
Mon
GSA: Bagel Day
Feb 6 @ 10:30 am – 1:00 pm

Bagels!

Feb
8
Wed
LCSR Seminar: Mark Savage “Resumes” @ Hackerman B17
Feb 8 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Mark Savage is the Johns Hopkins Life Design Educator for Engineering Masters Students, advising on all aspects of career development and the internship / job search, with the Handshake Career Management System as a necessary tool.  Look for weekly newsletters to soon be emailed to Homewood WSE Masters Students on Sunday Nights.

 

 

 

Feb
15
Wed
LCSR Seminar: Brent Gillespie “Predicting Human Behavior in Predictable Environments Using the Internal Model Principle” @ Hackerman B17
Feb 15 @ 12:00 pm – 1:00 pm

 

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract:

All models are wrong, and too many are directed inward. The Internal Model Principle of control engineering directs our attention (and modeling proficiency) to what makes the world around us patterned and predictable.  It says that driving a model of that patterned or predictable behavior in a feedback loop is the only way to achieve perfect tracking or disturbance rejection. In the spirit of “some models are useful”, I will present a control system model of humans tracking moving targets on a screen using a mouse and cursor. Simple analyses reveal this controller’s robustness to visual blanking and experiments (even experiments conducted remotely during the pandemic) provide ample support. Extensions that combine feedforward and feedback control complete the picture and complement existing literature in human motor behavior, most of which is focused on modeling the system under control rather than the environment.

Bio:

Brent Gillespie is a Professor of Mechanical Engineering and Robotics at the University of Michigan. He received a Bachelor of Science in Mechanical Engineering from the University of California Davis in 1986, a Master of Music from the San Francisco Conservatory of Music in 1989, and a Ph.D. in Mechanical Engineering from Stanford University in 1996. His research interests include haptic interface, human motor behavior, haptic shared control, and robot-assisted rehabilitation after neurological injury. Prof. Gillespie’s awards include the Popular Science Invention Award (2016), the University of Michigan Provost’s Teaching Innovation Prize (2012), and the Presidential Early Career Award for Scientists and Engineers (2001).

 

Feb
22
Wed
LCSR Seminar: Joshua Mangelson “Steps Towards Intelligent Autonomous Underwater Inspection and Data Collection” @ Hackerman B17
Feb 22 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

 

Abstract:

Over 70% of our world is underwater, but less than 1% of the world’s oceans have been mapped at resolutions greater than 100m per pixel. Regular inspection, mapping, and data collection in marine environments is essential for a whole host of reasons including gaining a scientific understanding of our planet, civil infrastructure maintenance, and safe navigation. However, manual inspection/data collection using divers is expensive, dangerous, time-consuming, and tedious work.

 

In this talk, I will discuss the use of autonomous underwater vehicles (AUVs) and autonomous surface vessels (ASVs) to automatically and intelligently map, inspect, and collect information in unstructured marine environments. In particular, we will discuss the problems present in this space as well as the contributions my lab is making towards addressing these problems, including i) the development of a general-purpose marine robotics testbed at BYU, ii) the development of a marine robotics simulator called HoloOcean (https://holoocean.readthedocs.io/en/stable/), iii) advancements in marine robotic localization using Lie groups, and iv) preliminary results towards expert-guided topic modeling and intelligent data collection.

 

Bio:

Dr. Joshua Mangelson holds PhD and Masters degrees in Robotics from the University of Michigan. After completing his degre, he served as a post-doctoral fellow at Carnegie Mellon University before joining the Electrical and Computer Engineering faculty at Brigham Young University in 2020. His qualifications include demonstrated expertise in robotic perception, mapping, and localization with a particular focus on marine robotics. He has extensive experience leading marine robotic field trials in various locations around the world including San Diego, Hawaii, Boston, northern Michigan, and Utah. In 2018, his work on multi-robot mapping received the Best Multi-Robot Paper Award at the IEEE ICRA conference and 1st-Place in the IEEE OCEANS Student Poster Competition. He is currently serving as an associate editor for The International Journal of Robotics Research (IJRR) and the IEEE/RSJ IROS Conference.

 

Mar
1
Wed
LCSR Seminar: Student Seminars @ Hackerman B17
Mar 1 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Ulas Berk Karli and Shiye (Sally) Cao “What if it is wrong: effects of power dynamics and trust repair strategy on trust and compliance in HRI.”

Abstract: Robotic systems designed to work alongside people are susceptible to technical and unexpected errors. Prior work has investigated a variety of strategies aimed at repairing people’s trust in the robot after its erroneous operations. In this work, we explore the effect of post-error trust repair strategies (promise and explanation) on people’s trust in the robot under varying power dynamics (supervisor and subordinate robot). Our results show that, regardless of the power dynamics, promise is more effective at repairing user trust than explanation. Moreover, people found a supervisor robot with verbal trust repair to be more trustworthy than a subordinate robot with verbal trust repair. Our results further reveal that people are prone to complying with the supervisor robot even if it is wrong. We discuss the ethical concerns in the use of supervisor robot and potential interventions to prevent improper compliance in users for more productive human-robot collaboration.

 

Bio: Ulas Berk Karli is a MSE student in Robotics LCSR, Johns Hopkins University. He received the Bachelor of Science degree in Mechanical Engineering and Double Majored in Computer Engineering from Koc University, Istanbul in 2021. His research interests are Human-Robot Collaboration and Robot Learning for HRI.

Shiye Cao is a first-year Ph.D. student in the Department of Computer Science, co-advised by Dr. Chien-Ming Huang and Dr. Anqi Liu. She received the Bachelor of Science degree in Computer Science with  a second major in Applied Mathematics and Statistics from Johns Hopkins University in 2021, and the Masters of Science in Engineering in Computer Science from Johns Hopkins University in 2022. Her work focuses on user trust and reliance in human-machine collaborative tasks.

 

 

Eugene Lin “Robophysical modeling of spider vibration sensing of prey on orb webs”

Abstract: Orb-weaving spiders are functionally blind and detect prey-generated web vibrations through vibration sensors at their leg joints to locate and identify prey caught in their (near) planar webs. Previous studies focused on how spiders use web geometry, silk properties, and web pre-tension to modulate vibration sensing. Spiders can also dynamically adjust their posture while sensing prey, which may be a form of active sensing (Hung, Corver, Gordus, 2022, APS March Meeting). However, whether this is true and how it works is poorly understood, due to difficulty of measuring the dynamics of the entire prey-web-spider interaction system all at once. Here, we developed a robophysical model of the system to test this hypothesis of active sensing and discover its principles. Our model consists of a vibrating prey robot and a spider robot that can adjust its posture, with torsional springs at leg joints and accelerometers to measure joint vibration. Both robots are attached to a physical web made of cords with qualitatively similar properties to real spider web threads. Load cells measure web pre-tension and a high-speed camera system measure web vibrations and robot movement. Preliminary results showed vibration attenuation through the web from the prey robot. We are currently studying the complex effects of spider robot’s dynamic posture change on vibration propagation across the web and leg joints, by systematically varying the parameters of prey robot vibration, spider robot leg posture, and web pre-tension.

 

Bio: Eugene Lin is a third year PhD student in Dr. Chen Li’s lab (Terradynamics lab). His work focuses on understanding environmental sensing on suspended, sparse terrain. He received a B.S. in Mechanical Engineering at the University of California, San Diego. He recently presented this work at the annual SICB conference and will present it again at the annual March APS conference.

 

 

Aishwarya Pantula “Pick a Side: Untethered Gel Crawlers That Can Break Symmetry”

Abstract: The development of untethered soft crawling robots programmed to respond to environmental stimuli and precisely maneuverable across size scales has been paramount to the fields of soft robotics, drug delivery, and autonomous smart devices. Of particular relevance are reversible thermoresponsive hydrogels, which swell and shrink in the temperature range of (30- 60 °C) for operating such untethered soft robots in human physiological and ambient conditions. While crawling has been demonstrated by thermoresponsive hydrogels, they need surface modifications in the form of rachets, asymmetric patterning, or constraints to achieve unidirectional motion.

Here we demonstrate and validate a new mechanism for untethered, unidirectional crawling for multisegmented gel crawlers built from an active thermoresponsive poly (N-isopropyl acrylamide) (pNIPAM) and passive polyacrylamide (pAAM) on flat unpatterned surfaces. By connecting bilayers of different geometries and thicknesses using a centrally suspended gel linker, we create a morphological gradient along the fore-aft axis, which leads to an asymmetry in the contact forces during the swelling and deswelling of our crawler. We thoroughly explain our mechanism using experiments and finite element simulations and, using experiments, demonstrate that we can tune the generated asymmetry and, in turn, increase the displacement of the crawler by varying linker stiffness, morphology, and the number of bilayer segments. We believe this mechanism can be widely applied across fields of study to create the next generation of autonomous shape-changing and smart locomotors.

Bio: Aishwarya is a 4th year Ph.D. candidate in the lab of Dr. David Gracias at Johns Hopkins University, USA. Her research focuses on exploring smart materials like stimuli-responsive hydrogels, combining them with novel patterning methods like 3D/4D printing, imprint molding, lithography, etc., and using different mechanical design strategies to create untethered biomimetic actuators and locomotors across size scales for soft robotics and biomedical devices.

 

 

Maia Stiber “On using social signals to enable flexible error-aware HRI.”

Abstract: Prior error management techniques often do not possess the versatility to appropriately address robot errors across tasks and scenarios. Their fundamental framework involves explicit, manual error management and implicit domain-specific information driven error management, tailoring their response for specific interaction contexts. We present a framework for approaching error-aware systems by adding implicit social signals as another information channel to create more flexibility in application. To support this notion, we introduce a novel dataset (composed of three data collections) with a focus on understanding natural facial action unit (AU) responses to robot errors during physical-based human-robot interactions—varying across task, error, people, and scenario. Analysis of the dataset reveals that, through the lens of error detection, using AUs as input into error management affords flexibility to the system and has the potential to improve error detection response rate. In addition, we provide an example real-time interactive robot error management system using the error-aware framework.

 

Bio: Maia Stiber is a 4th year Ph.D. candidate in the Department of Computer Science, co-advised by Dr. Chien-Ming Huang and Dr. Russell Taylor. She received a B.S. in Computer Science from Caltech in 2019 and a M.S.E. in Computer Science from Johns Hopkins University in 2021. Her work focuses on leveraging natural human responses to robot errors in an effort to develop flexible error management techniques in support of effective human-robot interaction.

 

Victor Antony “Co-designing with older adults, for older adults: robots to promote physical activity.”

Abstract: Lack of physical activity has severe negative health consequences for older adults and limits their ability to live independently. Robots have been proposed to help engage older adults in physical activity (PA), albeit with limited success. There is a lack of robust understanding of older adults’ needs and wants from robots designed to engage them in PA. In this paper, we report on the findings of a co-design process where older adults, physical therapy experts, and engineers designed robots to promote PA in older adults. We found a variety of motivators for and barriers against PA in older adults; we, then, conceptualized a broad spectrum of possible robotic support and found that robots can play various roles to help older adults engage in PA. This exploratory study elucidated several overarching themes and emphasized the need for personalization and adaptability. This work highlights key design features that researchers and engineers should consider when developing robots to engage older adults in PA, and underscores the importance of involving various stakeholders in the design and development of assistive robots.

 

Bio: Victor Antony is a second-year Ph.D. student in the Department of Computer Science, advised by Dr. Chien-Ming Huang. He received the Bachelor of Science degree in Computer Science from the University of Rochester in 2021. His work focuses on Social Robots for well-being.