Calendar

Nov
2
Wed
LCSR Seminar: Kapil Katyal “Robot Manipulation and Navigation Research at JHU/APL” @ Hackerman B17
Nov 2 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract:

This talk will describe the robotics and AI activities and projects within JHU/APL’s Research and Exploratory Development Department. I will present motivating challenge problems faced by various defense, military and medical sponsors across a number of government agencies. Further, I will highlight several research projects we are currently executing in the areas of robot manipulation, navigation and human robot interaction. Specifically, the projects will highlight areas including underwater manipulation, learned policies for off-road and complex terrain navigation, human robot interaction, heterogenous robot teaming, and fixed wing aerial navigation. Finally, I will present areas of future research and exploration and possible intersections with LCSR.

 

Bio:

Kapil Katyal is a principal researcher and robotics program manager in the Research and Exploratory Development Department at JHU/APL. He completed his PhD at JHU advised by Greg Hager on prediction and perception capabilities for robot navigation. He has worked at JHU/APL since 2007 on several projects spanning robot manipulation, brain machine interfaces, vision algorithms for retinal prosthetics and robot navigation in complex terrains. He holds 5 patents and has co-authored over 30 publications in areas of robotics and AI.

 

 

Nov
9
Wed
LCSR Seminar: Alessandro Roncone “Robots working with and around people” @ Hackerman B17
Nov 9 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract: Robots have begun to transition from assembly lines, where they are physically separated from humans, to human-populated environments and human-enhancing applications, where interaction with people is inevitable. With this shift, research in human-robot interaction (HRI) has grown to allow robots to work with and around humans on complex tasks, augment and enhance people, and provide the best support to them. In this talk, I will provide an overview of the work performed in the HIRO Group and our efforts toward intuitive, human-centered technologies for the next generation of robot workers, assistants, and collaborators. More specifically, I will present our research on: a) robots that are safe to people, b) robots that are capable of operating in complex environments, and c) robots that are good teammates. In all, this research will enable capabilities that were not previously possible, and will impact work domains such as manufacturing, construction, logistics, the home, and health care.

 

Bio: Alessandro Roncone is Assistant Professor in the Computer Science Department at University of Colorado Boulder. He received his B.Sc. summa cum laude in Biomedical Engineering in 2008, and his M.Sc. summa cum laude in NeuroEngineering in 2011 from the University of Genoa, Italy. In 2015 he completed his Ph.D. in Robotics, Cognition and Interaction Technologies from the Italian Institute of Technology [IIT], working on the iCub humanoid in the Robotics, Brain and Cognitive Sciences department and the iCub Facility. From 2015 to 2018, he was Postdoctoral Associate at the Social Robotics Lab in Yale University, performing research in Human-Robot Collaboration for advanced manufacturing. He joined as faculty at CU Boulder in August 2018, where he is director of the Human Interaction and Robotics Group (HIRO, https://hiro-group.ronc.one/ ) and co-director of the Interdisciplinary Research Theme in Engineering Education Research and AI-augmented Learning (EER-AIL IRT,  https://www.colorado.edu/irt/engineering-education-ai/ ).

 

Nov
14
Mon
Special LCSR Seminar: Desire Pantalone “Robotic Surgery in Space” @ Malone G33/35
Nov 14 @ 11:00 am – 12:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract: The target of human flight in space is missions beyond low earth orbit and the Lunar Gateway for deep space exploration and Missions to Mars. Several conditions, such as the effect of weightlessness and radiations on the human body, behavioral health decrements, and communication latency have to be considered. Telemedicine and telerobotic applications, robot-assisted surgery with some hints on experimental surgical procedures carried out in previous missions, have to be considered as well. The need for greater crew autonomy in dealing with health issues is related to the increasing severity of medical and surgical interventions that could occur in these missions, and the presence of a highly trained surgeon on board would be recommended. A surgical robot could be a valuable aid but only insofar as it is provided with multiple functions, including the capability to perform certain procedures autonomously. Providing a multi-functional surgical robot is the new frontier. Research in this field shall be paving the way for the development of new structured plans for human health in space, as well as providing new suggestions for clinical applications on Earth.

 

Bio: Dr. Desire Pantalone MD is a general surgeon with a particular interest in trauma surgery and emergency surgery. She is a staff surgeon in the Unit of Emergency Surgery and part of the Trauma Team of the University Hospital Careggi in Florence. She is also a specialist in General Surgery and  Vascular Surgery. She previously was a Research Associate at the University of Chicago (IL) (Prof M. Michelassi) for Oncological Surgery and for Liver Transplantation and Hepatobiliary Surgery (Dr. J Emond). She is also an instructor for the Advanced Trauma Operative Management (American College of Surgeons Committee for Trauma) and a Fellow of the American College of Surgeons. She is also a Core Board member responsible for “Studies on traumatic events and surgery” in the ESA-Topical Team on “Tissue Healing in Space: Techniques for promoting and monitoring tissue repair and regeneration” for Life Science Activities.

 

Nov
16
Wed
LCSR Seminar: Student Seminar @ Hackerman B17
Nov 16 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Student 1: Maia Stiber “Supporting Effective HRI via Flexible Robot Error Management Using Natural Human Responses”

Abstract: Unexpected robot errors during human-robot interaction are inescapable; they can occur during any task and do not necessarily fit human expectations of possible errors. When left unmanaged, robot errors’ impact on an interaction harms task performance and user trust, resulting in user unwillingness to work with a robot. Prior error management techniques often do not possess the versatility to appropriately address robot errors across tasks and error types as they frequently use task or error specific information for robust management. In this presentation, I describe my work on exploring techniques for creating flexible error management through leveraging natural human responses (social signals) to robot errors as input for error detection and classification across tasks, scenarios, and error types in physical human-robot interaction. I present an error detection method that uses facial reactions for real-time detection and temporal localization of robot error during HRI,  a flexible error-aware framework using traditional and social signal inputs that allow for improved error detection, and an exploration on the effects of robot error severity on natural human responses. I will end my talk by discussing how my current and future work further investigates the use of social signals in the context of HRI for flexible error detection and classification.

Bio: Maia Stiber is a Ph.D. candidate in the Department of Computer Science, co-advised by Dr. Chien-Ming Huang and Dr. Russell Taylor. Her work focuses on leveraging natural human responses to robot errors in an effort to develop flexible error management techniques in support of effective human-robot interaction.

 

 

Student 2: Akwasi Akwaboah “Neuromorphic Cognition and Neural Interfaces”

Abstract: I present research at the Ralph Etienne-Cummings-led Computational Sensor-Motor Systems Lab, Johns Hopkins University on two fronts – (1) Neuromorphic Cognition (NC) focused on the emulation neural physiology at algorithmic and hardware levels, and (2) Neural Interfaces with emphasis on electronics for neural MicroElectrode Array (MEA) characterization. The motivation for the NC front is as follows. The human brain expends a mere 20 watts in learning and inference, exponentially lower than state-of-the-art large language models (GPT-3 and LaMDA). There is the need to innovate sustainable AI hardware as the 3.4x compute doubling per month has drastically outpaced Moore’s law, i.e., a 2-year transistor doubling. Efforts here are geared towards realizing biologically plausible learning rules such as the Hebb’s rule-based Spike-Timing-Dependent Plasticity (STDP) algorithmically and in correspondingly low-power mixed analog-digital VLSI implements. On the same front of achieving a parsimonious artificial intelligence, we are investigating the outcomes of using our models of the primate visual attention to selectively sparsify computation in deep neural networks. At the NI front, we are developing an open-source multichannel potentiostat with parallel data acquisition capability. This work holds implications for rapid characterization and monitoring of neural MEAs often adopted in neural rehabilitation and in neuroscientific experiments. A standard characterization technique is the Electrochemical Impedance (EI) Spectrometry. However, the increasing channel counts in state-of-the-art MEAs (100x and 1000x) imposes the curse of prolonged acquisition time needed for high spectral resolution. Thus, a truly parallel EI spectrometer made available to the scientific community will ameliorate prolonged research time and cost.

Bio: Akwasi Akwaboah joined the Computational Sensory-Motor Systems (CSMS) Lab in Fall 2020 and is working towards his PhD. He received the MSE in Electrical Engineering from the Johns Hopkins University, Baltimore, MD in Summer 2022 en route the PhD. He received the B.Sc. Degree in Biomedical Engineering (First Class Honors) from the Kwame Nkrumah University of Science and Technology, Ghana in 2017. He also received the M.S. degree in Electronics Engineering from Norfolk State University, Norfolk, VA, USA in 2020. His master’s thesis there focused on the formulation of a heuristically optimized computational model of a stem cell-derived cardiomyocyte with implications in cardiac safety pharmacology. He subsequently worked at Dr. James Weiland’s BioElectronic Vision Lab at the University of Michigan, Ann Arbor, MI, USA in 2020; where he collaborated on research in retinal prostheses, calcium imaging and neural electrode characterization. His current interests include the following: neuromorphic circuits and systems, bio-inspired algorithms, computational biology, and neural interfaces. On the lighter side, Akwasi loves to cook and listen to classical and Afrobeats music. He lives by Marie Curie’s quote – “Nothing in life is to be feared, it is only to be understood …

 

Nov
30
Wed
LCSR Seminar: Careers in Robotics: A Panel Discussion With Experts From Industry and Academia @ Hackerman B17
Nov 30 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Panel Speaker 1: Erin Sutton, PhD

Guidance and Control Engineer at the JHU Applied Physics Laboratory

Ph.D. Mechanical Engineering 2017, M.S. Mechanical Engineering 2016

Erin Sutton is a mechanical engineer at Johns Hopkins Applied Physics Laboratory. She received a BS in mechanical engineering from the University of Dayton and an MS and a PhD in mechanical engineering from Johns Hopkins University. She spent a year at the Naval Air Systems Command designing flight simulators before joining APL in 2019. Her primary research interest is in enhancing existing guidance and control systems with autonomy, and her recent projects range from hypersonic missile defense to civil space exploration.

 

Panel Speaker 2: Star Kim, PhD

Job title and affiliation: Management Consultant at McKinsey & Company

Ph.D. Mechanical Engineering 2021

Star is an Associate at a global business management consulting firm, McKinsey & Company. At JHU, she worked on personalizing cardiac surgery by creating patient specific vascular conduits at Dr. Axel Krieger’s IMERSE lab. She made a virtual reality software for doctors to design and evaluate conduits for each patient. Her team filed a patent and founded a startup together, which received funding from the State of Maryland. Before joining JHU, she was at the University of Maryland, College Park and the U.S. Food and Drug Administration. There, she developed and tested patient specific medical devices and systems such as virtual reality mental therapy and orthopedic surgical cutting guides.

 

Panel Speaker 3: Nicole Ortega, MSE

Senior Robotics and Controls Engineer at Johnson and Johnson, Robotics and Digital Solutions

JHU MSE Robotics 2018, JHU BS in Biomedical Engineering 2016

At Johnson and Johnson Nicole works on the Robotis and Controls team to improve the accuracy of their laparoscopic surgery platform.  Before joining J&J, Nicole worked as a contractor for NASA supporting Gateway and at Think Surgical supporting their next generation knee arthroplasty robot.

 

Panel Speaker 4: Ryan Keating, MSE

Software Engineer at Nuro

BS Mechanical Engineering 2013, MSE Robotics 2014

Bio: After finishing my degrees at JHU, I spent two and a half years working at Carnegie Robotics, where I was primarily involved in the development of a land-mine sweeping robot and an inertial navigation system. Following a brief stint working at SRI International to prototype a sandwich-making robot system (yes, really), I have been working on the perception team at Nuro for the past four and a half years. I’ve had the opportunity to work on various parts of the perception stack over that time period, but my largest contributions have been to our backup autonomy system, our object tracking system, and the evaluation framework we use to validate changes to the perception system.

Dec
9
Fri
LCSR Winter Potluck/ Ugly Sweater Bash @ Levering Hall - Glass Pavilion
Dec 9 @ 5:00 pm – 7:00 pm

 

 

All LCSR members, their families, and significant others are invited to the:

 

Ugly (or normal) Sweater Bash
Friday, December 9th
5:00PM-7:00PM
Glass Pavilion

 

You can help by contributing your favorite holiday dish (regional specialties strongly encouraged!) to this pot-luck get together (you don’t have to bring anything to participate). Main dishes will be provided, as will plates, napkins, utensils, etc. Click here to sign up

 

There will a gingerbread decorating contest and prizes for best/ugliest sweater!

 

 

 

 

 

 

 

 

 

Jan
25
Wed
LCSR Seminar: Ugur Tumerdem “Recovering the Sense of Touch for Robotic Surgery and Surgical Training”
Jan 25 @ 12:00 pm – 1:00 pm

Recovering the Sense of Touch for Robotic Surgery and Surgical Training

 

By Ugur Tumerdem

Assistant. Professor of Mechanical Engineering at Marmara University

Visiting Assistant Professor of Mechanical Engineering at Johns Hopkins University

Fulbright Visiting Research Scholar 2022/23

 

Abstract

 

While robotic surgery systems have revolutionized the field of minimally invasive surgery in the past 25 years, their biggest disadvantage since their inception is the lack of haptic feedback to the surgeon. While controlling robotic instrument with teleoperation surgeons operate without their sense of touch and rely on only visual feedback which can result in unwanted complications.

 

In this seminar, I am going to talk about our recent and ongoing work to recover the lost sense of touch in robotic surgery, through new motion control laws, haptic teleoperation and machine learning algorithms as well as novel mechanism design. Major hurdles to providing reliable haptic feedback in robotic surgery systems are the difficulty in obtaining reliable force measurements/estimations from robotic laparoscopic instruments and the lack of transparent teleoperation architectures which can guarantee stability under environment uncertainty or  communication delays. I will be talking about our approaches to solving these issues and on our ongoing project to achieve haptic feedback on the da Vinci Research Kit. As an extension of the technology we are developing, I will also be discussing how the proposed haptic control approaches can be used to connect multiple surgeons through haptic interfaces to enable a new haptic training approach in surgical robotics.

 

Bio

Ugur Tumerdem is an Assistant Professor of Mechanical Engineering at Marmara University,Istanbul, Turkey and a Visiting Professor of Mechanical Engineering at Johns Hopkins University as the recipient of a Fulbright Visiting Research Fellowship in the academic year 2022/2023. Prof. Tumerdem received his B.Sc. in Mechatronics Engineering from Sabanci University, Istanbul, Turkey in 2005, his M.Sc. and Ph.D. degrees in Integrated Design Engineering from Keio University, Tokyo, Japan in 2007 and 2010 respectively. He worked as a postdoctoral researcher at IBM Research – Tokyo in 2011 before joining Marmara University.

 

 

 

Jan
26
Thu
GSA Ice Skating Social
Jan 26 @ 6:00 pm – 7:30 pm

We are super excited for you to join us on our first LCSR social event of this spring term! We are planning to get together for an ice-skating session on Thursday January 26th @6:00 pm at the JHU ice rink, followed by an informal happy hour at the Charles Village Pub (we are not providing food or drinks this time). The ice rink on the night of the 26th is dedicated to JHU grad students, so it’s a good opportunity to mingle with peeps from other departments as well! If you are interested in joining us, please sign up on this google form – we will be taking in people on a first come first serve basis.

 

We currently have 27 available tickets open only to LCSR students. However, you are free to bring in extra guests by signing them up yourselves at this link (please read through JHU’s policy on bringing in non JHU affiliated guests on their website). We will be meeting up at Hackerman breezeway at 5:40pm to head together as a group because all tickets are registered under 3 of our committee members’ names. The skating session will last for 1.5 hours.

 

FAQs:

  1. Do I need to know how to skate? Nope. You are all welcome to join, no matter much or little you know about ice-skating!
  2. Do I need to bring in anything? No. Come as you are! JHU will provide skates, that’s all you’re going to need. Just wear thicker socks for added comfort/padding. But you’re welcome to bring in your own skates or protective equipment (knee/butt pads) if you wish.

 

Lastly, we wanted to emphasize that the aforementioned date is TENTATIVE and weather dependent. Should the clouds bless us with rain on that Thursday, we will need to postpone the event. We will send an email on Monday January 23rd to confirm the final date, but it will most likely be a Thursday or Friday either the week of January 23 or 30.

 

Looking forward to cruising with you soon ⛸️⛸️!

Feb
1
Wed
LCSR Seminar: Dana Yoerger “Recent Results and Future Challenges for Autonomous Underwater Vehicles in Ocean Exploration” @ Hackerman B17
Feb 1 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract:

Recent Results and Future Challenges for Autonomous Underwater Vehicles in Ocean Exploration

Dr. Dana R. Yoerger

Senior Scientist

Dept of Applied Ocean Physics and Engineering

Woods Hole Oceanographic Institution

 

In the past two decades, engineers and scientists have used robots to study basic processes in the deep ocean including the Mid Ocean Ridge, coral habitats, volcanoes, and the deepest trenches. We have also used such vehicles to investigate the environmental impact of the Deepwater Horizon oil spill and to investigate ancient and modern shipwrecks. More recently, we are expanding our efforts to include the mesopelagic or “twilight zone” which extends vertically in the ocean from about 200 to 1000m where sunlight ceases to penetrate.  This regime is particularly under-explored and poorly understood due in large part to the logistical and technological challenges in accessing it.  However, knowledge of this vast region is critical for many reasons, including understanding the global carbon cycle – and Earth’s climate – and for managing biological resources. This talk will show results from our past expeditions and look to future challenges.

 

Bio:

Dr. Dana Yoerger is a Senior Scientist at the Woods Hole Oceanographic Institution and a researcher in robotics and autonomous vehicles.  He supervises the research and academic program of graduate students studying oceanographic engineering through the MIT/WHOI Joint Program in the areas of control, robotics, and design.  Dr. Yoerger has been a key contributor to the remotely-operated vehicle Jason; to the Autonomous Benthic Explorer known as ABE; most recently, to the autonomous underwater vehicle, Sentry; the hybrid remotely operated vehicle, Nereus which reached the bottom of the Mariana Trench in 2009, and most recently Mesobot, a hybrid robot for midwater exploration.  Dr. Yoerger has gone to sea on over 90 oceanographic expeditions exploring the Mid-Ocean Ridge, mapping underwater seamounts and volcanoes, surveying ancient and modern shipwrecks, studying the environmental effects of the Deepwater Horizon oil spill, and the recent effort that located the Voyage Data Recorder from the merchant vessel El Faro. His current research focuses on robots for exploring the midwater regions of the world’s ocean. Dr. Yoerger has served on several National Academies committees and is a member of the Research Board of the Gulf of Mexico Research Initiative.  He has a PhD in mechanical engineering from the Massachusetts Institute of Technology and is a Fellow of the IEEE.

 

 

 

Feb
6
Mon
GSA: Bagel Day
Feb 6 @ 10:30 am – 1:00 pm

Bagels!