Calendar

Mar
8
Wed
LCSR Seminar: Allison Okamura “Wearable Haptic Devices for Ubiquitous Communication” @ Hackerman B17
Mar 8 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

Allison Okamura: “Wearable Haptic Devices for Ubiquitous Communication”

Abstract:

Haptic devices allow touch-based information transfer between humans and intelligent systems, enabling communication in a salient but private manner that frees other sensory channels. For such devices to become ubiquitous, their physical and computational aspects must be intuitive and unobtrusive. The amount of information that can be transmitted through touch is limited in large part by the location, distribution, and sensitivity of human mechanoreceptors. Not surprisingly, many haptic devices are designed to be held or worn at the highly sensitive fingertips, yet stimulation using a device attached to the fingertips precludes natural use of the hands. Thus, we explore the design of a wide array of haptic feedback mechanisms, ranging from devices that can be actively touched by the fingertips to multi-modal haptic actuation mounted on the arm. We demonstrate how these devices are effective in virtual reality, human-machine communication, and human-human communication.

 

Bio:

Allison Okamura received the BS degree from the University of California at Berkeley, and the MS and PhD degrees from Stanford University. She is the Richard W. Weiland Professor of Engineering at Stanford University in the mechanical engineering department, with a courtesy appointment in computer science. She is an IEEE Fellow and is the co-general chair of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems and a deputy director of the Wu Tsai Stanford Neurosciences Institute. Her awards include the IEEE Engineering in Medicine and Biology Society Technical Achievement Award, IEEE Robotics and Automation Society Distinguished Service Award, and Duca Family University Fellow in Undergraduate Education. Her academic interests include haptics, teleoperation, virtual reality, medical robotics, soft robotics, rehabilitation, and education. For more information, please see the CHARM Lab website.

Mar
15
Wed
LCSR Seminar: Debra Mathews “Ethics and Governance of Emerging Technologies” @ Hackerman B17
Mar 15 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract: From genetic engineering to direct to consumer neurotechnology to ChatGPT, it is a standard refrain that science outpaces the development of ethical norms and governance. Further, technologies increasingly cross boundaries from medicine to the consumer market to law enforcement and beyond, in ways that our existing governance structures are not equipped to address. Finally, our standard governance approaches to addressing ethical issues related to new technologies fail to address population and societal-level impacts. This talk will demonstrate the above through a series of examples and describe ongoing work by the US National Academies and others to address these challenges.

 

Bio: Debra JH Mathews, PhD, MA, is the Associate Director for Research and Programs for the Johns Hopkins Berman Institute of Bioethics, and an Associate Professor in the Department of Genetic Medicine, Johns Hopkins University School of Medicine. Within the JHU Institute for Assured Autonomy, Dr. Mathews serves as the Ethics & Governance Lead. Her academic work focuses on ethics and policy issues raised by emerging technologies, with particular focus on genetics, stem cell science, neuroscience, synthetic biology, and artificial intelligence. Dr. Mathews helped found and lead The Hinxton Group, an international collective of scientists, ethicists, policymakers and others, interested in ethical and well-regulated science, and whose work focuses primarily on stem cell research. She has been a member of the Board of Directors of the International Neuroethics Society since 2015, and is currently President-Elect. In addition to her academic work, Dr. Mathews has spent time at the Genetics and Public Policy Center, the US Department of Health and Human Services, the Presidential Commission for the Study of Bioethical Issues, and the National Academy of Medicine working in various capacities on science policy.

Dr. Mathews earned her PhD in genetics from Case Western Reserve University, as well as a concurrent Master’s in bioethics. She completed a Post-Doctoral Fellowship in genetics at Johns Hopkins, and the Greenwall Fellowship in Bioethics and Health Policy at Johns Hopkins and Georgetown Universities.

 

 

Mar
24
Fri
2023 JHU Robotics Industry Day @ Levering Hall - Glass Pavilion
Mar 24 @ 9:00 am – 4:00 pm

2023 Industry Day Agenda/Program

Zoom Link for Morning Session

Friday 3/24 Location: Glass Pavilion – Levering Hall
8:30 AM Registration Open and Breakfast
9:00 AM Welcome
9:05 AM Introduction to LCSR – Russell H. Taylor, Director
9:20 AM LCSR Education – Louis Whitcomb, Deputy Director
9:25 AM IAA – James Bellingham and Anton Dahbura
9:30 AM Student Research Talk – Max Li
9:42 AM Student Research Talk – Divya Ramesh
9:55 AM Student Research Talk – Michael Kam
10:07 AM Student Research Talk – Di Cao
10:20 AM Coffee Break
10:40 AM Johns Hopkins Tech Ventures – Seth Zonies
10:55 AM Industry Talk – Ankur Kapoor, Siemens
11:15 AM Industry Talk – William Tan, GE
11:35 AM New LCSR Faculty – Alejandro Martin-Gomez,
11:55 AM Closing – Russell H. Taylor, Director
12:00 PM Lunch – Resume Roundtables
1:30-4:00 PM Poster and Demo Session (Hackerman Hall)
     1:45-3:45 PM         Guided Krieger Hall Tours (meet outside Hackerman 134)
4:00-5:00 PM Alumni Reception (Shriver Hall – Clipper Room)

 

The Laboratory for Computational Sensing and Robotics will highlight its elite robotics students and showcase cutting-edge research projects in areas that include Medical Robotics, Extreme Environments Robotics, Human-Machine Systems, BioRobotics and more.

Robotics Industry Day will provide top companies and organizations in the private and public sectors with access to the LCSR’s forward-thinking, solution-driven students. The event will also serve as an informal opportunity to explore university-industry partnerships.

You will experience dynamic presentations and discussions, observe live demonstrations, and participate in speed networking sessions that afford you the opportunity to meet Johns Hopkins most talented robotics students before they graduate.

Please contact Ashley Moriarty if you have any questions.



Please contact Ashley Moriarty if you have any questions.

 

Mar
29
Wed
LCSR Seminar: Brett Hobson “The development of robots for open ocean ecology” @ Hackerman B17
Mar 29 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract:

The open ocean is a massive 3D ecosystem responsible for absorbing much of Earth’s excess heat and CO2 emissions produced by humans.  A portion of the ocean’s carbon pump sequesters atmospheric carbon into the sediments of the deep sea. Quantifying the amount of this carbon exported to the deep and identifying the variables driving that export is vital to understanding how we might better mitigate the deleterious effects of climate change. The Monterey Bay Aquarium Institutes MBARI has developed high endurance mobile robots to investigate ocean carbon transport. One of its vehicles, the Benthic Rover has been working continuously on the seafloor at 4000m for 6 years– measuring the spatial and temporal variability of carbon export from the surface. This long-term dataset has revealed that carbon enters the deep sea in large pulses of sinking detritus. MBARI is now focused on connecting these carbon pulses to processes in the upper layers of the ocean. Exploring, mapping and sampling the upper water column to uncover ocean productivity hotspots (HS) is a central/key initiative/goal requiring the collaboration of MBARI’s Long Range Autonomous Underwater Vehicles (LRAUVs) as well as other complementary vehicles that are able to measure the full ecology of the hotspots from the microbes to the whales.

Bio:

Brett W. Hobson received a BS in Mechanical Engineering from San Francisco State University in 1989.  He began his ocean engineering career at Deep Ocean Engineering in San Leandro California, developing remotely operated vehicles (ROVs) and manned submarines. In 1992, he helped start and run Deep Sea Discoveries where he helped develop and operate deep towed sonar and camera systems offshore the US, Venezuela, Spain and the Philippians.  In 1998, he joined Nekton Research in North Carolina to develop bio-inspired underwater vehicles for Navy applications. After the sale of Nekton Research to iRobot in 2005, Hobson joined the Monterey Bay Aquarium Research Institute (MBARI) where he leads the Long Range Autonomous Underwater Vehicle (AUV) program overseeing the development and science operations of a fleet of AUVs.  He also helped develop MBARI’s long-endurance seafloor crawling Benthic Rover. Hobson holds a patent on the design of a biomimetic underwater vehicle and has been the Co-PI on large projects funded by NSF, NASA, and DHS projects aimed at developing novel underwater vehicles for ocean science.

Apr
5
Wed
LCSR Seminar: Student Seminar @ Hackerman B17
Apr 5 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Benjamin D. Killeen “An Autonomous X-ray Image Acquisition and Interpretation System for Assisting Percutaneous Pelvic Fracture Fixation”

Abstract: Percutaneous fracture fixation involves multiple X-ray acquisitions to determine adequate tool trajectories in bony anatomy. In order to reduce time spent adjusting the X-ray imager’s gantry, avoid excess acquisitions, and anticipate inadequate trajectories before penetrating bone, we propose an autonomous system for intra-operative feedback that combines robotic X-ray imaging and machine learning for automated image acquisition and interpretation, respectively. Our approach reconstructs an appropriate trajectory in a two-image sequence, where the optimal second viewpoint is determined based on analysis of the first image. The reconstructed corridor and K-wire pose are compared to determine likelihood of cortical breach, and both are visualized for the clinician in a mixed reality environment that is spatially registered to the patient and delivered by an optical see-through head-mounted display. We assess the upper bounds on system performance through in silico evaluation across 11 CTs with fractures present, in which the corridor and K-wire are adequately reconstructed. In post-hoc analysis of radiographs across 3 cadaveric specimens, our system determines the appropriate trajectory to within 2.8 ± 1.3 mm and 2.7 ± 1.8°. An expert user study with an anthropomorphic phantom demonstrates how our autonomous, integrated system requires fewer images and lower movement to guide and confirm adequate placement compared to current clinical practice.

Bio: A 4th year Ph.D. candidate at Johns Hopkins University, Benjamin D. Killeen is interested in intelligent surgical systems that improve patient outcomes. His recent work involves realistic simulation of interventional X-ray imaging for the purpose of developing AI-integrated surgical systems. Benjamin is a member of the Advanced Robotics and Computationally Augmented Environments (ARCADE) research group, led by Mathias Unberath, as well as the President of the LCSR Graduate Student Association (GSA) and Sports Officer for the MICCAI Student Board. In 2019, he earned a B.A. in Computer Science with Honors from the University of Chicago, with a minor in Physics, and he has completed internships at IBM Research – Almaden, Epic Systems, and Intuitive Surgical. In his spare time, he enjoys bouldering and creative writing.

 

Divya Ramesh “Studying terrestrial fish locomotion on wet deformable substrates”

Abstract: Many amphibious fishes can make forays onto land. The water-land interface often has wet deformable substrates like mud and sand, whose strength changes as they get dryer or wetter, challenging locomotion. Most previous terrestrial locomotion studies of fishes focused on quantifying kinematics, muscle control, and functional morphology. Yet, without quantifying how the complex mechanics of wet deformable substrates affect ground reaction forces during locomotion, we cannot fully understand how these locomotor features interact with the environment to permit performance. Here, we used controlled mud as a model wet deformable substrate and developed methods to prepare mud into spatially uniform and temporally stable states and tools to characterize its strength. As a first step to understand how mud strength impact locomotion, we studied the Atlantic mudskipper (Periophthalmus barbarus) moving on a thicker and a thinner mud, which differs in strength by a factor of two. The animal performed similar “crutching” walks on mud of both strengths, with only a slight reduction in speed on the thinner mud (from 0.39 ± 0.12 to 0.32 ± 0.14 body length/s, P < 0.05, ANOVA). However, it jumped more frequently on the thinner mud (from 1.2 ± 0.7 to 3.2 ± 1.6 times per minute, P < 0.05, ANOVA), likely due to it sticking more to the belly and fins and hindering walking.

Bio: Divya Ramesh is a fourth year PhD student in Dr. Chen Li’s lab (Terradynamics lab). Her current work focuses in studying and understanding amphibious fish locomotion on wet deformable substrates. Her previous work focused in using contact sensing to study and understand limbless locomotion of snakes and snake-robot on 3-D terrains. She received a BTech in Electronics and Communication Engineering from VIT University (Vellore, India) and MSE in Electrical Engineering from University of Pennsylvania. She has published in IEEE RA-L (presented at ICRA 2020) and presented at ICRA 2022. This work was presented in SICB 2023 where she was a finalist for Best Student Presentation in the Division of Comparative Biomechanics.

 

Gargi Sadalgekar “Template-level robophysical models for studying sustained terrestrial locomotion of amphibious fish”

Abstract: Studying terrestrial locomotion of amphibious fishes informs how early tetrapods may have invaded land. The water-land interface often has wet, deformable substrates like mud and sand that challenge locomotion. Recent progress has been made on understanding limbed and limbless tetrapod locomotion by studying robots as active physical models of model organisms. Robophysical models complement animals with their high controllability and repeatability for systematic experiments. They also complement theoretical and computational models because they enact physical laws in the real world, which is especially useful for studying locomotion in complex terrain. Here, we created the first robophysical models for studying sustained terrestrial locomotion of amphibious fishes on controlled mud as a model web deformable substrate. Our three robots are on the template level (lowest degree-of-freedom to generate a target locomotor behavior) and represent mudskippers, ropefish, and bichirs that use appendicular, axial, and axial-appendicular strategies, respectively. The mudskipper robot rotated two fins in phase to raise the body and “crutch” forward on mud. The ropefish robot used body lateral undulation to “surface-swim” on mud. The bichir robot combined body undulation and out-of-phase fin rotations to “army-crawl” forward on mud. Each robot generated qualitatively similar locomotion on mud as its model organism. We are currently refining the robots and performing systematic experiments on mud of a wide range of strengths.

Bio: Gargi Sadalgekar is a 2nd year master’s student in the Terradynamics Lab at Johns Hopkins University and is interested in developing bio-inspired robots to investigate locomotion in extreme environments. Her current work focuses on developing low-order robophysical models of amphibious fish to uncover general principles of locomotion over wet deformable substrates, and this work was presented in SICB 2023 where she was a finalist for Best Student Presentation in the Division of Comparative Biomechanics. Gargi received a BSE in Mechanical and Aerospace Engineering from Princeton University with a minor in Robotics and Information Systems.

 

Yaqing Wang “Force sensing can help robots reconstruct potential energy landscape and guide locomotor transitions to traverse large obstacles”

Abstract: Legged robots already excel at maintaining stability during upright walking and running to step over small obstacles. However, they must further traverse large obstacles comparable to body size to enable a broader range of applications like search and rescue in rubble and sample collection in rocky Martian hills. Our lab’s recent research demonstrated that legged robots can traverse large obstacles if they can be destabilized to transition across various locomotor modes. When viewed on a potential energy landscape of the system, which results from physical interaction with obstacles, these locomotor transitions are strenuous barrier-crossing transitions between landscape basins. Because potential energy landscape gradients are closely related to terrain reaction forces and torques, we hypothesize that sensing obstacle interaction forces allows landscape reconstruction, which can guide robots to cross barriers at the saddle to make transitions more easily (analogous to crossing a mountain ridge at its saddle). Here, we created a robophysical model with custom 3-axis force sensors and surface contact sensors to measure forces and contacts during interaction with large obstacles. We found that the measured forces indeed well captured potential energy landscape gradients and we could use the locally measured gradients to roughly reconstruct the potential energy landscape. Our future work should understand how to enable robots to make locomotor transitions at the landscape saddle based on local landscape reconstruction.

Bio: Yaqing Wang is a fourth-year PhD student in Dr. Chen Li’s lab (Terradynamics lab). His work focuses on understanding locomotor transitions in bio and bio-inspired terrestrial locomotion. He received a B.S. in Mechanical Engineering at Tsinghua University in China. He recently presented this work at the annual APS March meeting.

Apr
12
Wed
LCSR Seminar: Hatice Gunes “Emotional Intelligence for Human-Embodied AI Interaction” @ Hackerman B17
Apr 12 @ 12:00 pm – 1:00 pm

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Abstract:

Emotional intelligence for artificial systems is not a luxury but a necessity. It is paramount for many applications that require both short and long–term engaging human–technology interactions, including entertainment, hospitality, education, and healthcare. However, creating artificially intelligent systems and interfaces with social and emotional skills is a challenging task. Progress in industry and developments in academia provide us a positive outlook, however, the artificial emotional intelligence of the current technology is still quite limited. Creating technology with artificial emotional intelligence requires the development of perception, learning, action and adaptation capabilities, and the ability to execute these pipelines in real-time in human-AI interactions. Truly addressing these challenges relies on cross-fertilization of multiple research fields, including psychology, nonverbal behaviour understanding, psychiatry, vision, social signal processing, affective computing, and human-computer and human-robot interaction. My lab’s research has been pushing the state of the art in a wide spectrum of research topics in this area, including the  design and creation of new datasets; novel feature representations and learning algorithms for sensing and understanding human nonverbal behaviours in solo, dyadic and group settings; designing short/long-term human-robot adaptive interactions for wellbeing; and creating algorithmic solutions to mitigate the bias that creeps into these systems.

In this talk, I will present the recent explorations of the Cambridge Affective Intelligence and Robotics Lab in these areas with insights for human embodied-AI interaction research.

Bio:

Hatice Gunes is a Professor of Affective Intelligence and Robotics (AFAR) and leads the AFAR Lab at the University of Cambridge’s Department of Computer Science and Technology. Her expertise is in the areas of affective computing and social signal processing cross-fertilising research in multimodal interaction, computer vision, signal processing, machine learning and social robotics. She has published over 155 papers in these areas (H-index=36, citations > 7,300),  with  most  recent  works  on lifelong learning for facial expression recognition, fairness, and affective  robotics;  and  longitudinal  HRI  for  wellbeing. She has served as an Associate Editor for IEEE Transactions on Affective Computing, IEEE Transactions on Multimedia, and Image and Vision Computing Journal, and has guest edited many Special Issues, the latest ones being 2022 Int’l Journal of Social Robotics Special Issue on Embodied Agents for Wellbeing, 2022 Frontiers in Robotics and AI Special Issue on Lifelong Learning and Long-Term Human-Robot Interaction, and 2021 IEEE Transactions on Affective Computing  Special  Issue  on  Automated Perception of Human Affect from Longitudinal Behavioural Data. Other research highlights  include Outstanding PC Award at  ACM/IEEE HRI’23, RSJ/KROS Distinguished Interdisciplinary Research Award Finalist at IEEE RO-MAN’21, Distinguished PC  Award  at  IJCAI’21, Best Paper Award Finalist at IEEE RO-MAN’20, Finalist for the 2018 Frontiers Spotlight Award, Outstanding Paper Award at IEEE FG’11, and Best Demo Award at IEEE ACII’09. Prof Gunes is a former President of the Association for the Advancement of Affective Computing (2017-2019), is/was the General Co-Chair of ACM ICMI’24 and ACII’19, and the Program Co-Chair of ACM/IEEE HRI’20 and IEEE FG’17. She was the Chair of the Steering Board of IEEE Transactions on Affective Computing (2017-2019) and was a member of the Human-Robot Interaction Steering Committee (2018-2021. Her research has been supported by various competitive grants, with funding from Google, the Engineering and Physical Sciences Research Council UK (EPSRC), Innovate UK, British Council, Alan Turing Institute and EU Horizon 2020. In 2019 she was awarded a prestigious EPSRC Fellowship to investigate adaptive robotic emotional intelligence for wellbeing (2019-2024) and has been named a Faculty Fellow of the Alan Turing Institute – UK’s national centre for data science and artificial intelligence (2019-2021). Prof Gunes is currently a Staff Fellow of Trinity Hall, a Senior Member of the IEEE, and a member of the AAAC.

Apr
18
Tue
LCSR Seminar: Chia Day “Robotics in Intelligent Manufacturing” @ 106 Latrobe Hall
Apr 18 @ 11:00 am – 12:00 pm

Zoom Link for Seminar   Recorded seminars for the 2022/2023 school year

 

Abstract:

Old monkeys may have stories. Some could be lessons learned to help overcome obstacles. The first part of this seminar discusses classical robotic applications in industry and critical factors in the development and applications. The second part discusses intelligent manufacturing with the use of data and easy-to-use analytics, necessary in modern-day manufacturing. Moving forward, some opportunities in robotics in intelligent manufacturing are discussed.

 

 

Bio:

Dr. Day was previously a Senior VP of Foxconn Automation Technology. Dr. Day began his career in 1970 as a coop equipment development engineer at IBM Burlington Vt. and later continued to plow in the manufacturing automation field with General Motors, Fanuc, Rockwell Automation, Stoneridge and Foxconn. Dr. was the founder of Foxbot, with 80,000 units deployed in various applications. In June 2016, Dr. Day received the Joseph F. Engelberger award from the Robot Industries Association for a lifetime career contribution in the automotive and electronic industries.

 

Apr
19
Wed
LCSR Seminar: Ryan Sochol “Games without Frontiers: Beating Super Mario Bros. 1-1 with a 3D printed Soft Robotic Hand” @ Hackerman B17
Apr 19 @ 12:00 pm – 1:00 pm

 

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Games Without Frontiers: Beating Super Mario Bros. 1-1 with a 3D-Printed Soft Robotic Hand”

 Ryan D. Sochol, Ph.D.

 

Associate Professor, Department of Mechanical Engineering
Affiliate Faculty, Fischell Department of Bioengineering
Executive Committee Member, Maryland Robotics Center
Fischell Institute Fellow, Robert E. Fischell Institute for Biomedical Devices
Affiliate Faculty, Institute for Systems Research
James Clark School of Engineering
University of Maryland, College Park

 

Abstract:

Over the past decade, the field of “soft robotics” has established itself as uniquely suited for applications that would be difficult or impossible to realize using traditional, rigid-bodied robots.  The reliance on compliant materials that are often actuated by fluidic (e.g., hydraulic or pneumatic) means presents a number of inherent benefits for soft robots, particularly in terms of safety for human-robot interactions and adaptability for manipulating complex and/or delicate objects.  Unfortunately, progress has been impeded by broad challenges associated with controlling the underlying fluidics of such systems.  In this seminar, Prof. Ryan D. Sochol will discuss how his Bioinspired Advanced Manufacturing (BAM) Laboratory is leveraging the capabilities of two alternative types of additive manufacturing (or “three-dimensional (3D) printing”) technologies to address these critical barriers.  Specifically, Prof. Sochol will describe his lab’s recent strategies for using the 3D nanoprinting approach, “Two-Photon Direct Laser Writing”, and the inkjet 3D printing technique, “PolyJet 3D Printing”, to engineer soft robotic systems that comprise integrated fluidic circuitry… including a soft robotic “hand” that plays Nintendo.

 

Biography:

Prof. Ryan D. Sochol is an Associate Professor of Mechanical Engineering within the A. James Clark School of Engineering at the University of Maryland, College Park.  Prof. Sochol received his B.S. in Mechanical Engineering from Northwestern University in 2006, and both his M.S. and Ph.D. in Mechanical Engineering from the University of California, Berkeley, in 2009 and 2011, respectively, with Doctoral Minors in Bioengineering and Public Health.  Prior to joining the faculty at UMD, Prof. Sochol served two primary academic roles: (i) as an NIH Postdoctoral Trainee within the Harvard-MIT Division of Health Sciences & Technology, Harvard Medical School, and Brigham & Women’s Hospital, and (ii) as the Director of the Micro Mechanical Methods for Biology (M3B) Laboratory Program within the Berkeley Sensor & Actuator Center at UC Berkeley.  Prof. Sochol also served as a Visiting Postdoctoral Fellow at the University of Tokyo.  In 2019, Prof. Sochol was elected Co-President of the Mid-Atlantic Micro/Nano Alliance.  His group received IEEE MEMS Outstanding Student Paper Awards in both 2019 and 2021 and the Springer Nature Best Paper Award (Runner-Up) in 2022.   Prof. Sochol received the NSF CAREER Award in 2020 and the Early Career Award from the IOP Journal of Micromechanics and Microengineering in 2021, and was recently honored as an inaugural Rising Star by the journal, Advanced Materials Technologies, in 2023.

 

Apr
26
Wed
LCSR Seminar: Careers in Robotics: A Panel Discussion With Experts From Industry and Academia @ Hackerman B17
Apr 26 @ 12:00 pm – 1:00 pm

 

Link for Live Seminar

Link for Recorded seminars – 2022/2023 school year

 

Ayushi Sinha Ph.D.

Job title and affiliation: Senior Scientist, Philips

JHU degrees and year(s) of degree(s): Ph.D. Computer Science 2018, MSE Computer Science 2014

Short bio: Ayushi Sinha is a Senior Scientist at Philips working on image guided therapy systems including C-arm X-ray imaging systems. Ayushi received a BS in Computer Science and a BA in Mathematics from Providence College, RI, and a MSE and Ph.D. in Computer Science from Johns Hopkins University, MD. She remained at Hopkins as a Provost’s Postdoctoral Fellow followed by a Research Scientist before joining Philips in late 2019. Her primary research interest is in image analysis to enable automation of medical imaging systems and integration of multiple systems.

 

 

Can Kocabalkanli M.S.E.

Job title and affiliation: Computer Vision Research Scientist at PediaMetrix

JHU degrees and year(s) of degree(s): BS Mechanical Engineering 2019, MSE Robotics 2020

Short bio: Originally from Istanbul, Turkey, Can came to JHU for his undergraduate degree and early on explored an interest in robotics through coursework and the robotics minor. He completed his master’s research and thesis under Prof. Taylor in 2020 on an autonomous endoscope safety system. Since graduation, Can has been working as a Computer Vision Research Scientist at PediaMetrix, a medical imaging startup focused on infant healthcare. There he has worked on developing, deploying, and validating image processing and vision algorithms, machine and deep learning models, as well as acquiring 510(k) clearance. Since September 2022, he has taken a leadership role in their R&D department. Can is interested in making healthcare more robust and accessible through innovation and technology and is the co-inventor of 2 US patents.

 

 

Michael Kutzer Ph.D.

Job title and affiliation: Associate Professor, United States Naval Academy Department of Weapons, Robotics, and Control Engineering/Instructor, JHU-EP Mechanical Engineering Program

JHU degrees and year(s) of degree(s): M.S.E. Mechanical Engineering 2007, Ph.D. Mechanical Engineering 2012

Short bio: Mike Kutzer received his Ph.D. in mechanical engineering from the Johns Hopkins University, Baltimore, MD, USA in 2012. He is currently an Associate Professor in the Weapons, Robotics, and Control Engineering Department (WRCE) at the United States Naval Academy (USNA). Prior to joining USNA, he worked as a senior researcher in the Research and Exploratory Development Department of the Johns Hopkins University Applied Physics Laboratory (JHU/APL). His research interests include robotic manipulation, computer vision and motion capture, applications of and extensions to additive manufacturing, mechanism design and characterization, continuum manipulators, redundant mechanisms, and modular systems.

 

 

 

Aug
30
Wed
LCSR Seminar: Welcome Townhall “Review of LCSR” @ Hackerman B17
Aug 30 @ 12:00 pm – 1:00 pm