Calendar

Apr
6
Wed
LCSR Seminar: Ali Kamen: Imaging for Personalized Healthcare @ B17 Hackerman Hall
Apr 6 @ 12:00 pm – 1:00 pm

Abstract

In this talk, I will give a perspective on past and current trends in medical imaging particularly regarding the role of imaging in personalized medicine. I will then outline core technologies enabling the advancements, with specific focus on empirical and mechanistic modeling. In addition, I demonstrate some example clinical applications on how mechanistic and empirical models derived based on imaging are used for treatment planning and therapy outcome analysis. I conclude by providing a future outlook for the utilization of imaging in the healthcare continuum.

Apr
13
Wed
Christie Lecture: Vijay Kumar “Flying Robots: Beyond UAVs” in B17 Hackerman Hall @ B17 Hackerman Hall
Apr 13 @ 12:00 pm – 1:00 pm

Abstract

Flying robots can operate in three-dimensional, indoor and outdoor environments. However, many challenges arise as we scale down the size of the robot, which is necessary for operating in cluttered environments. I will describe recent work in developing small, autonomous robots, and the design and algorithmic challenges in the areas of (a) control and planning, (b) state estimation and mapping, and (c) coordinating large teams of robots. I will also discuss applications to search and rescue, first response and precision farming. Publications and videos are available at kumarrobotics.org.

 

Bio

DR. VIJAY KUMAR is the Nemirovsky Family Dean of Penn Engineering with appointments in the Departments of Mechanical Engineering and Applied Mechanics, Computer and Information Science, and Electrical and Systems Engineering at the University of Pennsylvania. Dr. Kumar received his Bachelor of Technology degree from the Indian Institute of Technology, Kanpur and his Ph.D. from The Ohio State University in 1987. He has been on the Faculty in the Department of Mechanical Engineering and Applied Mechanics with a secondary appointment in the Department of Computer and Information Science at the University of Pennsylvania since 1987.

Dr. Kumar served as the Deputy Dean for Research in the School of Engineering and Applied Science from 2000-2004. He directed the GRASP Laboratory,
a multidisciplinary robotics and perception laboratory, from 1998-2004. He was the Chairman of the Department of Mechanical Engineering and Applied Mechanics from 2005-2008. He served as the Deputy Dean for Education in the School of Engineering and Applied Science from 2008-2012. He then served as the assistant director of robotics and cyber physical systems at the White House Office of Science and Technology Policy (2012 – 2013).

Dr. Kumar is a Fellow of the American Society of Mechanical Engineers (2003), a Fellow of the Institution of Electrical and Electronic Engineers (2005) and a member of the National Academy of Engineering (2013). Dr. Kumar’s research interests are in robotics, specifically multi-robot systems, and micro aerial vehicles. He has served on the editorial boards of the IEEE Transactions on Robotics and Automation, IEEE Transactions on Automation Science and Engineering, ASME Journal of Mechanical Design, the ASME Journal of Mechanisms and Robotics and the Springer Tract in Advanced Robotics (STAR).

 

For More Information or to RSVP please contact Deana Santoni at dsantoni@jhu.edu.

Sponsored by the Department of Mechanical Engineering and the JHU Student Section and the Baltimore Section of the American Society of Mechanical Engineers.

Apr
20
Wed
LCSR Seminar: Timothy Kowalewski “Measuring Surgical Skill: Crowds, Robots, and Beyond” @ B17 Hackerman Hall
Apr 20 @ 12:00 pm – 1:00 pm

Abstract

For over a decade surgical educators have called for objective, quantitative methods to measure surgical skill. To date, no satisfactory method exists that is simultaneously accurate, scalable, and generalizable. That is, a method whose scores correlate with patient outcomes, can scale to cope with 51 million annual surgeries in the United States, and generalize across the diversity surgical procedures or specialties. This talk will review the promising results of exploiting crowdsourcing techniques to meet this need. The talk will also survey the limitations of this approach, fundamental problems in establishing ground truth for surgical skill evaluation, and steps to exploit surgical robotics data. The talk will conclude by proposing some future robotic approaches that may obviate the need for surgeons to master complex technical skills in the first place.

 

Bio

Dr. Kowalewski completed his PhD in electrical engineering for “quantitative surgical skill evaluation” at the University of Washington’s Biorobotics lab. This work was recognized with a best doctoral candidate award at the American College of Surgeons AEI Consortium on Surgical Robotics and Simulation. He was also a research scientist at DARPA’s “Traumapod: Operating room of the future” project. He has helped commercialize his PhD work for quantitative skill evaluation hardware (Simulab Corp., Seattle, WA) and also pioneered the use of crowdsourcing for highvolume assessment of surgical skills and cofounded CSATS Inc, Seattle, WA to make these methods available to modern healthcare. This work has been published in JAMA Surgery and formally adopted by the American Urological Association for educational and certification needs. In 2012 he started the Medical Robotics and Devices Lab at the University of Minnesota, Mechanical Engineering department where he is currently an Assistant Professor.

May
11
Wed
LCSR Seminar Cancelled @ 320 Hackerman Hall
May 11 @ 12:00 pm – 1:00 pm
Sep
7
Wed
LCSR Seminar: Welcome/Welcome Back Town Hall @ B17 Hackerman Hall
Sep 7 @ 12:00 pm – 1:00 pm

Abstract

 

This is the Fall 2016 Kick-Off Seminar, presenting an overview of LCSR, useful information, and an introduction to the faculty and labs.

Sep
14
Wed
Rebecca Schulman:Toward robotic materials: Self-Assembling Programmable Adaptive Structures with Molecules @ B17 Hackerman Hall
Sep 14 @ 12:00 pm – 1:00 pm

Abstract

While robots at the human size scale are generally composed of structures that are moved by a small set of actuators that shift materials or components with a well-defined shape, other principles for designing moving structures can control movement at the micron scale. For example, cells can move by disassembling parts of their rigid skeleton, or cytoskeleton, and reassembling new components in a different location. The structures that are disassembled and reassembled are often filaments that grow, shrink and form junctions between one another. Networks of rigid filaments serve as a cheap, reusable, movable scaffold that shapes and reshapes the cell.

Could we design synthetic materials to perform tasks of engineering interest at the micron scale? I’ll describe how we are using ideas from DNA nanotechnology to build synthetic filaments and how we can program where and when filaments assemble and disassemble and how they organize. We are able to use quantitative control over microscopic parameters, modeling and automated analysis to build increasingly sophisticated structures that can find, connect and move locations in the environment, form architectures and heal when damaged.

 

 

 

Sep
21
Wed
Bernhard Fuerst: Augmented Reality for Orthopedic Interventions: Translational research, funding, and recent developments @ B17 Hackerman Hall
Sep 21 @ 12:00 pm – 1:00 pm

Abstract

The most difficult procedures during orthopedic and trauma surgeries is the placement of screws to repair complex fractures. Using a vast amount of X-ray images (we have observed surgeries with up to 246 images) the surgeon needs to drill a guide wire through the bone fragments. The difficulty is further increased by muscle and other tissue covering the bones (e.g. for pelvis).
Our system comprises a traditional X-ray machine (C-arm), a 3D camera mounted on this X-ray machine, and generally available 3D Computed Tomography (CT) images to guide the surgeon. Rather than seeing simple 2D X-ray images, our system shows the surgeon a 3D view of the bones, the drill, the patient surface and even the surgeon’s hands in real-time. This “Superman”-view, referred to Interventional 3D Augmented Reality, was shown to reduce duration, radiation dose, number of X-ray images, and complications in our preclinical experiments. In summary, our system increases patient safety and represents the future of interventional X-ray
imaging.

Bio

Bernhard Fuerst is a research engineer at the Engineering Research Center at Johns Hopkins University. He received his Bachelor’s degree in Biomedical Computer Science at the University for Medical Technology in Austria in 2009 and his Master’s degree in Biomedical Computing at the Technical University in Munich, Germany in 2011. During his studies he joined Siemens Corporate Research in Princeton to research biomechanical simulations for compensation of respiratory motion under Dr. Ali Kamen’s supervision, and Georgetown University to investigate techniques for meta-optimization using particle swarm optimizers under Dr. Kevin Cleary’s supervision. Since joining the Johns Hopkins University, he worked on establishing Dr. Nassir Navab’s research group to focus on robotic ultrasound, minimally invasive nuclear imaging, and bioelectric localization and navigation.

Sep
28
Wed
LCSR Seminar: Dave Akin “Robotics for Extreme Environments: Space, Undersea, and Medical Rehabilitation” @ B17 Hackerman Hall
Sep 28 @ 12:00 pm – 1:00 pm

Dr. David L. Akin

Director, Space Systems Laboratory

Associate Professor of Aerospace Engineering

University of Maryland

 

Abstract

For decades, the Space Systems Laboratory at the University of Maryland has been involved with advancing the capabilities of dexterous robotic systems to facilitate operations in challenging environments such as space and deep ocean. This work has been focused on developing integrated systems for these “extreme” environments, usually involving both mobility and dexterous manipulation. The talk will focus on the design, development, and operation of robotic systems developed in the SSL, including the Ranger Dexterous Servicing System (originally intended as a Space Shuttle flight experiment), SAMURAI (a 6000-meter deep ocean autonomous sampling system), an exoskeleton system for shoulder rehabilitation, and various rovers and robot arms.

 

Bio

David L. Akin is an Associate Professor in the Department of Aerospace Engineering and Director of the Space Systems Laboratory at the University of Maryland. He earned SB (1974), SM (1975), and ScD (1981) degrees from M.I.T. His current research focuses on space operations, including dexterous robotics, pressure suit design, and human-robot interactions. He is also active in the areas of spacecraft design, space simulation, and space systems analysis. He has been principal investigator for several space flight systems, and for multiple experimental space suit and robotic systems. He has over 100 professional publications in journals and conference proceedings.

Oct
5
Wed
Roy E. Ritzmann: Insect Brain Systems and Their Role in Context Dependent Behavior @ B17 Hackerman Hall
Oct 5 @ 12:00 pm – 1:00 pm

Abstract

Contrary to popular notions, insects have sophisticated brains that allow them to adjust control so that behaviors are consistent with current internal and external conditions. The Central Complex (CX) is a set of midline neuropils in the brains of all arthropods. It is made up of the columnar structures including the protocerebral bridge, fan-shaped body and ellipsoid body. Neurons in these structures project to the paired nodules and lateral accessory lobes where they have access to descending interneurons that alter movements.

Over the past couple of decades, the CX has received a remarkable amount of attention by insect neurobiologists. We now know that several types of sensory information projects to the CX including mechanical information from the antennae and various visual cues including polarized light. Polarized light is used by several migratory insects to guide their long distance flights. We also know that activity in the CX precedes changes in movement and stimulation in the same regions can evoke turning behavior. Recently, navigation cues such as head direction compass cells have been identified in several insects.

Cockroaches are scavengers that forage through darkened environments. Like many foraging insects, they must keep track of targets while negotiating barriers. Thus, they need to simultaneously integrate sensory information and produce appropriate motor commands. As cockroaches move toward a darkened shelter they continually asses their situation and decide to either continue or turn based on whether they still see the shelter (Daltorio et al., 2013). This foraging behavior requires that the insect know its orientation and the direction of recent turns. It must then use that information to influence descending commands that result in turning behaviors. By performing tetrode recordings in a restrained preparation, we found CX neurons that encode the animal’s orientation using external and internal sensory cues, similarly to mammalian head direction cells as well as the direction of recent rotations (Varga and Ritzmann, 2016). How can this information influence movement in the arena? We recorded from tethered and freely walking cockroaches and found CX neurons in which activity increased just prior to changes in direction or speed (Martin et al., 2015). The patterns of movement coded in each CX neuron represents a population code that covers the entire range of horizontal movements that cockroaches make in the arena. Moreover, stimulation through the same tetrodes evoked movements consistent with the recorded activity. For individuals that consistently evoked turning in a particular direction, we further examined leg reflexes associated with the femoral chordotonal organ (FCo), which evokes reflex changes in the motor neurons that control the femur-tibia joint as well as the adjacent coxa-trochanter joint. Lesion of all descending activity causes a reversal in the FCo reflex to the slow depressor neuron (Ds) of the coxa-trochanter joint, which is consistent with changes associated with turning. Together these studies demonstrated that the cockroach CX relies upon a variety of sensory modalities to encode the animal’s orientation, which is then used to generate directionally specific motor commands, and therefore, direct locomotion.

More recently, we have turned to an insect predator to expand our understanding of how brain systems alter behavior. Predators must track down and accurately strike prey. Many change their strategy for obtaining food as they become satiated. We have been able to tap into CX activity during this process and have begun to examine how neuromodulators associated with satiety alter CX activity and related stalking behavior.

Bio

Roy E. Ritzmann is a Professor in the Department of Biology at Case Western Reserve University in Cleveland, Ohio. He received the B.A. degree in Zoology from the University of Iowa, the Ph.D. in Biology from the University of Virginia then moved to a postdoctoral position at Cornell University where he began working with insects on the neural circuitry underlying escape systems. His laboratory focuses on behavioral and neural properties that are involved in insect movement around barriers in complex terrain most recently focusing upon context and state dependent control in an insect brain region called the central complex (a group of neuropils that reside on the midline of virtually all arthropod brains). To that end they employ both extracellular (multi-channel) and intracellular recording techniques in the brain and thoracic ganglia of cockroaches and praying mantises. Using these techniques the Ritzmann laboratory has made progress in understanding how the central complex integrates massive amounts of information on the insect’s surroundings and internal state into descending commands that adjust movements toward goals and away from threats in a context dependent fashion. The Ritzmann laboratory has also collaborated on many biologically inspired robotic projects.

Oct
12
Wed
LCSR Seminar: Rene Vidal “Automatic Methods for the Interpretation of Biomedical Data” @ B17 Hackerman Hall
Oct 12 @ 12:00 pm – 1:00 pm

Abstract

In this talk, I will overview our recent work on the development of automatic methods for the interpretation of biomedical data from multiple modalities and scales. At the cellular scale, I will present a structured matrix factorization method for segmenting neurons and finding their spiking patterns in calcium imaging videos, and a shape analysis method for classifying embryonic cardiomyocytes in optical imaging videos. At the organ scale, I will present a Riemannian framework for processing diffusion magnetic resonance images of the brain, and a stochastic tracking method for detecting Purkinje fibers in cardiac MRI. At the patient scale, I will present dynamical system and machine learning methods for recognizing surgical gestures and assessing surgeon skill in medical robotic motion and video data.

 

Bio

Professor Vidal received his B.S. degree in Electrical Engineering (highest honors) from the Pontificia Universidad Catolica de Chile in 1997 and his M.S. and Ph.D. degrees in Electrical Engineering and Computer Sciences from the University of California at Berkeley in 2000 and 2003, respectively. He was a research fellow at the National ICT Australia in 2003 and has been a faculty member in the Department of Biomedical Engineering and the Center for Imaging Science of The Johns Hopkins University since 2004. He has held several visiting faculty positions at Stanford, INRIA/ENS Paris, the Catholic University of Chile, Universite Henri Poincare, and the Australian National University. Dr. Vidal was co-editor (with Anders Heyden and Yi Ma) of the book “Dynamical Vision” and has co-authored more than 180 articles in biomedical image analysis, computer vision, machine learning, hybrid systems, robotics and signal processing. Dr. Vidal is or has been Associate Editor of Medical Image Analysis, the IEEE Transactions on Pattern Analysis and Machine Intelligence, the SIAM Journal on Imaging Sciences and the Journal of Mathematical Imaging and Vision, and guest editor of Signal Processing Magazine. He is or has been program chair for ICCV 2015, CVPR 2014, WMVC 2009, and PSIVT 2007. He was area chair for ICCV 2013, CVPR 2013, ICCV 2011, ICCV 2007 and CVPR 2005. Dr. Vidal is recipient of numerous awards for his work, including the 2012 J.K. Aggarwal Prize for “outstanding contributions to generalized principal component analysis (GPCA) and subspace clustering in computer vision and pattern recognition”, the 2012 Best Paper Award in Medical Robotics and Computer Assisted Interventions (with Benjamin Bejar and Luca Zappella), the 2011 Best Paper Award Finalist at the Conference on Decision and Control (with Roberto Tron and Bijan Afsari), the 2009 ONR Young Investigator Award, the 2009 Sloan Research Fellowship, the 2005 NFS CAREER Award and the 2004 Best Paper Award Honorable Mention (with Prof. Yi Ma) at the European Conference on Computer Vision. He also received the 2004 Sakrison Memorial Prize for “completing an exceptionally documented piece of research”, the 2003 Eli Jury award for “outstanding achievement in the area of Systems, Communications, Control, or Signal Processing”, the 2002 Student Continuation Award from NASA Ames, the 1998 Marcos Orrego Puelma Award from the Institute of Engineers of Chile, and the 1997 Award of the School of Engineering of the Pontificia Universidad Catolica de Chile to the best graduating student of the school. He is a fellow of the IEEE and a member of the ACM.

Laboratory for Computational Sensing + Robotics