Calendar

Mar
23
Wed
Yaoling Yu: The Computational, Statistical, and Practical Aspects of Machine Learning @ B17 Hackerman Hall
Mar 23 @ 12:00 pm – 1:00 pm

Abstract

The big data revolution has profoundly changed, among many other things, how we perceive business, research, and application. However, in order to fully realize the potential of big data, certain computational and statistical challenges need to be addressed. In this talk, I will present my research in facilitating the deployment of machine learning methodologies and algorithms in big data applications. I will first present robust methods that are capable of accounting for uncertain or abnormal observations. Then I will present a generic regularization scheme that automatically extracts compact and informative representations from heterogeneous, multi-modal, multi-array, time-series, and structured data. Next, I will discuss two gradient algorithms that are computationally very efficient for our regularization scheme, and I will mention their theoretical convergence properties and computational requirements. Finally, I will present a distributed machine learning framework that allows us to process extremely large-scale datasets and models. I conclude my talk by sharing some future directions that I am and will be pursuing.

 

Bio

Yaoliang Yu is currently a research scientist affiliated with the center for machine learning and health, and the machine learning department of Carnegie Mellon University. He obtained his PhD (under Dale Schuurmans and Csaba Szepesvari) in computing science from University of Alberta (Canada, 2013), and he received the PhD Dissertation Award from the Canadian Artificial Intelligence Association in 2015.

Mar
30
Wed
Berk Gonenc: Force Sensing for Robotic Assistance in Retinal Microsurgery @ B17 Hackerman Hall
Mar 30 @ 12:00 pm – 1:00 pm

Abstract

Microsurgery ranks among the most challenging areas of surgical practice, requiring the manipulation of extremely delicate tissues by various micron scale maneuvers and the application of very small forces. Vitreoretinal surgery, as the most technically demanding field of ophthalmic surgery, treats disorders of the retina, vitreous body, and macula, such as retinal detachment, diabetic retinopathy, macular hole, and epiretinal membrane. Recent advancements in medical robotics have significant potential to address most of the challenges in vitreoretinal practice, and therefore to prevent traumas, lessen complications, minimize intra-operative surgeon effort, maximize surgeon comfort, and promote patient safety. In this talk, I will present the development of novel force-sensing tools and robot control methods to produce integrated assistive surgical systems that work in partnership with surgeons against the current limitations in microsurgery, specifically focusing on membrane peeling and vein cannulation tasks in retinal microsurgery. Integrating high sensitivity force sensing into the ophthalmic instruments enables precise quantitative monitoring of applied forces. Auditory feedback based upon the measured forces can inform (and warn) the surgeon quickly during the surgery and help prevent injury due to excessive forces. Using these tools on a robotic platform can attenuate hand tremor of the surgeon, which effectively promotes tool manipulation accuracy. In addition, based upon certain force signatures, the robotic system can actively guide the tool towards clinical targets, compensate any involuntary motion of the surgeon, or generate additional motion that will make the surgical task easier. I will present our latest experimental results for two distinct robotic platforms, the Steady Hand Robot and Micron, with the force-sensing ophthalmic instruments, which show significant performance improvement in artificial dry phantoms and ex-vivo biological tissues.

 

Bio

Berk Gonenc is a Ph.D. candidate in Mechanical Engineering at Johns Hopkins University. He received his M.S. degree in Mechanical Engineering from Washington State University Vancouver in 2011 and joined the Advanced Medical Instrumentation and Robotics Research Laboratory in Johns Hopkins University. He received his M.S.E. degree in Mechanical Engineering from Johns Hopkins University in 2014. His research is focused on developing smart instruments and robot systems for microsurgery.

Apr
6
Wed
LCSR Seminar: Ali Kamen: Imaging for Personalized Healthcare @ B17 Hackerman Hall
Apr 6 @ 12:00 pm – 1:00 pm

Abstract

In this talk, I will give a perspective on past and current trends in medical imaging particularly regarding the role of imaging in personalized medicine. I will then outline core technologies enabling the advancements, with specific focus on empirical and mechanistic modeling. In addition, I demonstrate some example clinical applications on how mechanistic and empirical models derived based on imaging are used for treatment planning and therapy outcome analysis. I conclude by providing a future outlook for the utilization of imaging in the healthcare continuum.

Apr
13
Wed
Christie Lecture: Vijay Kumar “Flying Robots: Beyond UAVs” in B17 Hackerman Hall @ B17 Hackerman Hall
Apr 13 @ 12:00 pm – 1:00 pm

Abstract

Flying robots can operate in three-dimensional, indoor and outdoor environments. However, many challenges arise as we scale down the size of the robot, which is necessary for operating in cluttered environments. I will describe recent work in developing small, autonomous robots, and the design and algorithmic challenges in the areas of (a) control and planning, (b) state estimation and mapping, and (c) coordinating large teams of robots. I will also discuss applications to search and rescue, first response and precision farming. Publications and videos are available at kumarrobotics.org.

 

Bio

DR. VIJAY KUMAR is the Nemirovsky Family Dean of Penn Engineering with appointments in the Departments of Mechanical Engineering and Applied Mechanics, Computer and Information Science, and Electrical and Systems Engineering at the University of Pennsylvania. Dr. Kumar received his Bachelor of Technology degree from the Indian Institute of Technology, Kanpur and his Ph.D. from The Ohio State University in 1987. He has been on the Faculty in the Department of Mechanical Engineering and Applied Mechanics with a secondary appointment in the Department of Computer and Information Science at the University of Pennsylvania since 1987.

Dr. Kumar served as the Deputy Dean for Research in the School of Engineering and Applied Science from 2000-2004. He directed the GRASP Laboratory,
a multidisciplinary robotics and perception laboratory, from 1998-2004. He was the Chairman of the Department of Mechanical Engineering and Applied Mechanics from 2005-2008. He served as the Deputy Dean for Education in the School of Engineering and Applied Science from 2008-2012. He then served as the assistant director of robotics and cyber physical systems at the White House Office of Science and Technology Policy (2012 – 2013).

Dr. Kumar is a Fellow of the American Society of Mechanical Engineers (2003), a Fellow of the Institution of Electrical and Electronic Engineers (2005) and a member of the National Academy of Engineering (2013). Dr. Kumar’s research interests are in robotics, specifically multi-robot systems, and micro aerial vehicles. He has served on the editorial boards of the IEEE Transactions on Robotics and Automation, IEEE Transactions on Automation Science and Engineering, ASME Journal of Mechanical Design, the ASME Journal of Mechanisms and Robotics and the Springer Tract in Advanced Robotics (STAR).

 

For More Information or to RSVP please contact Deana Santoni at dsantoni@jhu.edu.

Sponsored by the Department of Mechanical Engineering and the JHU Student Section and the Baltimore Section of the American Society of Mechanical Engineers.

Apr
20
Wed
LCSR Seminar: Timothy Kowalewski “Measuring Surgical Skill: Crowds, Robots, and Beyond” @ B17 Hackerman Hall
Apr 20 @ 12:00 pm – 1:00 pm

Abstract

For over a decade surgical educators have called for objective, quantitative methods to measure surgical skill. To date, no satisfactory method exists that is simultaneously accurate, scalable, and generalizable. That is, a method whose scores correlate with patient outcomes, can scale to cope with 51 million annual surgeries in the United States, and generalize across the diversity surgical procedures or specialties. This talk will review the promising results of exploiting crowdsourcing techniques to meet this need. The talk will also survey the limitations of this approach, fundamental problems in establishing ground truth for surgical skill evaluation, and steps to exploit surgical robotics data. The talk will conclude by proposing some future robotic approaches that may obviate the need for surgeons to master complex technical skills in the first place.

 

Bio

Dr. Kowalewski completed his PhD in electrical engineering for “quantitative surgical skill evaluation” at the University of Washington’s Biorobotics lab. This work was recognized with a best doctoral candidate award at the American College of Surgeons AEI Consortium on Surgical Robotics and Simulation. He was also a research scientist at DARPA’s “Traumapod: Operating room of the future” project. He has helped commercialize his PhD work for quantitative skill evaluation hardware (Simulab Corp., Seattle, WA) and also pioneered the use of crowdsourcing for highvolume assessment of surgical skills and cofounded CSATS Inc, Seattle, WA to make these methods available to modern healthcare. This work has been published in JAMA Surgery and formally adopted by the American Urological Association for educational and certification needs. In 2012 he started the Medical Robotics and Devices Lab at the University of Minnesota, Mechanical Engineering department where he is currently an Assistant Professor.

May
11
Wed
LCSR Seminar Cancelled @ 320 Hackerman Hall
May 11 @ 12:00 pm – 1:00 pm
Sep
7
Wed
LCSR Seminar: Welcome/Welcome Back Town Hall @ B17 Hackerman Hall
Sep 7 @ 12:00 pm – 1:00 pm

Abstract

 

This is the Fall 2016 Kick-Off Seminar, presenting an overview of LCSR, useful information, and an introduction to the faculty and labs.

Sep
14
Wed
Rebecca Schulman:Toward robotic materials: Self-Assembling Programmable Adaptive Structures with Molecules @ B17 Hackerman Hall
Sep 14 @ 12:00 pm – 1:00 pm

Abstract

While robots at the human size scale are generally composed of structures that are moved by a small set of actuators that shift materials or components with a well-defined shape, other principles for designing moving structures can control movement at the micron scale. For example, cells can move by disassembling parts of their rigid skeleton, or cytoskeleton, and reassembling new components in a different location. The structures that are disassembled and reassembled are often filaments that grow, shrink and form junctions between one another. Networks of rigid filaments serve as a cheap, reusable, movable scaffold that shapes and reshapes the cell.

Could we design synthetic materials to perform tasks of engineering interest at the micron scale? I’ll describe how we are using ideas from DNA nanotechnology to build synthetic filaments and how we can program where and when filaments assemble and disassemble and how they organize. We are able to use quantitative control over microscopic parameters, modeling and automated analysis to build increasingly sophisticated structures that can find, connect and move locations in the environment, form architectures and heal when damaged.

 

 

 

Sep
21
Wed
Bernhard Fuerst: Augmented Reality for Orthopedic Interventions: Translational research, funding, and recent developments @ B17 Hackerman Hall
Sep 21 @ 12:00 pm – 1:00 pm

Abstract

The most difficult procedures during orthopedic and trauma surgeries is the placement of screws to repair complex fractures. Using a vast amount of X-ray images (we have observed surgeries with up to 246 images) the surgeon needs to drill a guide wire through the bone fragments. The difficulty is further increased by muscle and other tissue covering the bones (e.g. for pelvis).
Our system comprises a traditional X-ray machine (C-arm), a 3D camera mounted on this X-ray machine, and generally available 3D Computed Tomography (CT) images to guide the surgeon. Rather than seeing simple 2D X-ray images, our system shows the surgeon a 3D view of the bones, the drill, the patient surface and even the surgeon’s hands in real-time. This “Superman”-view, referred to Interventional 3D Augmented Reality, was shown to reduce duration, radiation dose, number of X-ray images, and complications in our preclinical experiments. In summary, our system increases patient safety and represents the future of interventional X-ray
imaging.

Bio

Bernhard Fuerst is a research engineer at the Engineering Research Center at Johns Hopkins University. He received his Bachelor’s degree in Biomedical Computer Science at the University for Medical Technology in Austria in 2009 and his Master’s degree in Biomedical Computing at the Technical University in Munich, Germany in 2011. During his studies he joined Siemens Corporate Research in Princeton to research biomechanical simulations for compensation of respiratory motion under Dr. Ali Kamen’s supervision, and Georgetown University to investigate techniques for meta-optimization using particle swarm optimizers under Dr. Kevin Cleary’s supervision. Since joining the Johns Hopkins University, he worked on establishing Dr. Nassir Navab’s research group to focus on robotic ultrasound, minimally invasive nuclear imaging, and bioelectric localization and navigation.

Sep
28
Wed
LCSR Seminar: Dave Akin “Robotics for Extreme Environments: Space, Undersea, and Medical Rehabilitation” @ B17 Hackerman Hall
Sep 28 @ 12:00 pm – 1:00 pm

Dr. David L. Akin

Director, Space Systems Laboratory

Associate Professor of Aerospace Engineering

University of Maryland

 

Abstract

For decades, the Space Systems Laboratory at the University of Maryland has been involved with advancing the capabilities of dexterous robotic systems to facilitate operations in challenging environments such as space and deep ocean. This work has been focused on developing integrated systems for these “extreme” environments, usually involving both mobility and dexterous manipulation. The talk will focus on the design, development, and operation of robotic systems developed in the SSL, including the Ranger Dexterous Servicing System (originally intended as a Space Shuttle flight experiment), SAMURAI (a 6000-meter deep ocean autonomous sampling system), an exoskeleton system for shoulder rehabilitation, and various rovers and robot arms.

 

Bio

David L. Akin is an Associate Professor in the Department of Aerospace Engineering and Director of the Space Systems Laboratory at the University of Maryland. He earned SB (1974), SM (1975), and ScD (1981) degrees from M.I.T. His current research focuses on space operations, including dexterous robotics, pressure suit design, and human-robot interactions. He is also active in the areas of spacecraft design, space simulation, and space systems analysis. He has been principal investigator for several space flight systems, and for multiple experimental space suit and robotic systems. He has over 100 professional publications in journals and conference proceedings.

Laboratory for Computational Sensing + Robotics