Calendar

Jun
11
Thu
Fijs W.B. van Leeuwen: Image guidance technologies as an add on to robotic surgery @ 320 Hackerman Hall
Jun 11 @ 12:00 pm – 1:00 pm

Abstract:

 

Image guided interventions are increasingly gaining interest from surgical and radiological disciplines. In addition to radiological imaging technologies, radionuclear imaging and optical imaging have the potential to visualize molecular features of disease. To fully exploit the potential of these technologies, it is instrumental to understand the advantages and shortcomings that come with them. Based on this knowledge the clinically applied guidance approaches and well-known surgical planning such as US, CT, MRI, SPECT, and PET can be placed in perspective. The approaches used for radionuclear and optical modalities are often complementary, a feature that can be exploited further with the use of hybrid tracers. The latter molecular parameters can be detected both in-depth (using radionuclear imaging modalities) and superficially (using optical imaging modalities).

The (da Vinci) robot provides an ideal platform to effectively integrate the image guidance technologies in clinical routine and provides a solid basis for international dissemination of successful technologies.

In his talk Dr. van Leeuwen will illustrate the clinical implementation of a range of radionuclear and optical guidance technologies during robot assisted laparoscopic prostatectomy (RALP) combined with sentinel lymph node dissection.

 

Bio:

Fijs did his masters in Chemistry at the Bioinorganic and Coordination Chemistry group (prof. dr. Jan Reedijk; Leiden Institute of Chemistry) followed by a PhD at the MESA+ Institute for Nanotechnology (University of Twente) in the former Supramolecular Chemistry and Technology group headed by prof. dr. David Reinhoudt. During this period he also performed research at the Irradiation & Development business unit (dr. Ronald Schram) of the Dutch Nuclear Research and Consultancy group (NRG) in Petten. After obtaining his PhD he made the shift to biomedical research by pursuing a postdoctoral fellowship in the Chemical Biology group (Dr. Huib Ovaa) at the department of Cellular Biochemistry at the Netherlands Cancer Institute – Antoni van Leeuwenhoek Hospital (NKI-AvL). After being awarded a personal VENI grant from the Dutch Research Counsel he moved, within the NKI-AvL, to the clinical departments of Radiology and Nuclear Medicine were he became senior postdoctoral fellow in the medical image processing group of dr. Kenneth Gilhuijs. Under the guidance of the diagnostic oncology devision heads, initially prof. dr. Marc van de Vijver and later by prof. dr. Laura van ‘t Veer, he started to set up his own molecular imaging research line. In 2009 he obtained a personal cancer career award from the Dutch Cancer Society (KWF) on the development of multimodal imaging agents and was appointed associate staff scientist at the NKI-AVL. In 2010 he obtained a VIDI-grant from the Dutch Research Counsel on the development of imaging agents for surgical guidance. Soon after this he moved to the Leiden University Medical Center (LUMC) to become an associate professor at the department of Radiology (2011). Here he received an ERC-starting grant (2012) for the illumination of peripheral nerve structures. At the LUMC he heads the highly multidisciplinary Interventional Molecular Imaging laboratory, wherein the “from molecule to man” principle is actively pursued.

Jun
15
Mon
Nicolas Padoy: Radiation Exposure Monitoring in the Hybrid Operating Room using a Multi-camera RGBD System @ B17 Hackerman Hall
Jun 15 @ 10:00 am – 11:00 am

Abstract:

The growing use of image-guided minimally-invasive surgical procedures is confronting clinicians and surgical staff to new radiation exposure risks from X-ray imaging devices. Furthermore, the current surgical practice of wearing a single dosimeter at chest level to measure radiation exposure does not provide a sufficiently accurate estimation of radiation absorption throughout the body. Therefore, our aim is to develop a global radiation awareness system that can more accurately estimate intra-operative radiation exposure, thereby increasing staff awareness of radiation exposure risks and enabling the implementation of well-adapted safety measures.
In this talk, I will present our work towards the development of such a system. I will first present a computer vision approach that combines data from wireless dosimeters with the simulation of radiation propagation in order to compute a global radiation risk map in the area near the X-ray device. A multi-camera RGBD system is used to estimate the layout of the room and display the estimated risk map using augmented reality. By using real-time wireless dosimeters in our system, we can both calibrate the simulation and validate its accuracy at specific locations in real-time.
I will then describe our recent work on human pose estimation and activity recognition using RGBD data recorded during real X-ray guided interventions. Among other applications, the pose estimation of the persons present in the room will allow the computation of the radiation exposure per body part over time and the recognition of surgical activities will permit the correlation of these activities with the radiation risk they pose to staff and clinicians.

 

 

Bio:

Nicolas Padoy is an Assistant Professor at the University of Strasbourg, holding a Chair of Excellence in medical robotics within the ICube laboratory. He leads the research group CAMMA on Computational Analysis and Modeling of Medical Activities, which focuses on computer vision, activity recognition and the applications thereof to surgical workflow analysis and human-machine cooperation during surgery. He graduated with a Maîtrise in Computer Science from the Ecole Normale Supérieure de Lyon in 2003 and with a Diploma in Computer Science from the Technische Universität München (TUM), Munich, in 2005. He completed his PhD jointly between the Chair for Computer Aided Medical Procedures at TUM and the INRIA group MAGRIT in Nancy. Subsequently, he was a postdoctoral researcher and later an Assistant Research Professor in the Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, USA.

Jul
31
Fri
Special Seminar: Yunhui Liu: Towards Fusion of Vision with Robot Motion @ 320 Hackerman Hall
Jul 31 @ 12:00 pm – 1:00 pm

Abstract

Human heavily relies on visual feedback from eyes to control his/her motion.  To develop a robotic vision system that functions like human eyes, one of the crucial and difficult problems is how to effectively incorporate visual information to motion control of a robot whose dynamics is highly nonlinear. This talk presents our recent efforts and latest results on vision-based control of robotic systems. The controllers developed embed feedback from visual sensors into the low-level loop of robot motion control. It will be demonstrated that by an innovative and simple design of the visual feedback, we can solve several difficult problems in visual servoing such as uncalibrated dynamic visual servoing, trajectory tracking of nonholonomic mobile robots without position measurement, visual odometry, and model-free manipulation of deformable objects like soft tissues. Applications of the visual servoing approaches in robotic surgery will be also introduced.

 

Bio

Yunhui Liu received his B. Eng. degree in Applied Dynamics from Beijing Institute of Technology, China, in 1985, his M. Eng. degree in Mechanical Engineering from Osaka University in 1989, and his Ph.D. degree in Mathematical Engineering and Information Physics from the University of Tokyo, Japan, in 1992.  He worked at the Electrotechnical Laboratory, MITI, Japan from 1992 to 1995 as a Research Scientist. He has been with Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong since 1995, and is currently a Professor, Director of Networked Sensors and Robotics Laboratory, Director of Medical Robotics Laboratory. Professor Liu is interested in vision-based robot control, medical robotics, aerial robotics, multi-fingered grasping, and robot applications. His research has been widely funded by the Research Grants Councils, the Innovation and Technology Fund, and the Quality Education Fund in Hong Kong, and by the national 863 and 973 programs in Mainland China. He has published over 200 papers in refereed professional journals and international conference proceedings. He has received a number of best paper awards from international journals and major international conferences in robotics.  He is the Editor-in-Chief of Robotics and Biomimetics and an Editor of Advanced Robotics, and was an Associate Editor of IEEE Transactions on Robotics and Automation. He was listed in Highly Cited Authors (Engineering) by Thomson Reuters in 2013. Professor Liu was the General Chair of 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). He is a Fellow of IEEE.

Aug
26
Wed
Special Seminar: Darius Burschka “Vision-Based Interaction in Dynamic Scenes” @ B17 Hackerman Hall
Aug 26 @ 12:00 pm – 1:00 pm

Abstract

While perception and modelling of static environments became a well-research problem, independent motions in the scene are still difficult to acquire and represent for robotic tasks. The challenges range from the required significantly higher sampling rate for a correct representation of the motion to appropriate representation of actions and behaviours in a knowledge database. For an observation of human actions, the typical sampling rate of a standard video-camera is not sufficient to catch the details of the transport action beyond the registration of the resulting position change of an object. A high-speed motion tracking system is necessary to analyse the intentions of the agent while performing an action. At the same time, the dynamic change in the scene is often used not only for task analysis but also for the implementation of reactive behaviours on systems. An interesting aspect in this context is to find a robust representation for the information exchange that is insensitive to calibration errors in the visual and the control part of the system. Experiments show that exchange in the three-dimensional Cartesian space is not optimal although it is easier to understand by the human operator.

In my talk, I will present the newest research results in my group that allow a fast labelling and estimation of physical properties of dynamic objects in manipulation scenarios and that allow also to implement low level reactive behaviours on mobile and flying robots without exact camera calibration. The developed hybrid stereo system allows motion acquisitions up to 120Hz providing a better sampling of the human behaviours. I will also present our work on motion representation for trajectory planning and collision avoidance on our robotic car platform RoboMobil.

Speaker Bio

Darius Burschka received his PhD degree in Electrical and Computer Engineering in 1998 from the Technical University Munich (TUM) in the field of vision-based navigation and map generation with binocular stereo systems. In 1999, he was a Postdoctoral Associate at Yale University, New Haven, Connecticut,  where he worked on laser-based map generation and landmark selection from video images for vision-based navigation systems. From 1999 to 2003, he was an Associate Research Scientist at the Johns Hopkins University.  Later 2003 to 2005, he was an Assistant Research Professor in Computer Science at JHU.
Currently, he is an Associate Professor in Computer  Science at the TUM in Germany, where he heads the computer vision and perception group. He has a close collaboration with the German Aerospace Agency (DLR). He is a Co-Chair of the IEEE RAS Technical Committee for Computer and Robot Vision, Co-Chair of the Computer Vision and Perception Topic Group at euRobotics (EU Horizon2020), and a Senior Member of IEEE.

Sep
2
Wed
LCSR/ERC Seminar: Welcome/Welcome Back Town Hall
Sep 2 @ 12:00 pm – 1:00 pm

Abstract

This is the Fall 2015 Kick-Off Seminar, presenting an overview of LCSR, useful information, and an introduction to the faculty and labs.

 

Sep
9
Wed
Francois Lacombe: Optical Biopsy : a New Paradigm or a Multiple Scale Shift? @ B17 Hackerman Hall
Sep 9 @ 12:00 pm – 1:00 pm

Abstract

When it was first introduced a decade ago in gastroenterology, optical biopsy was the so-long-expected solution to give physicians the real time information which could improve their practice, optimize their workflow and accelerate their decision making process. Ten years later, it has proven to bring a much bigger benefit : multiple specialists, even in different and remote locations, can now share in a few milliseconds histological information collected in real time directly inside their patients. Since diagnostic or surgical endoscopy shifted to microscopic scale, histology data and medical proof became digital and immaterial, allowing  geographic and temporal scales, two other very important limitations in patient management, to collapse. This talk will show an example of this fundamental shift as we experienced it at Mauna Kea Technologies.

 

Sep
16
Wed
Noah Cowan: Control Theory as a Framework for Biology: The “Plant” is the Animal @ B17 Hackerman Hall
Sep 16 @ 12:00 pm – 1:00 pm

Abstract

Control systems engineering commonly relies on the “separation principle” which allows designers to independently design state observers and controllers. Biological sensorimotor control systems, however, routinely violate the requirements for separability. 1) Sensory modulation: animals often rely on a strategy known as “active sensing” in they use their own movements to alter spatiotemporal patterns of sensory information to improve task-level performance. 2) Motor modulation: In addition, animals routinely coordinate their motor systems to simplify task-level control—i.e. they actuate their effectors in such a way that simplifies the task-level “plant”. Here, we integrate “top-down” and “bottom-up” modeling and analysis of  sensorimotor control and active sensing in an ideally suited organism, the weakly electric glass knifefish, and provide new clues into how the brain and body work together to control movement.

 

Speaker Bio

For nearly 20 years my research has been devoted to understanding navigation and control in machines and animals. My students and postdocs conduct original experiments and computational analyses on both biological and robotic systems, with focus on applying concepts from dynamical systems and control theory to garner new insights into the principles that underly neural computation. This research program has been recognized by several awards, including a Presidential Early Career Award in Science and Engineering (PECASE) and a James S. McDonnel Complex Systems Scholar award.

Sep
23
Wed
Kevin Cleary: The Bioengineering Initiative at Children’s National Medical Center; Device Development for the Pediatric Environment @ B17 Hackerman Hall
Sep 23 @ 12:00 pm – 1:00 pm

Abstract

This talk will give an overview of the recently established Bioengineering Initiative in the Sheikh Zayed Institute for Pediatric Surgical Innovation at Children’s National Medical Center, Washington, DC, USA. The mission of the Bioengineering Initiative is to serve as an engineering research for the hospital and work with clinical partners to develop technology for minimally invasive interventions. The technology developments include medical devices, medical robotics, image registration and fusion, and image-guided navigation for pediatric interventions. The clinical applications include laparoscopic abdominal surgery, knee arthoscopy, craniosynostosis, ureteroscopy, and cochlear implant surgery. The institute includes scientists, radiologists, and surgeons that are dedicated to improving the precision and decreasing the invasiveness of pediatric procedures.

Speaker Bio

Kevin Cleary, PhD

Technical Director

Bioengineering Initiative

Sheikh Zayed Institute for Pediatric Surgical Innovation

Children’s National Medical Center

Washington, DC, USA

Sep
30
Wed
LCSR/ERC seminar cancelled for IROS
Sep 30 @ 12:00 pm – 1:00 pm
Oct
7
Wed
Peter Kazanzides: Remote Teleoperation for Satellite Servicing (“Satellite Surgery”) @ B17 Hackerman Hall
Oct 7 @ 12:00 pm – 1:00 pm

Abstract

We are developing methods for telerobotic on-orbit servicing of spacecraft under ground-based supervisory control of human operators to perform tasks in the presence of uncertainty and telemetry time delay of several seconds. As an initial application, we consider the case where the remote slave robot is teleoperated to cut the tape that secures a flap of multi-layer insulation (MLI) over a satellite access panel. This talk will present a delay tolerant control methodology, using virtual fixtures, hybrid position/force control, and environment modeling, that is robust to modeling and registration errors. The task model is represented by graphical primitives and virtual fixtures on the teleoperation master and by a hybrid position/force controller on the slave robot. The virtual fixtures guide the operator through a model-based simulation of the task, and the goal of the slave controller is to reproduce this action (after a few seconds of delay) or, if measurements are not consistent with the models, to stop motion and alert the operator. Experiments, including IRB-approved multi-user studies, are performed with a ground-based test platform where the master console of a da Vinci Research Kit is used to teleoperate a Whole Arm Manipulator (WAM) robot.

 

Speaker Bio

Peter Kazanzides has been working in the field of surgical robotics since 1989, when he started as a postdoctoral researcher with Russell Taylor at the IBM T.J. Watson Research Center. Dr. Kazanzides co-founded Integrated Surgical Systems (ISS) in November 1990 to commercialize the robotic hip replacement research performed at IBM and the University of California, Davis. As Director of Robotics and Software, he was responsible for the design, implementation, validation and support of the ROBODOC System, which has been used for more than 20,000 hip and knee replacement surgeries. Dr. Kazanzides joined the Engineering Research Center for Computer-Integrated Surgical Systems and Technology (CISST ERC) in December 2002 and currently holds an appointment as a Research Professor of Computer Science at Johns Hopkins University. This talk highlights the extension of his research in computer assisted surgery to encompass “surgery” on satellites.

Laboratory for Computational Sensing + Robotics