Abhilash Pandya, “Robot-Integrated Raman Spectrometer for Cancer Detection and Visualization”

When:
September 24, 2014 @ 12:00 pm – 1:00 pm
2014-09-24T12:00:00-04:00
2014-09-24T13:00:00-04:00
Where:
B17 Hackerman Hall
Cost:
Free

Abstract

Robotics now allow surgeons to navigate registered tools and sensors in vivo. Image guidance is a technique that often uses augmented reality, virtual reality and imaging data to provide accurate localization and real-time surgical navigation. Raman spectroscopy is a powerful laser-based analysis technique that allows real-time tissue diagnosis (e.g. cancer vs. normal). It is an optical technique in which monochromatic light from a laser is used to excite a tissue, generating characteristic vibrational movement of chemical bonds. Light that is scattered in an inelastic manner is then collected and analyzed, providing a distinctive molecular “fingerprint” of the tissue which can be deciphered using classification techniques such as Support Vector Machines.

A robot-integrated Raman system combined with an augmented reality presentation of mutual, registered, diagnostic information could result in faster and more accurate tissue resections. We demonstrate a near-real-time diagnosis of tissue being analyzed (e.g. cancer) and the corresponding localization information displayed within an image-guided framework. For our system, a portable Raman probe was attached to a mechanical arm and used to scan, classify, and visualize objects within a phantom skull. We discuss the implementation of the integrated system, the classification/diagnosis algorithms developed, along with the visualization techniques. We highlight future steps for its development and eventual application.

 

Speaker Bio

From 1986-88 Dr. Pandya worked as an Engineer for Virogen Inc. developing solutions for a robotics-based AIDS blood testing system. From 1988 to 1998, he worked at NASA Johnson Space Center’s Graphics Research and Analysis Facility (GRAF) and Remote Operator Interaction Laboratory(ROIL), under various Lockheed Martin contracts for NASA’s Flight Crew Support Division. His work involved primarily Space Station Robotics, Astronaut suit simulation, and the development of software for an immersive human-model-based Virtual Reality system for station applications. From 1998 – 2002, he worked at the Neurosurgery Department (Harper Hospital, Detroit Mi.) where he helped developed Image Guided Surgery software and hardware for use in the operating room and lead a team of engineers in research on Robotic (Neuromate) and Image Guided Neurosurgery. He received a BS and MS from the University of Michigan and a Ph.D from Wayne State University in Bioengineering/Scientific Computing (2004). Since 2004, he has been faculty at the ECE Department at Wayne State University, and holds joint appointments in Bioengineering and Surgery.

Laboratory for Computational Sensing + Robotics