September 21, 2016 @ 12:00 pm – 1:00 pm
B17 Hackerman Hall
The most difficult procedures during orthopedic and trauma surgeries is the placement of screws to repair complex fractures. Using a vast amount of X-ray images (we have observed surgeries with up to 246 images) the surgeon needs to drill a guide wire through the bone fragments. The difficulty is further increased by muscle and other tissue covering the bones (e.g. for pelvis).
Our system comprises a traditional X-ray machine (C-arm), a 3D camera mounted on this X-ray machine, and generally available 3D Computed Tomography (CT) images to guide the surgeon. Rather than seeing simple 2D X-ray images, our system shows the surgeon a 3D view of the bones, the drill, the patient surface and even the surgeon’s hands in real-time. This “Superman”-view, referred to Interventional 3D Augmented Reality, was shown to reduce duration, radiation dose, number of X-ray images, and complications in our preclinical experiments. In summary, our system increases patient safety and represents the future of interventional X-ray
Bernhard Fuerst is a research engineer at the Engineering Research Center at Johns Hopkins University. He received his Bachelor’s degree in Biomedical Computer Science at the University for Medical Technology in Austria in 2009 and his Master’s degree in Biomedical Computing at the Technical University in Munich, Germany in 2011. During his studies he joined Siemens Corporate Research in Princeton to research biomechanical simulations for compensation of respiratory motion under Dr. Ali Kamen’s supervision, and Georgetown University to investigate techniques for meta-optimization using particle swarm optimizers under Dr. Kevin Cleary’s supervision. Since joining the Johns Hopkins University, he worked on establishing Dr. Nassir Navab’s research group to focus on robotic ultrasound, minimally invasive nuclear imaging, and bioelectric localization and navigation.