Augmented Reality for Ultrasound Guided Surgical Interventions by Raj Shekar at the 2019 MAVRIC Conf

The applications of augmented reality (AR) in medicine continue to be explored. The ever-increasing computational power is now making real-time fusion of live images possible enabling surgical applications. Ultrasound guidance for surgical applications is especially common and ranges from simple procedures such as vascular access to complex surgical procedures. The focus of this talk is two ultrasound-guided surgical AR systems we have developed.

Laparoscopic surgery is an attractive alternative to conventional open surgery and is known to improve outcomes, cause less scarring, and lead to significantly faster patient recovery. Standard laparoscopy cannot visualize anatomy beyond what is present in front of the laparoscope. Conventional laparoscopic ultrasound imaging provides information on subsurface anatomy, but can only be integrated with the laparoscopic images in the surgeon’s mind. Additionally, focus is distracted from the laparoscopy screen to look at ultrasound images on a separate screen. We have developed an AR system that fuses real-time ultrasound images with live laparoscopic video, allowing explicit visualization of subsurface structures (blood vessels, bile ducts, tumors, etc.) in the surgical field of view. Potential applications of this system include laparoscopic partial hepatectomy (liver resection) and laparoscopic radiofrequency ablation of liver cancer. A first-generation prototype was tested on 13 human cases at Children’s National, the results of which led to the development of a second-generation prototype that is being tested preclinically and will be evaluated clinically at Children’s National and University of Pittsburgh Medical Center.

Using ultrasound guidance during interventions generally involves the clinician splitting their attention between examining the ultrasound screen and paying attention to the patient while also advancing a needle or another instrument into the body. The needed hand-eye coordination is a skill that requires years of training. Overlaying live ultrasound on the clinician’s view has numerous advantages including potentially reducing complications and shortening the learning curve. Using Microsoft Hololens as the wearable AR platform, we have developed an app that allows the user to project the ultrasound image directly onto the part of the body being imaged, giving the user a real-time view of the anatomical structures. Additional features allow making various enhancements to the image using voice activation. This system, whether used for education or in actual procedures, enhances precision, efficiency and user experience. User studies are underway to quantify these AR-enabled benefits.

Scroll to top

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close