C-Arm X-Ray systems are the workhorse modality for guidance of percutaneous orthopaedic surgical procedures. However, two-dimensional observations of the three-dimensional anatomy suffer from the effects of projective simplification. Consequently, many X-Ray images from various orientations need to be acquired for the surgeon to accurately assess the spatial relations between the patient’s anatomy and the surgical tools.
In this paper, we present an on-the-fly surgical support system that provides guidance using augmented reality and can be used in quasi-unprepared operating rooms. The proposed system builds upon a multi-modality marker and simultaneous localization and mapping technique to co-calibrate an optical see-through head mounted display to a C-Arm fluoroscopy system. Then, annotations on the 2-D X-Ray images can be rendered as virtual objects in 3-D providing surgical guidance. In a feasibility study on a semi-anthropomorphic phantom we found the accuracy of our system to be comparable to the traditional image-guided technique while substantially reducing the number of acquired X-Ray images as well as procedure time. Our promising results encourage further research on the interaction between virtual and real objects, that we believe will directly benefit the proposed method. Further, we would like to explore the capabilities of our on-the-fly augmented reality support system in a larger study directed towards common orthopaedic interventions.
Proper implant alignment is a critical step in total hip arthroplasty (THA) procedures. In current practice, correct alignment of the acetabular cup is verified in C-arm X-ray images that are acquired in an anteriorposterior (AP) view. Favorable surgical outcome is, therefore, heavily dependent on the surgeon’s experience in understanding the 3D orientation of a hemispheric implant from 2D AP projection images. This work proposes an easy to use intra-operative component planning system based on two C-arm X-ray images that is combined with 3D augmented reality (AR) visualization that simplifies impactor and cup placement according to the planning by providing a real-time RGBD data overlay. We evaluate the feasibility of our system in a user study comprising four orthopedic surgeons at the Johns Hopkins Hospital, and also report errors in translation, anteversion, and abduction as low as 1.98 mm, 1.10°, and 0.53°, respectively. The promising performance of this AR solution shows that deploying this system could eliminate the need for excessive radiation, simplify the intervention, and enable reproducibly accurate placement of acetabular implants.
Pre-operative CT data is available for several orthopedic and trauma interventions, and is mainly used to identify injuries and plan the surgical procedure. In this work we propose an intuitive augmented reality environment allowing visualization of pre-operative data during the intervention, with an overlay of the optical information from the surgical site. The pre-operative CT volume is first registered to the patient by acquiring a single C-arm X-ray image and using 3D/2D intensity-based registration. Next, we use an RGBD sensor on the C-arm to fuse the optical information of the surgical site with patient pre-operative medical data and provide an augmented reality environment. The 3D/2D registration of the pre- and intra-operative data allows us to maintain a correct visualization each time the C-arm is repositioned or the patient moves. An overall mean target registration error (mTRE) and standard deviation of 5.24 ± 3.09 mm was measured averaged over 19 C-arm poses. The proposed solution enables the surgeon to visualize pre-operative data overlaid with information from the surgical site (e.g. surgeon’s hands, surgical tools, etc.) for any C-arm pose, and negates issues of line-of-sight and long setup times, which are present in commercially available systems.
Fluoroscopic x-ray guidance is a cornerstone for percutaneous orthopedic surgical procedures. However, two-dimensional (2-D) observations of the three-dimensional (3-D) anatomy suffer from the effects of projective simplification. Consequently, many x-ray images from various orientations need to be acquired for the surgeon to accurately assess the spatial relations between the patient’s anatomy and the surgical tools. We present an on-the-fly surgical support system that provides guidance using augmented reality and can be used in quasiunprepared operating rooms. The proposed system builds upon a multimodality marker and simultaneous localization and mapping technique to cocalibrate an optical see-through head mounted display to a C-arm fluoroscopy system. Then, annotations on the 2-D x-ray images can be rendered as virtual objects in 3-D providing surgical guidance. We quantitatively evaluate the components of the proposed system and, finally, design a feasibility study on a semianthropomorphic phantom. The accuracy of our system was comparable to the traditional image-guided technique while substantially reducing the number of acquired x-ray images as well as procedure time. Our promising results encourage further research on the interaction between virtual and real objects that we believe will directly benefit the proposed method. Further, we would like to explore the capabilities of our on-the-fly augmented reality support system in a larger study directed toward common orthopedic interventions.
Reproducibly achieving proper implant alignment is a critical step in total hip arthroplasty procedures that has been shown to substantially affect patient outcome. In current practice, correct alignment of the acetabular cup is verified in C-arm x-ray images that are acquired in an anterior–posterior (AP) view. Favorable surgical outcome is, therefore, heavily dependent on the surgeon’s experience in understanding the 3-D orientation of a hemispheric implant from 2-D AP projection images. This work proposes an easy to use intraoperative component planning system based on two C-arm x-ray images that are combined with 3-D augmented reality (AR) visualization that simplifies impactor and cup placement according to the planning by providing a real-time RGBD data overlay. We evaluate the feasibility of our system in a user study comprising four orthopedic surgeons at the Johns Hopkins Hospital and report errors in translation, anteversion, and abduction as low as 1.98 mm, 1.10 deg, and 0.53 deg, respectively. The promising performance of this AR solution shows that deploying this system could eliminate the need for excessive radiation, simplify the intervention, and enable reproducibly accurate placement of acetabular implants.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.