Loading…
Knee arthroscopic navigation using virtual-vision rendering and self-positioning technology
Purpose Knee arthroscopy suffers from a lack of depth information and easy occlusion of the visual field. To solve these limitations, we propose an arthroscopic navigation system based on self-positioning technology, with the guidance of virtual-vision views. This system can work without any externa...
Saved in:
Published in: | International journal for computer assisted radiology and surgery 2020-03, Vol.15 (3), p.467-477 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Purpose
Knee arthroscopy suffers from a lack of depth information and easy occlusion of the visual field. To solve these limitations, we propose an arthroscopic navigation system based on self-positioning technology, with the guidance of virtual-vision views. This system can work without any external tracking devices or added markers, thus increasing the working range and improving the robustness of the rotating operation.
Methods
The fly-through view and global positioning view for surgical guidance are rendered through virtual-vision rendering in real time. The fly-through view provides surgeons with navigating the arthroscope in the internal anatomical structures using a virtual camera perspective. The global positioning view shows the posture of the arthroscope relative to the preoperative model in a transparent manner. The posture of the arthroscope is estimated from the fusion of visual and inertial data based on the visual–inertial stereo slam. A flexible calibration method that transforms the posture of the arthroscope in the physical world into the virtual-vision rendering framework is proposed for the arthroscopic navigation system with self-positioning information.
Results
Quantitative experiments for evaluating self-positioning accuracy were performed. For translation, the acquired mean error was 0.41 ± 0.28 mm; for rotation, it was 0.11° ± 0.07°. The tracking range of the proposed system was approximately 1.4 times that of the traditional external optical tracking system for the rotating operation. Simulated surgical operations were performed on the phantom. The fly-through and global positing views were paired with original arthroscopic images for intuitive surgical guidance.
Conclusion
The proposed system provides surgeons with both fly-through and global positioning views without a dependence on the traditional external tracking systems for surgical guidance. The feasibility and robustness of the system are evaluated, and it shows promise for medical applications. |
---|---|
ISSN: | 1861-6410 1861-6429 |
DOI: | 10.1007/s11548-019-02099-6 |