Loading…

Robust hand tracking for surgical telestration

Purpose As human failure has been shown to be one primary cause for post-operative death, surgical training is of the utmost socioeconomic importance. In this context, the concept of surgical telestration has been introduced to enable experienced surgeons to efficiently and effectively mentor traine...

Full description

Saved in:
Bibliographic Details
Published in:International journal for computer assisted radiology and surgery 2022-08, Vol.17 (8), p.1477-1486
Main Authors: Müller, Lucas-Raphael, Petersen, Jens, Yamlahi, Amine, Wise, Philipp, Adler, Tim J., Seitel, Alexander, Kowalewski, Karl-Friedrich, Müller, Beat, Kenngott, Hannes, Nickel, Felix, Maier-Hein, Lena
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Purpose As human failure has been shown to be one primary cause for post-operative death, surgical training is of the utmost socioeconomic importance. In this context, the concept of surgical telestration has been introduced to enable experienced surgeons to efficiently and effectively mentor trainees in an intuitive way. While previous approaches to telestration have concentrated on overlaying drawings on surgical videos, we explore the augmented reality (AR) visualization of surgical hands to imitate the direct interaction with the situs. Methods We present a real-time hand tracking pipeline specifically designed for the application of surgical telestration. It comprises three modules, dedicated to (1) the coarse localization of the expert’s hand and the subsequent (2) segmentation of the hand for AR visualization in the field of view of the trainee and (3) regression of keypoints making up the hand’s skeleton. The semantic representation is obtained to offer the ability for structured reporting of the motions performed as part of the teaching. Results According to a comprehensive validation based on a large data set comprising more than 14,000 annotated images with varying application-relevant conditions, our algorithm enables real-time hand tracking and is sufficiently accurate for the task of surgical telestration. In a retrospective validation study, a mean detection accuracy of 98%, a mean keypoint regression accuracy of 10.0 px and a mean Dice Similarity Coefficient of 0.95 were achieved. In a prospective validation study, it showed uncompromised performance when the sensor, operator or gesture varied. Conclusion Due to its high accuracy and fast inference time, our neural network-based approach to hand tracking is well suited for an AR approach to surgical telestration. Future work should be directed to evaluating the clinical value of the approach.
ISSN:1861-6429
1861-6410
1861-6429
DOI:10.1007/s11548-022-02637-9