Continuous Feature-Based Tracking of the Inner Ear for Robot-Assisted Microsurgery

Marzi C, Prinzen T, Haag J, Klenzner T, Mathis-Ullrich F (2021)


Publication Type: Journal article

Publication year: 2021

Journal

Book Volume: 8

Article Number: 742160

DOI: 10.3389/fsurg.2021.742160

Abstract

Robotic systems for surgery of the inner ear must enable highly precise movement in relation to the patient. To allow for a suitable collaboration between surgeon and robot, these systems should not interrupt the surgical workflow and integrate well in existing processes. As the surgical microscope is a standard tool, present in almost every microsurgical intervention and due to it being in close proximity to the situs, it is predestined to be extended by assistive robotic systems. For instance, a microscope-mounted laser for ablation. As both, patient and microscope are subject to movements during surgery, a well-integrated robotic system must be able to comply with these movements. To solve the problem of on-line registration of an assistance system to the situs, the standard of care often utilizes marker-based technologies, which require markers being rigidly attached to the patient. This not only requires time for preparation but also increases invasiveness of the procedure and the line of sight of the tracking system may not be obstructed. This work aims at utilizing the existing imaging system for detection of relative movements between the surgical microscope and the patient. The resulting data allows for maintaining registration. Hereby, no artificial markers or landmarks are considered but an approach for feature-based tracking with respect to the surgical environment in otology is presented. The images for tracking are obtained by a two-dimensional RGB stream of a surgical microscope. Due to the bony structure of the surgical site, the recorded cochleostomy scene moves nearly rigidly. The goal of the tracking algorithm is to estimate motion only from the given image stream. After preprocessing, features are detected in two subsequent images and their affine transformation is computed by a random sample consensus (RANSAC) algorithm. The proposed method can provide movement feedback with up to 93.2 μm precision without the need for any additional hardware in the operating room or attachment of fiducials to the situs. In long term tracking, an accumulative error occurs.

Authors with CRIS profile

Involved external institutions

How to cite

APA:

Marzi, C., Prinzen, T., Haag, J., Klenzner, T., & Mathis-Ullrich, F. (2021). Continuous Feature-Based Tracking of the Inner Ear for Robot-Assisted Microsurgery. Frontiers in Surgery, 8. https://doi.org/10.3389/fsurg.2021.742160

MLA:

Marzi, Christian, et al. "Continuous Feature-Based Tracking of the Inner Ear for Robot-Assisted Microsurgery." Frontiers in Surgery 8 (2021).

BibTeX: Download