Goldmann M, Manhart M, Preuhs A, Goldmann F, Kowarschik M, Maier A (2024)
Publication Language: English
Publication Type: Conference contribution, Conference Contribution
Publication year: 2024
Publisher: IEEE
Series: 2024 IEEE Nuclear Science Symposium (NSS), Medical Imaging Conference (MIC) and Room Temperature Semiconductor Detector Conference (RTSD)
City/Town: New York, USA
Book Volume: 2024
Pages Range: 1 - 1
Conference Proceedings Title: IEEE Nuclear Science Symposium (NSS), Medical Imaging Conference (MIC) and Room Temperature Semiconductor Detector Conference (RTSD)
ISBN: 979-8-3503-8815-2
URI: https://ieeexplore.ieee.org/document/10656155
DOI: 10.1109/NSS/MIC/RTSD57108.2024.10656155
Interventional C-arm cone beam CT significantly reduces time-to-therapy for patients suffering from acute stroke. Extended acquisition times as compared to modern helical CT systems make rigid patient motion more likely, introducing a mismatch with the geometry alignment presupposed during reconstruction, leading to typical blurring or streaking artifacts. Different autofocus methods exist which aim to find a compensation trajectory and reestablish a motion-free reconstruction by optimizing a quality metric on the reconstructed images. However, depending on the type of implemented metric, these methods show diminished performance for larger motion amplitudes, often converging into local minima. To overcome these drawbacks, we present OrdinalNet, a deep learning model predicting a pseudo-binary score to assess motion artifact deterioration or improvement between two reconstructions resulting from slightly different motion trajectories. Our method eliminates the necessity for absolute artifact quantification in each image, along with establishing a relative ordering among data points which is sufficient for effective simplex-based optimization. The model is trained and evaluated on simulated motion applied to motion-free clinical acquisitions. When presented 1000 motion-affected pairs from previously unseen patients, it shows superior performance on a threshold-dependent binary classification task when compared to an entropy-based metric, resulting in areas under the curve of AUCONet =0.9295 and AUCEnt=0.6366
APA:
Goldmann, M., Manhart, M., Preuhs, A., Goldmann, F., Kowarschik, M., & Maier, A. (2024). OrdinalNet: A Deep Learning-based Relative Image Quality Metric for Motion Compensation in CBCT. In IEEE (Eds.), IEEE Nuclear Science Symposium (NSS), Medical Imaging Conference (MIC) and Room Temperature Semiconductor Detector Conference (RTSD) (pp. 1 - 1). Tampa, FL, US: New York, USA: IEEE.
MLA:
Goldmann, Manuela, et al. "OrdinalNet: A Deep Learning-based Relative Image Quality Metric for Motion Compensation in CBCT." Proceedings of the 2024 IEEE Nuclear Science Symposium (NSS), Medical Imaging Conference (MIC) and Room Temperature Semiconductor Detector Conference (RTSD), Tampa, FL Ed. IEEE, New York, USA: IEEE, 2024. 1 - 1.
BibTeX: Download