Basic Study
Copyright ©The Author(s) 2023. Published by Baishideng Publishing Group Inc. All rights reserved.
World J Orthop. Jun 18, 2023; 14(6): 387-398
Published online Jun 18, 2023. doi: 10.5312/wjo.v14.i6.387
Automated patellar height assessment on high-resolution radiographs with a novel deep learning-based approach
Kamil Kwolek, Dariusz Grzelecki, Konrad Kwolek, Dariusz Marczak, Jacek Kowalczewski, Marcin Tyrakowski
Kamil Kwolek, Marcin Tyrakowski, Department of Spine Disorders and Orthopaedics, Centre of Postgraduate Medical Education, Gruca Orthopaedic and Trauma Teaching Hospital, Otwock 05-400, Poland
Dariusz Grzelecki, Dariusz Marczak, Jacek Kowalczewski, Department of Orthopaedics and Rheumoorthopedics, Centre of Postgraduate Medical Education, Gruca Orthopaedic and Trauma Teaching Hospital, Otwock 05-400, Poland
Konrad Kwolek, Department of Orthopaedics and Traumatology, University Hospital, Krakow 30-663, Poland
Author contributions: Kwolek Ka, Grzelecki D, Tyrakowski M designed research; Kwolek Ka, Kwolek Ko performed research; Kwolek Ka, Kwolek Ko elaborated analytic tools, Kwolek Ka, Tyrakowski M, Kowalczewski J, Marczak D analyzed data; Kwolek Ka, Dariusz G, Kwolek Ko, Tyrakowski M wrote the paper.
Institutional review board statement: This study protocol was reviewed and approved by the Bioethics Committee of the authors’ institution (No.133/PB/2020).
Institutional animal care and use committee statement: No animals were used in the study.
Conflict-of-interest statement: The authors have no conflict of interest concerning the materials or methods used in this study or the findings specified in this article.
Data sharing statement: No additional data are available.
ARRIVE guidelines statement: The authors have read the ARRIVE guidelines, and the manuscript was prepared and revised according to the ARRIVE guidelines.
Open-Access: This article is an open-access article that was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution NonCommercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: https://creativecommons.org/Licenses/by-nc/4.0/
Corresponding author: Kamil Kwolek, MD, Academic Research, Doctor, Surgeon, Department of Spine Disorders and Orthopaedics, Centre of Postgraduate Medical Education, Gruca Orthopaedic and Trauma Teaching Hospital, Konarskiego 13, Otwock 05-400, Poland. kwolekamil@gmail.com
Received: February 15, 2023
Peer-review started: February 15, 2023
First decision: March 24, 2023
Revised: April 6, 2023
Accepted: May 6, 2023
Article in press: May 6, 2023
Published online: June 18, 2023
Processing time: 123 Days and 15 Hours
Abstract
BACKGROUND

Artificial intelligence and deep learning have shown promising results in medical imaging and interpreting radiographs. Moreover, medical community shows a gaining interest in automating routine diagnostics issues and orthopedic measurements.

AIM

To verify the accuracy of automated patellar height assessment using deep learning-based bone segmentation and detection approach on high resolution radiographs.

METHODS

218 Lateral knee radiographs were included in the analysis. 82 radiographs were utilized for training and 10 other radiographs for validation of a U-Net neural network to achieve required Dice score. 92 other radiographs were used for automatic (U-Net) and manual measurements of the patellar height, quantified by Caton-Deschamps (CD) and Blackburne-Peel (BP) indexes. The detection of required bones regions on high-resolution images was done using a You Only Look Once (YOLO) neural network. The agreement between manual and automatic measurements was calculated using the interclass correlation coefficient (ICC) and the standard error for single measurement (SEM). To check U-Net's generalization the segmentation accuracy on the test set was also calculated.

RESULTS

Proximal tibia and patella was segmented with accuracy 95.9% (Dice score) by U-Net neural network on lateral knee subimages automatically detected by the YOLO network (mean Average Precision mAP greater than 0.96). The mean values of CD and BP indexes calculated by orthopedic surgeons (R#1 and R#2) was 0.93 (± 0.19) and 0.89 (± 0.19) for CD and 0.80 (± 0.17) and 0.78 (± 0.17) for BP. Automatic measurements performed by our algorithm for CD and BP indexes were 0.92 (± 0.21) and 0.75 (± 0.19), respectively. Excellent agreement between the orthopedic surgeons’ measurements and results of the algorithm has been achieved (ICC > 0.75, SEM < 0.014).

CONCLUSION

Automatic patellar height assessment can be achieved on high-resolution radiographs with the required accuracy. Determining patellar end-points and the joint line-fitting to the proximal tibia joint surface allows for accurate CD and BP index calculations. The obtained results indicate that this approach can be valuable tool in a medical practice.

Keywords: Medical imaging; Artificial intelligence in orthopedics; Patellar index; Deep learning; Bone segmentation; Region of interest detection

Core Tip: This study presents an accurate method for automatic assessment of patellar height on high-resolution lateral knee radiographs. First, You Only Look Once neural network is used to detect patellar and proximal tibial region. Next, U-Net neural network is utilized to segment bones of the detected region. Then, the Caton-Deschamps and Blackburne-Peel indexes are calculated upon patellar end-points and joint line fitted to proximal tibia joint surface. Experimental results show that our approach has the potential to be used as a pre- and postoperative assessment tool.