Basic Study
Copyright ©The Author(s) 2023. Published by Baishideng Publishing Group Inc. All rights reserved.
World J Orthop. Jun 18, 2023; 14(6): 387-398
Published online Jun 18, 2023. doi: 10.5312/wjo.v14.i6.387
Automated patellar height assessment on high-resolution radiographs with a novel deep learning-based approach
Kamil Kwolek, Dariusz Grzelecki, Konrad Kwolek, Dariusz Marczak, Jacek Kowalczewski, Marcin Tyrakowski
Kamil Kwolek, Marcin Tyrakowski, Department of Spine Disorders and Orthopaedics, Centre of Postgraduate Medical Education, Gruca Orthopaedic and Trauma Teaching Hospital, Otwock 05-400, Poland
Dariusz Grzelecki, Dariusz Marczak, Jacek Kowalczewski, Department of Orthopaedics and Rheumoorthopedics, Centre of Postgraduate Medical Education, Gruca Orthopaedic and Trauma Teaching Hospital, Otwock 05-400, Poland
Konrad Kwolek, Department of Orthopaedics and Traumatology, University Hospital, Krakow 30-663, Poland
Author contributions: Kwolek Ka, Grzelecki D, Tyrakowski M designed research; Kwolek Ka, Kwolek Ko performed research; Kwolek Ka, Kwolek Ko elaborated analytic tools, Kwolek Ka, Tyrakowski M, Kowalczewski J, Marczak D analyzed data; Kwolek Ka, Dariusz G, Kwolek Ko, Tyrakowski M wrote the paper.
Institutional review board statement: This study protocol was reviewed and approved by the Bioethics Committee of the authors’ institution (No.133/PB/2020).
Institutional animal care and use committee statement: No animals were used in the study.
Conflict-of-interest statement: The authors have no conflict of interest concerning the materials or methods used in this study or the findings specified in this article.
Data sharing statement: No additional data are available.
ARRIVE guidelines statement: The authors have read the ARRIVE guidelines, and the manuscript was prepared and revised according to the ARRIVE guidelines.
Open-Access: This article is an open-access article that was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution NonCommercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: https://creativecommons.org/Licenses/by-nc/4.0/
Corresponding author: Kamil Kwolek, MD, Academic Research, Doctor, Surgeon, Department of Spine Disorders and Orthopaedics, Centre of Postgraduate Medical Education, Gruca Orthopaedic and Trauma Teaching Hospital, Konarskiego 13, Otwock 05-400, Poland. kwolekamil@gmail.com
Received: February 15, 2023
Peer-review started: February 15, 2023
First decision: March 24, 2023
Revised: April 6, 2023
Accepted: May 6, 2023
Article in press: May 6, 2023
Published online: June 18, 2023
Processing time: 123 Days and 15 Hours
ARTICLE HIGHLIGHTS
Research background

Recent advancements in artificial intelligence and deep learning have contributed to the development of medical imaging techniques, leading to better interpretation of radiographs. Moreover, there is an increasing interest in automating routine diagnostic activities and orthopedic measurements.

Research motivation

The automation of patellar height assessment using deep learning-based bone segmentation and detection on high-resolution radiographs could provide a valuable tool in medical practice.

Research objectives

The aim of this study was to verify the accuracy of automated patellar height assessment using a U-Net neural network and to determine the agreement between manual and automatic measurements.

Research methods

Proximal tibia and patella was segmented by U-Net neural network on lateral knee subimages automatically detected by the You Only Look Once (YOLO) network. The patellar height was quantified by Caton-Deschamps and Blackburne-Peel indexes. The interclass correlation coefficient and standard error for single measurement were used to calculate agreement between manual and automatic measurements.

Research results

Proximal tibia and patella were segmented with 95.9% accuracy by the U-Net neural network on lateral knee subimages automatically detected by the YOLO network (mean Average Precision mAP greater than 0.96). Excellent agreement achieved between manual and automatic measurements for both indexes (interclass correlation coefficient > 0.75, SEM < 0.014).

Research conclusions

Automatic patellar height assessment can be achieved with high accuracy on high-resolution radiographs. Proximal tibia and patella can be segmented precisely by U-Net neural network on lateral knee subimages automatically detected by the YOLO network. Determining patellar endpoints and fitting the line to the proximal tibia joint surface enables accurate Caton-Deschamps and BP index calculations, making it a valuable tool in medical practice.

Research perspectives

Future research can focus on the clinical implementation of this automated method, which has the potential to enhance diagnostic accuracy, reduce human error, and improve patient outcomes.