Skeletal bone age assessment is a common clinical practice to investigate endocrinology, genetic and growth disorders in children. It is generally performed by radiological examination of the left hand by using either the Greulich and Pyle (G&P) method or the Tanner-Whitehouse (TW) one. However, both clinical procedures show several limitations, from the examination effort of radiologists to (most importantly) significant intra- and inter-operator variability. To address these problems, several automated approaches (especially relying on the TW method) have been proposed; nevertheless, none of them has been proved able to generalize to different races, age ranges and genders. In this paper, we propose and test several deep learning approaches to assess skeletal bone age automatically; the results showed an average discrepancy between manual and automatic evaluation of about 0.8 years, which is state-of-the-art performance. Furthermore, this is the first automated skeletal bone age assessment work tested on a public dataset and for all age ranges, races and genders, for which the source code is available, thus representing an exhaustive baseline for future research in the field. Beside the specific application scenario, this paper aims at providing answers to more general questions about deep learning on medical images: from the comparison between deep-learned features and manually-crafted ones, to the usage of deep-learning methods trained on general imagery for medical problems, to how to train a CNN with few images.

Deep Learning for Automated Skeletal Bone Age Assessment in X-Ray Images

ALDINUCCI, MARCO;
2017-01-01

Abstract

Skeletal bone age assessment is a common clinical practice to investigate endocrinology, genetic and growth disorders in children. It is generally performed by radiological examination of the left hand by using either the Greulich and Pyle (G&P) method or the Tanner-Whitehouse (TW) one. However, both clinical procedures show several limitations, from the examination effort of radiologists to (most importantly) significant intra- and inter-operator variability. To address these problems, several automated approaches (especially relying on the TW method) have been proposed; nevertheless, none of them has been proved able to generalize to different races, age ranges and genders. In this paper, we propose and test several deep learning approaches to assess skeletal bone age automatically; the results showed an average discrepancy between manual and automatic evaluation of about 0.8 years, which is state-of-the-art performance. Furthermore, this is the first automated skeletal bone age assessment work tested on a public dataset and for all age ranges, races and genders, for which the source code is available, thus representing an exhaustive baseline for future research in the field. Beside the specific application scenario, this paper aims at providing answers to more general questions about deep learning on medical images: from the comparison between deep-learned features and manually-crafted ones, to the usage of deep-learning methods trained on general imagery for medical problems, to how to train a CNN with few images.
2017
36
41
51
http://www.elsevier.com/inca/publications/store/6/2/0/9/8/3/index.htt
Convolutional neural networks; Deep learning for medical images; Greulich and Pyle; Tanner–Whitehouse; Radiological and Ultrasound Technology; Radiology, Nuclear Medicine and Imaging; 1707; Health Informatics; Computer Graphics and Computer-Aided Design
Concetto, Spampinato; Simone, Palazzo; Daniela, Giordano; Aldinucci, Marco; Rosalia, Leonardi
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S1361841516301840-main.pdf

Accesso riservato

Tipo di file: PDF EDITORIALE
Dimensione 1.54 MB
Formato Adobe PDF
1.54 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
main.pdf

Accesso aperto

Descrizione: preprint
Tipo di file: PREPRINT (PRIMA BOZZA)
Dimensione 3.5 MB
Formato Adobe PDF
3.5 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1607122
Citazioni
  • ???jsp.display-item.citation.pmc??? 69
  • Scopus 337
  • ???jsp.display-item.citation.isi??? 254
social impact