Purpose: The aim of this study is to evaluate the usefulness of artificial intelligence (AI) in the diagnosis of complex cases of scaphoid fractures on conventional wrist radiographs. Background: Scaphoid fractures are common and frequently poorly detectable on plain radiographs. New commercial AI software has been developed to assist radiologists, but its role in the diagnosis of complex cases of scaphoid fractures is yet to be established. Methods: We retrospectively included 270 participants (854 radiographs, XR) referred to our emergency department for wrist trauma. Subjects were classified by senior radiologists (SRs) and by Gleamer BoneView© AI software as “Positive”/“Negative”/“Doubt” for scaphoid fractures. We identified according to SRs’ reports: 1) complex cases, which included SRs' “Doubt” or “Negative” subjects with high clinical suspicion of fracture, who required a CT scan to confirm the diagnosis; 2) simple cases, which included “Positive” and “Negative” subjects who do not required a second CT to confirm the diagnosis. The Cohen's kappa coefficient was used to measure overall concordance between SR and AI classification. We evaluated the sensitivity and specificity of the AI software in the diagnosis of simple cases, in which classification by SRs was assumed as the gold standard. We assessed sensitivity of AI and SRs in the detection of complex cases, in which CT scan results were assumed as the gold standard. Results: SRs diagnosed 96 (36%) patients as “Positive”, 146 (54%) as “Negative”, and 28 (10%) as “Doubt” for scaphoid fracture. Corresponding results of the AI were 102 (38%), 144 (53%) and 24 (9%), respectively. Concordance of diagnosis classification of SRs and AI was moderate (Cohen's kappa = 0.459). Sensitivity and specificity of the AI in the detection of simple cases of scaphoid fractures were 79.2% and 86.5%, respectively. Among 63 complex cases (23%) of scaphoid fracture (35 initially “Negative” + 28 “Doubt” diagnoses), the AI sensitivity in the detection of complex cases was 55.6%, while the sensitivity of SRs was 44.4% (p = 0.248). Conclusions: AI may be useful to assist radiologists in the diagnosis of scaphoid fractures, particularly in complex cases.

Purpose: The aim of this study is to evaluate the usefulness of artificial intelligence (AI) in the diagnosis of complex cases of scaphoid fractures on conventional wrist radiographs. Background: Scaphoid fractures are common and frequently poorly detectable on plain radiographs. New commercial AI software has been developed to assist radiologists, but its role in the diagnosis of complex cases of scaphoid fractures is yet to be established. Methods: We retrospectively included 270 participants (854 radiographs, XR) referred to our emergency department for wrist trauma. Subjects were classified by senior radiologists (SRs) and by Gleamer BoneView© AI software as “Positive”/“Negative”/“Doubt” for scaphoid fractures. We identified according to SRs’ reports: 1) complex cases, which included SRs' “Doubt” or “Negative” subjects with high clinical suspicion of fracture, who required a CT scan to confirm the diagnosis; 2) simple cases, which included “Positive” and “Negative” subjects who do not required a second CT to confirm the diagnosis. The Cohen's kappa coefficient was used to measure overall concordance between SR and AI classification. We evaluated the sensitivity and specificity of the AI software in the diagnosis of simple cases, in which classification by SRs was assumed as the gold standard. We assessed sensitivity of AI and SRs in the detection of complex cases, in which CT scan results were assumed as the gold standard. Results: SRs diagnosed 96 (36%) patients as “Positive”, 146 (54%) as “Negative”, and 28 (10%) as “Doubt” for scaphoid fracture. Corresponding results of the AI were 102 (38%), 144 (53%) and 24 (9%), respectively. Concordance of diagnosis classification of SRs and AI was moderate (Cohen's kappa = 0.459). Sensitivity and specificity of the AI in the detection of simple cases of scaphoid fractures were 79.2% and 86.5%, respectively. Among 63 complex cases (23%) of scaphoid fracture (35 initially “Negative” + 28 “Doubt” diagnoses), the AI sensitivity in the detection of complex cases was 55.6%, while the sensitivity of SRs was 44.4% (p = 0.248). Conclusions: AI may be useful to assist radiologists in the diagnosis of scaphoid fractures, particularly in complex cases.

Application of Artificial Intelligence in the diagnosis of complex cases of scaphoid fractures: preliminary results of a "real life" retrospective study

MINONNE, LEONARDO
2023/2024

Abstract

Purpose: The aim of this study is to evaluate the usefulness of artificial intelligence (AI) in the diagnosis of complex cases of scaphoid fractures on conventional wrist radiographs. Background: Scaphoid fractures are common and frequently poorly detectable on plain radiographs. New commercial AI software has been developed to assist radiologists, but its role in the diagnosis of complex cases of scaphoid fractures is yet to be established. Methods: We retrospectively included 270 participants (854 radiographs, XR) referred to our emergency department for wrist trauma. Subjects were classified by senior radiologists (SRs) and by Gleamer BoneView© AI software as “Positive”/“Negative”/“Doubt” for scaphoid fractures. We identified according to SRs’ reports: 1) complex cases, which included SRs' “Doubt” or “Negative” subjects with high clinical suspicion of fracture, who required a CT scan to confirm the diagnosis; 2) simple cases, which included “Positive” and “Negative” subjects who do not required a second CT to confirm the diagnosis. The Cohen's kappa coefficient was used to measure overall concordance between SR and AI classification. We evaluated the sensitivity and specificity of the AI software in the diagnosis of simple cases, in which classification by SRs was assumed as the gold standard. We assessed sensitivity of AI and SRs in the detection of complex cases, in which CT scan results were assumed as the gold standard. Results: SRs diagnosed 96 (36%) patients as “Positive”, 146 (54%) as “Negative”, and 28 (10%) as “Doubt” for scaphoid fracture. Corresponding results of the AI were 102 (38%), 144 (53%) and 24 (9%), respectively. Concordance of diagnosis classification of SRs and AI was moderate (Cohen's kappa = 0.459). Sensitivity and specificity of the AI in the detection of simple cases of scaphoid fractures were 79.2% and 86.5%, respectively. Among 63 complex cases (23%) of scaphoid fracture (35 initially “Negative” + 28 “Doubt” diagnoses), the AI sensitivity in the detection of complex cases was 55.6%, while the sensitivity of SRs was 44.4% (p = 0.248). Conclusions: AI may be useful to assist radiologists in the diagnosis of scaphoid fractures, particularly in complex cases.
2023
Application of Artificial Intelligence in the diagnosis of complex cases of scaphoid fractures: preliminary results of a "real life" retrospective study
Purpose: The aim of this study is to evaluate the usefulness of artificial intelligence (AI) in the diagnosis of complex cases of scaphoid fractures on conventional wrist radiographs. Background: Scaphoid fractures are common and frequently poorly detectable on plain radiographs. New commercial AI software has been developed to assist radiologists, but its role in the diagnosis of complex cases of scaphoid fractures is yet to be established. Methods: We retrospectively included 270 participants (854 radiographs, XR) referred to our emergency department for wrist trauma. Subjects were classified by senior radiologists (SRs) and by Gleamer BoneView© AI software as “Positive”/“Negative”/“Doubt” for scaphoid fractures. We identified according to SRs’ reports: 1) complex cases, which included SRs' “Doubt” or “Negative” subjects with high clinical suspicion of fracture, who required a CT scan to confirm the diagnosis; 2) simple cases, which included “Positive” and “Negative” subjects who do not required a second CT to confirm the diagnosis. The Cohen's kappa coefficient was used to measure overall concordance between SR and AI classification. We evaluated the sensitivity and specificity of the AI software in the diagnosis of simple cases, in which classification by SRs was assumed as the gold standard. We assessed sensitivity of AI and SRs in the detection of complex cases, in which CT scan results were assumed as the gold standard. Results: SRs diagnosed 96 (36%) patients as “Positive”, 146 (54%) as “Negative”, and 28 (10%) as “Doubt” for scaphoid fracture. Corresponding results of the AI were 102 (38%), 144 (53%) and 24 (9%), respectively. Concordance of diagnosis classification of SRs and AI was moderate (Cohen's kappa = 0.459). Sensitivity and specificity of the AI in the detection of simple cases of scaphoid fractures were 79.2% and 86.5%, respectively. Among 63 complex cases (23%) of scaphoid fracture (35 initially “Negative” + 28 “Doubt” diagnoses), the AI sensitivity in the detection of complex cases was 55.6%, while the sensitivity of SRs was 44.4% (p = 0.248). Conclusions: AI may be useful to assist radiologists in the diagnosis of scaphoid fractures, particularly in complex cases.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

È consentito all'utente scaricare e condividere i documenti disponibili a testo pieno in UNITESI UNIPV nel rispetto della licenza Creative Commons del tipo CC BY NC ND.
Per maggiori informazioni e per verifiche sull'eventuale disponibilità del file scrivere a: unitesi@unipv.it.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14239/17549