• Medientyp: E-Artikel
  • Titel: METhodological RadiomICs Score (METRICS): a quality scoring tool for radiomics research endorsed by EuSoMII
  • Beteiligte: Kocak, Burak; Akinci D’Antonoli, Tugba; Mercaldo, Nathaniel; Alberich-Bayarri, Angel; Baessler, Bettina; Ambrosini, Ilaria; Andreychenko, Anna E.; Bakas, Spyridon; Beets-Tan, Regina G. H.; Bressem, Keno; Buvat, Irene; Cannella, Roberto; Cappellini, Luca Alessandro; Cavallo, Armando Ugo; Chepelev, Leonid L.; Chu, Linda Chi Hang; Demircioglu, Aydin; deSouza, Nandita M.; Dietzel, Matthias; Fanni, Salvatore Claudio; Fedorov, Andrey; Fournier, Laure S.; Giannini, Valentina; Girometti, Rossano; [...]
  • Erschienen: Springer Science and Business Media LLC, 2024
  • Erschienen in: Insights into Imaging, 15 (2024) 1
  • Sprache: Englisch
  • DOI: 10.1186/s13244-023-01572-w
  • ISSN: 1869-4101
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: Abstract Purpose To propose a new quality scoring tool, METhodological RadiomICs Score (METRICS), to assess and improve research quality of radiomics studies. Methods We conducted an online modified Delphi study with a group of international experts. It was performed in three consecutive stages: Stage#1, item preparation; Stage#2, panel discussion among EuSoMII Auditing Group members to identify the items to be voted; and Stage#3, four rounds of the modified Delphi exercise by panelists to determine the items eligible for the METRICS and their weights. The consensus threshold was 75%. Based on the median ranks derived from expert panel opinion and their rank-sum based conversion to importance scores, the category and item weights were calculated. Result In total, 59 panelists from 19 countries participated in selection and ranking of the items and categories. Final METRICS tool included 30 items within 9 categories. According to their weights, the categories were in descending order of importance: study design, imaging data, image processing and feature extraction, metrics and comparison, testing, feature processing, preparation for modeling, segmentation, and open science. A web application and a repository were developed to streamline the calculation of the METRICS score and to collect feedback from the radiomics community. Conclusion In this work, we developed a scoring tool for assessing the methodological quality of the radiomics research, with a large international panel and a modified Delphi protocol. With its conditional format to cover methodological variations, it provides a well-constructed framework for the key methodological concepts to assess the quality of radiomic research papers. Critical relevance statement A quality assessment tool, METhodological RadiomICs Score (METRICS), is made available by a large group of international domain experts, with transparent methodology, aiming at evaluating and improving research quality in radiomics and machine learning. Key points • A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol. • The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time. • METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines. • A web application has been developed to help with the calculation of the METRICS score (https://metricsscore.github.io/metrics/METRICS.html) and a repository created to collect feedback from the radiomics community (https://github.com/metricsscore/metrics). Graphical Abstract
  • Zugangsstatus: Freier Zugang