Skip to content

SynthRAD2025/metrics

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Contributors Forks Stargazers Issues GNU GPL-v3.0


Logo

Preparing the metrics for evaluation of SynthRAD2025 Grand Challenge
Explore the docs »

View Demo · Report Bug · Request Feature

Table of Contents

Goal

Assess the quality of the synthetic computed tomography (sCT) images against CT. A brief description of the metrics can be found at: https://synthrad2025.grand-challenge.org/metrics-ranking/.

Getting Started

To get a local copy up and running follow these simple steps.

Installation

  1. Clone the repo
git clone https://github.com/SynthRAD2025/metrics.git

or

git clone git@github.com:SynthRAD2025/metrics.git

Prerequisites

  • numpy
pip install -r requirements.txt

Usage

The metrics are computed in two files: image_metrics.py and dose_metrics.py. These compute respectively,

  • The image similarity between the ground-truth CT and the synthetic CT. Thes metrics include the mean squared error (MSE), peak signal to noise ratio (PSNR), and multiscale structural similarity (MS-SSIM).
  • The segmentation metrics evaluate the geometric accuracy between the ground-truth CT and synthetic CT. They compute the DICE and 95th percentile Hausdorff distance (HD95) between a ground-truth segmentation set and segmentations estimated from synthetic CTs. The weights and implementations of the TotalSegmentator model are archived and illustrated in the evaluation repository.
  • The metrics to compare the dose delivered to the ground truth and the synthetic CT. These metrics include the mean absolute dose (MAE), a dose-volume histogram (DVH) metric, and the gamma pass rate.

Functions Descriptions

In general, any function can be used in the following way.

a(input, output)

description:
compute the metric a (e.g., mse, psnr, ssim) between input and output

arguments:
input: The numpy array of the ground-truth image
output: The numpy array of the predicted image

All metrics can be computed by using the score_patient, which loads the data and returns all metrics:

Image similarity

    metrics = ImageMetrics()
    ground_truth_path = "path/to/ground_truth.mha"
    predicted_path = "path/to/prediction.mha"
    mask_path = "path/to/mask.mha"
    print(metrics.score_patient(ground_truth_path, predicted_path, mask_path))

Geometry consistency

    metrics = SegmentationMetrics()
    ground_truth_path = "path/to/ground_truth.mha"
    predicted_path = "path/to/prediction.mha"
    mask_path = "path/to/mask.mha"
    print(metrics.score_patient(ground_truth_path, predicted_path, mask_path))

Dose evaluation

    dose_path = 'path/to/treatment_plans'
    predicted_path = "path/to/prediction.mha"
    patient_id="1BA000"
    
    metrics = DoseMetrics(dose_path)
    print(metrics.score_patient(patient_id, predicted_path))

Roadmap

See the open issues for a list of proposed features (and known issues).

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the GNU General Public License v3.0. See LICENSE for more information.

Contact

Maarten Terpstra - M.L.Terpstra-5@umcutrecht.nl Matteo Maspero - @matteomasperonl - m.maspero@umcutrecht.nl

Project Link: https://github.com/SynthRAD2025/metrics

About

Metrics adopted in SynthRAD2025

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages