### General Information

The LoDoPaB-CT Challenge started as part of the CT Code Sprint 2020. Participants could submit their results for the first phase of the challenge until the end of August 2020. The results of this first challenge part were evaluated in the journal article Quantitative Comparison of Deep Learning-Based Image Reconstruction Methods for Low-Dose and Sparse-Angle CT Applications by Leuschner et al.

The challenge has now entered the second phase. It is permanently open for submissions!

The task of this challenge is to reconstruct CT images of the human lung from (simulated) low photon count measurements. For evaluation, the PSNR and SSIM values are computed w.r.t. the images that were used as ground truth.
Looking for sparse-angle CT data? Check out the Apples-CT Grand Challenge.

### Dataset

Training of learned methods can be performed by using the LoDoPaB-CT dataset (documented in this Scientifc Data article). Easy access to the dataset is provided by the dival python package.

### Submission format

We recommend using the utilities to create a submission from your reconstructions. This is done by calling save_reconstruction() for each reconstruction. Afterwards the written files can be zipped and uploaded to the challenge website.

Note: A submission must contain reconstructions for the whole challenge set.

A submission is a zip file containing several HDF5 files.

Each HDF5 file contains 128 reconstructions in form of a dataset named 'data' of shape (128, 362, 362) and dtype='float32'. The HDF5 filenames must end with indices (before the extension dot), e.g. 'reco_000.hdf5' or '0.h5' are valid filenames. These indices must be consecutive starting from 0 or 1 and determine the order of the files. The dataset in the last HDF5 file may either have a smaller first dimension, i.e. shape (94, 362, 362), or be filled up with arbitrary values for the remaining indices.

### Evaluation

The PSNR and SSIM values are computed w.r.t. the used ground truth images. Both metrics require a data_range of possible pixel values to be defined. Values are reported for two different data range choices:

1. Use the difference between the maximum and minimum pixel value of each corresponding ground truth image. This metric unfortunately is image-dependent, but avoids overoptimism (see 2.).
2. Use data_range=1.0. This metric is image-independent, but may be too optimistic, since natural bodies do not reach the attenuation of 3071 HU corresponding to a value of 1.0 (e.g. metal has such high attenuation, it occurs in few samples of the dataset).

Note: due to a bug in the evaluation script, incorrect data range values were used for the SSIM with image-dependent data range until Jan 04, 2021. Specifically, the data range for sample i (for i = 0,…,3677was determined from ground truth image floor(i / 128) * 128 instead of ground truth image i. This bug is now fixed and all submissions were re-evaluated.

### Publishing Results

We encourage participants of the LoDoPaB-CT Challenge to publish their results. The leaderboard can be included to compare the performance of the submission with other reconstruction methods. We ask authors to cite any method that they include from the leaderboard, if the information was published together with the submission. In addition, a citation of the dataset paper (Scientific Data article), a link to the challenge website, and the date the information was accessed should be provided.

Authors can contact the organizing team (see information below)  to request additional results from challenge submission, e.g. a small selection of sample reconstruction.  We are also happy to link corresponding publications on this website.

### Organizing Team and Contact

The challenge is organized by:

• Johannes Leuschner, Center for Industrial Mathematics, University of Bremen
• Maximilian Schmidt, Center for Industrial Mathematics, University of Bremen
• Alexander Denker, Center for Industrial Mathematics, University of Bremen

You can contact the organizing team via the following email address: codesprint2020[at]uni-bremen.de