You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

Finding labeled datasets on TCIA


Suggested Deep Learning Parameters for TCIA Result Submission

Deep Learning hyperparameters are critical for researchers to reproduce Deep Learning experiments. However, it is usually the author's discretion for the completeness and format of reported hyperparameters in manuscripts. This document proposes a list of essential Deep Learning parameters to be included in TCIA results submission process. The goal is to explicitly capture these parameters in a common format such that TCIA users can easily reproduce and compare analysis results from TCIA data.

List of Deep Learning Parameters

  1. Deep Neural Network (DNN) Name - for example, VGG16, ResNet-101, UNet, etc., or a link to GitHub repository or manuscript for customized DNNs if applicable.

  2. Data Augmentation Methods - for example, color augmentation (HSV or RGB color space), transformation, noise, GAN, patch generation, downsizing parameters, etc.

  3. Training, Validation, and Testing Set Configuration - for example number of samples per each set, total number of samples, etc.

  4. Hyperparameters - for example, learning rate, early stopping, batch size, number of epochs, etc.

  5. Training Statistics - for example, wall time spent in training, accuracy metrics such as if average score or best score is reported, etc.

  6. Training Environment - for example, GPU type, Deep Learning framework used such as TensorFlow/PyTorch, number of GPUs, number of nodes, etc

  • No labels