Automated Segmentation of Prostate Structures (ASPS): Magnetic resonance imaging (MRI) is the clinically accepted modality for staging extra capsular extension and thus a key modality for image-guided interventions of the prostate. The accuracy of such interventions may be improved by automated segmentation of the prostate capsule and clinically relevant internal structures. The overall goal of the prostate challenge is to promote the development of robust algorithms that automatically segment the neurovascular bundle and seminal vesicles from clinical images. [Due to a low number of registrants at the early registration period, the 2013 prostate challenge is canceled. It will be reconsidered in future.]
Multiparametric Brain Tumor Segmentation (BRATS): Segmentation of brain tumors is a critical step in treatment planning and evaluation of response to therapy. It is also one of the most challenging tasks in medical image analysis due to the variable shape and heterogeneity of such tumors. MRI provides a rich and diverse data set to study brain tumors. Multicenter data will be used to segment four tumor subregions, while inter-reader agreements from clinicians will be used as a benchmark for comparing the algorithm. Participants can access the challenge program, data, and instructions at: https://www.smir.ch/BRATS/Start2013.
Each challenge will be based on a collection of 60 de-identified clinical cases with expert annotations. Initially, the contestants may access 40 training cases with annotations. Several weeks before the meeting, they may upload the segmentation of leaderboard cases (10) to allow computation of scores as feedback. "Leaderboard" is a popular term (borrowed from golf) for a procedure to see how well you are doing on a pre-test set in order to get a sense of your relative competitive rank before submitting to a final test. Contestants will be granted access to the sequestered challenge test data (10 cases) at the beginning of the workshop. In the first half of the workshop, participants process the data and report their scores. Testing of open and closed source solutions will run in parallel. In the second half of the workshop, the scores are posted and presentations are given.
Email questions to: email@example.com.
Challenge Structure and Time Line
- May 24: Training data release.
- July 15: Challenge Abstracts Due (new extended deadline). Free formatted, 2-4 page documents describing approach, preliminary results and conclusions, based on training data. Acceptance notification will be sent by July 18. Abstract should be submitted to firstname.lastname@example.org.
- July 22: MICCAI Early registration deadline.
- August 23: Leader Board release (scoring begins Aug 30).
- September 22: On-site NCI-MICCAI Challenges: BRATS & ASPS.