Illumination Estimation Challenge
The task is to estimate the color of the illumination on the given linear images. Each of the illumination estimations should be given as a three-dimensional vector representing the illumination color in the RGB color space. Only the direction of the vector is what matters. For example, for the image below (tone mapped for display purposes), a possible and in this particular case very accurate solution would be (0.2021, 0.4705, 0.3274).
In short, the median of the reproduction angular errors obtained for all images will be used for scoring.
For each image in the test set the reproduction angular error is calculated. It is defined as the angle between the vector of the white color in the image color space corrected by using the ground-truth illumination and the vector of the white color in the image color space corrected by the provided illumination estimation. More formally, as described in the original paper this is calculated by using the equation
where U is the vector of the ideally corrected white color i.e. (1, 1, 1), ρE, W is the vector of the white color in the image RGB color space under the image scene illumination, ρEst is the illumination estimation, and all multiplication and division operations in the equation are performed element-wise. A smaller angle means higher estimation accuracy and in the ideal case, the angular error is 0.
Next, as recommended by the relevant literature, the median of all obtained angles will be calculated and used as the summary metric for the results obtained on the whole dataset. In the case of a tie, the mean angular error will be used followed by the trimean angular error.
The instructions on how to download the images are available on the data page. Before using the images from the train and test datasets, the participants are expected to subtract the black level and to remove the oversaturated pixels as described in more detail in the instructions that are accessible through the data page.
The test images will be available for download on the data page in an encrypted ZIP archive one week before the submission deadline expires. The password for the ZIP archive will be published there 24 hours before the submission deadline expires. In the given test images the SpyderCube calibration object will be masked out as described for the Cube dataset and it is recommended to do the same on the training images as well.
To submit the results, a submission form available on the submission page has to be filled out. The form will be available only on the submission day. The results have to be stored in a textual file where each line contains the red, green, and blue values of illumination estimation vectors separated by whitespaces. The nth line has to contain the illumination for the nth image. The values are real numbers and scientific notation is allowed.
Important dates - DEADLINES EXTENDED!
Encrypted test dataset available:
May 18June 1, 2019
Submissions open / Test archive password:
May 25June 8, 2019 - 00:00 CET
May 25June 9, 2019 - 23:59 CET
May 27 June 10June 11, 2019
Should you have any questions regarding the challenge, please contact:
Karlo Koščević, Faculty of Electrical Engineering and Computing, University of Zagreb, Croatia, E-mail: email@example.com