Paper: A bio-inspired synergistic Virtual Retina model for Tone Mapping

A Bio-inspired Synergistic Virtual Retina Model for Tone Mapping

Marco Benzi, Maria-Jose Escobar and Pierre Kornprobst

Abstract: Real-world radiance values span several orders of magnitudes which have to be processed by artificial systems in order to capture visual scenes with a high visual sensitivity. Interestingly, it has been found that similar processing happens in biological systems, starting at the retina level. So our motivation in this paper is to develop a new video TMO based on a synergistic model of the retina. We start from the so-called Virtual Retina model, which has been developed in computational neuroscience. We show how to enrich this model with new features to use it as a TMO, such as color management, luminance adaptation at photoreceptor level and readout from a heterogeneous population activity. Our method works for video but can be applied to static images seen as a video of a static frame. It has been carefully evaluated on standard benchmarks in the static case, giving comparable results to the state-of-the-art using default parameters, while offering user control for finer tuning. Results on HDR video are also promising, specifically w.r.t. temporal luminance coherency. The code is available as a Python notebook and a C++ implementation through GitHub so that reader could test and experiment the approach step-by-step. As a whole, this paper shows a promising way to address computational photography challenges by exploiting the current research in neuroscience about retina processing.

Sample results:

Memorial image
 Waffle House image
Exploratorium image
 Tunnel video
 Highway video

Memorial image is copyright of Paul Debevec; Waffle House and Exploratorium are copyright of Mark D. Fairchild; Tunnel and Highway videos are copyright of Grzegorz Krawczyk.

Code: The source code (in C++) is available upon request to the authors via the following form:

    Please fill in information below:

    By clicking on "Download", you will receive an email with the link to download the source code.

    Citation: If your results were obtained using our software, you have to cite the following paper as written below: (coming)

    Comments are closed.