TY - JOUR
T1 - Colorful image reconstruction from neuromorphic event cameras with biologically inspired deep color fusion neural networks
AU - Cohen-Duwek, Hadar
AU - Tsur, Elishai Ezra
N1 - © 2024 IOP Publishing Ltd.
PY - 2024/3/4
Y1 - 2024/3/4
N2 - Neuromorphic event-based cameras communicate transients in luminance instead of frames, providing visual information with a fine temporal resolution, high dynamic range and high signal-to-noise ratio. Enriching event data with color information allows for the reconstruction of colorful frame-like intensity maps, supporting improved performance and visually appealing results in various computer vision tasks. In this work, we simulated a biologically inspired color fusion system featuring a three-stage convolutional neural network for reconstructing color intensity maps from event data and sparse color cues. While current approaches for color fusion use full RGB frames in high resolution, our design uses event data and low-spatial and tonal-resolution quantized color cues, providing a high-performing small model for efficient colorful image reconstruction. The proposed model outperforms existing coloring schemes in terms of SSIM, LPIPS, PSNR, and CIEDE2000 metrics. We demonstrate that auxiliary limited color information can be used in conjunction with event data to successfully reconstruct both color and intensity frames, paving the way for more efficient hardware designs.
AB - Neuromorphic event-based cameras communicate transients in luminance instead of frames, providing visual information with a fine temporal resolution, high dynamic range and high signal-to-noise ratio. Enriching event data with color information allows for the reconstruction of colorful frame-like intensity maps, supporting improved performance and visually appealing results in various computer vision tasks. In this work, we simulated a biologically inspired color fusion system featuring a three-stage convolutional neural network for reconstructing color intensity maps from event data and sparse color cues. While current approaches for color fusion use full RGB frames in high resolution, our design uses event data and low-spatial and tonal-resolution quantized color cues, providing a high-performing small model for efficient colorful image reconstruction. The proposed model outperforms existing coloring schemes in terms of SSIM, LPIPS, PSNR, and CIEDE2000 metrics. We demonstrate that auxiliary limited color information can be used in conjunction with event data to successfully reconstruct both color and intensity frames, paving the way for more efficient hardware designs.
KW - event-based vision
KW - image reconstruction
KW - neuromorphic camera
KW - Neural Networks, Computer
KW - Image Processing, Computer-Assisted
KW - Signal-To-Noise Ratio
KW - Benchmarking
KW - Equipment Design
UR - http://www.scopus.com/inward/record.url?scp=85186653558&partnerID=8YFLogxK
U2 - 10.1088/1748-3190/ad2a7c
DO - 10.1088/1748-3190/ad2a7c
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
C2 - 38373337
AN - SCOPUS:85186653558
SN - 1748-3182
VL - 19
JO - Bioinspiration and Biomimetics
JF - Bioinspiration and Biomimetics
IS - 3
M1 - 036001
ER -