Skip to yearly menu bar Skip to main content


Poster

GLARE: Low Light Image Enhancement via Generative Latent Feature based Codebook Retrieval

Han Zhou · Wei Dong · Xiaohong Liu · Shuaicheng Liu · Xiongkuo Min · Guangtao Zhai · Jun Chen

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Tue 1 Oct 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

The majority of existing Low-light Image Enhancement (LLIE) methods attempt to directly learn the mapping from Low-Light (LL) to Normal-Light (NL) images or benefit from semantic or illumination map to guide such learning. However, the inherent ill-posed nature of LLIE, coupled with the additional difficulty in retrieving the semantic information from significantly impaired inputs, compromises the performance of enhanced outputs, especially in extremely low-light environments. To address this issue, we present a new LLIE network via Generative LAtent feature based codebook REtrieval (GLARE) to improve the visibility of LL images. The codebook prior is derived from undegraded NL images using Vector Quantization (VQ) strategy. However, simply adopting codebook prior does not necessarily ensure the alignment between LL and NL features. Therefore, we develop an Invertible Latent Normalizing Flow (I-LNF) to generate features aligned with NL latent representations, guaranteeing the correct code matching in the codebook. In addition, a novel Adaptive Feature Transformation (AFT) module, which contains an Adaptive Mix-up Block (AMB) and a dual-decoder architecture, is devised to further elevate the fidelity while maintaining realistic details provided by codebook prior. Extensive experiments verify the superior performance of our GLARE. More importantly, the application on low-light object detection demonstrates the effectiveness of our method as a pre-processing tool in high-level vision tasks. Codes will be released upon publication.

Live content is unavailable. Log in and register to view live content