Many contemporary perceptual color difference (CD) metrics operate at the pixel level, entailing per-pixel difference calculation and a subsequent global average operation to ascertain the overall CD. Nonetheless, these metrics inadequately deliver precise CD assessments for misaligned photographic image pairs, particularly in cases involving disparities in image layout and object position. In this paper, we leverage the Sliced Wasserstein Distance to formulate a novel perceptual CD metric that holistically assesses images, specifically tailored for image pairs that are not perfectly aligned. To enhance adaptability to varying image resolutions and viewing conditions, such as display resolution and viewing distance, the proposed metric operates on multiple scales of images. Our method is conceptually straightforward and does not necessitate a training process. Quantitative and qualitative experiments demonstrate that our metric achieves a state-of-the-art performance in assessing CDs for non-aligned image pairs, displaying a high degree of agreement with human visual perception. We also conducted additional tests of our metric in image colorization and video color transfer tasks. The experimental results indicate that our metric effectively emphasizes color information while accommodating variations in content.
Live content is unavailable. Log in and register to view live content