This work proposes to perform colour transfer by minimising a divergence (the L2 distance) between two colour distributions. We first propose to model each colour distribution as a compact Gaussian mixture which is designed for the specific purpose of colour transfer between images which have different scene content . A non rigid transformation is estimated by minimising the Euclidean distance (L2) between these two distributions, and the estimated transformation is used for transferring colour statistics from one image to another. Experimental results show that this is a very promising approach for transferring colour and it performs very well against an alternative reference approach. In , we show that this approach can be easily extended to video content, and as the same parametric transformation is applied to all video frames, no temporal inconsistencies are introduced. In  we show that our L2 based framework can also be extended to take into account colour correspondences between images with similar content, and compares well to state of the art techniques in this area. Our recolouring technique is fast and can take efficient advantage of parallel processing architectures for recolouring images and videos. Our parametric transfer function can also be stored in memory for later usage and combined with other computed transfer functions to create interesting visual effects.
|Original Image||Palette Image||Result|