There are numerous approximate colour transforms reported in the literature that aim to reduce display power consumption by imperceptibly changing the colour content of displayed images. To be practical, these techniques need to be content-aware in picking transformation parameters to preserve perceptual quality. This work presents a computationally-efficient method for calculating a parameter lower bound for approximate colour transform parameters based on the content to be transformed. We conduct a user study with 62 participants and 6,400 image pair comparisons to derive the proposed solution. We use the user study results to predict this lower bound reliably with a 1.6% mean squared error by using simple image-colour-based heuristics. We show that these heuristics have Pearson and Spearman rank correlation coefficients greater than 0.7 (p<0.01) and that our model generalizes beyond the data from the user study. The user study results also show that the colour transform is able to achieve up to 50% power saving with most users reporting negligible visual impairment.
Samarakoon, Chatura, Gehan Amaratunga, and Phillip Stanley-Marbell. "Content-Aware Automated Parameter Tuning for Approximate Color Transforms." ArXiV preprint arXiv:2007.00494 (2020) / 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI).