How to compute similarity between two colors in RGBA color space? (where the background color is unknown of course)
I need to remap an RGBA image to a palette of RGBA colors by finding the best palette entry for each pixel in the image*.
In the RGB color space the most similar color can be assumed to be the one with the smallest euclidean distance. However, this approach doesn't work in RGBA, e.g., Euclidean distance from rgba(0,0,0,0)
to rgba(0,0,0,50%)
is smaller than to rgba(100%,100%,100%,1%)
, but the latter looks much better.
I'm using premultiplied RGBA color space:
r = r×a
g = g×a
b = b×a
and I've tried this formula (edit: See the answer below for better formula):
Δr² + Δg² + Δb² + 3 × Δa²
but it doesn't look optimal — in images with semitransparent gradients it finds wrong colors that cause discontinuities/sharp edges. Linear proportions between opaque colors and alpha seem fishy.
What's the optimal formula?
*) for simplicity of this question I'm ignoring error diffusion, gamma and psychovisual color spaces.
Slightly related: if you want to find nearest color in this non-Euclidean RGBA space, vp-trees are the best.
Finally, I've found it! After thorough testing and experimentation my conclusions are:
The correct way is to calculate maximum possible difference between the two colors.
Formulas with any kind of estimated average/typical difference had room for discontinuities.
I was unable to find a working formula that calculates the distance without blending RGBA colors with some backgrounds.
There is no need to take every possible background color into account. It can be simplified down to blending maximum and minimum separately for each of R/G/B channels:
Fortunately blending with "white" and "black" is trivial when you use premultiplied alpha.
The complete formula for premultiplied alpha color space is:
rgb *= a // colors must be premultiplied
max((r₁-r₂)², (r₁-r₂ - a₁+a₂)²) +
max((g₁-g₂)², (g₁-g₂ - a₁+a₂)²) +
max((b₁-b₂)², (b₁-b₂ - a₁+a₂)²)