[FIXED] Subtract Blending Mode

Issue

I have been trying to implement some of the layer blending modes of GIMP (GEGL) to Python. Currently, I am stuck in Subtract Blending mode. As per documentation, Subtract = max(Background - Foreground, 0). However, doing a simple test in GIMP, with Background image = (205,36,50) and Foreground image = (125,38,85), the resultant composite image/colour comes to be (170, 234, 0) which doesn’t quite follow the math above.

As per understanding, Subtract does not use Alpha Blending. So, could this be a compositing issue? Or Subtract follows different math? More details and background can be find in a separate SO question.

EDIT [14/10/2021]:
I tried with this image as my Source. Performed following steps on images normalised in range [0, 1]:

  1. Applied a Colour Dodge (no prior conversion from sRGB -> linear RGB was done) and obtained this from my implementation which matches with GIMP result.
  2. sRGB -> linear RGB conversion on Colour Dodge and Source image. [Reference]
  3. Apply Subtract blending with Background = Colour Dodge and Foreground = Source Image
  4. Reconvert linear RGB-> sRGB

I obtain this from POC. Left RGB triplet: (69,60,34); Right RGB triplet: (3,0,192). And the GIMP result. Left RGB triplet: (69,60,35); Right RGB triplet: (4,255,255)

Solution

If you are looking at channel values in the 0 ➞ 255 range they are likely gamma-corrected. The operation is possibly done like this:

  • convert each layer to "linear light" in the 0.0 ➞ 1.0 range using something like
L = ((V/255) ** gamma) (*)
  • apply the "difference" formula
  • convert the result back to gamma-corrected:
V = (255 * (Diff ** (1/gamma)))

With gamma=2.2 you obtain 170 for the Red channel, but I don’t see why you get 234 on the Green channel.

(*) The actual formula has a special case for the very low values IIRC.

Answered By – xenoid

Answer Checked By – Marie Seifert (Easybugfix Admin)

Leave a Reply

(*) Required, Your email will not be published