Page MenuHome

Overlay blending mode.
Closed, InvalidPublic

Description

System Information
Operating system and graphics card
Windows7; Windows8.1; Gtx560; hd4000

Blender Version
Broken: 2.76 48f7dd6

Short description of error
Overlay blending mode works in rgb color space and should work in linear space.

Exact steps for others to reproduce the error
Blend two textures and compare results in Photoshop/Gimp/Krita/. Ive attached png file and blend for comparison. Notice that gimp uses some wrong math and result aint same as krita or photoshop. Even blender after gamma correction looks a bit off cause of algorith they use for gamma correction. I used gamma correction node 1/2.2 for rgb to linear convertion and x2.2 multiply for linear to rgb convertion back.


{F283981)

@It is broken in every place in blender that use overlay, like material creators/node editors/texture paint etc.
@Other blending modes that use math in linar space are broken too. But overlay is most useful so im reporting it first.
@tanget space normal maps used in provided picture are just examples, you can use any color image to reproduce bug.

Details

Type
Bug

Event Timeline

Paul (rice) raised the priority of this task from to Needs Triage by Developer.
Paul (rice) updated the task description. (Show Details)
Paul (rice) added a project: BF Blender.
Paul (rice) set Type to Bug.
Paul (rice) added a subscriber: Paul (rice).

It is actually Blender that works in linear color space, while other painting applications typically work in sRGB or gamma 2.2 space. For loading normal maps manually with a texture node like this (instead of through the texture stack), you must go into the image editor and set the color space to "Linear" instead of "sRGB". Otherwise Blender has no way of knowing if it's a normal map or another kind of texture that actually contains colors instead of data. If you do this, then I think you will overlaying normal maps works correctly.

Now, it could be argued that blend modes in Blender should be designed to match painting applications, that is they should work as if they are performed in sRGB or gamma 2.2 space. I'm not really convinced by this, and in my experience other renderers and compositing software that are designed to work in linear space will not do that either.

T30912: Overlay Blending Mode broken in Texture stack is a different issues about how the texture stack influences work.

Brecht Van Lommel (brecht) claimed this task.

I'm closing this as not a bug, I think this is working as designed.

More details about color management in Blender are here:
https://www.blender.org/manual/render/post_process/cm_and_exposure.html

Overlay blending mode was designed to work in color space not in linear space. Why do you accept wrong results?
For node editor it doesnt matter cause i can make a simple node group to work but for example in texture paint mode overlay set to blending type or in paint mode produces wrong result

Overlay combines Multiply and Screen blend modes. The parts of the top layer where base layer is light become lighter, the parts where the base layer is dark become darker. An overlay with the same picture looks like an S-curve.

Depending on the value a of the base layer, one gets a linear interpolation between black (a=0), the top layer (a=0.5), and white (a=1).

So by this description from wikipedia, if you blend two 50% grey images there should be nothing changed. In blender result is darker. How can you accept this?
Result is different than any 2d apps. In my oppinion blender for this kind of things should be adapted to other software

https://en.wikipedia.org/wiki/Blend_modes

Overlay blending mode was designed to work in color space not in linear space. Why do you accept wrong results?

I guess by "color space" you mean "sRGB color space" or "gamma 2.2 color space" or "display color space" or ..? There are many color spaces, and Blender like other modern 3D and compositing software works in linear color space most of the time, which is not wrong, but a different convention.

Note also that for the case of normal maps, if you had set up the normal map image to load correctly, such that it gives correct results when actually used for normal mapping, then you will only get the correct result doing the overlay operation in linear color space. Blending in gamma 2.2 color space would not have worked in that material node setup.

For node editor it doesnt matter cause i can make a simple node group to work but for example in texture paint mode overlay set to blending type or in paint mode produces wrong result

As far as I can tell, texture paint mode on 8 bit images does actually work the same as the 2D apps you mention, in sRGB color space. If you work on a linear float image, then the results is indeed different.

Overlay combines Multiply and Screen blend modes. The parts of the top layer where base layer is light become lighter, the parts where the base layer is dark become darker. An overlay with the same picture looks like an S-curve.
Depending on the value a of the base layer, one gets a linear interpolation between black (a=0), the top layer (a=0.5), and white (a=1).
So by this description from wikipedia, if you blend two 50% grey images there should be nothing changed. In blender result is darker. How can you accept this?

How do you define 50% grey though? I guess you want 50% grey in the display color space, but that idea goes against the concept of doing rendering and compositing in linear color space, independent of any specific display color space.

50% grey is rather ill defined for HDR images that can go from 0 to infinity. Further when you think about diffuse or glossy maps in materials that are in the range 0..1, then 0.5 in linear color space seems more correct than 0.5 in sRGB color space, because it actually means 50% of reflected light. If we're talking about perceptual 50% then it's ill defined regardless of the color space you choose, since the multiplication with lighting can put the final value anywhere.

Result is different than any 2d apps. In my oppinion blender for this kind of things should be adapted to other software

You're comparing to 2D painting apps. If we are talking about 2D compositing apps like Nuke, then Blender follows the convention.

First thing, i wrote in last sentence of this bug report
@tanget space normal maps used in provided picture are just examples, you can use any color image to reproduce bug.

Second,
Texture paint mode does not work as 2d apps, if you create two 50% grey textures and choose overlay blending mode on one of them you will get darker image as result. Its a clear bug for me that should be somehow changed to match 2d apps like photoshop.

"You're comparing to 2D painting apps. If we are talking about 2D compositing apps like Nuke, then Blender follows the convention."

So for things like composition make blender "nuke" like, im fine with that. But for things like texture painting make blender "photoshop" like.

sorry for my english if i get missunderstud

First thing, i wrote in last sentence of this bug report
@tanget space normal maps used in provided picture are just examples, you can use any color image to reproduce bug.

Yes, and I explained that there are different types of images, and that in fact for some cases like tangent space normal maps linear color space is correct, and blending in sRGB color space would not actually do what you want. Not that this is the case for all types of images.

Second,
Texture paint mode does not work as 2d apps, if you create two 50% grey textures and choose overlay blending mode on one of them you will get darker image as result. Its a clear bug for me that should be somehow changed to match 2d apps like photoshop.


"You're comparing to 2D painting apps. If we are talking about 2D compositing apps like Nuke, then Blender follows the convention."
So for things like composition make blender "nuke" like, im fine with that. But for things like texture painting make blender "photoshop" like.

I can see that since they are presented as layers in the texture paint UI, you might expect those layers to behave consistent with 2D painting apps. But we're actually in 3D rendering territory here, and that works in linear color space. One could image a system where you have different behavior for this kind of layering and then another behavior for other kinds of shading operations, but they are really the same system in Blender.

You're coming to this from a particular point of view and you want some subset of Blender to behave consistent with 2D painting apps, but that necessarily leads to internal inconsistency, where e.g. textures would behave differently depending if you use them in the compositor or texture painting or 3D rendering. There's no way to make everyone happy here, whatever solution is chosen will always have some inconsistency or complexity, you need to understand the bigger picture to see why things work the way they work.

Ok, so ive downloaded nuke 15day trial version to check how this works there and ive made a screenshot to compare

As you can see there is little checkbox "video colorspace" to get "photoshop" look. Can we have such thing in blender?

It's possible to add but that's beyond the scope of this bug tracker, it would be a new feature.

I wonder if this could be good for quick hacks. Ive found information about this from nuke creators.
http://help.thefoundry.co.uk/nuke/8.0/content/user_guide/merging/layering_images.html

By default, Nuke assumes that images are in linear color space. However, if you want to convert colors to the default 8-bit color space defined in the LUT tab of your project settings (usually, sRGB), check Video colorspace. The conversion is done before the images are composited together, and the results are converted back to linear afterwards. Any other channels than the red, green, and blue are merged without conversion.

I rather not add this to the quick hacks, there's discussion necessary about the right way to implement this and how it fits in the overall color space design, not something that can be done by a new developer in a short time. We have discussed such per node color space options before and it's something developers are aware of could be improved.