Page MenuHome

Color Management Improvements
Confirmed, NormalPublicTO DO

Tokens
"Like" token, awarded by Schamph."Love" token, awarded by deadpin."Love" token, awarded by BSannholm."Love" token, awarded by BlackRainbow."Love" token, awarded by ReinhardK."100" token, awarded by bintang."Love" token, awarded by Andrea_Monzini."Mountain of Wealth" token, awarded by predoe."Mountain of Wealth" token, awarded by aliasguru."100" token, awarded by cgvirus."Party Time" token, awarded by sobotka."Love" token, awarded by bnzs."Love" token, awarded by nacioss.
Assigned To
None
Authored By

Description

Original source: https://wiki.blender.org/wiki/Source/Render/ColorManagement

Blender integrates OpenColorIO, but still misses some functionality to make it fully possible to use for example the ACES configuration.

Cycles and Eevee shader nodes need to support arbitrary working color space:

  • Blackbody
  • Wavelength
  • Sky Texture
  • Render file output and file saving needs to have a configurable color space to write to.

By default it would use the scene display and view transform. This could then be changed to a color space, or a display + view transform + look.

  • OpenEXR reading and writing should use chromaticity metadata to determine the color space of the image, and convert it to and from the working color space.
  • The compositor should get a compositor node to convert between two color spaces defined in the OpenColorIO configuration.

The bundled configuration can be improved:

  • Add support for the latest ACES view transform, the current one is outdated.
  • To add support for ACEScg as working space out of the box, a second configuration would need to be added that can be chosen per project.

Such a configuration requires materials and lights to be set up with different values, and so is not compatible with most existing .blend files. Projects using such a configuration need to use it from the start so all assets can be created to work with it.

  • To support 10 bit wide gamut monitors, Blender drawing code needs to be improved to output 10 bit instead of 8 bit when displaying the 3D viewport and images/renders.

Related Objects

Mentioned In
T70393: Filmic Affects Non-EXR file formats with wrong Gamma
P1075 New Tasks for the Modules
T66305: Render & Cycles Module
Mentioned Here
T71705: Improve Palette management in Paint Modes (Meshes and Grease Pencil)
T71357: Design: DrawManager Colormanagement
T71389: Lookdev HDRIs are interpreted incorrectly when using different colour spaces
D2547: Color Management: Allow selection of the OCIO configuration in the User Preferences
T69562: Manipulating saturation of colors via HSV or RGB curves shifts HUE
T58805: Isolate File Encoding Colour Management
T60221: TIFF 8bit always saved and loaded as unassociated alpha
T41287: Color Picker applies color management incorrectly in 2.71
T43025: linear color space issues (2.73 rc1)
T51231: OpenEXR DWA compression should not apply to non-color data passes
T54554: bpy.ops.paint.image_paint with a texture requires initialization with manual paint stroke otherwise color not quite right
T54697: Vertex color baking for Cycles
T55422: Motion JPEG 2000 with 16bit color channels lose color resolution.
T56703: The Proxy creation in the clip editor does not take color management into account
T62218: 2.8 | Empty Images have washed out colors in viewport render
T62375: Grease pencil viewport render does not use color management view transform
T64437: The result of Dynamic paint Image sequence is brighter than intended when using initial color.
T66645: Vertex color painting. Syringe is screen-color dependent.
T66682: Color management is broken in the Sequencer
T68239: Cryptomatte affected by Color Space change in OpenEXR multilayer.
D3638: Compositor: Film-like curve
T34684: DPX/Cineon : let OCIO handle color space conversions
T46860: Cycles Color space flexibility
T67832: Background images ignore View transform
T68242: Color space seems wrong while painting in image editor.

Event Timeline

Dalai Felinto (dfelinto) lowered the priority of this task from 90 to Normal.Aug 20 2019, 9:39 PM
Dalai Felinto (dfelinto) created this task.

Great to see that the wiki document made it into a task. I would like to suggest this task to be made a parent task of all other open tasks regarding the subject. It's great that this task is finally here - it would be even better if we make it an overview of everything else - stuff like T46860, T34684, T68242, T67832, etc. Then at least we have a bird's eye view on the scope of what needs to be done.

If I can share some input, I've been doing some research over the past few months, about the state of things in Blender and how other software generally tackles the subject. The following is a result of that + feedback from @Troy Sobotka (sobotka) and @Brecht Van Lommel (brecht).

Input / Output

OpenEXR reading and writing should use chromaticity metadata to determine the color space of the image, and convert it to and from the working color space.

I would suggest a different approach. Other software generally uses a set of input rules for the user to determine, but with default settings originating in the roles determined in the OCIO config. I've been taking a look with @Jeroen Bakker (jbakker) at how Blender reads those roles, and there's still a bit of hardcoding there.

Render file output and file saving needs to have a configurable color space to write to.

Maybe this is the right spot to finally get rid of the confusing View as Render flag, and expand the Save as Render checkbox to the more granular UI that users actually need.

This includes leaving the Display out of the file encoding. See T58805.

I've been thinking about an improved UI to expose all this to the artist. Generally, it's essential for the Color Management properties panel to be subdivided into Output and View panels. The following is heavily inspired by Maya and Nuke, and again, with defaults set by the Roles in the OCIO config.

Display

Add support for the latest ACES view transform, the current one is outdated. To add support for ACEScg as working space out of the box, a second configuration would need to be added that can be chosen per project. Such a configuration requires materials and lights to be set up with different values, and so is not compatible with most existing .blend files. Projects using such a configuration need to use it from the start so all assets can be created to work with it.

The ACES view transform was designed for the ACES ecosystem, and it shouldn't be used in a differently managed workflow. It's like throwing Filmic on an Adobe RGB image.
To be honest, ACES itself is outside the scope of Blender's responsibility. What the software is responsible for, is allowing any OCIO config, including ACES, to be used as intended.

Color management on UI controls

Troy posted an article on the old wiki about a problem with curves. The general issue is that every context requires a different transform on the UI to make sure the linear data we are manipulating is mapped correctly. For example sRGB for albedos (display linear energy), Filmic for renders (if that’s the transform we view them through), Z-depth (linear from some range to another) etc. It needs to be selectable by the user.

ASC-CDL, for example, can work on linear and log, but the UI responds totally differently depending on the encoding. Slope on scene linear data is exposure. Slope on log encoded data is a power function. Offset on log is a multiply, hence exposure.

I suppose this is the right spot to mention that in a colour picker when using the ACES config, the RGB sliders are the only reliable way to pick your colour. Both HSV and the colour wheel make the coordinates go bananas.

Compositor

What I'm aware of:

  1. Remove all Adobe PDF blending modes from the Mix node - or adapt them to scene referred data, similar to Nuke. This also goes for the MixRGB node in Cycles/Eevee shaders.
  2. Remove Lift/Gamma/Gain from the Color Balance node. It simply has no use in a linear workflow, which is what a compositing package should employ anyway.
  3. More Color Balance node fixes:
    1. Fix the order of the ASC-CDL controls (Slope, Offset, Power).
    2. Allow for negative values in the Offset control.
    3. Add an optional pivot point to permit say, contrast (power) around a particular value without moving it.

The compositor should get a compositor node to convert between two color spaces defined in the OpenColorIO configuration.

Yes. This.

OpenEXR reading and writing should use chromaticity metadata to determine the color space of the image, and convert it to and from the working color space.

Rarely present if ever. Would need to remain configurable per buffer.

The compositor should get a compositor node to convert between two color spaces defined in the OpenColorIO configuration.

Immensely useful immediately.

Add support for the latest ACES view transform, the current one is outdated.

Bad idea. ACES is ACES and should remain an external configuration. Doing so would be a mountain of work and confuse everyone into thinking “they are doing ACES” as we have seen in the past.

To add support for ACEScg as working space out of the box, a second configuration would need to be added that can be chosen per project.

See above. Woefully bad idea.

To support 10 bit wide gamut monitors, Blender drawing code needs to be improved to output 10 bit instead of 8 bit when displaying the 3D viewport and images/renders.

Beneficial to all Apple hardware as well as the climb towards HDR10 / Dolby Vision. Minimum should be 12 bit, downgraded as required. Dolby Vision is a 12 bit representation.

Maybe this is the right spot to finally get rid of the confusing View as Render flag, and expand the Save as Render checkbox to the more granular UI that users actually need.

+?

Other than that, most of Sam’s comment is pretty spot on.

I would add that for the “exports” it is critical, and an “Advanced” makes sense here so, for example, you could render out entire shots subject to a particular look, and change the look for another shot. That is, think a stackable series of transforms ala the Texture stack. Simple to implement, and powerful. With a sane default as per Sam’s offering.

Regarding UI, that’s a huge one. Given software cannot know what a pixel pusher is trying to achieve, it is a fundamentally impossible task to properly display the arbitrary data. Is it an alpha? An albedo? A scene referred emission? A normal? A depth? Each of those would require potentially different transforms applied for interacting with the data, as well as unique display approaches via views etc.

Sam’s comment about fundamental ground truth assumptions for data is also prudent. That is, by default, the sane output of a path tracing engine such as Cycles should be assumed for all nodes. Display referred manipulations are fine, but they must not be the default. Same as with alpha encoding state.

Grease Pencil needs therapy. Also it would help if garbage unmanaged code from other sloppy crap software ceased being blindly grafted onto Blender. See D3638.

Found a few more related tasks that might be worth listing here.
Some of them are quite old, but the fact that they are still marked as open, at least warrants a check for whether they should be. Especially the ones containing lengthy discussions that I'm not sure how valid they are today.

I subscribe most things noted here but I write some observations below.

In the Transform Preferences, Rendering Space should be labeled as Working Space as involving also the Compositor and Sequencer.

The Image Editor needs a Viewing Transform option clearly visible since going back and forth to the Color Management panel is counterproductive.
LiftGammaGain should stay since many videos are composited, edited and graded in Rec.709, and many workflows involve this space but Lift should be fixed, it behaves as Lift+Power.
Also LOG based controls should be included like a Contrast with a Pivot option (add Pivot to the contrast node?)
Shadow/Midtone/Highlight control wheels should be included for LOG based secondaries.
A LUT node to apply different luts not only for artistic but also technical transforms, tonemappers, etc.
(link)

In the Transform Preferences, Rendering Space should be labeled as Working Space as involving also the Compositor and Sequencer.

Correct.

The Image Editor needs a Viewing Transform option clearly visible since going back and forth to the Color Management panel is counterproductive.

Also correct. Just like the Save as Render flag, View as Render should make way for a more granular UI (Color Space is already there, but View Transform is not).
Note that the settings inside the Color Management panel don't even influence the view inside the Image Editor, as they only pertain to rendering and output.
View as Render is not only useless and badly named, it's also incorrect to use it. An image used as input should not have a transform used for viewing output slapped onto it. The checkbox should be replaced by a View Transform enum, also stored per image datablock.

LiftGammaGain should stay since many videos are composited, edited and graded in Rec.709, and many workflows involve this space but Lift should be fixed, it behaves as Lift+Power.

Just because your input material is Rec. 709, doesn't mean that it is display referred. On the contrary - the Blender compositor operates linearly, which is where Lift/Gamma/Gain breaks. The algorithm works on shadows, midtones and highlights, which are impossible to separate when using scene referred data.

Also LOG based controls should be included like a Contrast with a Pivot option (add Pivot to the contrast node?)
Shadow/Midtone/Highlight control wheels should be included for LOG based secondaries.

Again, compositing is a linear affair. The log controls you mention (including a shadows/midtones/highlights separation) are geared towards a grading workflow. Which should not happen on linear data, unless the tools use a log shaper if both input and output are stored linearly, like in an ACES workflow. But I digress.
It's beyond the scope of this task, but @Troy Sobotka (sobotka) did suggest a dedicated Grading Editor with correct controls for a log-based workflow. This editor would for example allow the colourist to work on NodeTrees linked to VSE strips, which is similar to e.g. DaVinci Resolve.
That Pivot option, however, is something that could very well be added to colour manipulation nodes already present in Blender, like Brightness/Contrast, but also Slope/Offset/Power.

A LUT node to apply different luts not only for artistic but also technical transforms, tonemappers, etc.

LUT nodes, enums for every node, .. all ways to prep data more adequately for the nodes they're about to go through.

Just want to mention T69562 here as well

Fahad Hasan (cgvirus) added a comment.EditedOct 3 2019, 2:28 PM

In case someone interested in Natron/Nuke merge node math.
These are the implementation papers. I was able to create some mix nodes preserving luminance for production following these math. Hope it helps.
edit: [Not Valid]

Troy Sobotka (sobotka) signed these changes with MFA.Oct 3 2019, 3:24 PM

In case someone interested in Natron/Nuke merge node math.
These are the implementation papers. I was able to create some mix nodes preserving luminance for production following these math. Hope it helps.

Most of those are display referred and don’t work @Fahad Hasan (cgvirus). Those are the same broken formulas in the Mix node. Luminance based formulas are an entirely other matter.

Natron has a completely broken colour pipeline and is not exemplary in this regard.

Hi, @Troy Sobotka (sobotka) thanks for the reply. Which app/s could be taken as a reference point for scene referred pipeline? I am interested to experiment it in deep. Thanks.

@Fahad Hasan (cgvirus) The Nuke Merge nodes by default assume scene referred and will only change to the display referred versions with the horribly titled “Video Colorspace” check box toggled to on. OpenImageIO has the same swapped out formulas that Nuke uses. You will notice that things like Screen etc. will work with scene referred emissions, but is completely different from the A+B - (A*B) for example.

@Troy Sobotka (sobotka) Thanks for the hints! Yes, I noticed a huge f-stop difference with some shots lately.
I think I am getting this scene referred conundrum now. Should we explore ACES as well? If you could kindly provide us some papers to read that would be great! Thanks.

ACES is really tangential; it’s just a simple series of transfer functions and a reference working space that is more or less BT.2020 for use with CGI / rendering / compositing.

If one focuses on the three classes of buffers being display referred emissions, scene referred emissions, and non-color data, the rest is vastly easier to understand.

Thanks for all the hints! Going to dig it deep. I will eventually disturb you annoyingly in forums :) @Troy Sobotka (sobotka)

Also linking D2547 here. In my opinion not absolutely essential, but the discussion is lengthy and should not be disregarded.

T71357 was made yesterday by @Jeroen Bakker (jbakker) and would resolve a huge amount of the current problems.
T71389 is a small thing, but relevant to the issue.

Just a reminder for T71705 to also be CM aware.

Just a reminder for T71705 to also be CM aware.

Yep. That’s sRGB unmanaged hell dumpster from what I can see.