Page MenuHome

Color Management Improvements
Open, NormalPublic

Description

Original source: https://wiki.blender.org/wiki/Source/Render/ColorManagement

Blender integrates OpenColorIO, but still misses some functionality to make it fully possible to use for example the ACES configuration.

Cycles and Eevee shader nodes need to support arbitrary working color space:

  • Blackbody
  • Wavelength
  • Sky Texture
  • Render file output and file saving needs to have a configurable color space to write to.

By default it would use the scene display and view transform. This could then be changed to a color space, or a display + view transform + look.

  • OpenEXR reading and writing should use chromaticity metadata to determine the color space of the image, and convert it to and from the working color space.
  • The compositor should get a compositor node to convert between two color spaces defined in the OpenColorIO configuration.

The bundled configuration can be improved:

  • Add support for the latest ACES view transform, the current one is outdated.
  • To add support for ACEScg as working space out of the box, a second configuration would need to be added that can be chosen per project.

Such a configuration requires materials and lights to be set up with different values, and so is not compatible with most existing .blend files. Projects using such a configuration need to use it from the start so all assets can be created to work with it.

  • To support 10 bit wide gamut monitors, Blender drawing code needs to be improved to output 10 bit instead of 8 bit when displaying the 3D viewport and images/renders.

Details

Type
To Do

Event Timeline

Great to see that the wiki document made it into a task. I would like to suggest this task to be made a parent task of all other open tasks regarding the subject. It's great that this task is finally here - it would be even better if we make it an overview of everything else - stuff like T46860, T34684, T68242, T67832, etc. Then at least we have a bird's eye view on the scope of what needs to be done.

If I can share some input, I've been doing some research over the past few months, about the state of things in Blender and how other software generally tackles the subject. The following is a result of that + feedback from @Troy Sobotka (sobotka) and @Brecht Van Lommel (brecht).

Input / Output

OpenEXR reading and writing should use chromaticity metadata to determine the color space of the image, and convert it to and from the working color space.

I would suggest a different approach. Other software generally uses a set of input rules for the user to determine, but with default settings originating in the roles determined in the OCIO config. I've been taking a look with @Jeroen Bakker (jbakker) at how Blender reads those roles, and there's still a bit of hardcoding there.

Render file output and file saving needs to have a configurable color space to write to.

Maybe this is the right spot to finally get rid of the confusing View as Render flag, and expand the Save as Render checkbox to the more granular UI that users actually need.

This includes leaving the Display out of the file encoding. See T58805.

I've been thinking about an improved UI to expose all this to the artist. Generally, it's essential for the Color Management properties panel to be subdivided into Output and View panels. The following is heavily inspired by Maya and Nuke, and again, with defaults set by the Roles in the OCIO config.

Display

Add support for the latest ACES view transform, the current one is outdated. To add support for ACEScg as working space out of the box, a second configuration would need to be added that can be chosen per project. Such a configuration requires materials and lights to be set up with different values, and so is not compatible with most existing .blend files. Projects using such a configuration need to use it from the start so all assets can be created to work with it.

The ACES view transform was designed for the ACES ecosystem, and it shouldn't be used in a differently managed workflow. It's like throwing Filmic on an Adobe RGB image.
To be honest, ACES itself is outside the scope of Blender's responsibility. What the software is responsible for, is allowing any OCIO config, including ACES, to be used as intended.

Color management on UI controls

Troy posted an article on the old wiki about a problem with curves. The general issue is that every context requires a different transform on the UI to make sure the linear data we are manipulating is mapped correctly. For example sRGB for albedos (display linear energy), Filmic for renders (if that’s the transform we view them through), Z-depth (linear from some range to another) etc. It needs to be selectable by the user.

ASC-CDL, for example, can work on linear and log, but the UI responds totally differently depending on the encoding. Slope on scene linear data is exposure. Slope on log encoded data is a power function. Offset on log is a multiply, hence exposure.

I suppose this is the right spot to mention that in a colour picker when using the ACES config, the RGB sliders are the only reliable way to pick your colour. Both HSV and the colour wheel make the coordinates go bananas.

Compositor

What I'm aware of:

  1. Remove all Adobe PDF blending modes from the Mix node - or adapt them to scene referred data, similar to Nuke. This also goes for the MixRGB node in Cycles/Eevee shaders.
  2. Remove Lift/Gamma/Gain from the Color Balance node. It simply has no use in a linear workflow, which is what a compositing package should employ anyway.
  3. More Color Balance node fixes:
    1. Fix the order of the ASC-CDL controls (Slope, Offset, Power).
    2. Allow for negative values in the Offset control.
    3. Add an optional pivot point to permit say, contrast (power) around a particular value without moving it.

The compositor should get a compositor node to convert between two color spaces defined in the OpenColorIO configuration.

Yes. This.

OpenEXR reading and writing should use chromaticity metadata to determine the color space of the image, and convert it to and from the working color space.

Rarely present if ever. Would need to remain configurable per buffer.

The compositor should get a compositor node to convert between two color spaces defined in the OpenColorIO configuration.

Immensely useful immediately.

Add support for the latest ACES view transform, the current one is outdated.

Bad idea. ACES is ACES and should remain an external configuration. Doing so would be a mountain of work and confuse everyone into thinking “they are doing ACES” as we have seen in the past.

To add support for ACEScg as working space out of the box, a second configuration would need to be added that can be chosen per project.

See above. Woefully bad idea.

To support 10 bit wide gamut monitors, Blender drawing code needs to be improved to output 10 bit instead of 8 bit when displaying the 3D viewport and images/renders.

Beneficial to all Apple hardware as well as the climb towards HDR10 / Dolby Vision. Minimum should be 12 bit, downgraded as required. Dolby Vision is a 12 bit representation.

Maybe this is the right spot to finally get rid of the confusing View as Render flag, and expand the Save as Render checkbox to the more granular UI that users actually need.

+💯

Other than that, most of Sam’s comment is pretty spot on.

I would add that for the “exports” it is critical, and an “Advanced” makes sense here so, for example, you could render out entire shots subject to a particular look, and change the look for another shot. That is, think a stackable series of transforms ala the Texture stack. Simple to implement, and powerful. With a sane default as per Sam’s offering.

Regarding UI, that’s a huge one. Given software cannot know what a pixel pusher is trying to achieve, it is a fundamentally impossible task to properly display the arbitrary data. Is it an alpha? An albedo? A scene referred emission? A normal? A depth? Each of those would require potentially different transforms applied for interacting with the data, as well as unique display approaches via views etc.

Sam’s comment about fundamental ground truth assumptions for data is also prudent. That is, by default, the sane output of a path tracing engine such as Cycles should be assumed for all nodes. Display referred manipulations are fine, but they must not be the default. Same as with alpha encoding state.

Grease Pencil needs therapy. Also it would help if garbage unmanaged code from other sloppy crap software ceased being blindly grafted onto Blender. See D3638.

Found a few more related tasks that might be worth listing here.
Some of them are quite old, but the fact that they are still marked as open, at least warrants a check for whether they should be. Especially the ones containing lengthy discussions that I'm not sure how valid they are today.

I subscribe most things noted here but I write some observations below.

In the Transform Preferences, Rendering Space should be labeled as Working Space as involving also the Compositor and Sequencer.

The Image Editor needs a Viewing Transform option clearly visible since going back and forth to the Color Management panel is counterproductive.
LiftGammaGain should stay since many videos are composited, edited and graded in Rec.709, and many workflows involve this space but Lift should be fixed, it behaves as Lift+Power.
Also LOG based controls should be included like a Contrast with a Pivot option (add Pivot to the contrast node?)
Shadow/Midtone/Highlight control wheels should be included for LOG based secondaries.
A LUT node to apply different luts not only for artistic but also technical transforms, tonemappers, etc.
(link)

In the Transform Preferences, Rendering Space should be labeled as Working Space as involving also the Compositor and Sequencer.

Correct.

The Image Editor needs a Viewing Transform option clearly visible since going back and forth to the Color Management panel is counterproductive.

Also correct. Just like the Save as Render flag, View as Render should make way for a more granular UI (Color Space is already there, but View Transform is not).
Note that the settings inside the Color Management panel don't even influence the view inside the Image Editor, as they only pertain to rendering and output.
View as Render is not only useless and badly named, it's also incorrect to use it. An image used as input should not have a transform used for viewing output slapped onto it. The checkbox should be replaced by a View Transform enum, also stored per image datablock.

LiftGammaGain should stay since many videos are composited, edited and graded in Rec.709, and many workflows involve this space but Lift should be fixed, it behaves as Lift+Power.

Just because your input material is Rec. 709, doesn't mean that it is display referred. On the contrary - the Blender compositor operates linearly, which is where Lift/Gamma/Gain breaks. The algorithm works on shadows, midtones and highlights, which are impossible to separate when using scene referred data.

Also LOG based controls should be included like a Contrast with a Pivot option (add Pivot to the contrast node?)
Shadow/Midtone/Highlight control wheels should be included for LOG based secondaries.

Again, compositing is a linear affair. The log controls you mention (including a shadows/midtones/highlights separation) are geared towards a grading workflow. Which should not happen on linear data, unless the tools use a log shaper if both input and output are stored linearly, like in an ACES workflow. But I digress.
It's beyond the scope of this task, but @Troy Sobotka (sobotka) did suggest a dedicated Grading Editor with correct controls for a log-based workflow. This editor would for example allow the colourist to work on NodeTrees linked to VSE strips, which is similar to e.g. DaVinci Resolve.
That Pivot option, however, is something that could very well be added to colour manipulation nodes already present in Blender, like Brightness/Contrast, but also Slope/Offset/Power.

A LUT node to apply different luts not only for artistic but also technical transforms, tonemappers, etc.

LUT nodes, enums for every node, .. all ways to prep data more adequately for the nodes they're about to go through.