- User Since
- Jun 16 2017, 1:41 AM (194 w, 2 d)
Wed, Mar 3
Tue, Mar 2
@Brecht Van Lommel (brecht) could you take a look at the many suggestions that have been posted here, and consider updating the task description with them? They have been made a long time ago.
While it's unlikely that major changes happen soon, said suggestions are well-intended, yet on the verge of being completely forgotten underneath a task description that still lacks important stuff while addressing non-essential targets.
I was pretty surprised to discover the flaws in displacement baking as it currently stands. I'm happy to see this task but I'm a bit astonished about its lack of information.
Since I can't find much that's up to date, I went exploring with my students today. I hope this helps as some sort of overview from an end user perspective.
Sep 17 2020
Thanks, great work @Jeroen Bakker (jbakker) !
Sep 11 2020
Sep 2 2020
Jul 22 2020
... like with most Python API changes? ;)
To me this just looks like an oversight. And I'd personally rather see addons updating their code according to an API change, than having to keep remapping these strings for all eternity because of something that wasn't intentional to begin with.
Jul 20 2020
Jun 18 2020
Mentioning T72420 here. It's a design task, so it's a solution rather than a problem!
Jun 16 2020
@Jeroen Bakker (jbakker) sounds great, thanks!
Jun 15 2020
@Jeroen Bakker (jbakker) Could you elaborate on why this design is invalid? I'm not super familiar with the internals of Blender, but describing the issue might help people understand.
Jun 12 2020
Just noticed that it only occurs when the image is at a zoom level of less than 100%.
@Brecht Van Lommel (brecht) I disagree. Feature requests are scoped to obviously missing things that generally would be nice to have - hence that page's reference to Right-Click Select etc.
The only feature request I see here is that mention of a dropdown - which is a mere suggestion on how to fix the error. It's the error I'm reporting - how exactly it gets solved is up to you.
The error at hand is that Blender only allows colour picking in one space. This results in output that's unexpectedly wrong in many cases. To the vast majority of users, that will look like nothing but a bug.
Jun 10 2020
I just reported T77736, totally forgot about this design task.
I would however like to stress that it's necessary for the colour picker to allow the user to choose between a display space and the scene linear space when colour picking. In any context. Shader nodes, texture painting, vertex colours, you name it.
Internally, always consider the scene linear values, but expose the alternative in the UI.
Good to know, but like many things in Blender this should be made clear to the user.
Even just a tooltip saying 'requires mirrored strands' would remove the confusion.
Jun 8 2020
Apr 26 2020
Apr 19 2020
My (slightly controversial) two cents are to completely remove any blend modes that are not compatible with scene-referred data, HDR or otherwise.
Adobe PDF blend modes (screen, color dodge, ..) are *not* designed to work with it, and yet Cycles, Eevee, and the Compositor (should) operate exclusively on it. It is true that most users are completely unaware of this, and continue to use them 'because they know them from Photoshop'.
This is correct, the only ones that actually work on scene-referred data are add, subtract, multiply and divide. Anything else is not designed for higher dynamic ranges.
Apr 1 2020
Awesome! You sure work fast, @Philipp Oeser (lichtwerk)
Mar 31 2020
Mar 7 2020
Dec 19 2019
Dec 16 2019
Dec 15 2019
Great to see this being tackled! One small note:
Looking good as far as I can tell. A few things:
Dec 14 2019
Nov 24 2019
Nov 21 2019
I believe we're mixing terms here? I've used the concept of light groups in Arnold in the past, as (I think) a precursor of custom AOVs, where the lights' influence gets output to a separate render pass per group. In which case this seems better covered by D4837.
@Dalai Felinto (dfelinto) 's task description is closer to what @Gavin Scott (Zoot) writes:
Applications that support Light Linking generally let you associate either individual lights or groups of lights with one or more objects to illuminate.
Nov 20 2019
Nov 19 2019
Ironically, that makes me very happy :D
- When rendering and compositing the linear scene ref space (default REC709) must be used. This means that the overlay/mode engines will be drawing in REC709 and only as a last step the view transform will be applied.
- Note that this will also impact how Grease pencil blends colors and there will be rendering differences. These differences are currently also noticeable during final rendering of GP projects.
- Images that are rendered in the background are currently fixed to REC709 or sRGB. other colorspaces are ignored. We should transform the image buffer from its color space to one of them (depends a bit on use).
Nov 17 2019
Without having a deeper look (I havent checked how the names are actually ending up when blender writes them), I am reading this as Sets.Glass.Combined would be a valid name for layer/sub-layer? (Unless a deeper nesting is not allowed? Maybe there is no layer.sublayer.sublayer.channel allowed?)
Nov 15 2019
I don't get it. Why don't you simply NOT use . in your view layer names?
That's obviously my workaround now. I'm simply pointing out that Blender isn't following a common convention, and its own convention is to use periods in *all* datablock naming, including view layers. Which I happen to prefer over anything else for a number of reasons.
Nov 14 2019
Apparently it does work if the glass object is not inside the volume object. This defeats the purpose in many cases though.
Nov 9 2019
Nov 8 2019
Nov 7 2019
I think your expectations are incorrect. Your sRGB image is *always* linearised, the second you indicated that its file encoding is indeed sRGB. Behind the scenes of the compositor, this removes the sRGB OETF and turns the image into a display linear sRGB buffer.
Many many kudos for the write-up. This would tackle a huge amount of the current issues.
When doing viewport rendering the color management is done on the CPU what is typically slow.
But doing it on the GPU would be less accurate, you once told me. How inaccurate are we talking? Is it worth the trade-off?
I think it's more superficial than that. Lookdev Mode to me seems like a temporary World that Blender creates and tells Eevee to use while the Viewport is in that mode. In that case this World uses an environment texture like any other World does, only without the amount of controls we normally would be presented with.
Nov 6 2019
when using a different OCIO config, the Rec.709 colour space is almost never listed as just 'Linear', so Blender picks a different space from the config
This is probably best avoided by looking up roles instead of literal colour space names.
I've attempted summarising the DevTalk post. Of course the original article and its reactions contain much more information - and a bit of sidetracking - but I've tried to condense the gist of it as best I could. With just a hint of personal opinion here and there.
Nov 1 2019
Oct 15 2019
Also linking D2547 here. In my opinion not absolutely essential, but the discussion is lengthy and should not be disregarded.
Sep 27 2019
The core issue is that the HSV code is legacy, and was based on the nonlinear assumptions behind sRGB.
Would this be the cause of the HSV colour picker returning crazy RGB ratios when using an alternative OCIO config?
Sep 22 2019
Any news on this?
Sep 16 2019
Sep 13 2019
Sep 12 2019
Simply create a new text object, the issue is already clearly visible in the placeholder text, between the letters 'T' and 'e'.
Seems a bit pointless to attach a sample file for issues that are clear as day with default values. But perhaps this screenshot shows what I mean.
Generally, graphics software provide the option to either enable or disable kerning. Which is the default differs per package, but it should never be ignored.
Note that this simply constitutes taking into account a table of data that is already present in the font file (or not, depending on the quality of the typeface). That said, it's possible that Bfont might not have kerning data present in its file at all, but I have tested with typefaces that do, and the same issue shows.
Sep 11 2019
All non-master commits are "unpublished" since a while now
@William Reynish (billreynish) Ah, 'non-master'. Didn't spot that before. I haven't said anything!
In the Transform Preferences, Rendering Space should be labeled as Working Space as involving also the Compositor and Sequencer.
The Image Editor needs a Viewing Transform option clearly visible since going back and forth to the Color Management panel is counterproductive.
Also correct. Just like the Save as Render flag, View as Render should make way for a more granular UI (Color Space is already there, but View Transform is not).
Note that the settings inside the Color Management panel don't even influence the view inside the Image Editor, as they only pertain to rendering and output.
View as Render is not only useless and badly named, it's also incorrect to use it. An image used as input should not have a transform used for viewing output slapped onto it. The checkbox should be replaced by a View Transform enum, also stored per image datablock.
LiftGammaGain should stay since many videos are composited, edited and graded in Rec.709, and many workflows involve this space but Lift should be fixed, it behaves as Lift+Power.
Just because your input material is Rec. 709, doesn't mean that it is display referred. On the contrary - the Blender compositor operates linearly, which is where Lift/Gamma/Gain breaks. The algorithm works on shadows, midtones and highlights, which are impossible to separate when using scene referred data.
Also LOG based controls should be included like a Contrast with a Pivot option (add Pivot to the contrast node?)
Shadow/Midtone/Highlight control wheels should be included for LOG based secondaries.
Again, compositing is a linear affair. The log controls you mention (including a shadows/midtones/highlights separation) are geared towards a grading workflow. Which should not happen on linear data, unless the tools use a log shaper if both input and output are stored linearly, like in an ACES workflow. But I digress.
It's beyond the scope of this task, but @Troy Sobotka (sobotka) did suggest a dedicated Grading Editor with correct controls for a log-based workflow. This editor would for example allow the colourist to work on NodeTrees linked to VSE strips, which is similar to e.g. DaVinci Resolve.
That Pivot option, however, is something that could very well be added to colour manipulation nodes already present in Blender, like Brightness/Contrast, but also Slope/Offset/Power.
A LUT node to apply different luts not only for artistic but also technical transforms, tonemappers, etc.
LUT nodes, enums for every node, .. all ways to prep data more adequately for the nodes they're about to go through.
Is there a reason for ea94cade2991 to be an unpublished commit?
Sep 5 2019
Found a few more related tasks that might be worth listing here.
Some of them are quite old, but the fact that they are still marked as open, at least warrants a check for whether they should be. Especially the ones containing lengthy discussions that I'm not sure how valid they are today.
Sep 4 2019
Sep 3 2019
Aug 28 2019
Previous status was probably the best way to never make it into a developer's field of attention.
Updating projects and priority so the devs can decide where this belongs.
Aug 21 2019
Great to see that the wiki document made it into a task. I would like to suggest this task to be made a parent task of all other open tasks regarding the subject. It's great that this task is finally here - it would be even better if we make it an overview of everything else - stuff like T46860, T34684, T68242, T67832, etc. Then at least we have a bird's eye view on the scope of what needs to be done.
Aug 19 2019
Aug 15 2019
Seemed to be related to X Mirror being enabled.
Jul 20 2019
Apr 18 2019
Apr 15 2019
Apr 10 2019
After performing a test just now at the studio, it seems that the Random distribution method is now render farm safe, which I hear it hasn't been before. Is this correct?
If that behaviour remains consistent, I frankly don't need the Jittered method anymore - since it was only used here to circumvent that problem.
Apr 2 2019
On Windows I'm still having the problem. Here's a blend file that shows what I mean. In my experience the problem manifests itself in different patterns, but the repetition is noticeable nonetheless. Additionally, the system generates multiple particles in the same spot. Changing the jittering does nothing but change the angle of the lines in the pattern, and the seed value doesn't resolve it either.
Apr 1 2019
You can make it totally random by choosing the Random distribution method, or semi-random by changing the jittering.