- User Since
- Jun 16 2017, 1:41 AM (117 w, 6 d)
Mon, Sep 16
Fri, Sep 13
Thu, Sep 12
Simply create a new text object, the issue is already clearly visible in the placeholder text, between the letters 'T' and 'e'.
Seems a bit pointless to attach a sample file for issues that are clear as day with default values. But perhaps this screenshot shows what I mean.
Generally, graphics software provide the option to either enable or disable kerning. Which is the default differs per package, but it should never be ignored.
Note that this simply constitutes taking into account a table of data that is already present in the font file (or not, depending on the quality of the typeface). That said, it's possible that Bfont might not have kerning data present in its file at all, but I have tested with typefaces that do, and the same issue shows.
Wed, Sep 11
All non-master commits are "unpublished" since a while now
@William Reynish (billreynish) Ah, 'non-master'. Didn't spot that before. I haven't said anything!
In the Transform Preferences, Rendering Space should be labeled as Working Space as involving also the Compositor and Sequencer.
The Image Editor needs a Viewing Transform option clearly visible since going back and forth to the Color Management panel is counterproductive.
Also correct. Just like the Save as Render flag, View as Render should make way for a more granular UI (Color Space is already there, but View Transform is not).
Note that the settings inside the Color Management panel don't even influence the view inside the Image Editor, as they only pertain to rendering and output.
View as Render is not only useless and badly named, it's also incorrect to use it. An image used as input should not have a transform used for viewing output slapped onto it. The checkbox should be replaced by a View Transform enum, also stored per image datablock.
LiftGammaGain should stay since many videos are composited, edited and graded in Rec.709, and many workflows involve this space but Lift should be fixed, it behaves as Lift+Power.
Just because your input material is Rec. 709, doesn't mean that it is display referred. On the contrary - the Blender compositor operates linearly, which is where Lift/Gamma/Gain breaks. The algorithm works on shadows, midtones and highlights, which are impossible to separate when using scene referred data.
Also LOG based controls should be included like a Contrast with a Pivot option (add Pivot to the contrast node?)
Shadow/Midtone/Highlight control wheels should be included for LOG based secondaries.
Again, compositing is a linear affair. The log controls you mention (including a shadows/midtones/highlights separation) are geared towards a grading workflow. Which should not happen on linear data, unless the tools use a log shaper if both input and output are stored linearly, like in an ACES workflow. But I digress.
It's beyond the scope of this task, but @Troy Sobotka (sobotka) did suggest a dedicated Grading Editor with correct controls for a log-based workflow. This editor would for example allow the colourist to work on NodeTrees linked to VSE strips, which is similar to e.g. DaVinci Resolve.
That Pivot option, however, is something that could very well be added to colour manipulation nodes already present in Blender, like Brightness/Contrast, but also Slope/Offset/Power.
A LUT node to apply different luts not only for artistic but also technical transforms, tonemappers, etc.
LUT nodes, enums for every node, .. all ways to prep data more adequately for the nodes they're about to go through.
I noticed the file browser asks for confirmation upon clicking the New Folder button. Is this new? It's unexpected and therefore easily missed when clicking the button, moving the mouse, not noticing the confirmation, and therefore not seeing a new folder being made.
Thu, Sep 5
Found a few more related tasks that might be worth listing here.
Some of them are quite old, but the fact that they are still marked as open, at least warrants a check for whether they should be. Especially the ones containing lengthy discussions that I'm not sure how valid they are today.
Wed, Sep 4
Tue, Sep 3
Wed, Aug 28
Previous status was probably the best way to never make it into a developer's field of attention.
Updating projects and priority so the devs can decide where this belongs.
Wed, Aug 21
Great to see that the wiki document made it into a task. I would like to suggest this task to be made a parent task of all other open tasks regarding the subject. It's great that this task is finally here - it would be even better if we make it an overview of everything else. Then at least we have a bird's eye view on the scope of what needs to be done.
Aug 19 2019
Aug 15 2019
Seemed to be related to X Mirror being enabled.
Jul 20 2019
Apr 18 2019
Apr 15 2019
Apr 10 2019
After performing a test just now at the studio, it seems that the Random distribution method is now render farm safe, which I hear it hasn't been before. Is this correct?
If that behaviour remains consistent, I frankly don't need the Jittered method anymore - since it was only used here to circumvent that problem.
Apr 2 2019
On Windows I'm still having the problem. Here's a blend file that shows what I mean. In my experience the problem manifests itself in different patterns, but the repetition is noticeable nonetheless. Additionally, the system generates multiple particles in the same spot. Changing the jittering does nothing but change the angle of the lines in the pattern, and the seed value doesn't resolve it either.
Apr 1 2019
You can make it totally random by choosing the Random distribution method, or semi-random by changing the jittering.
Mar 31 2019
Mar 25 2019
Feb 28 2019
I didn't specifically mean moving the cursor. That as a concept doesn't sit with me very well in all cases. Keystrokes are fine being tied to specific functionality in specific editors, and therefore require Blender to keep the focus in the intended area. But I don't like that to be tied to where the mouse *is*. While typing in the text editor or Python console, for example, this can have me pressing 237689 keystrokes in the viewport before I realise it.
Any news on this?
Feb 27 2019
Just discussed this with @Julien Kaspar (JulienKaspar). Seems like it's not a bug per se, only confusing behaviour emphasized by another bug.
The fact that add-ons with panels set to '.objectmode' disappear when changing modes is of course correct behaviour - the add-ons themselves are at fault for declaring they are related to Object Mode, while that's not necessarily true.
The confusing behaviour is caused by the fact that the top bar *looks* application-wide, but should have contents directly linked to the UI area and the tool that are currently active.
The actual bug is that the top bar doesn't update when it should, namely when focus shifts from one area to the other. Accordig to @Julien Kaspar (JulienKaspar), this has been discussed before, so I assume there's already a task for that?
Feb 25 2019
Feb 15 2019
Feb 14 2019
Feb 11 2019
Feb 7 2019
Feb 1 2019
I'd like to mention one that's been confusing me forever.
Randomize Transform has meters as the unit for location and scale. For location this makes sense, but for scale I feel it doesn't.
Also, if I understand it correctly, setting this number to 1.15m would set the boundaries for the randomisation to 85% - 115%, which is not very clear in this UI.
I would like to propose two fixes for this:
- Lose the meters unit for the random scale part of the tool, and display the input as the factor it actually is.
- Work with actual, defined boundaries in the UI - minimum and maximum. I feel this offers more control and predictability than using one number that the user has to imagine going in both directions of the axis.
Dec 21 2018
Display the UVs very clearly at all times for all 3D Viewport Modes when in the UV Editor
I totally agree with @Julien Kaspar (JulienKaspar) here. I've always found it super annoying that I had to go into edit mode just to see the UVs on a mesh. In object mode, it could even just display them, and only make them editable in edit mode - this would be very intuitive across Blender, and similar to the way Maya works as well, for example.
Dec 18 2018
I am really dying to know the new Tools and interface changes that must be planned to make the UV Editor not just better than it currently is
Whether or not that's related to splitting editors. We're just wondering.
@William Reynish (billreynish) To be honest, that's not really answering @Julien Kaspar (JulienKaspar)'s question. And I must admit I've asked the same before. What are the plans for the editors after they've been separated? Any new tools planned, UI revamp, ..?
Aaaaand my bad, little oversight that was completely my own fault.
In my case, both the background colour and the shading are pure black.
Attached is a screenshot of rendered mode with film transparency and the problem.
This is at the studio in Linux build 08e6948da50, so it does have the same problem as my 9149e894210 build on Windows at home.
It can't really be a driver thing afaik, since this is not using GPU rendering.
Fair enough, makes sense.
Is it normal though for undo steps to take up more memory when the complexity of said objects increases? Even if the datablocks are linked from another library?
A cube, for example, adds kilobytes. A forest of linked trees adds tens of megabytes each time.
@Brecht Van Lommel (brecht) What if I wanted to paint on the World texture? Good idea about separating mesh-related data and independent images. But I still don't see why UV editing and image painting have to happen in the same editor, especially if they're going to limit data access to what is selected in the viewport.
I would suggest this:
- UV Editor: data depends on the selected mesh, allows you to work in a dedicated editor
- Image Editor: independently selected image datablock (you want to be able to paint on anything really), optionally show the selected mesh's UVs. Image Viewer doesn't really need to be something separate. I mean, if you can look at an image, you might as well be able to paint on it without having to open up a different editor.
Dec 11 2018
@Brecht Van Lommel (brecht) I would expect the same from the RGB values, no? Only involve the view transform in displaying the colours, not the numbers? I just assumed all of the values were always linear, and wherever I'd see an actual colour displayed, that would have been transformed. Or am I seeing things wrong?
Just wanted to report that this has been back for a few days. Current build with the problem is e4153946ad1.
In the Shader or Comp Editor and the Dopesheet or Graph Edior you at least get different content shown. In the UV or Image Edior you can display both UVs and Images.
The fact that we used to have to call it the UV/Image Editor in itself demonstrates that it makes more sense to be split.
Dec 6 2018
Dec 4 2018
@Brecht Van Lommel (brecht) that might be the case, I can't reproduce it myself anymore either. Thanks!