- User Since
- Jun 16 2017, 1:41 AM (92 w, 1 d)
Thu, Feb 28
I didn't specifically mean moving the cursor. That as a concept doesn't sit with me very well in all cases. Keystrokes are fine being tied to specific functionality in specific editors, and therefore require Blender to keep the focus in the intended area. But for example typing in the text editor or Python console without paying attention to where the mouse is, can have you pressing 237689 keystrokes in the viewport before you realise it.
Any news on this?
Wed, Feb 27
Just discussed this with @Julien Kaspar (JulienKaspar). Seems like it's not a bug per se, only confusing behaviour emphasized by another bug.
The fact that add-ons with panels set to '.objectmode' disappear when changing modes is of course correct behaviour - the add-ons themselves are at fault for declaring they are related to Object Mode, while that's not necessarily true.
The confusing behaviour is caused by the fact that the top bar *looks* application-wide, but should have contents directly linked to the UI area and the tool that are currently active.
The actual bug is that the top bar doesn't update when it should, namely when focus shifts from one area to the other. Accordig to @Julien Kaspar (JulienKaspar), this has been discussed before, so I assume there's already a task for that?
Mon, Feb 25
Feb 15 2019
Feb 14 2019
Feb 11 2019
Feb 7 2019
Feb 1 2019
I'd like to mention one that's been confusing me forever.
Randomize Transform has meters as the unit for location and scale. For location this makes sense, but for scale I feel it doesn't.
Also, if I understand it correctly, setting this number to 1.15m would set the boundaries for the randomisation to 85% - 115%, which is not very clear in this UI.
I would like to propose two fixes for this:
- Lose the meters unit for the random scale part of the tool, and display the input as the factor it actually is.
- Work with actual, defined boundaries in the UI - minimum and maximum. I feel this offers more control and predictability than using one number that the user has to imagine going in both directions of the axis.
Dec 21 2018
Display the UVs very clearly at all times for all 3D Viewport Modes when in the UV Editor
I totally agree with @Julien Kaspar (JulienKaspar) here. I've always found it super annoying that I had to go into edit mode just to see the UVs on a mesh. In object mode, it could even just display them, and only make them editable in edit mode - this would be very intuitive across Blender, and similar to the way Maya works as well, for example.
Dec 18 2018
I am really dying to know the new Tools and interface changes that must be planned to make the UV Editor not just better than it currently is
Whether or not that's related to splitting editors. We're just wondering.
@William Reynish (billreynish) To be honest, that's not really answering @Julien Kaspar (JulienKaspar)'s question. And I must admit I've asked the same before. What are the plans for the editors after they've been separated? Any new tools planned, UI revamp, ..?
Aaaaand my bad, little oversight that was completely my own fault.
In my case, both the background colour and the shading are pure black.
Attached is a screenshot of rendered mode with film transparency and the problem.
This is at the studio in Linux build 08e6948da50, so it does have the same problem as my 9149e894210 build on Windows at home.
It can't really be a driver thing afaik, since this is not using GPU rendering.
Fair enough, makes sense.
Is it normal though for undo steps to take up more memory when the complexity of said objects increases? Even if the datablocks are linked from another library?
A cube, for example, adds kilobytes. A forest of linked trees adds tens of megabytes each time.
@Brecht Van Lommel (brecht) What if I wanted to paint on the World texture? Good idea about separating mesh-related data and independent images. But I still don't see why UV editing and image painting have to happen in the same editor, especially if they're going to limit data access to what is selected in the viewport.
I would suggest this:
- UV Editor: data depends on the selected mesh, allows you to work in a dedicated editor
- Image Editor: independently selected image datablock (you want to be able to paint on anything really), optionally show the selected mesh's UVs. Image Viewer doesn't really need to be something separate. I mean, if you can look at an image, you might as well be able to paint on it without having to open up a different editor.
Dec 11 2018
@Brecht Van Lommel (brecht) I would expect the same from the RGB values, no? Only involve the view transform in displaying the colours, not the numbers? I just assumed all of the values were always linear, and wherever I'd see an actual colour displayed, that would have been transformed. Or am I seeing things wrong?
Just wanted to report that this has been back for a few days. Current build with the problem is e4153946ad1.
In the Shader or Comp Editor and the Dopesheet or Graph Edior you at least get different content shown. In the UV or Image Edior you can display both UVs and Images.
The fact that we used to have to call it the UV/Image Editor in itself demonstrates that it makes more sense to be split.
Dec 6 2018
Dec 4 2018
@Brecht Van Lommel (brecht) that might be the case, I can't reproduce it myself anymore either. Thanks!