- User Since
- Feb 6 2013, 7:05 PM (427 w, 4 d)
Tue, Apr 13
Just a thought about UI : I think it makes sense to have filter options both inside the popover and in a sidebar, akin to tool/brush options in the 3D view that are both accessible through the header as popovers and in the sidebar (so they're always visible).
Fri, Apr 9
Thu, Apr 8
Wed, Apr 7
I thought I had a bug, so I checked this task and if I understand correctly overrides in material node trees is still incomplete, am I correct ? It works in some files, and not in others (same character though, which is unexpected). Should I report it ? (I am using 2.92)
Wed, Mar 31
@Hans Goudey (HooglyBoogly) The radio button idea is consistent with other places in Blender (fcurve modifiers) and it serves that it's an explicit button instead of an "invisible-hitbox" activation like in 2.92. It's further less horizontal space in a place that's running out of it already though (unless the new datablock selector goes in as well?)
Having shortcuts affect not what seems to be selected is indeed confusing. This is not how the rest of Blender works : viewport, sequencer, ... every editor operates on selection. The problem here is that this is no selection, but looks like one. I'm not sure what being active entails for a modifier, but I think the indicator should be proportional to its importance. If we're going to have shortcuts affects the hovered modifier, then there needs to be a hover indicator that's more prominent than the active indicator.
Thu, Mar 25
Tue, Mar 23
Mon, Mar 22
I don't know, doesn't this seem more complicated than the problem it tries to solve ? Don't frames and reroutes already provide all the organizing functionality ? Is scrolling actually an issue ? Maybe a minimap could help in that regard ?
Sun, Mar 21
Concerning the "limits" fmodifier : wouldn't it make sense to group value sliders by dimension (first X, then Y) rather than minimum and maximum ? Using this layout, I feel there is a disconnect between the way they're grouped and the way I tend to mentally picture myself setting the limits in question : first set abcissa bounds, then set ordinate bounds. I realize this is quite nitpicky.
Mar 6 2021
Mar 4 2021
Mar 3 2021
Any news ?
Feb 25 2021
I'm not sure why a new UI is necessary since this seems to cover exactly what node groups do ? Is the idea to communicate to the user that they are working in a special sandbox and only certain nodes are allowed ?(as opposed to a regular node group which has no such limitations)
Apart from that you have my blessing. I've also started finding long string of attribute nodes unnecessarily difficult to read.
Feb 9 2021
Jan 30 2021
Jan 29 2021
It may be good to expose individual counts as sockets, so as to drive them from the modifier interface.
In case the attribute doesn't exist yet, replace will create the attribute ? In this case, how about calling it create or create/replace ?
Jan 28 2021
Hope that passes review. That would be quite the qol improvement. Thanks for working on it anyway.
Jan 24 2021
Jan 21 2021
Jan 18 2021
If indeed the fake user is on its way out, then the user count can be overlaid on top of the datablock icon on the left, and the duplicate button can be moved to the dropdown menu, at it is arguably less essential than the unlink button.
Jan 15 2021
Jan 14 2021
Apologies for the noise, but I'll drink to your health tonight @Brecht Van Lommel (brecht). Big, big thanks !!
Jan 12 2021
That is so much better. I don't see why decorators should be hidden though, these properties can be either connected to other nodes or animated/driven in their own right, as far as I know. I think this should be communicated as with all animatable properties.
Jan 5 2021
Dec 6 2020
As far as I know regular modifiers are not going anywhere so anyone can still use the array modifier in between geometry nodes.
Dec 2 2020
Nov 24 2020
Sorry for the noise, does this enable middle-click-drag to scroll popovers ?
Nov 14 2020
Will they ? I thought current modifiers would be kept ? Is the plan to convert them all to "node tree presets" ?
Nov 12 2020
Oct 31 2020
Oct 27 2020
Oct 10 2020
@Don Hopkins (SimHacker) hadn't seen your answer. Thanks for your interest! I don't think mouse warping can work with devices that communicate an absolute pointer position (tablets)? My idea was more in line with spawning the menu partly offscreen, just as if the interface actually extended beyond the screen's edge. This works on paper because the minimum necessary interaction area for pie menus (or even regular ones) is much smaller than their entire displayed size, and one way to interact with them is to simply flick the pointer in any direction, with no need to actually reach the desired button with the pointer. However, I think pointer warping can be desirable for when the user uses a mouse and when the invoked menus are small enough that they don't cover the entire editor.
Sep 25 2020
Sep 24 2020
I'll add my thoughts because I feel something may be overlooked : besides the fact that this fallof visualization is completely standard in all DCCs I used (because that alone doesn't make it a necessity in Blender), it is useful because :
- adjusting the "proportional size" before a transform operation is impossible (it should be possible), so there is no way to know beforehand how big of an area will be affected
- tablet users have no practical way of changing the proportional size during transform (bound to mouse wheel), so they are pretty much relegated to having to tweak the proportional size in the last operator panel
- finally they can't even do that because Blender struggles too much when changing any value in the last operator panel if the mesh is bigger than about 50k verts (which is very, very low)
Sep 21 2020
That sounds very flexible, best of both worlds ! I assume existing modifiers will be kept around for transition and compatibility ?
Additionally I suppose custom interfaces for "node modifiers" will require a more thorough access to socket type, widget creation (checkboxes...), etc. than we currently have in Cycles node groups ?
Sep 17 2020
Sep 13 2020
It doesn't sound like this has been discussed with Pablo ? Fast navigate is a necessity to keep navigating smoothly around a heavy multires object, it is standard in sculpting programs. Then, @ronan ducluzeau (zeauro) is right about a multires cube not being a good candidate for "starter geometry" - the workflow introduced by Pablo goes like this : 1. blockout forms from a primitive, using voxel remesh 2. retopo and use multires on the new object. For a "starter geometry", it does make more sense to use a subdivided cube.
Aug 14 2020
Typical characters are symmetrical, props and other objects can either be or not be, so I would say it makes sense in the context of rigging that symmetry be a per-armature setting as well. However, it would probably have to be split off from pose mode's symmetry options as well, because animators don't want to be posing their character symmetrically (unless exceptional circumstances), otherwise any symmetrical edit required on the character rig file would require the rigger to turn symmetry on, make the needed fixes, and turn it off again before saving. Or maybe this is something riggers could live with ?
Aug 2 2020
Given the number of new tools added recently to sculpt mode, I reckon it'd make sense to only have a single "brush" tool, and have all brushes be accessible from there.
Jul 2 2020
Jul 1 2020
Jun 29 2020
Last I heard, pictures and videos of other software are prohibited here for legal reasons - I think it's safer to remove it and take it to devtalk.blender.org/ or https://rightclickselect.com/p/ where there can be a lengthy design discussion about this particular feature.
Indeed... still it would be nice to support snapping along a different axis than the transform axis. (Non-orthogonal snapping)
Jun 18 2020
Jun 11 2020
Jun 10 2020
Jun 9 2020
Vertex sliding is basically a translation with an axis constraint, so I guess it could behave exactly the same (along with planned improvements such "snap to intersecting geometry").
Jun 6 2020
May 25 2020
May 6 2020
Apr 16 2020
Just chiming in to say as useful as it sounds it would be much appreciated if kept optional - I like selecting what I can see without any side effects, so the sync between xray and selection occlusion suits me fine.
Apr 14 2020
Thanks for the merge I wasn't sure the issue affected all modifiers.
Apr 12 2020
Apr 6 2020
To me it seems the gizmo just needs to follow the input transform values. @Julian Eisel (Severin) in your example, showing the gizmo while transforming and continually updating its position to match the selection's center of mass would probably look like it's lagging behind the mouse cursor. In my opinion as an artist, it seems okay to update it according to the input transform values while the operator is running, and then recalculate its position once the transformation is done (just like we do now).
Mar 31 2020
If we don't plan on supporting live creation of nodetree from edit mesh operators, then a potential solution would be to create a "procedural object". This object would not have a predetermined data type (mesh, curve, volume...) but would be able to convert between types internally through conversion nodes and use any operators related to them such as VDB boolean operations, curve lofting or regular mesh operations. It could also have its own edit mode where selection, instead of activating given vertex or edge, would activate the relevant node and display its gizmo(s). Just water to the mill.
Mar 23 2020
Blender's Vertex Groups perhaps should be renamed to Weight Groups, since they don't really store selections, don't allow for edge or face data - instead they allow you to specify a weight value per vertex.
Mar 22 2020
As long as we're talking about simulation with meshes, and not flips or other kind of data, I think it would make sense to present geometry nodes and higher level simulation nodes in the same place : rigid bodies, soft bodies, cloth all manipulate geometry, the main difference being the latter have to be aware of their previous state to compute current state, hence requiring a simulation step from frame 1 - but the way I see it they're a different tool to manipulate the same data.
To add to that, I believe there are advantages to having a mesh freely going back and forth between generative modeling and simulation and back : adding procedural cracking or wrinkling onto a simulated cloth or skin, or instantiate static flowers on top of a plant asset rocking in the wind, for instance. The use cases seem pretty much infinite.
Mar 21 2020
@Brecht Van Lommel (brecht) other programs separate object manipulation (working with transforms, matrices, object instancing...) from geometry manipulation, I'd say that's good UX. I expect within the latter everything would happen in object space, with utilities to convert to other spaces optionally (matrix nodes already present in Jacques' functions branch).
Mar 20 2020
@William Reynish (billreynish) I think the bracket-socket works very nicely for constants.
For grouping of newly created geometry, it's important to do two things :
- give the choice of component type, so that following node can act on either, say, new faces or new edges, such as bevel
- split into several groups when relevant, *eg* extrusion tip and sides
Mar 13 2020
Feb 24 2020
The only reasonable way I can seem is to only show properties that are available on all selected items. Take the example in the task, where a Point and Sun light are both selected. Point lights have a Radius property, but Sun lights have an Angle property. If we react to the selection instead of the active item, we could either show both or none of those properties. Showing one of them doesn't work - it only makes sense when you use the *active* option. Showing none of them is much clearer, because they you know that everything you see can be multi-edited, and otherwise our layouts might also break if we try to merge layouts to show all properties.
Feb 13 2020
If there's a single multiply node that can handle floats and vectors, how does it interface ? We should keep the value field when nothing's connected to the socket, but how should it behave then ? as a vector, or a single value field ?
Jan 16 2020
I requested this some time ago on twitter, and I am very happy that you worked on it Pablo. I think this is a great improvement. However, previous behaviour is desirable in some cases, and should be kept of course.
Jan 14 2020
Dec 10 2019
Dec 3 2019
The technicalities may fly over my head, the importance of being able to deeply modify a linked character while it's being animated in another file (add/remove/rename objects) is not lost on me - not being able to do this would be quite the limitation. However I don't understand why there has to be something else on the UI side ? Shouldn't this all be mostly transparent ? How are these override groups relevant to the user ?
Dec 1 2019
Nov 30 2019
Maybe this should be a global property ? I'm not sure how much added value there is in having it per-brush. In any case, great qol addition.
Nov 23 2019
Haven't tested, but from the changelog looks like a much welcome improvement, especially in the absence of a gizmo - any plans to have one or does it become superfluous with these changes ?
Nov 8 2019
Just saw that commit and I think it's great having more flexibility in regular parenting. This last possibility you mention (inheritance of individual scale values) sounds good as an alternative to setting up copyscale constraints on a long bone chain, and average sounds like it might just be better as default, imho, but I'm hesitant on that tbh - like you say, any rigger should know that decomposition, even that fancy polar one attempted on sheared bones won't really work in any predictable/desirable way, which is why (afaik) non-uniform scaling on a (simply parented) chain is usually either prohibited by the rigger or not attempted by the animator, so... All in all this sounds like a good, thought-through change to me.
Oct 2 2019
@Brecht Van Lommel (brecht) Why not simply write them as xml instead of inside a blend file ?
Sep 30 2019
Good point. I know of a way certain programs address that : they allow for splitting the list in two : one at the top, another at the bottom. Moving an item a long way is then just a matter of scrolling to the right place in the first 'pane' so to say, and then dragging the item from the second pane into the first one.
Sep 25 2019
I have to say that falloff preview looks rather distracting... not sure if that's necessary. One look to the top (or right) is enough to remind the user which falloff they've selected, if that's the intended use (?). Apart from that I like the changes. The strength direction being reversed will take some getting used to, but makes more sense. Those little tweaks really obviously come from a sculptor !
I'm not fond of animations especially for the 'UI list' case - I think drawing a dotted line where the item is going to be inserted is more efficient and straightforward (as most painting programs do it in their 'layers' list) - seeing items shift around disorientates me when I'm trying to precisely move a specific item to a location I'm targeting - when suddenly that visual target slides around, I'm not sure where to drop the thing anymore. Maybe that's just my brain ? Quite a few interfaces do this (off the top of my head Discord does this with its channel list, & last I checked Android UI framework kind of enforces it too) and I find it quite disturbing. They do make more sense in a grid view (2D) though, because the order of items is not necessarily as obvious as in a list (1D).
Other than that, obviously in full agreement of the proposal.
Sep 20 2019
Worth noting that currently bones can neither be sorted alphabetically, nor by hand. Would be super cool to have them benefit from all these enhancements, too !
Sep 4 2019
Sep 3 2019
@William Reynish (billreynish) Oh so it's actually an additional mode ? I thought that would replace Lookdev. Ok, so I understand better now : Lookdev becomes a way to preview your assets using the scene engine -supposedly the one that'll be used for the final renders- and Material Preview is gonna be what Lookdev was until now. Am I correct ?
For what it's worth, here is my use case : I have a scene set to render with Cycles, of which I render out previews using lookdev mode, with 'scene lights' and 'scene world' enabled, because that's what I need in a preview (not just previewing assets, but entire animated shots with lighting and backdrop). Why not simply use Eevee as the scene renderer then ? Good question ! Simply because I do full renders once in a while to check on the final look and lighting. Ideally, - and I have read all of the above and understood at least 40% of it- I could still do that after this change.