Page MenuHome

Everything Nodes UX
Closed, ResolvedPublicDESIGN

Authored By
William Reynish (billreynish)
Jul 16 2019, 11:26 PM
"Love" token, awarded by Mylo."Burninate" token, awarded by zickkie."Love" token, awarded by MarcoDiVita."Burninate" token, awarded by AnityEx."Love" token, awarded by Dragon.Studio."Love" token, awarded by russ1642."Love" token, awarded by Ionarn."Like" token, awarded by IPv6."Love" token, awarded by df."Love" token, awarded by proulxpl."Like" token, awarded by bestelix."Love" token, awarded by vitorbalbio."Love" token, awarded by Muritaka."Love" token, awarded by HoloTheDrunk."Like" token, awarded by Hto-Ya."The World Burns" token, awarded by vfxfan."Like" token, awarded by Simto1."Love" token, awarded by silex."Love" token, awarded by johnsyed."Like" token, awarded by Kdaf."Love" token, awarded by Arkhangels."Love" token, awarded by baswein2."Love" token, awarded by Dir-Surya."Love" token, awarded by brezdo."Love" token, awarded by DiogoX2."Pirate Logo" token, awarded by shader."Love" token, awarded by Ryxx."Love" token, awarded by kenziemac130."Love" token, awarded by BSannholm."Love" token, awarded by Cedch."Like" token, awarded by Constantina32."Love" token, awarded by mistajuliax."Love" token, awarded by ofuscado."Love" token, awarded by MetinSeven."Love" token, awarded by jfmatheu."Love" token, awarded by lopoIsaac."Love" token, awarded by ugosantana."Love" token, awarded by iWaraxe."Love" token, awarded by Kronk."Love" token, awarded by roman13."Burninate" token, awarded by Shimoon."Love" token, awarded by bnzs."Love" token, awarded by momotron2000."Love" token, awarded by 0o00o0oo."The World Burns" token, awarded by DaPaulus."Love" token, awarded by CandleComet."Like" token, awarded by amonpaike."Love" token, awarded by samytichadou."Love" token, awarded by SecuoyaEx."Love" token, awarded by hadrien."Like" token, awarded by julperado."Love" token, awarded by lcs_cavalheiro."Love" token, awarded by dark999."Love" token, awarded by brilliant_ape."Orange Medal" token, awarded by Design-Circle."Orange Medal" token, awarded by Zino."Love" token, awarded by razgriz286."Love" token, awarded by xrg."Mountain of Wealth" token, awarded by duarteframos."Love" token, awarded by Stuntkoala.


The forthcoming ‘Everything Nodes’ project is bound to have large implications for how we use Blender to do many common tasks. The purpose of this document, is to add more specificity to this area, and to serve as a starting point for further design work.

We already have a series of technical design docs here: by @Jacques Lucke (JacquesLucke)

This design document is meant as a counterpoint by instead focusing on the end-user experience and how the various parts fit together.


The overall aim of the Everything Nodes system is twofold: Both to add tremendous amounts of new low level power and flexibility, and also to add high level simplicity and ease of use.

Nodes will allow Blender to become fully procedural, meaning that artists will be able to more efficiently build scenes that are orders of magnitude more complex and advanced in a short amount of time, with full non-linear control. The object-oriented nature of node systems means that users can use, re-use and share node systems and node groups without having to start from scratch.

We should aim to make this system a foundational, integrated one, which will be core to many Blender features and workflows, rather than something tacked on on the side, with limited scope and integration.

We should also aim to fully replace the previous systems, such as the old modifiers and particles systems. Otherwise we end up with multiple competing systems which will be hard to maintain for developers and confusing for users.

Additionally, nodes, assets and properties should work together to provide an integrated, wholistic system that works together as one.

Node Systems

Currently, we use nodes for materials, compositing and textures (deprecated).

In addition to these, the Everything Nodes concept would add nodes for many more areas:

  • Modifiers & procedural modeling
  • Particles & hair
  • Physics & Simulations
  • Constraints & kinematics

Open Questions:

  • ? How do we define the borders between these systems?
  • ? Modifiers, particles, hair, materials can all be attached to objects, but per definition this is not the case for constraints. These need to operate on a higher level. How and where does this work exactly?
  • ? How can you communicate from one node tree type to another? You might want to drive a modifier and a material property with the same texture, for eg.
  • ? Particles currently sort of integrate with the modifier stack. If these things are in separate node trees, how does this work exactly? Are particles a separate object type which then references emitters?


Node-based modifiers may seem like a curiosity, until you realize how powerful this can be. The old modifiers stack works ok for simple cases if stringing a few modifiers together, but as soon as you want to do more complex generative procedural modeling, the limitations of the modifier stack become apparent.

Node-based modifiers will allow:

  • Much more powerful use of textures to drive any parameter
  • Can do more flexible trees, rather than just a stack (this is needed for generative modeling)
  • Much more powerful procedural animations can be created (see the Animation Nodes addon for example)
  • etc

Open Questions:

  • ? Currently, the modifier stack allows users to toggle modifiers for the viewport and render result separately. Nodes don't have this feature, although it could be added, but how? Via separate node tree outputs or bypass toggles on each node?

Parametric Modeling

Currently in Blender, as soon as you create a primitive, the settings are immediately baked and cannot later be changed. Node-based modifiers have the potential to finally address this issue. Here’s how:

  • When the user adds any primitive (Eg UV Sphere, Cone, Cylinder etc), they see the usual operator controls for adjusting the settings and values
  • However, rather than simply baking those settings into a static mesh, they simply modify the settings inside the modifier nodetree
  • These settings can be changed and modified at any time, and you can build modifiers on top of this node, so you can use these as inputs for boolean operations for example, and still change the # of segments at any time.
  • If the user wishes to ‘freeze’ the node tree, they can do so by running a ‘Freeze Nodes’ operator, or by going to Edit Mode, which will automatically prompt the user to freeze the mesh.

Properties Editor & high level control

Nodes allow for far more complex control and power. But how can we package this power in a way that stays simple and easy to use?

For materials, we already mirror the node tree inside the Properties editor, but in my estimation this works poorly in anything other than the very simplest of cases. We can do better.

As it turns out, we actually already have solved this: We already include a beautifully simple and powerful method of packaging low level complexity in a higher level interface with Group nodes. This system allows users to hide away lots of complexity and only expose a few useful parameters. This general concept can be expanded upon, by making it so entire node trees, not just Group nodes, can expose a small subset of parameters in the Properties Editor.

The nice thing about this solution, is that casual users don’t need to actually open up the node editors, but can just tweak the high level exposed inputs.

The node tree has defined a series of high level inputs

These will be exposed in the Properties Editor, like so:

Material nodes, with exposed parameters in the Properties Editor:

Modifier nodes, with exposed parameters in the Properties Editor:

Particle nodes, with exposed parameters in the Properties Editor:

This approach makes even more sense if we provide a series of starting points. This is where assets come in:

Open question: ? Would we then remove the old nodes-in-properties system from the Material Properties?
I think yes, as the above system is just cleaner and scales much better, although in theory we could keep both views inside Material Properties


Assets can play an important role in the Everything Nodes system. With node systems exposing a smaller subset of parameters in the Properties, it makes a lot more sense to supply users with many more starting points. The idea being that casual users won’t have to dive into the nodes - they can just add materials, particles, modifiers etc and adjust the exposed parameters. Users will only have to delve into the node systems if they wish to deviate from what the available assets allow for.

For more on assets, see T54642: Asset Project: User Interface

Workflow examples:
  1. The user opens the Asset Browser, and navigates to the screw asset. The user drags this asset into the scene. The screw itself is being generated with a node system, and has a small set of high-level user-facing controls exposed in the Properties (head type, length, etc)
  2. The user may want to have a bent screw, so they open up the nodes and add a Bend deformer node at the end of the node tree
  3. The user browses the Asset Browser and locates the Rusty Metal material. The user drags this over the screw to apply it. This material has a few parameters exposed (rustiness, cavity dirt, metal type, etc)

Particle assets:

Mesh/modifier assets:


While a node-based workflow allows for a fully procedural workflow, it’s also more technical and disconnected from directly manipulating items in the 3d view. We can address this by letting nodes spawn interactive gizmos in the viewport.

We can implement this by adding a series of built-in special 'gizmo nodes' which can be used as inputs for the node tree. Examples of gizmo nodes are the Location Gizmo, Direction Gizmo and others. A toggle on these nodes can show or hide these gizmos in the viewport.

Node Editor

With a higher reliance on nodes, it makes sense to make a few key improvements to the node editors themselves, such as the following:

Compact mode

Node trees can easily become quite messy, so a toggle to work in a clean and compact node layout can make things more tidy:

In this mode, we can also make re-ordering nodes much easier, by simply allowing for dragging the nodes, which will automatically re-order them and re-connect them up, like so:

Connect pop-up

Currently it can be quite a maze to figure out which nodes fit with each other, and you have to dive into long menus to find the relevant nodes. We can make this much simpler, by making it so dragging from an input spawns a searchable popup with only the relevant node-types. This way you don't have to guess and search for which types of nodes fit the current input socket - they will be right there:


The nodes system in Blender can both make Blender vastly more powerful with added flexibility, but also can make Blender much easier to use, if we combine nodes with the assets system and high level controls in the Properties and viewport gizmos, as well as introduce a few key improvements to the node editors themselves.

  • We can add high level controls inside the Properties Editor, using a system similar to Group Nodes
  • This works best if we have a built-in assets system so users don't have to build all this from scratch
  • This in turn means that some user don't even NEED to mess around with nodes in many simple cases, even if he/she is using nodes indirectly by simply adjusting the exposed values inside the Properties editor.
  • Gizmos can add visual interactive controls in the viewport, to more directly control nodes
  • A few key improvements to the node editors can go a long way to make using nodes easier

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes

I hope that the modeling of the meshes themselves will not change much in a regular editor and will remain as they are now, but will simply be written to the node system in a separate editor?
Houdini, in my opinion, has one drawback, it is not convenient to work with uv there. If at the same time the work with uv does not change radically it will be very convenient when everything will be on ndoah.

William Reynish (billreynish) changed the status of subtask T74967: Geometry Nodes Design from Needs Triage to Confirmed.Mar 20 2020, 12:06 PM

@William Reynish (billreynish)

Just an addendum to the "Stack Based" node design:

Nodes could be better-visualized in a custom UI "stack" by using a similar UI approach as Houdini does in its "Digital Assets" (HDAs) approach to UI.
This approach generates a "higher-level" UI, which is useful for artists when importing their procedural ("digital") assets into game engines to design levels. Artists generally only need a few values to tweak at a time, and giving them control over what values are displayed by default would be a better approach (imo) than "stacks" as we are currently seeing them.

Check it out:

The user-interface can be comprised of any UI element the programmer/artist wants (that Blender supports), leading to better usability, greater REusability of the node programming, and easier-to-understand UI for the node-based "tool" / "modifier" overall. We only need to think of the modifier in terms of being a reusable "tool" instead of a "pre-programmed 'time-saver' feature that still fits in with the old paradigm", as we are currently looking at it now.

This approach can also fit in well with the Gizmos / Exposed Parameter setup we've already got in the design by simply offering those "Gizmos" or "Parameters" as options to include as well as what UI element (i.e. button/sider/label/gizmo) would best suit them in the UI the artist desires.

To summarize:

This approach would keep Blender simple/straightforward to use (for new users) while also creating a better approach to UI even for existing modifiers, as many controls on some modifiers are simply not relevant to some artists/projects, so providing the option to hide them, potentially by default (and thereby "simplify" the whole interface), would be a step up to users who prefer the "old" modifier stack to the "new" node-based approach to modifiers. They can essentially have the equivalent to a "modifier preset" using a system like this.


Not sure if this is within scope or currently being considered, but one thing in Blender that isn't currently modifier based but would very much benefit from a modifier based workflow is bezier curve geometry generation.

Those currently belong to objectdata level datablocks, but having them separate as non destructive operations on top of objectdata would be preferable.
Materializing beveling, extruding, tapering, offsetting bezier curves as nodes/modifier would definitely be powerful, if not a must.

This curve/line-based modifier approach to procedural modeling is the thing people use in Houdini the most.

I would agree that it is a MUST in the Blender Node-based modeling workflow.

Regarding performance for Python nodes, could numba help with that? It allows to JIT-compile Python code and provides a great boost in speed similar to Cython, while being able to run on multiple cores. It can also compile ahead of time, which could be useful with nodes that have static inputs/outputs. It also supports GPU computation with CUDA, but not OpenCL. Some Python developers discussed the pros/cons of using either Cython/Numba/C++ for speeding up low-level code, and it seems that Numba made a good case for its adoptability. I personally used it on small projects and the speed-up was great for little effort. However this would introduce new dependencies (it requires numpy, and relies on llvmlite for JIT compilation, total size is around 30MB). I am quite new here and I have no idea if there are strict guidelines/requirements in that regard. Anyway, that could be interesting to look into if we want to support Python nodes without significant loss of performance.

I think some people are flocking here to give feedback regarding features instead of UX because the main task T68980 Everything Nodes was closed as a duplicate of T73324 Particles Nodes, even though Particles are only a fraction of the Everything Nodes project (?).

One thing I saw in the documentation is that the functions are being abstracted as much as possible, which is great for increased control over all aspects of a mesh/particle/simulation/material/etc. But i can see this clearly becoming too confusing and overly complex just as Animation nodes is currently, for a non-programmer or someone not acquainted with some specific terms created for everything nodes (such as primitives/functions/attributes).
My suggestion is to keep, just as currently we have inside Blender, a set of standard modifiers, which would be in fact "node groups" of the functions required to do the desired operation, that the user can select (For example a Move or Extrude Modifier), instead of having to manually insert all of the required nodes for the operation (input/convert/group/operate/output). (Of course the user could still access the modifier group to create more inputs/outputs, or "ungroup" the group to expose all nodes for further rearrange/usage)

Also, I think is imperative that every operation in blender to have a modifier a.k.a "group node" equivalent, this including regular Move/Rotate/Scale operations, as well as edit-mode ones such as Unwrap, Join, Split, Extrude, Mark/Clear Seam/Sharp (maybe by angle or feature), Bevel, Shear, Slide, Push/Pull, Connect, Create/Delete Vertex Group (as well as Assign/Remove from vertex group), Create/Delete Vertex Color, Create/Delete UV Maps, etc.
I can also see the usefulness of instead of having the "By Angle/Weight/Vertex Group" controls inside of each operation node (ex: Bevel), have it be a separate node, so that you could do any of the above operations on only a set of "primitives" (vertices, edges, tris or quads) depending on the angle/weight/vertexgroup, etc. Ex: Only UV-Unwrap a set of vertex groups that were defined (unsing a Assign to vertex group) by the angles to adjacent primitives (using a "By Feature" node).

In the case of a Unwrap modifier (a group containing a Create/Assign UV Map node and an Unwrap node), knowing that the operation can be taxing on very complex objects if done after every change to the node tree, maybe the unwrap node could have a special option to be updated manually, or only if the directly previous nodes in the chain had changes, etc.
Also, the option Project from View and Project from View(Bounds) could be instead substituted to Project from Camera and Project from Camera(Bounds) with an option to select which camera. This way not only can you create and modify an object parametrically, you could also auto-seam the object (maybe based on features like angle), uv-unwrap it and assign procedural/image materials to it, all inside the node window.

For now the Python API for "building" Compositing nodes (or node groups more precisely) misses a Python-scriptable node a bit as the Blender Game Engine had: one node, one linked python file in the Text editor for example. This Python node should be able to take namely at least an image input socket, and output and image socket.
For me, this is blocking a bit an easy integration of the G'MIC Python/C++ binding which provides more that 500 hundred 2d filters.
The Blender Custom Nodes stalled Github project by @bitsawer had introduced within a probably easily splittable C/C++ patch:

  • a Python-scriptable compositing node (with images input and output)
  • a G'MIC compositing node
  • a GLSL compositing node

If the Python-scriptable compositing node could extracted from that patch, adapted and implemented, I could then more easily implement a pure-python G'MIC compositing node with the G'MIC Python binding for which I have been working full time for 8 months. Also many people would be interested in writing numpy-based compositing node scripts I am sure.
This was thought over on this Github Issue on the now sleeping blender-custom-nodes project. :-)

I have a suggestion and I am not sure it should go here.

My suggestion is to have portal nodes to portal strings from one node tree to another. You would start with an output node. This output node you would name like a data-block. You could then plug in as many strings into this node on the left and name each string. Then you would go to another node tree and add a input portal. You would select from a drop down what portal data-block to use. The portal-ed strings would then be on the right of this input portal.

This would be useful because it will allow for more custom node tree sizes and dividing as well as more repeat node options than just node groups.

Note: There would also need to be an error message if a tree becomes a cycle due to the portals and cannot be executed. Precisely if (a) nodes output cannot be determined because it's output cycles back around to its own input. Preferably the node(s) with this error become or are bordered with a certain color.

If I did not post this in the right spot, feel free to add/change this suggestion to the right spot.

Hans Goudey (HooglyBoogly) claimed this task.

I'll close this task now, since much of this has already been implemented, designed, or planned. One thing we don't have a task for yet is parametric primitives that add a geometry nodes modifier and prompt for applying when entering edit mode. I think that can be considered separately.