A big part of the Everything Nodes project, is to convert the modifier stack to a much more powerful Geometry nodes system. This document serves to explain how this will work more precisely from a user point of view.
Geometry nodes encompass what Modifiers used to be, but where modifiers only allowed you to modify geometry, Geometry nodes also allow you to create geometry.
Moving to a node-based system will take a bit of time, so we will probably need some way to smoothen the transition and to keep old files working for a while. One way to solve this, is to add a Use Nodes toggle for modifiers, just like other areas:
If disabled, the modifier stack will continue to work, all the while developers can continue to improve and build the geometry nodes system. This approach would allow us to merge in the geometry quicker, and not have to worry about backwards compatibility as much. However, note it's possible that no new modifiers will be added to the stack.
How are nodes different?
In order to convert the modifier stack to a node-based system, we can't just do a straight 1:1 port of each modifier to a node. In many cases, nodes are different, because you have the concept of inputs and outputs. For this reason, modifiers as nodes would have to become more atomic, generally simpler, and allow users to plug in whatever they need to get the desired effect.
Every time a modifier has a property for limiting the result to a vertex group, or options to perform an operation along some direction or vector, those things should become generic inputs instead, so that you can plug in whatever you want, in order to take full advantage of the power of nodes.
Take this example for the Displace modifier:
The Direction and Vertex Group properties should simply be inputs, and the Texture property is also moot, because with nodes you can use textures to drive anything.
As a node, Displace is a lot simpler:
An example where a UV Sphere is generated, and there is a noise displacement happening along it's normals could look like this:
We can compare some more examples below:
For Boolean, currently you specify an object directly in the modifier:
As a node, this is not needed - you would simply have two inputs. You can then plug in anything you wish here:
Here is a little overview of a few example nodes:
Selections / Clusters
When modelling normally using a destructive workflow using the Edit Mode tools, you always operate on the selected items (vertices, edges or faces). In non-destructive modelling, we also need this concept. We could call the Clusters. A Cluster is simply a set of vertices, edges or faces. How do you 'select' items? Well, various nodes can get, generate or manipulate clusters. Here are some examples:
The concept of Clusters also allows us to make certain improvements to the modifiers. Often, the current modifiers have certain specific settings that apply some fixed operation to the newly generated geometry. See for example the here in the Bevel modifier:
These kinds of controls are quite inflexible and arbitrary, and different modifiers provide a different subset of fixed options. For nodes, it makes sense to generalize this, so that users can apply any operation to newly generated geometry. We can do this a number of ways, but one simple solution could be to automatically generate an attribute output for the newly generated geometry:
You could then use this output to chain together nodes in more powerful ways. Here only the result of the extruded geometry is bevelled:
Of course you can also use predefined user-created Clusters as inputs:
In a nodes system, you can use textures to drive any input, rather than the arbitrary Texture fields for a certain subset of modifier effects in the current stack. This makes using textures orders magnitude more powerful. We can make textures work much like material textures. Here is a simple example of using a texture to drive the displace strength:
Vertex Groups & Attributes
Since we can get rid of all the fixed vertex group fields for the modifiers, we can instead make this part of the node tree, just like how we currently handle vertex colors for materials. Below is an example of using a vertex group to modulate the strength of a texture, which in turn controls the displacement Strength property:
Many values for these nodes will be per element (vertex, edge, face), but some values are fixed for that node operation, and cannot vary per element. One such example is the Bevel Segments value. You can set it to any value, but it will be the same for all the affected elements during that operation.
It would be nice to still be able to drive these kinds of properties, and also to pipe them into group nodes for higher level control.
We can show these kinds of values differently, and because this will always be a single value, we can even tell the user what the value is:
The output is much like the Material Output for materials, but it outputs a mesh instead:
We can provide some extra features here, for making an output only for the viewport or the render. This could replace these toggles from the stack system:
We can also provide an output for visualizing certain textures or effects in the viewport.
It would also be useful to set any node to be the one that is previewed in the viewport:
Users will constantly want to preview certain sections of the node tree, and having to manually connect those things up to the output node will be a pain. An easier way to preview the output of any node would be extra useful for geometry nodes, but would also be a benefit for shader or compositing nodes.
Caching & Applying
Some node trees can become heavy or very large, and sometimes it's not useful to have to recalculate the entire node tree every time a property is edited down stream. For this reason, we could provide a Cache node, that takes everything upstream and caches it so it doesn't need to be recalculated.
When cached, all nodes upstream will be greyed out:
When converting to to nodes, we will most likely want to add many more nodes compared to the small list of modifiers today.
For example, we would want to add nodes to do all sorts of common modelling tasks. Here you see the Extrude and Bisect nodes, as an example:
Other examples are nodes for creating new geometry:
These kind of things are a taste of how much more powerful and broader in scope the geometry nodes system is.
It's not always useful to have to edit node properties inside the node editor. We can re-use the same concept we already have for node groups, where we specify a series of node inputs which can be manipulated in the Properties Editor.
Here are a set of input nodes:
And here you see the controls reflected in the Properties Editor:
Adding new objects
An important example is for one of the most basic tasks in Blender: adding new objects. Here you can see a newly added UV Sphere, which automatically gets a geometry node tree and a series of useless inputs:
These can be controlled by the user at any time. An Apply button can be added to easily freeze the geometry node tree, so that users can enter edit mode and perform destructive edits.
- How to handle the physics modifiers? These are different from the regular modifiers, since they need to be simulated. We could move those to the Simulation Node system instead of the Geometry nodes
- Clusters can be a set of vertices, edges or faces - how do you define what type the Cluster is?
- How to handle things other than meshes? We could allow the system to also output and use curves or NURBS objects