Geometry Nodes: Point child distribution method #84816

Closed
opened 2021-01-18 12:08:10 +01:00 by Kenzie · 6 comments
Contributor

Intro

Scattering in nature has a reason behind it. There is always a reason that fate has driven something to end up in a certain spot and state. In many of these cases the reason can be the result of a chain reaction where the reason one object ended up in a certain position or state is due to another object being in a certain position or state. In this sense natural scattering has a hierarchy to it.

For example: The reason there is a bunch of leaves scattered on the ground is likely because a tree was scattered not too far away from it, you can expect to see exposed knots of roots around the tree, the tree has also likely provided shade to other plants that needed it as well, those plants are likely good places to spawn butterflies and target points for their boid simulation and so on.

To scatter each of these elements by simply rolling the dice on points across the ground would be to ignore the either logical or artist driven reasons behind the placement of these elements and create a very chaotic looking result. In the current system the user is expected to hand paint areas in which the placement of these objects would likely make sense, taking away from the artist's time they can be directing the scene at a higher level. Therefore a scattering method which can take into account this kind of cause an affect would be extremely beneficial to creating believable natural environments quickly. Luckily creating the key to create such complex scattering in the geometry nodes system shouldn't be too difficult. At the core of all those reactions was the idea that in this heirarchy objects are likely found close to their parent. In this design document a point child scattering method is proposed.

Breakdown

The input could be any point(s) in 3D space, whether it be a hand placed object, the vertices on a model, the result of a previous scatter or some data imported into the scene. In this example we will examine a single point.
image.png

For each input point we choose a number of points to place randomly around the provoking point as our output points.

We can easily do this by plotting a random spot on a disk of a defined radius in polar space ,

 const float r = random_gen.get_float() * radius;
 const float theta = random_gen.get_float() * M_PI * 2.0f;
 const float h = 0.0f;

then we convert it back to euclidean space,

const float3 offset_position = float3(
   r * cosf(theta),
   r * sinf(theta),
   h
);

(since we are working on a flat plane for now, we will skip orientation) add back to the provoking point position to get a random spot around the original point.

return provoking_position + offset_position;

If we repeat this step for each point we want to create around every input point, we end up with something like this:
PointScatterFirst_500_0-5_0.png
(Take note, this node only outputs the newly created points, the original input is displayed with a merge at the end of the network)

But now we have encountered a problem many of our points are too close to the original point, likely the artist would not be satisfied with the results. Luckily the solution is easy: we just need to offset the possible radius an artist defined amount from the center, this is as simple as a simple addition. Our random radius now becomes:

 const float r = random_gen.get_float() * outer_radius + inner_radius;

As a result we are able to define an inner radius for the scatter.
PointScatterFirst_500_0-5_0-5.png
The outer radius did extend the distance at which points are scattered this could be a point of debate into if this is intuitive. You can look at the inner radius as defining the size of the object to scatter around. Creating a torus often works in a similar manner. In order to see this next point more clearly the settings of the scatter will be tweaked to something more reasonable.

PointScatterFirst_32_0-4_0-1.png

The scatter has now completed it's task. While it doesn't seem like much, this is an extremely important key in creating natural hierarchical scatters. At this point the user is free to work with the attributes, delete points (such as ones that might intersect buildings and/or walkways), once simulation nodes comes into play they could simulate these points (such as rocks falling or leaves blowing) or in our example run the results into another point scatter to produce another level of scattering. This is where the nested or hierarchical scattering element comes into play. Notice the green dots are now scattered around our previously created blue dots, which is scattered around the red dot.

PointScatterSecondary_Distinct.png

Now lets look at how these can scale up to give the user control over a large number of input points. Here more than one input is scattered using Poisson Disk. In this design the user can multiply the inner and outer radius using attributes (I picked in this example the radius attribute to control the outer and the scale attribute to control the inner but ideally the user should be able to define their own). In this example the inner and outer radius of the first scatter is being controlled.

PointScatter_Attribute_1_0.png
PointScatter_Attribute_0_1.png
The user can then drive the radii randomly, or by using their own math, painting, or textures to tune the results of the scatter.
PointScatter_Attribute_Random1.png

Results

In this example a bunch of boulders are scattered randomly around a weight painted beach. With the help of the point child distribution: around those boulders are rocks and around those rocks are patches of gravel, (ignoring the placeholder art assets) this produces a rather believable looking scene.

PointScatter_Rocks.png

Compare this to the same scene without the point child approach. In this example the rocks and gravel are distributed with random distribution using the same density attribute as the boulder. The resulting image feels more chaotic and no longer seems to have a natural order. The boulders no longer blend into the beach and the eye ends up getting lost in noise, struggling to pick a subject. Without point child this would require manual artist intervention to correct, and when the layout of the scene changes so must their solution.

RandomScatter_Rocks.png

This forest example helps break down some of what is achievable with this kind of approach. Multiple assets can be populated based on a single point. This result can be taken further by layering more scattering systems and instances to create a flushed out forest environment.
PointScatter_Forest.png

Issues and workarounds

There are still some problems and subjects that would need to be addressed.

  1. So far in this design there are a fixed amount of points created for each input point. This is to help performance as the storage of the final result can easily be deduced and allocated up-front. However this can create some unnatural looking results if assets are too easy to pick out. One approach might be drive a chance each new point might not be created using the density attribute.

  2. Unfortunately (whether it be an oversight or a misunderstanding of blender's math) so far I have not been able to find a concrete solution for orienting the distribution disk along the surface. The disk must have a proper direction following the direction the points it is emitted on is facing. PointScatterIssueSlopeAttachment.PNG

  3. Even if 2. is addressed orienting the disk along the orientation of the incoming point isn't always enough (especially if this point was distributed far away from the original on bumpy terrain) there has to be a way that the created points can be re-projected back onto the intended geometry. Considering the original terrain geometry could be not accessible in the current point distribute node and found farther upstream in the node network, with more work that could be done (including culling) before re-projecting it might be worth considering leaving it up to an external "shrink-wrap" node.

  4. An alternative solution/workaround might be to allow users to drive attributes via a "point-density" attribute node to drive scatter density, however this would rely on the mesh tessellation of the geometry users wish to scatter on, which could definitely be an issue with some archvis, background scenery, or low poly assets for games use cases. This would likely be needed anyways and could compliment this feature.

  5. The point distribute node is currently only designed to work on mesh components as inputs. It will likely have to be able to accommodate things such as volumes and point-clouds anyways in the future so it shouldn't be too big of a concern. However the approach taken in the prototype patch is hacky and requires code duplication.

  6. Intersections aren't all fixed with the inner radius offset, the scatter is still not aware of neighboring or higher up the chain points. This can lead to some obvious intersections. Similarly to the terrain intersection issue it seems a bit much to solve it inside the scatter node. One approach might be to provide a node to iteratively "relax" a point cloud until none/less of their radii intersect. A benefit to this approach is it could be applied to more scattering approaches as well. PointScatterIssueSelfIntersection.PNG

  7. (If there are any other things that might need addressing please provide info)

Prototype Patch

(This prototype was made with an earlier version and is in no shape to be merged in the current implementation)
https://developer.blender.org/D10133

# Intro Scattering in nature has a reason behind it. There is always a reason that fate has driven something to end up in a certain spot and state. In many of these cases the reason can be the result of a chain reaction where the reason one object ended up in a certain position or state is due to another object being in a certain position or state. In this sense natural scattering has a hierarchy to it. For example: The reason there is a bunch of leaves scattered on the ground is likely because a tree was scattered not too far away from it, you can expect to see exposed knots of roots around the tree, the tree has also likely provided shade to other plants that needed it as well, those plants are likely good places to spawn butterflies and target points for their boid simulation and so on. To scatter each of these elements by simply rolling the dice on points across the ground would be to ignore the either logical or artist driven reasons behind the placement of these elements and create a very chaotic looking result. In the current system the user is expected to hand paint areas in which the placement of these objects would likely make sense, taking away from the artist's time they can be directing the scene at a higher level. Therefore a scattering method which can take into account this kind of cause an affect would be extremely beneficial to creating believable natural environments quickly. Luckily creating the key to create such complex scattering in the geometry nodes system shouldn't be too difficult. At the core of all those reactions was the idea that in this heirarchy objects are likely found close to their parent. In this design document a point child scattering method is proposed. # Breakdown The input could be any point(s) in 3D space, whether it be a hand placed object, the vertices on a model, the result of a previous scatter or some data imported into the scene. In this example we will examine a single point. ![image.png](https://archive.blender.org/developer/F9585018/image.png) For each input point we choose a number of points to place randomly around the provoking point as our output points. We can easily do this by plotting a random spot on a disk of a defined radius in [polar space ](https://en.wikipedia.org/wiki/Polar_coordinate_system), ``` const float r = random_gen.get_float() * radius; const float theta = random_gen.get_float() * M_PI * 2.0f; const float h = 0.0f; ``` then we convert it back to euclidean space, ``` const float3 offset_position = float3( r * cosf(theta), r * sinf(theta), h ); ``` (since we are working on a flat plane for now, we will skip orientation) add back to the provoking point position to get a random spot around the original point. ``` return provoking_position + offset_position; ``` If we repeat this step for each point we want to create around every input point, we end up with something like this: ![PointScatterFirst_500_0-5_0.png](https://archive.blender.org/developer/F9585022/PointScatterFirst_500_0-5_0.png) (Take note, this node only outputs the newly created points, the original input is displayed with a merge at the end of the network) But now we have encountered a problem many of our points are too close to the original point, likely the artist would not be satisfied with the results. Luckily the solution is easy: we just need to offset the possible radius an artist defined amount from the center, this is as simple as a simple addition. Our random radius now becomes: ``` const float r = random_gen.get_float() * outer_radius + inner_radius; ``` As a result we are able to define an inner radius for the scatter. ![PointScatterFirst_500_0-5_0-5.png](https://archive.blender.org/developer/F9585046/PointScatterFirst_500_0-5_0-5.png) The outer radius did extend the distance at which points are scattered this could be a point of debate into if this is intuitive. You can look at the inner radius as defining the size of the object to scatter around. Creating a torus often works in a similar manner. In order to see this next point more clearly the settings of the scatter will be tweaked to something more reasonable. ![PointScatterFirst_32_0-4_0-1.png](https://archive.blender.org/developer/F9585058/PointScatterFirst_32_0-4_0-1.png) The scatter has now completed it's task. While it doesn't seem like much, this is an extremely important key in creating natural hierarchical scatters. At this point the user is free to work with the attributes, delete points (such as ones that might intersect buildings and/or walkways), once simulation nodes comes into play they could simulate these points (such as rocks falling or leaves blowing) or in our example run the results into another point scatter to produce another level of scattering. This is where the nested or hierarchical scattering element comes into play. Notice the green dots are now scattered around our previously created blue dots, which is scattered around the red dot. ![PointScatterSecondary_Distinct.png](https://archive.blender.org/developer/F9585073/PointScatterSecondary_Distinct.png) Now lets look at how these can scale up to give the user control over a large number of input points. Here more than one input is scattered using Poisson Disk. In this design the user can multiply the inner and outer radius using attributes (I picked in this example the radius attribute to control the outer and the scale attribute to control the inner but ideally the user should be able to define their own). In this example the inner and outer radius of the first scatter is being controlled. ![PointScatter_Attribute_1_0.png](https://archive.blender.org/developer/F9585079/PointScatter_Attribute_1_0.png) ![PointScatter_Attribute_0_1.png](https://archive.blender.org/developer/F9585081/PointScatter_Attribute_0_1.png) The user can then drive the radii randomly, or by using their own math, painting, or textures to tune the results of the scatter. ![PointScatter_Attribute_Random1.png](https://archive.blender.org/developer/F9585092/PointScatter_Attribute_Random1.png) # Results In this example a bunch of boulders are scattered randomly around a weight painted beach. With the help of the point child distribution: around those boulders are rocks and around those rocks are patches of gravel, (ignoring the placeholder art assets) this produces a rather believable looking scene. ![PointScatter_Rocks.png](https://archive.blender.org/developer/F9585089/PointScatter_Rocks.png) Compare this to the same scene without the point child approach. In this example the rocks and gravel are distributed with random distribution using the same density attribute as the boulder. The resulting image feels more chaotic and no longer seems to have a natural order. The boulders no longer blend into the beach and the eye ends up getting lost in noise, struggling to pick a subject. Without point child this would require manual artist intervention to correct, and when the layout of the scene changes so must their solution. ![RandomScatter_Rocks.png](https://archive.blender.org/developer/F9585105/RandomScatter_Rocks.png) This forest example helps break down some of what is achievable with this kind of approach. Multiple assets can be populated based on a single point. This result can be taken further by layering more scattering systems and instances to create a flushed out forest environment. ![PointScatter_Forest.png](https://archive.blender.org/developer/F9585125/PointScatter_Forest.png) # Issues and workarounds There are still some problems and subjects that would need to be addressed. 1. So far in this design there are a fixed amount of points created for each input point. This is to help performance as the storage of the final result can easily be deduced and allocated up-front. However this can create some unnatural looking results if assets are too easy to pick out. One approach might be drive a chance each new point might not be created using the density attribute. 2. Unfortunately (whether it be an oversight or a misunderstanding of blender's math) so far I have not been able to find a concrete solution for orienting the distribution disk along the surface. The disk must have a proper direction following the direction the points it is emitted on is facing. ![PointScatterIssueSlopeAttachment.PNG](https://archive.blender.org/developer/F9585140/PointScatterIssueSlopeAttachment.PNG) 3. Even if 2. is addressed orienting the disk along the orientation of the incoming point isn't always enough (especially if this point was distributed far away from the original on bumpy terrain) there has to be a way that the created points can be re-projected back onto the intended geometry. Considering the original terrain geometry could be not accessible in the current point distribute node and found farther upstream in the node network, with more work that could be done (including culling) before re-projecting it might be worth considering leaving it up to an external "shrink-wrap" node. 4. An alternative solution/workaround might be to allow users to drive attributes via a "point-density" attribute node to drive scatter density, however this would rely on the mesh tessellation of the geometry users wish to scatter on, which could definitely be an issue with some archvis, background scenery, or low poly assets for games use cases. This would likely be needed anyways and could compliment this feature. 5. The point distribute node is currently only designed to work on mesh components as inputs. It will likely have to be able to accommodate things such as volumes and point-clouds anyways in the future so it shouldn't be too big of a concern. However the approach taken in the prototype patch is hacky and requires code duplication. 6. Intersections aren't all fixed with the inner radius offset, the scatter is still not aware of neighboring or higher up the chain points. This can lead to some obvious intersections. Similarly to the terrain intersection issue it seems a bit much to solve it inside the scatter node. One approach might be to provide a node to iteratively "relax" a point cloud until none/less of their radii intersect. A benefit to this approach is it could be applied to more scattering approaches as well. ![PointScatterIssueSelfIntersection.PNG](https://archive.blender.org/developer/F9585159/PointScatterIssueSelfIntersection.PNG) 7. (If there are any other things that might need addressing please provide info) # Prototype Patch (This prototype was made with an earlier version and is in no shape to be merged in the current implementation) https://developer.blender.org/D10133
Author
Contributor

Added subscriber: @KenzieMac130

Added subscriber: @KenzieMac130

Added subscriber: @victorlouis

Added subscriber: @victorlouis

I think the use cases you describe are very important. There certainly should be a category of Point Distribute nodes that distribute new points around input points. Some thoughts about your current proposal:

  • I'm not sure whether the parent/child terminology should be used here. I feel like it is somewhat unnecessary because this parent-child relation doesn't persist. I would just view it as scattering points around other points.
  • You mention:

Considering the original terrain geometry could be not accessible in the current point distribute node and found farther upstream in the node network, with more work that could be done (including culling) before re-projecting it might be worth considering leaving it up to an external "shrink-wrap" node.

  • I think you raise a good point here that we need to think about in general. Scattered points "forget" which mesh they were scattered on, so scattering more points on the mesh around those points is not really possible. Scattering followed by projection could work in some cases.
I think the use cases you describe are very important. There certainly should be a category of Point Distribute nodes that distribute new points around input points. Some thoughts about your current proposal: - I'm not sure whether the parent/child terminology should be used here. I feel like it is somewhat unnecessary because this parent-child relation doesn't persist. I would just view it as scattering points around other points. - You mention: > Considering the original terrain geometry could be not accessible in the current point distribute node and found farther upstream in the node network, with more work that could be done (including culling) before re-projecting it might be worth considering leaving it up to an external "shrink-wrap" node. - I think you raise a good point here that we need to think about in general. Scattered points "forget" which mesh they were scattered on, so scattering more points on the mesh around those points is not really possible. Scattering followed by projection could work in some cases.

Added subscriber: @DimitriBastos

Added subscriber: @DimitriBastos

I think this is a must have. Specially for more natural environment set dressing.

I think this is a must have. Specially for more natural environment set dressing.
Author
Contributor

Changed status from 'Needs Triage' to: 'Archived'

Changed status from 'Needs Triage' to: 'Archived'
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
3 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#84816
No description provided.