Geometry Nodes: Point Distribute Node Methods #84330

Closed
opened 2021-01-02 18:09:51 +01:00 by Victor-Louis De Gusseme · 11 comments

The goal of this task is to summarize the design and implementation choices made for the Point Distribute Geometry Node. To clarify, I'm new to Blender development so this post is not meant to be authoritative at all, just as a place to collect ideas.

Use Cases


Two guiding use cases for the Geometry Nodes project can be seen on the project page.

  • Scattering Pebbles
  • Trees and Flowers Scattering

Both use cases highlight the importance of weight painting for artistic control over the distribution of points. For example, the weights in the image below govern the distribution of Large Pebbles.
point_distribute_weight_paint.png

Point Distribute Methods


This section covers the already implemented Point Distribute Methods. For the most recent refactor of the Point Distribute code see: https://developer.blender.org/D10104

Random Sampling

Description
This method does uniform random sampling for each triangle of the mesh. Uniform sampling has the effect of creating gaps and clusters, as can be seen in the image below. For some uses cases this doesn't matter, but for others this might be undesirable. This is the motivation for the second method, Poisson Disk sampling.
image.png

Node inputs

  • Geometry: the mesh on which the points will be distributed
  • Density Max: a float that is multiplied with the Density Attribute
  • Density Attribute: refers to which "weight paint" map to use, these weights range from 0->1

Current Implementation

foreach (triangle in mesh) {
  points_amount = round(triangle.area * density_max * sum(vertex_weights) / 3)
  for (int i = 0; i < points_amount; i++) {
    point = rng.get_random_point_on(triangle)
  }
}

What's nice about this algorithm is that it's fast. A small thing to note is that this is not uniform random sampling on the mesh. For example, if a triangle is large enough, such that triangle.area * density_max * sum(vertex_weights) / 3 > 1, the triangle will always contain a point. When doing uniform random sampling over the entire mesh there's always a chance that triangles contain no points, no matter how large the triangle is.

Poisson Disk Sampling

Description

A Poisson disk point set is a uniformly distributed set of points in which no two points are closer to each other than some minimum distance. (Cline 2009 )

Poission Disk sampling is useful when you don't want points to be too close, or you don't want large noticeable gaps in the sampling pattern. A Poisson Disk point set is often said to have "blue-noise characteristics". This image shows the difference between uniform random sampling and a Poisson disk sampling (image source ).
random_vs_poisson.png

Node inputs

  • Geometry: the mesh on which the generated points are projected
  • Distance Min: the Poisson radius, no points will be closer than this
  • Density Max: a float that is multiplied with the Density Attribute
  • Density Attribute: refers to which "weight paint" map to use, these weights range from 0->1

Implementation
First scatter points randomly on the mesh. Then loop over all points in order and for each point first check whether the point was already eliminated, if not eliminate all its neighbors. Neighbors are found using a range search in a KDTree.

Possible Future Distribute Methods


Feel free to leave suggestions or use cases.

Clustered Sampling

Description
Example:
weighted_elimination_magenta.png

Use cases

  • Patches of vegetation
  • Herds of animals

Node inputs

Implementation

Volume Sampling

Description
Use cases
Node inputs
Implementation

Variable Radius Poisson Disk Sampling

Description
Use cases

  • Lily pads on a lake

Node inputs
Implementation

Poisson Disk Sampling with Geodesic Distance

Description
Use cases
Node inputs
Implementation

Randomly scatter objects without intersecting other objects

Note: this doesn't really fit the Point Distribute node, but I've included it because it's related

Description
Use cases
Node inputs
Implementation

The goal of this task is to summarize the design and implementation choices made for the Point Distribute Geometry Node. To clarify, I'm new to Blender development so this post is not meant to be authoritative at all, just as a place to collect ideas. Use Cases **** Two guiding use cases for the Geometry Nodes project can be seen on the [project page. ](https://developer.blender.org/project/view/121/) - `Scattering Pebbles` - `Trees and Flowers Scattering` Both use cases highlight the importance of weight painting for artistic control over the distribution of points. For example, the weights in the image below govern the distribution of `Large Pebbles`. ![point_distribute_weight_paint.png](https://archive.blender.org/developer/F9547123/point_distribute_weight_paint.png) Point Distribute Methods **** This section covers the already implemented Point Distribute Methods. For the most recent refactor of the Point Distribute code see: https://developer.blender.org/D10104 Random Sampling ------------------------------------------------- **Description** This method does uniform random sampling for each triangle of the mesh. Uniform sampling has the effect of creating gaps and clusters, as can be seen in the image below. For some uses cases this doesn't matter, but for others this might be undesirable. This is the motivation for the second method, Poisson Disk sampling. ![image.png](https://archive.blender.org/developer/F9547199/image.png) **Node inputs** - `Geometry`: the mesh on which the points will be distributed - `Density Max`: a float that is multiplied with the Density Attribute - `Density Attribute`: refers to which "weight paint" map to use, these weights range from 0->1 **Current Implementation** ``` foreach (triangle in mesh) { points_amount = round(triangle.area * density_max * sum(vertex_weights) / 3) for (int i = 0; i < points_amount; i++) { point = rng.get_random_point_on(triangle) } } ``` What's nice about this algorithm is that it's fast. A small thing to note is that this is not uniform random sampling on the mesh. For example, if a triangle is large enough, such that `triangle.area * density_max * sum(vertex_weights) / 3 > 1`, the triangle will always contain a point. When doing uniform random sampling over the entire mesh there's always a chance that triangles contain no points, no matter how large the triangle is. Poisson Disk Sampling ------------------------------------------------ **Description** > A Poisson disk point set is a uniformly distributed set of points in which no two points are closer to each other than some minimum distance. ([Cline 2009 ](https://www.cg.tuwien.ac.at/research/publications/2009/cline-09-poisson/cline-09-poisson-paper.pdf)) Poission Disk sampling is useful when you don't want points to be too close, or you don't want large noticeable gaps in the sampling pattern. A Poisson Disk point set is often said to have "blue-noise characteristics". This image shows the difference between uniform random sampling and a Poisson disk sampling ([image source ](http://www.cemyuksel.com/cyCodeBase/soln/poisson_disk_sampling.html)). ![random_vs_poisson.png](https://archive.blender.org/developer/F9547188/random_vs_poisson.png) **Node inputs** - `Geometry`: the mesh on which the generated points are projected - `Distance Min`: the Poisson radius, no points will be closer than this - `Density Max`: a float that is multiplied with the Density Attribute - `Density Attribute`: refers to which "weight paint" map to use, these weights range from 0->1 **Implementation** First scatter points randomly on the mesh. Then loop over all points in order and for each point first check whether the point was already eliminated, if not eliminate all its neighbors. Neighbors are found using a range search in a KDTree. Possible Future Distribute Methods **** Feel free to leave suggestions or use cases. Clustered Sampling ----------------------------------------------------------- **Description** Example: ![weighted_elimination_magenta.png](https://archive.blender.org/developer/F9583224/weighted_elimination_magenta.png) **Use cases** - Patches of vegetation - Herds of animals **Node inputs** **Implementation** Volume Sampling ------------------------------------------------------------ **Description** **Use cases** **Node inputs** **Implementation** Variable Radius Poisson Disk Sampling ----------------------------------------------------------- **Description** **Use cases** - Lily pads on a lake **Node inputs** **Implementation** Poisson Disk Sampling with Geodesic Distance ----------------------------------------------------------- **Description** **Use cases** **Node inputs** **Implementation** Randomly scatter objects without intersecting other objects ----------------------------------------------------------- **Note:** this doesn't really fit the `Point Distribute` node, but I've included it because it's related **Description** **Use cases** **Node inputs** **Implementation**
Author
Member

Added subscriber: @victorlouis

Added subscriber: @victorlouis

Added subscribers: @HooglyBoogly, @grosgood

Added subscribers: @HooglyBoogly, @grosgood

@victorlouis
First off, thumbs up, plus one, love token, whatever, and running with this. Your enthusiasm is wonderful to behold.
Second off, don't lose sight that the distribution has to work in an animation without popping or temporal disruption. They may be 'point clouds' in a flat, one-frame sense, but they are really noodles of spaghetti running through time - and look like a model of laminar flow.
I built your D9949 patch last night (ummm... early this morning) and compared its behavior, side-by-side, with the random distribution method - both on stretching and contracting geometry, typical in animation.
compare.mp4
The random distribution on the right is steadier, and benefits from stabilization work @HooglyBoogly did a little while ago (D9832). Particles pop in and out as contraction and stretching take place - and that grates - but, on the whole, a particle stays put for the duration of its time on the surface. On the other hand, there is a lot of jitter in your implementation thus far - and yes, I know it is far from done. I wanted to wave the temporal flag early in your game so that it is in play. As Sherlock Holmes oft-times said to Dr. Watson: "Watson. This seems to be a three-pipe problem." So 'tis here. Have fun with it!

@victorlouis First off, thumbs up, plus one, love token, whatever, and running with this. Your enthusiasm is wonderful to behold. Second off, don't lose sight that the distribution has to work in an animation without popping or temporal disruption. They may be 'point clouds' in a flat, one-frame sense, but they are really noodles of spaghetti running through time - and look like a model of laminar flow. I built your [D9949](https://archive.blender.org/developer/D9949) patch last night (ummm... early this morning) and compared its behavior, side-by-side, with the random distribution method - both on stretching and contracting geometry, typical in animation. [compare.mp4](https://archive.blender.org/developer/F9547672/compare.mp4) The random distribution on the right is steadier, and benefits from stabilization work @HooglyBoogly did a little while ago ([D9832](https://archive.blender.org/developer/D9832)). Particles pop in and out as contraction and stretching take place - and that grates - but, on the whole, a particle stays put for the duration of its time on the surface. On the other hand, there is a lot of jitter in your implementation thus far - and yes, I know it is far from done. I wanted to wave the temporal flag early in your game so that it is in play. As Sherlock Holmes oft-times said to Dr. Watson: "Watson. This seems to be a three-pipe problem." So 'tis here. Have fun with it!
Contributor

Added subscriber: @KenzieMac130

Added subscriber: @KenzieMac130
Contributor

In #84330#1085715, @grosgood wrote:
@victorlouis
First off, thumbs up, plus one, love token, whatever, and running with this. Your enthusiasm is wonderful to behold.
Second off, don't lose sight that the distribution has to work in an animation without popping or temporal disruption. They may be 'point clouds' in a flat, one-frame sense, but they are really noodles of spaghetti running through time - and look like a model of laminar flow.
I built your D9949 patch last night (ummm... early this morning) and compared its behavior, side-by-side, with the random distribution method - both on stretching and contracting geometry, typical in animation.
compare.mp4
The random distribution on the right is steadier, and benefits from stabilization work @HooglyBoogly did a little while ago (D9832). Particles pop in and out as contraction and stretching take place - and that grates - but, on the whole, a particle stays put for the duration of its time on the surface. On the other hand, there is a lot of jitter in your implementation thus far - and yes, I know it is far from done. I wanted to wave the temporal flag early in your game so that it is in play. As Sherlock Holmes oft-times said to Dr. Watson: "Watson. This seems to be a three-pipe problem." So 'tis here. Have fun with it!

I feel like animation deformation should work on top of the generated point cloud rather than having to make sure the every algorithm adopted is coherent over temporally deforming geometry. In a good distribution points will inevitably have to be repacked when the surface area changes. It makes much more sense to process it once and deform the results.

Imagine this is someone's arm and the points are hair: it doesn't make sense to deterministically regrow the hair from scratch every frame while the arm bends. It would be better to generate the hair in base pose, deform the arm and the roots of the hair, and apply any simulations on top of those changes.

> In #84330#1085715, @grosgood wrote: > @victorlouis > First off, thumbs up, plus one, love token, whatever, and running with this. Your enthusiasm is wonderful to behold. > Second off, don't lose sight that the distribution has to work in an animation without popping or temporal disruption. They may be 'point clouds' in a flat, one-frame sense, but they are really noodles of spaghetti running through time - and look like a model of laminar flow. > I built your [D9949](https://archive.blender.org/developer/D9949) patch last night (ummm... early this morning) and compared its behavior, side-by-side, with the random distribution method - both on stretching and contracting geometry, typical in animation. > [compare.mp4](https://archive.blender.org/developer/F9547672/compare.mp4) > The random distribution on the right is steadier, and benefits from stabilization work @HooglyBoogly did a little while ago ([D9832](https://archive.blender.org/developer/D9832)). Particles pop in and out as contraction and stretching take place - and that grates - but, on the whole, a particle stays put for the duration of its time on the surface. On the other hand, there is a lot of jitter in your implementation thus far - and yes, I know it is far from done. I wanted to wave the temporal flag early in your game so that it is in play. As Sherlock Holmes oft-times said to Dr. Watson: "Watson. This seems to be a three-pipe problem." So 'tis here. Have fun with it! I feel like animation deformation should work on top of the generated point cloud rather than having to make sure the every algorithm adopted is coherent over temporally deforming geometry. In a good distribution points will inevitably have to be repacked when the surface area changes. It makes much more sense to process it once and deform the results. Imagine this is someone's arm and the points are hair: it doesn't make sense to deterministically regrow the hair from scratch every frame while the arm bends. It would be better to generate the hair in base pose, deform the arm and the roots of the hair, and apply any simulations on top of those changes.
Author
Member

@grosgood I've thought about the stability and for the Poisson Disk method it seems really hard to achieve, because the points influence each other so much. For the Random method there's no dependence between points. Also points are sampled per triangle independently so deforming the mesh doesn't matter as much, you can just add/remove points in triangles whose area changes.

Yuksel's algorithm however does have an interesting and relevant property that allows for progressive sampling.

We achieve this by ordering the samples in the final set, such that when the samples are introduced in this order, each subset in the sequence exhibits blue noise characteristics.

prog.png
I'm not entriely sure whether we could use this property to increase stability under mesh deformation though.

@grosgood I've thought about the stability and for the `Poisson Disk` method it seems really hard to achieve, because the points influence each other so much. For the `Random` method there's no dependence between points. Also points are sampled per triangle independently so deforming the mesh doesn't matter as much, you can just add/remove points in triangles whose area changes. Yuksel's algorithm however does have an interesting and relevant property that allows for *progressive sampling*. > We achieve this by ordering the samples in the final set, such that when the samples are introduced in this order, each subset in the sequence exhibits blue noise characteristics. ![prog.png](https://archive.blender.org/developer/F9548012/prog.png) I'm not entriely sure whether we could use this property to increase stability under mesh deformation though.

Added subscriber: @brecht

Added subscriber: @brecht

For stability under deformation, there's a few distinct use cases, and they have different solutions.

One is how to make the distribution more stable as the mesh coordinates and even topology change, for when the users is editing the mesh and doesn't want the results to completely change on every minor edit. Some distribution algorithm changes are possible to accommodate this, though in the end the quality of distribution is more important and perfect stability is simply impossible.

The second is where you are deforming the source mesh in an animation and want a stable distribution, for hair or instances that are attached to the mesh. In this case you want the distribution to be identical, and ideally only generated once for performance. No distribution algorithm changes will solve this, it's a matter of always using the same undeformed mesh for distribution.

A third use case would be where you want to do some kind of level of detail, frustum culling, or other incremental generation based on the camera. It's possible to have a distribution algorithm that generates locally stable results, where mesh coordinates or topology beyond a certain distance have no influence.

A fourth is where you want to dynamically add more points. For example to continuously emit particles a random distribution is usually fine. A progressive Poisson distribution would make sense if you want to e.g. grow plants over time.

I think the use case being referred to by @grosgood is the second one. For that problem, we really needed a dedicated task because it goes quite deep. The current particle system handles this using a data layer with original undeformed coordinate. Using it either to generate the distribution, or to map an existing (manually edited) distribution from undeformed to deformed coordinates. A similar system is needed for geometry nodes, though the best UI for that is not obvious.

For stability under deformation, there's a few distinct use cases, and they have different solutions. One is how to make the distribution more stable as the mesh coordinates and even topology change, for when the users is editing the mesh and doesn't want the results to completely change on every minor edit. Some distribution algorithm changes are possible to accommodate this, though in the end the quality of distribution is more important and perfect stability is simply impossible. The second is where you are deforming the source mesh in an animation and want a stable distribution, for hair or instances that are attached to the mesh. In this case you want the distribution to be identical, and ideally only generated once for performance. No distribution algorithm changes will solve this, it's a matter of always using the same undeformed mesh for distribution. A third use case would be where you want to do some kind of level of detail, frustum culling, or other incremental generation based on the camera. It's possible to have a distribution algorithm that generates locally stable results, where mesh coordinates or topology beyond a certain distance have no influence. A fourth is where you want to dynamically add more points. For example to continuously emit particles a random distribution is usually fine. A progressive Poisson distribution would make sense if you want to e.g. grow plants over time. I think the use case being referred to by @grosgood is the second one. For that problem, we really needed a dedicated task because it goes quite deep. The current particle system handles this using a data layer with original undeformed coordinate. Using it either to generate the distribution, or to map an existing (manually edited) distribution from undeformed to deformed coordinates. A similar system is needed for geometry nodes, though the best UI for that is not obvious.
Victor-Louis De Gusseme changed title from Geometry Nodes: Point Distribute Node Design & Implementation to Geometry Nodes: Point Distribute Node Methods 2021-01-17 15:50:30 +01:00
Contributor

Distribute method design proposal for point child. https://developer.blender.org/T84816

Distribute method design proposal for point child. https://developer.blender.org/T84816
Author
Member

Changed status from 'Needs Triage' to: 'Archived'

Changed status from 'Needs Triage' to: 'Archived'
Author
Member

Archiving this because it is not actively being investigated.

Archiving this because it is not actively being investigated.
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
4 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#84330
No description provided.