Page MenuHome

Geometry Nodes: Point Distribute Node - Density Max value
Confirmed, NormalPublic

Description

In Random mode, the Density Max socket suggests that this limits the maximum density.

However, multiplying the Density Attribute means that the Density Max can be overridden. It seems there is an assumption that the Density Attribute is likely to be a scalar value from a weight map in which case this would work.

Either this is a bug or it is by design.

If it is a bug then here is a suggested fix in the sample_mesh_surface function in source/blender/nodes/geometry/nodes/node_geo_point_distribute.cc:

const float v0_density_factor = std::max(0.0f, (*density_factors)[v0_index]);

change to

const float v0_density_factor = std::max(0.0f, std::min(base_density, (*density_factors)[v0_index])); etc...

Otherwise the socket should just be renamed to Density to avoid confusion.

Event Timeline

I've asked something similar before in the chat, and I believe this is the intended behavior.

The Density Attribute is intended to function as an kind of mask that says: "here I want more stuff and here I want less stuff". The Density Max was added to have a convenient slider in the node to tweak the density (by multiplying every density with this value).

I think the name Density Max originated when it was assumed that the Density Attribute would be between 0 and 1. In that case, it would indeed be the maximum density. But because the density attribute can be > 1, it's somewhat of a misnomer. Maybe Density Max should be renamed to Density Multiplier?

Right, "Density Factor" might work too. "Max" is definitely misleading.

File for testing, ignore messy nodes!

Please note how Density Max and Density Attribute affect the number of points in Random and Poisson Disk modes.

When the Density Attribute is in the range of 0-1, the Density Max setting works for both methods in a predictable way. Once the Density Attribute range is increased, there is a multiplication in the Random method and not the Poisson Disk method.

From a user perspective, the relationship should be the same for both methods.

Yellow Torus = Random; Green Cones = Poisson;
Density Max = 4; Density Attribute = 4; Total = 16;

Density Max = 16; Density Attribute = 1; Total = 16;

Density Max = 32; Density Attribute = 1; Total = 32;

Density Max = 8; Density Attribute = 4; Total = 32;

I've looked at the code to see why this happens. The reason is quite subtle, and I agree the behavior is inconsistent.

The random method calls sample_mesh_surface and passes the density_factors. These two lines then determine the density:

looptri_density_factor = (v0_density_factor + v1_density_factor + v2_density_factor) / 3.0f;

and

const float points_amount_fl = area * base_density * looptri_density_factor;

Note here that looptri_density_factor can be greater than 1, depending on the other density_factors. For the Poisson distribute method, this is different, the sample_mesh_surface is also called, but without the density_factors, and thus:

float looptri_density_factor = 1.0f;

The Poisson method uses the density_factors later in update_elimination_mask_based_on_density_factors, and there they are interpreted as probabilities, so they're effectively capped at 1:

const float probablity = v0_density_factor * bary_coord.x + v1_density_factor * bary_coord.y +
                         v2_density_factor * bary_coord.z;

The easiest change to make both methods consistent, would be to calculate the looptri_density_factor like this:

looptri_density_factor = std::max(1.0f, (v0_density_factor + v1_density_factor + v2_density_factor) / 3.0f);

This way, the Density Attribute is always interpreted as a mask with value between 0 and 1, but I'm not sure this is the behavior we want.

Alternatively, density_factors could be passed to sample_mesh_surface when using the poisson option and remove the update_elimination_mask_based_on_density_factors function. In my tests this seems to work quite well but there maybe a good reason to use different density_factors for each distribution type.

Change Poisson from:

sample_mesh_surface(
    mesh, max_density, nullptr, seed, r_positions, r_bary_coords, r_looptri_indices);
Array<bool> elimination_mask(r_positions.size(), false);
update_elimination_mask_for_close_points(r_positions, minimum_distance, elimination_mask);
update_elimination_mask_based_on_density_factors(
    mesh, density_factors, r_bary_coords, r_looptri_indices, elimination_mask);
eliminate_points_based_on_mask(elimination_mask, r_positions, r_bary_coords, r_looptri_indices);

To:

sample_mesh_surface(
    mesh, max_density, &density_factors, seed, r_positions, r_bary_coords, r_looptri_indices);
Array<bool> elimination_mask(r_positions.size(), false);
update_elimination_mask_for_close_points(r_positions, minimum_distance, elimination_mask);
eliminate_points_based_on_mask(elimination_mask, r_positions, r_bary_coords, r_looptri_indices);

I asked @Jacques Lucke (JacquesLucke) about that before. That order is to provide stability in id attribute that this node generates.

To test, add an attribute randomize node on the point's radius, try painting the density, transforming the mesh, or extruding some of the mesh's faces. Odds are changing the order makes one of those operations unstable.

Philipp Oeser (lichtwerk) changed the task status from Needs Triage to Confirmed.Wed, Feb 24, 2:53 PM

Shall we confirm this for now?