Page MenuHome

New complex solidify algorithm handles poorly merging threshold of small geometry details.
Confirmed, NormalPublicBUG

Description

Using same file from T74195:

  1. Open the file
  2. Set curve precision to 15 and increment the value further.
  3. Notice how incrementally bigger and bigger chunks of the geometry get merged together.

Think there are two issues there:

  1. Merge threshold seems to be a constant, non-editable value currently. It should either be exposed to user (as many other places in Blender), and/or made "smart" by adjusting its value to the global size of the geometry (using eg the farthest vertex from origin, or something like that).
  2. Merge algorithm itself looks somewhat broken, as it seems to merge more and more geometry, leading to serious loss of details when the vertex density gets high.

Note: noticed while reviewing D7214.

Event Timeline

Bastien Montagne (mont29) changed the task status from Needs Triage to Confirmed.Mon, Mar 23, 3:20 PM
Bastien Montagne (mont29) created this task.
Bastien Montagne (mont29) changed the subtype of this task from "Report" to "Bug".

the chain merging is a technical limitation that I also noticed during testing your file.

I would like to get a second opinion how to fix that, so I will explain here why that is the case and why merging is sooo important.

My algorithm sorts faces around an edge by angle. To do so it needs a normalised direction vector of the edge.
If vertices are really close normalization might fail or come out with very poor precision. Thats why merging is important.
Merging is then done edge by edge. So it would start in element order with the first edge and check if it should merge it and
then go over to the next one and so on. The merge will always prefer the vertex with the smaller index (I think there was a reason but I forgot...)
The problem arises if the next edge is the one that has the vertex that the previous edge is merged to, because then, that edge
needs to be collapsed as well and that can chain through infinitly many vertices.

The main ways to fix it that I can think of right now are the following:

  • randomize the order to check the edges in a deterministic way
    • pro: it would fix the issue without too much overhead
    • contra: it would be unpredictable by the user (but I don't think thats really bad)
  • merge vertices to the center of the edge
    • pro: it would be predictable and stable
    • contra: would require yet one more copy of the vertex data

after writing this I see merge to center as the way to go, but feel free to add an idea to the list.

After all keep in mind that the degenerate feature of solidify is not meant for dissolving, just to handle degenerate cases.

also... How do I add a field to struct SolidifyModifierData in DNA_modifier_types.h ?
I feel like just making it longer would mess with things. Last time I used the padding (the char _pad[4] is now a char _pad and my mode data is stored there instead). This time there is definitly not the space for an additional float value for the precision user input.

It should be fine to make DNA structs longer. The DNA system / readfile can handle that fine.

Regarding DNA, yes, just add some new value, and for 32bits things (like floats or ints), group them by two to get 64bits (that means that if you only need one float, you need to add another char _pad2[4] just after it).

note that by default DNA will init that one to zero when loading a file from a version where it did not exist, so if you need a different default value you'll have to edit the versioning file (just look at surrounding code for examples of how to do it exactly).

Regarding merging, I indeed think that merging at the center of the edge is the best way to go.