Page MenuHome

Fix T70358: Poor performance with QuadriFlow on high density meshes
ClosedPublic

Authored by Pablo Dobarro (pablodp606) on Tue, Oct 1, 5:35 PM.

Diff Detail

Repository
rB Blender

Event Timeline

Brecht Van Lommel (brecht) requested changes to this revision.Tue, Oct 1, 5:42 PM

I don't think this is a good solution, the remesh operator should initialize a good default value for the current mesh, and then we might as well keep using ratio.

This revision now requires changes to proceed.Tue, Oct 1, 5:42 PM

The purpose of this operator is to create a base mesh to subdivide with multires, not to keep sculpting directly in the result of the remesher. In that case, why would be prefer to use ratio instead of the target number of faces of the base mesh?
My plan for this operator is to support a direct multires conversion of the sculpt in one step. When we have the multires workflow fixed, this operator should create the base mesh with Quadriflow, subdivide the mesh a certain number of steps with multires, project the detail on that level and propagate the detail back to the lower ones. When we have that, using ratio would make sense because the operator should be used as "convert this high poly sculpt to multires while keeping the same number of polygons", but I don't think that it makes that much sense now with our current workflow without supporting subdivisions directly.

Ratio is easy because you can look at the mesh and say, I want approximately 10% of those faces. And it still tells you the estimated number of faces anyway.

If it defaults to 4000 faces, you either have to go in and out of edit mode (can't only see total number of faces in the scene in object mode), or you have to switch to ratio to get some idea of how many faces there are in the mesh.

In general looking at a complex mesh it's not easy to estimate how many faces it needs as an absolute number. Maybe that's a skill modelers acquire after a while, but I don't think it makes sense to assume that since it's still trivial to switch to faces.

You think in terms of "I want 10% of this face count" if you are decimating your mesh, but that is not what Quadriflow is for. You should use Quadriflow when you need to keep increasing the resolution of your mesh with multires. In that case, you only care that the output from Quadriflow returns a face count that is subdividable multiple steps (4000 is a reasonable number for that). You can go higher an lower from there depending on the detail you want to preserve, but a fixed number of 4000 is always suitable for this purpose no matter you sculpt has 100k faces or 20M faces.

Brecht Van Lommel (brecht) accepted this revision.EditedTue, Oct 1, 6:35 PM

I don't think 4000 is always suitable, it's maybe biased towards sculpting characters where that is a good first guess, even if it's not the only reason to use QuadriFlow. I guess it's fine to have that as the default value anyway.

This revision is now accepted and ready to land.Tue, Oct 1, 6:35 PM

Also, please change the commit message to be more clear than the title of the bug reports when needed.

If bug reports titles are vague like this one it's deceptive, this is really just changing a default value and not actually improving performance of quadriflow itself.

@Brecht Van Lommel (brecht) I did some tests and I figured out that QuadriFlow calculates a lot faster on decimated models than when they're not. I tried it on my bird model and once I started to reach around 250k verts after running the Decimate modifier, I saw significant performance boosts for QuadriFlow. I was even able to do a QuadriFlow remesh at 100k faces and get these results:

So @Pablo Dobarro (pablodp606), I would suggest capping the QuadriFlow faces setting at 100k (this is how ZBrush ZRemesher works) and work on getting QuadriFlow to automatically decimate high poly models until reaching an acceptable number of verts that QuadriFlow can run on. I will keep testing to see if I can figure out where the sweet spot for decimate is before it starts losing too many details (the process still ran fairly slow on a 250k decimated mesh, so a lower decimated model should be more ideal than that number).

I did some further tests and here are some of the results in a Blender file. Each of them are named after the amount of verts they had after decimation and using 100k faces with QuadriFlow:

The best looking results were the 250k one and 131k one. After that the results only became worse and the remesh quality started to deteriorate rapidly. The speed for the remeshing was better on the 131k one, so I think somewhere around 150-120k might be a sweet spot for decimation before QuadriFlow remeshing.