Page MenuHome

FBX import characters with bone weights exceeding 1.0
Needs RevisionPublic

Authored by Paolo Acampora (pkrime) on Jun 15 2017, 6:57 PM.

Details

Summary

Hello,

fbx characters are imported with incorrect bone weighting when their vertex influences exceed the value of 1.0.

The fix proposed in the attached patch stores the value of the exceeding influences in a dictionary and applies them after they have been normalized, preserving the correct deformation.

Hope you guys are interested in merging, please let me know if it's the case for review

Thanks,
Paolo

Diff Detail

Event Timeline

Bastien Montagne (mont29) requested changes to this revision.Jun 26 2017, 4:42 PM

This won’t work that way… Main issue with this patch is that it ignores weights below 1.0 (so they won't be scaled accordingly when bigger ones are). Further more, adding yet another storage here can edn up being a problem with big meshes or lots of objects (cases that are already not-so-well handled). Further more, different 'maximum weights' would lead to different normalization factor, which

I can think of two different approaches to solve that issue:

  1. The simple one: add yetanotheroption de specify a global weight divider. This assume user knows (even roughly) the maximum weight value to expect in a given FBX file…
  2. Variation of previous solution, do a pre-scan of all weights values in the file and automatically set weight normalizing factor from that one.

But in an y case, I really think we should use a single weight factor global to whole file, to preserve consistency between vgroups values.

This revision now requires changes to proceed.Jun 26 2017, 4:42 PM

Hello mont29, thanks for looking into this

This won’t work that way… Main issue with this patch is that it ignores weights below 1.0 (so they won't be scaled accordingly when bigger ones are).

All weights are scaled in the latest block (line 2189). No need to store weights below 1.0 in the dictionary, as no information is lost when we assign them to the vertex group

Further more, adding yet another storage here can edn up being a problem with big meshes or lots of objects

If weights need a rebase, there is no point in importing badly deformed characters

  1. The simple one: add yetanotheroption de specify a global weight divider. This assume user knows (even roughly) the maximum weight value to expect in a given FBX file…

not all the vertices have the same maximum (yes, this format is unfair)

Hello mont29, thanks for looking into this

This won’t work that way… Main issue with this patch is that it ignores weights below 1.0 (so they won't be scaled accordingly when bigger ones are).

All weights are scaled in the latest block (line 2189). No need to store weights below 1.0 in the dictionary, as no information is lost when we assign them to the vertex group

No they are not, unless I miss something in your code, a vertex which none weights are above 1.0 is not stored at all in excess_weights, which means its weights wont be scaled down at all. In fact, as far as I understand the code, you are doing a local normalization to each single vertex (i.e. only takes into account weights of a single vertex to do normalization of that vertex's weights).

I guess that’s what you call “unfair format” (meaning, weights between different vertices are totally unrelated and free to change as they like, only weights of a same vertex should remain proportional)? But that sounds to me like a very specific case, and imho generally totally wrong behavior (weights should keep proportional values across vertices as well).

unless I miss something in your code, a vertex which none weights are above 1.0 is not stored at all in excess_weights, which means its weights wont be scaled down at all. In fact, as far as I understand the code, you are doing a local normalization to each single vertex (i.e. only takes into account weights of a single vertex to do normalization of that vertex's weights).

The fbx importer, unless I am missing something (which can be), imports the weights as they are and does not do any normalization, nor do I see why should I when tackling a different issue. Nothing bad happens as long as no weight exceeds 1.0: blender normalizes per vertex under the hood and also provides tools to fix the weights, the character works well, everybody's happy, non normalized weights is not the issue here.

The reason why on a regular basis somebody reports something like "character downloaded from X has incorrect weights" (where X is a character library of choice) is that when a vertex has bone weights (0.2, 3.0, 6.0), the statement vgroup.weight = w sets them to (0.2, 1.0, 1.0), the proportion between the bone weights is lost for ever.

I agree that such situation is ugly, I agree it shouldn't happen, it never happens with my own characters, it happens a lot with character libraries. Should I report this to Mixamo? Probably not: their characters work everywhere, they are fbx compliant, they have no problem. But what to do when a character library does not work in blender for this reason? What's the workflow for the customer that buys characters on Mixamo and wants to use blender? Should he load the character in Maya and then export to blender? Should he try the joys of hex editing and fix the encoded weights? Use the fbx SDK? I'm sure he will be all ears. At least my customers have this fix, which is now public. It's not really my concern what is made of it, it's just useless to deny the problem or change its boundaries (it's not weights < 1.0 are not normalized, it's weights > 1.0 are lost)

I guess that’s what you call “unfair format” (meaning, weights between different vertices are totally unrelated and free to change as they like, only weights of a same vertex should remain proportional)?

I'm even afraid that's the "fair" part

But that sounds to me like a very specific case, and imho generally totally wrong behavior (weights should keep proportional values across vertices as well).

Yes it's wrong. But it's out there

Cheers,
Paolo

I don’t say trying to fix those stupid weighting is bad or wrong, I say that doing so by only normalizing on a per-vertex basis is wrong. With your code, a set of weights for three vertices like that ((0.1, 0.5, 0.2), (0.2, 1.6, 0.2), (3.0, 2.4, 0.6)) would be normalized to something like that (if I followed your code correctly): ((0.1, 0.5, 0.2), (0.1, 0.8, 0.1), (0.5, 0.4, 0.1)). On set of weights remain unchanged, while the two others are scaled by different factors - you end up with vgroups totally different from input ones, not only uniformly scaled, which imho is not acceptable in general. That’s why I said 'scale them all by highest weight found', and you are done. If people want to normalize their groups afterwards, they can do it - but it should not be done silently by default, and certainly not while letting some unnormalized.

Side note: am thinking about adding a way to set out-of-range values for weights, this would help a lot from python to avoid storing them in separate place, this is potentially really heavy and complicates code…

Perhaps the best option is to write an external tool: a very simple stand alone program which can normalize all the weights and save to a clean fbx file. This way we would avoid polluting blender and we would not be limited to python. We could even use the autodesk SDK (need to check their license though).

It would end up being a much more specific tool, but with a wider scope, and it might feel much more better to clean up the characters before we start to handle them