Page MenuHome

Blender crashes PC. Modifier Remesh Voxel Size issue.
Confirmed, NormalPublicBUG

Description

System Information
Operating system: Windows-10-10.0.17763-SP0 64 Bits
Graphics card: GeForce GTX 1660/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 432.00
Processor: AMD Ryzen 7 3800x 8-Core Processor 3.90 GHz

Blender Version
Broken: version: 2.90.0 Alpha, branch: master, commit date: 2020-06-12 17:01, hash: rBfd8d245e6a80
(Also with Blender 2.83 BETA)

Short description of error
Selecting the "Remesh" modifier, selected "Voxel", Voxel size below "0.01m" = PC Crash. Needed to restart the PC.

Exact steps for others to reproduce the error

  1. Start Blender.
  2. Let the default Cube alive and add the "Remesh" modifier.
  3. Make the "Voxel Size" below "0.01m".
  4. Did it crashed?

(Found out in my Blender Project, but I tested it then in a default Scene. Same thing happened.)

(Why I set the Voxel Size below "0.01m"?
It happens accidentally, when I have it like "0.42m", but I hold my left click and slide it to the left, so I accidentally go below "0.01m".)

Good Luck!

Event Timeline

Robert Guetzkow (rjg) changed the task status from Needs Triage to Needs Information from User.Jun 14 2020, 11:33 PM

If your operating system crashes there seems to be an issue outside of Blender. Please update the graphics driver and check if the crash still occurs. Be aware the remeshing is very memory intensive, you may be running out of memory (see similar reports T77664 and T77556).

Ankit Meel (ankitm) renamed this task from --{ Blender crashes PC. Modifier Remesh Voxel Size issue. }-- to Blender crashes PC. Modifier Remesh Voxel Size issue..Jun 15 2020, 8:20 AM
Ankit Meel (ankitm) changed the task status from Needs Information from User to Confirmed.Jun 15 2020, 8:38 AM


The memory usage goes wild when reducing the voxel size just under .01 (use the "reduce" button beside the slider)

Ankit Meel (ankitm) added a comment.EditedJun 15 2020, 8:39 AM

An easy fix would be to put a soft limit of 0.01 in the slider UI itself. and manually let people type into it what they want.

Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI.

If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.

IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

@Ankit Meel (ankitm) I can't reproduce a crash on my system with 0.001 on a subdivided Suzanne, it only consumes a large amount of memory. That in itself is not surprising, since the smaller the voxel the more memory it needs and the memory consumption grows cubically. The issue is that the soft_min is 0.0001 (or 9.999999747378752e-05 to be precise), which means that sliding over the value can easily result in OOM. I wouldn't change the hard_limit, since someone with more memory or a use case like @Henrik Dick (weasel) described, may want to use really small values and should be able to type them in manually. Edit: I would change the hard_limit too, in order to avoid the arithmetic error reported in T77878.

This comment was removed by Robert Guetzkow (rjg).

Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI.

If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.

IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

I agree with Henrick. I'm just finishing sculpting quite a hefty model in Blender. I think the solution of having a hard limit of 0.01 is not a solution as you definitely use these small numbers when you scale the model down, when you have little details, but you still need that geometrical density to sculpt, it all depends on the type of the model and its scale and putting a hard-limit is a hacky solution. To be honest, I would argue that your computer running out of memory because you created a geo that is too dense is a bug, that is your fault.

Having no notification that geo might get too dense and you'll most likely get a crash (as it is done in Zbrush once you subdivide model too much) should be considered more of a UI paper-cut I guess. You can crash any sculpting software like that if you don't know what you're doing and which numbers are viable. You can easily crush Zbrush the same way if you over-do Dynamesh values or overly subdivide your model.

@Robert Guetzkow (rjg) for me bpy.context.object.modifiers["Remesh"].bl_rna.properties["voxel_size"].soft_min - 0.0001 is basically giving zero, so the soft_min matches. The little error is due to float not beeing able to store that number exactly (see https://www.h-schmidt.net/FloatConverter/IEEE754.html)

@Henrik Dick (weasel) You're right I misread it and thought there was one erroneous extra zero digit. The actual deviation is obviously coming from floating point precision.

Ankit Meel (ankitm) added a comment.EditedJun 15 2020, 12:07 PM

Edited the comment up there. T77868#953819 It felt surprising to me that going from 0.03 > 0.02 > 0.01 > (the number here) created such a huge increase. What I had was the default cube with nothing else.

I can't reproduce a crash on my system with 0.001 on a subdivided Suzanne, it only consumes a large amount of memory.

Just to clarify: is anyone opposing the confirmation of the report ?

@Ankit Meel (ankitm) No it definitly crashes when decreasing from 0.01 with the left arrow going to the softmin at 0.0001. But only on the default cube and objects of similar size or bigger.

T77878 is related to the decision what the hard limit should be.

@Henrik Dick (weasel) What is a suitable soft_min, when the required memory is relative to the size of the object? Remeshing a 50 m cube and a voxel size of 0.01 will have the same effect, allocating about 27 GB. In a perfect world the modifier would base the limits on the available resources. The next best would be dynamic limits based on the object's bounding box. I'm not sure if there's a good solution that fits well with Blender's current concept of limits on properties.

If I'm not mistaken T77878 that is a duplicate of T72747 fixed in rB6a54969cf15e861e13c654c230a35db8e6f377f9 Nevermind that is for the voxel remesher, but generally related. Making too many mistakes this morning.

@Robert Guetzkow (rjg)

Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI.

If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well, indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.

IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

@Henrik Dick (weasel) I definitely shouldn't be allowed on the internet without caffeine. Sorry for missing so many obvious things today. Do you know of any modifier that currently implement dynamic limits? I don't think that's currently the case, but given how many things I was already wrong about today ...

@Robert Guetzkow (rjg) I don't think there is a modifier with dynamic limits yet. The general assumtion is that you know what values you are allowed to enter without crashing blender. There are lots of modifiers that can crash blender in a similar way (array, screw, bevel, subsurface). All of those other modifiers including the other modes of remesh are designed in a way, that you slowly descent in performance when clicking or dragging the slider, so you can intuitivly see where the limits for your computer are. The voxel size is the only field which will jump in a very rapid way to a crash (one click literally). Thats exactly what I was on in the last paragraph of my comment T77868#953889, when I was talking about nonlinear sliders.

What also should be considered is that, if your object is very big in size, adding the modifier will already crash blender. In that case an error message raised by a dynamic limit would be a life saver.

@Henrik Dick (weasel) The issue with dynamic limits based on available memory is that it is not quite easy to know how much is actually available, due to virtual memory, swap and other programs (de)allocating alongside of Blender. I don't think this is the right approach. As you've said, there are already plenty of ways to run out of memory, it just shouldn't be that easy to do that unintentionally.

The different scaling of the increments is a nice idea. What I meant in my comment was that instead of an absolute voxel size, one could rather use a factor that is used to determine that voxel size based on the object's bounding box. This should solve the problem, including the case when the modifier is added to a large object.

@Robert Guetzkow (rjg) no please not, please! The bounding box size changes all the time when using different modifier combinations or shapekeys and I don't like the effect that the other remesh modes are giving me. The other remesh modes do exactly that. They have an octree depth and a scale to scale and subdivide the bounding box. The issue is that it would render this new voxel remesh as useless as the others for metaball-like animations. I want the grid to stay where I specified it with a certain size that I can easily control. I don't think there is a reasonable way around nonlinear sliders.

Also nonlinear slider when implemented would be used everywhere is blender (e.g. all Merge Thresholds, Voxel size also for the remesh in the properties ans sculpt, most scale sliders, radius in painting and sculpt mode, ...) so it would be a very very useful addition for blender in general.

@Henrik Dick (weasel) I see, that's a valid point. I like the non-linear slider idea. Perhaps there is an even better solution for this particular modifier, but I'm not familiar with the implementation of the remesher and openvdb.

@Pablo Dobarro (pablodp606) since you've added the voxel mode to the modifier, perhaps you could take a look at this?

Robert Guetzkow (rjg) (Can't tag you, why ever I tag then someone else) To the first comment you've sent: I updated my Graphics Driver and tried it again. It's crashed, I needed to restart my PC. My Memory is 16GB RAM, so I don't think, that this is a problem, because, when I try to get high, I can get higher than like 20000000000m. Also, when I go below 0.1 to 0.01, it is smooth, it works ok. But then suddenly, when I go below 0.01 it just freezes and I need to crash my PC to be able to restart it. So, with an updated Graphics Driver and recommended Hardware/Memory specs, it just goes down.

@RedAssassin (RedAssassin44) As we've discussed in the comments above. Going below 0.01 can very easily exceed the available memory. At 0.0001, which is one step after 0.01, it takes approx. 27 GB with the default Cube. Depending on the available swap space this may either result in Blender crashing or being frozen for a long time. Your OS may react very slowly too, until either Blender finishes processing or it terminates Blender for running out of memory. Note that the OS freezing/reacting slowly is not the same as a crash. A crash means that it terminates unexpectedly, it could be displaying a blue screen or the computer simply turns off/restarts on it's own.

Happened couple of times for me too. Crashed on 64gb ram.
Maybe it will be better to show some kind of warning before proceeding after certain threshold. Similar example from Maya: if you try to subdivide dense mesh you'll receive warning message, from there you can confirm or cancel.

For 3ds max there is a good opportunity to cancel the currently executing task via pressing ESC

@Robert Guetzkow (rjg) I just found out that the Array Modifier has a limit at which it sends an error without crashing blender. It limits the vertex count to 67 million. We could do the same for the voxel remesher, so it would abort if it would generate more than say 2^32 voxels (computed from bounding box volume), which would be for a simple cube 24 million vertices with 2000 voxels per dimension (2000^2*6=24*10^6). That example still works on my 8 GiB computer. For more complex objects it would be more vertices and can go up to almost the voxel count (or a fraction of it? I am not entirely sure on the upper limit for vertices here). Then it will crash on my computer due to lack of memory, but may still work for others.

@Henrik Dick (weasel) Sounds like a good idea for a heuristic. Since I'm not familiar with the algorithm used for the voxel remesher and currently busy with my thesis, this should be looked at by @Pablo Dobarro (pablodp606) or other developers familiar with the memory requirements and runtime complexity.

Bastien Montagne (mont29) changed the subtype of this task from "Report" to "Bug".Jul 29 2020, 11:59 AM

It doesn't crash here, but the time it takes to compute is also not convenient.
Perhaps the modifier could raise warning messages, but it also seems like a good idea to lower the sliding limit when pressing and dragging with the mouse.

diff --git a/source/blender/makesrna/intern/rna_modifier.c b/source/blender/makesrna/intern/rna_modifier.c
index a891194550f..4c715bbb747 100644
--- a/source/blender/makesrna/intern/rna_modifier.c
+++ b/source/blender/makesrna/intern/rna_modifier.c
@@ -5557,7 +5557,7 @@ static void rna_def_modifier_remesh(BlenderRNA *brna)
   prop = RNA_def_property(srna, "voxel_size", PROP_FLOAT, PROP_DISTANCE);
   RNA_def_property_float_sdna(prop, NULL, "voxel_size");
   RNA_def_property_range(prop, 0.0001f, FLT_MAX);
-  RNA_def_property_ui_range(prop, 0.0001, 2, 0.1, 3);
+  RNA_def_property_ui_range(prop, 0.01, 2, 0.1, 3);
   RNA_def_property_ui_text(prop,
                            "Voxel Size",
                            "Size of the voxel in object space used for volume evaluation. Lower "
Henrik Dick (weasel) added a comment.EditedAug 6 2020, 3:26 PM

@Germano Cavalcante (mano-wii) The problem why we kind of agreed, that that was not the solution is that 0.0001 is kind of reasonable when your bounding box is 0.02 instead of 2. There are lots of people (that I know) who dont comply with the rule to use a unitsystem where your objects are "close" to a size of 1. They would want to use 0.0001 then.

If you want to implement a quick solution, doing it this way would be nice:

@Robert Guetzkow (rjg) I just found out that the Array Modifier has a limit at which it sends an error without crashing blender. It limits the vertex count to 67 million. We could do the same for the voxel remesher, so it would abort if it would generate more than say 2^32 voxels (computed from bounding box volume), which would be for a simple cube 24 million vertices with 2000 voxels per dimension (2000^2*6=24*10^6). That example still works on my 8 GiB computer. For more complex objects it would be more vertices and can go up to almost the voxel count (or a fraction of it? I am not entirely sure on the upper limit for vertices here). Then it will crash on my computer due to lack of memory, but may still work for others.

I am having the same issue on Mac os Mojave and Blender 2.90
It is not exactly the same, but out of nowhere Blender and eventually the entire computer hangs on simply adding the remesh modifier. Even with the object being hidden. Standard voxel size of 0.1m wasn't an issue yesterday but now even after a reboot it doesn't want to work. It even crashes other programs when adding the remesh modifier.

I have the same problem that Unity crashes while trying to apply a Remesh modifier to a fairly complex mesh. But the problem occured first after trying to crank up the regular Octree Depth to around 9 and then changed to Voxel to see what it was. And now the default value is set to voxels with 0.1m voxel size that craches the program very time.

Tried to uninstall blender and install the latest 2.90.1. But seems there where some save files left. Couse had still stuck on the voxel preset for every time I applied the modifier to a new mesh.

Manage to get around it by applying the modifier in edit mode, and then switch back to sharp instead.

Remeshing a cube of 20,000 m x 20,000 m x 2000 m with a voxel remesh of 0.1 crashes blender after it devours all of my 32 gb of ram, running latest blender release