Page MenuHome

Blender crashes PC. Modifier Remesh Voxel Size issue.
Confirmed, NormalPublic

Description

System Information
Operating system: Windows-10-10.0.17763-SP0 64 Bits
Graphics card: GeForce GTX 1660/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 432.00
Processor: AMD Ryzen 7 3800x 8-Core Processor 3.90 GHz

Blender Version
Broken: version: 2.90.0 Alpha, branch: master, commit date: 2020-06-12 17:01, hash: rBfd8d245e6a80
(Also with Blender 2.83 BETA)

Short description of error
Selecting the "Remesh" modifier, selected "Voxel", Voxel size below "0.01m" = PC Crash. Needed to restart the PC.

Exact steps for others to reproduce the error

  1. Start Blender.
  2. Let the default Cube alive and add the "Remesh" modifier.
  3. Make the "Voxel Size" below "0.01m".
  4. Did it crashed?

(Found out in my Blender Project, but I tested it then in a default Scene. Same thing happened.)

(Why I set the Voxel Size below "0.01m"?
It happens accidentally, when I have it like "0.42m", but I hold my left click and slide it to the left, so I accidentally go below "0.01m".)

Good Luck!

Event Timeline

Robert Guetzkow (rjg) changed the task status from Needs Triage to Needs Information from User.Sun, Jun 14, 11:33 PM

If your operating system crashes there seems to be an issue outside of Blender. Please update the graphics driver and check if the crash still occurs. Be aware the remeshing is very memory intensive, you may be running out of memory (see similar reports T77664 and T77556).

Ankit (ankitm) renamed this task from --{ Blender crashes PC. Modifier Remesh Voxel Size issue. }-- to Blender crashes PC. Modifier Remesh Voxel Size issue..Mon, Jun 15, 8:20 AM
Ankit (ankitm) changed the task status from Needs Information from User to Confirmed.Mon, Jun 15, 8:38 AM
Ankit (ankitm) added a subscriber: Ankit (ankitm).


The memory usage goes wild when reducing the voxel size just under .01 (use the "reduce" button beside the slider)

Ankit (ankitm) added a comment.EditedMon, Jun 15, 8:39 AM

An easy fix would be to put a soft limit of 0.01 in the slider UI itself. and manually let people type into it what they want.
Not a serious comment. Sorry.

Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI.

If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.

IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

@Ankit (ankitm) I can't reproduce a crash on my system with 0.001 on a subdivided Suzanne, it only consumes a large amount of memory. That in itself is not surprising, since the smaller the voxel the more memory it needs and the memory consumption grows cubically. The issue is that the soft_min is 0.0001 (or 9.999999747378752e-05 to be precise), which means that sliding over the value can easily result in OOM. I wouldn't change the hard_limit, since someone with more memory or a use case like @Henrik Dick (weasel) described, may want to use really small values and should be able to type them in manually. Edit: I would change the hard_limit too, in order to avoid the arithmetic error reported in T77878.

This comment was removed by Robert Guetzkow (rjg).

Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI.

If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.

IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

I agree with Henrick. I'm just finishing sculpting quite a hefty model in Blender. I think the solution of having a hard limit of 0.01 is not a solution as you definitely use these small numbers when you scale the model down, when you have little details, but you still need that geometrical density to sculpt, it all depends on the type of the model and its scale and putting a hard-limit is a hacky solution. To be honest, I would argue that your computer running out of memory because you created a geo that is too dense is a bug, that is your fault.

Having no notification that geo might get too dense and you'll most likely get a crash (as it is done in Zbrush once you subdivide model too much) should be considered more of a UI paper-cut I guess. You can crash any sculpting software like that if you don't know what you're doing and which numbers are viable. You can easily crush Zbrush the same way if you over-do Dynamesh values or overly subdivide your model.

@Robert Guetzkow (rjg) for me bpy.context.object.modifiers["Remesh"].bl_rna.properties["voxel_size"].soft_min - 0.0001 is basically giving zero, so the soft_min matches. The little error is due to float not beeing able to store that number exactly (see https://www.h-schmidt.net/FloatConverter/IEEE754.html)

@Henrik Dick (weasel) You're right I misread it and thought there was one erroneous extra zero digit. The actual deviation is obviously coming from floating point precision.

Ankit (ankitm) added a comment.EditedMon, Jun 15, 12:07 PM

Edited the comment up there. T77868#953819 It felt surprising to me that going from 0.03 > 0.02 > 0.01 > (the number here) created such a huge increase. What I had was the default cube with nothing else.

I can't reproduce a crash on my system with 0.001 on a subdivided Suzanne, it only consumes a large amount of memory.

Just to clarify: is anyone opposing the confirmation of the report ?

@Ankit (ankitm) No it definitly crashes when decreasing from 0.01 with the left arrow going to the softmin at 0.0001. But only on the default cube and objects of similar size or bigger.

T77878 is related to the decision what the hard limit should be.

@Henrik Dick (weasel) What is a suitable soft_min, when the required memory is relative to the size of the object? Remeshing a 50 m cube and a voxel size of 0.01 will have the same effect, allocating about 27 GB. In a perfect world the modifier would base the limits on the available resources. The next best would be dynamic limits based on the object's bounding box. I'm not sure if there's a good solution that fits well with Blender's current concept of limits on properties.

If I'm not mistaken T77878 that is a duplicate of T72747 fixed in rB6a54969cf15e861e13c654c230a35db8e6f377f9 Nevermind that is for the voxel remesher, but generally related. Making too many mistakes this morning.

@Robert Guetzkow (rjg)

Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI.

If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well, indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.

IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

@Henrik Dick (weasel) I definitely shouldn't be allowed on the internet without caffeine. Sorry for missing so many obvious things today. Do you know of any modifier that currently implement dynamic limits? I don't think that's currently the case, but given how many things I was already wrong about today ...

@Robert Guetzkow (rjg) I don't think there is a modifier with dynamic limits yet. The general assumtion is that you know what values you are allowed to enter without crashing blender. There are lots of modifiers that can crash blender in a similar way (array, screw, bevel, subsurface). All of those other modifiers including the other modes of remesh are designed in a way, that you slowly descent in performance when clicking or dragging the slider, so you can intuitivly see where the limits for your computer are. The voxel size is the only field which will jump in a very rapid way to a crash (one click literally). Thats exactly what I was on in the last paragraph of my comment T77868#953889, when I was talking about nonlinear sliders.

What also should be considered is that, if your object is very big in size, adding the modifier will already crash blender. In that case an error message raised by a dynamic limit would be a life saver.

@Henrik Dick (weasel) The issue with dynamic limits based on available memory is that it is not quite easy to know how much is actually available, due to virtual memory, swap and other programs (de)allocating alongside of Blender. I don't think this is the right approach. As you've said, there are already plenty of ways to run out of memory, it just shouldn't be that easy to do that unintentionally.

The different scaling of the increments is a nice idea. What I meant in my comment was that instead of an absolute voxel size, one could rather use a factor that is used to determine that voxel size based on the object's bounding box. This should solve the problem, including the case when the modifier is added to a large object.

@Robert Guetzkow (rjg) no please not, please! The bounding box size changes all the time when using different modifier combinations or shapekeys and I don't like the effect that the other remesh modes are giving me. The other remesh modes do exactly that. They have an octree depth and a scale to scale and subdivide the bounding box. The issue is that it would render this new voxel remesh as useless as the others for metaball-like animations. I want the grid to stay where I specified it with a certain size that I can easily control. I don't think there is a reasonable way around nonlinear sliders.

Also nonlinear slider when implemented would be used everywhere is blender (e.g. all Merge Thresholds, Voxel size also for the remesh in the properties ans sculpt, most scale sliders, radius in painting and sculpt mode, ...) so it would be a very very useful addition for blender in general.

@Henrik Dick (weasel) I see, that's a valid point. I like the non-linear slider idea. Perhaps there is an even better solution for this particular modifier, but I'm not familiar with the implementation of the remesher and openvdb.

@Pablo Dobarro (pablodp606) since you've added the voxel mode to the modifier, perhaps you could take a look at this?

Robert Guetzkow (rjg) (Can't tag you, why ever I tag then someone else) To the first comment you've sent: I updated my Graphics Driver and tried it again. It's crashed, I needed to restart my PC. My Memory is 16GB RAM, so I don't think, that this is a problem, because, when I try to get high, I can get higher than like 20000000000m. Also, when I go below 0.1 to 0.01, it is smooth, it works ok. But then suddenly, when I go below 0.01 it just freezes and I need to crash my PC to be able to restart it. So, with an updated Graphics Driver and recommended Hardware/Memory specs, it just goes down.

@RedAssassin (RedAssassin44) As we've discussed in the comments above. Going below 0.01 can very easily exceed the available memory. At 0.0001, which is one step after 0.01, it takes approx. 27 GB with the default Cube. Depending on the available swap space this may either result in Blender crashing or being frozen for a long time. Your OS may react very slowly too, until either Blender finishes processing or it terminate Blender for running out of memory. Note that the OS freezing/reacting slowly is not the same as a crash. A crash means that it terminates unexpectedly, it could be displaying a blue screen or the computer simply turns off/restarts on it's own.