Page MenuHome

Memory increase on duplicate and delete objects without limit
Closed, ArchivedPublic

Description

System Information
Operating system: Linux-5.3.0-24-generic-x86_64-with-debian-buster-sid 64 Bits
Graphics card: GeForce GT 630M/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 390.129

Blender Version
Broken: version: 2.81 (sub 16), branch: master, commit date: 2019-12-04 11:32, hash: rBf1aa4d18d49d
Worked: (optional)

Short description of error
When I have a script that runs multiple times duplicate an object increases the memory that I consider normal, but then that same script deletes the object from the scene and its orphan meshes too but the memory is not released, every time I run the script again the memory continues to increase without limit.

Exact steps for others to reproduce the error
Option 1:
Copy script in python interpreter inside blender and run it again a few times or

Option 2:
Use the addon verión of script and press buton few times.

With any of those the problem can be exprienced and seen

Script version:
https://pastebin.com/1f8QzY8z

Addon version:
https://pastebin.com/8d98PQwC

Event Timeline

This is the script, in case it's lost from pastebin :)

ob = bpy.context.active_object
 
def duplicate_test_memory(repeat):
    for b in range(repeat):
        bpy.ops.object.duplicate()
    #
    bpy.ops.object.select_all(action='SELECT')
    ob.select_set(False)
    bpy.ops.object.delete(use_global=False)
    #
    for mesh in bpy.data.meshes:
        if mesh.users < 1:
            bpy.data.meshes.remove(mesh)
 
repeat = 100
 
if ob:
    for r in range(repeat):
        duplicate_test_memory(repeat)
else:
    bpy.ops.mesh.primitive_cube_add(size=2, enter_editmode=False, location=(0, 0, 0))
    ob = bpy.context.active_object
    for r in range(repeat):
        duplicate_test_memory(repeat)

I think this is rather important because we tend to use different scrpits to scatter objects, or to procedurally modify a scene, and it feels like a big memory leak.

I have run the script in Windows using the last master code and I didn't see any memory leak running in debug mode.

Here the VS2019 memory chart:

I have run the script in Windows using the last master code and I didn't see any memory leak running in debug mode.

Here the VS2019 memory chart:

Doesn't every time you run it increase your memory usage?

No memory leaks, I run 4 times and the memory in VS is the same....but in the bottom bar the Blender memory stat increases.

I have run in debug mode and this doesn't show any leak message and the VS memory chart keeps flat.

Then may there be a problem with the stats and not with the memory?

Not sure. I don't know how the memory stats work.

Stats in the status bar will show amount of memory which Blender requested from the system. The system monitor (and the one in VS) will show memory which was allocated to Blender by the system.

Now, the system memory allocation might allocate more memory than is required at a specific moment and hence make subsequent allocation to happen faster.

What you can do to help investigating the issue is to run Blender with --debug-memory command line argument and watch output of "Memory Statistics" operator.

@Sergey Sharybin (sergey) I have done what you said and I only see a difference:

Test 1:

total memory len: 24.695 MB
peak memory len: 29.542 MB
slop memory len: 8.551 MB
 ITEMS TOTAL-MiB AVERAGE-KiB TYPE
    34 (   0.782    23.566) Chunk buffer

Test 2:

total memory len: 26.310 MB
peak memory len: 31.155 MB
slop memory len: 13.493 MB
 ITEMS TOTAL-MiB AVERAGE-KiB TYPE
    46 (   1.080    24.033) Chunk buffer

Test 3:

total memory len: 28.995 MB
peak memory len: 33.838 MB
slop memory len: 20.906 MB
 ITEMS TOTAL-MiB AVERAGE-KiB TYPE
    67 (   1.787    27.316) Chunk buffer

Looking more in detail, this memory is allocated by Undo system (writedata_do_write()), so maybe is only the undo information.

Germano Cavalcante (mano-wii) changed the task status from Needs Triage to Needs Information from User.Jan 5 2020, 3:44 PM

Can you see if the same happens if you reduce the amount of undo steps?
Preferences -> System -> Undo Steps

@Germano Cavalcante (mano-wii) You are right. If you set the undo to 0, the memory increases in a very slow rate, .

@zebus3d (ofuscado), thanks for the report.

This is not a bug, it's how the memory is used by the Undo system.
Your script calls bpy.ops within a loop, and for each call, an undo step is added.