Page MenuHome

blend file is huge, verts count only 7000, lag is huge
Closed, ResolvedPublic


System Information
windows 8 64 bit, gtx 750ti, intel hd graphics, core i3 540

Blender Version
Broken: I worked in 2.71
Worked: (optional)

Short description of error
I was modelling front panel part for cd player, and at some point blend file just skyrocketed in size, it is very laggy and unresponsive, ram usage is full, what system has, and inside blender it shows that it uses even more, I have seen over 10 gb of memory usage. And it all is cause by one mesh, I appended it to the single file, please take a look.

blend file is huge (almost 1 gb), there is nothing in it, just a mesh with 7000 verts.

Exact steps for others to reproduce the error

here is blend file link:

Event Timeline

Denis (dns_di) added a project: BF Blender.
Denis (dns_di) set Type to Bug.
Denis (dns_di) added a subscriber: Denis (dns_di).
Denis (dns_di) created this task.
Denis (dns_di) raised the priority of this task from to Needs Triage by Developer.

Hi, tested this file on:

Opensuse 13.1/64
Intel i5 3770K
GTX 760 4 GB (Display)
GTX 560Ti 1.28 GB 448 Cores
Driver 343.19

Blender 2.72

There are 2300 UV-Maps in this file, see in Object Data Slot.
May it is a way to delete them all in one rush but I don´t know it.

Cheers, mib

thanks for getting looked at the file, lol, why is there 2300 UVs there, any idea, bug or ...?

You can manually assign a maximum of 8 UV maps to a mesh. This number goes higher if you join two meshes each with its own number of uniquely named UV maps, but to get to 2300 of them it must be a bug.

Did you use any UV addons on the mesh?

Hi lamoot,

No I did not use any add-ons, apart from layer management add-on, and work spent on project add-on, but as you can see the mesh is quite complex, that is because I used boolean modifier a lot! I joined meshes, made wholes, etc. But how the hell I managed to get 2300 uniquely named UVs in those meshes is beyond my understanding, hardly I used boolean modifier more than 100 times.

BUT anyways, I found a way to remove all those UVs (thanks to mib2berlin for pointing out). I used python script. And it took very long time to do that, blender was totally unresponsive during the process, Iol I knew its deleting them, just because in resource monitor I saw memory usage dropping slowly. File size after that procedure dropped from 1 gb to 1 mb :). The link to the script, If somebody need it, is below.

Bastien Montagne (mont29) triaged this task as Normal priority.

Yeah, getting thousands of UVs on a single mesh should not be possible… Looks like we are missing some checks in merging operations here :/

Campbell, is our max number of 8 UVMaps a strict enforcement? If so, I’d suggest to add an option of cdlayers merging funcs to specify a maximum number of layers in result, and e.g. to fallback to 'by position' merging when 'by name' would generate too much layers (with a nice warning to user)?

OK, will see how to handle those cases in best way then.