Page MenuHome

Crash after loading big image file in compositor
Closed, ResolvedPublic

Description

System Information
Debian AMD64, 2 GiB RAM, without OpenCL

Blender Version
Broken: current from buildbot, I noticed this bug in first half of December
Worked: 2.69 Release

Short description of error
Loading an big image file (like 4000x4000) in the Image-Node in the compositor causes a segfault.

Exact steps for others to reproduce the error

hmm.. drag and drop don't seem to work, so I can't upload the blend file and backtrace.
Just find a huge image and open it in the compositor Image, and expect a crash.

Event Timeline

Robert M (robertm) raised the priority of this task from to Needs Triage by Developer.
Robert M (robertm) updated the task description. (Show Details)
Robert M (robertm) set Type to Bug.

Don't see the crash here on debian wheezy.

You can upload files without drag-n-drop by changing Action from Comment to Upload File. Having files which crashes for you will help a lot.

Couldn't reproduce the issue here still. Maybe others will be more lucky.

Also, are you sure you're not running out of memory? Are there any error messages printed in the console before blender crashes?

I'm pretty sure I'm not running out of memory. (My swap partition would cause a significant slow down otherwise)
Instead, Blender crashes immediately, without touching the hard disc.
And 2.69 doesn't have any problems opening the test image.

Output from the console:
read blend: /home/amd5200/BUG/open_big_image/open_big_image.blend
Writing: /tmp/open_big_image.crash.txt
Writing: /tmp/open_big_image.crash.txt
Speicherzugriffsfehler (Segfault)

No crash on Windows 7 x64, nor Mac OS X 10.9, both with 8GB Ram and latest master.

I tested it again on another Linux computer with 8 GB RAM.
To achieve the same crash I needed to resize the image to 8000x8000.
Thomas could you please test it again with test_image8000.png?

Tested again, It's not a problem with the image loading, but with image viewing!
When I enable Backdrop or Show Viewer-Node in UV-Editor then Blender consumes very large amount of RAM.
Another thing I found out is, when I change the "Image Draw Method" to GLSL, RAM consumpion improves dramatically, and I can handle 8K Images on my 2 GB RAM system.
(though Draw Method: "2D Texture" seems to be a bit faster on my system)
But still the current version seems to consume more RAM than 2.69.
So it's not a real bug, but maybe improvements to RAM usage can be done some time in the future.

After a second thought it might be a bug.
Blender 2.69 consumes 26 percent of my 2GB with a 4000x4000 image.
So why should the current development version crash?

It seems that the "Image Drawing Method" in File/User Preferencies/System plays a significant role here.
GLSL is probably the most stable method (no crash so far), but it's slower than the other methods.
"DrawPixels" and "2D Texture" both cause this segfault.

I would actually expect it to use less memory than 2.69. Might have done a mistake somewhere, will doublecheck the memory usage.

Sergey Sharybin (sergey) lowered the priority of this task from Needs Triage by Developer to Normal.Jan 5 2014, 3:59 PM

Run some tests and don't see any unexpected behavior with latest builds. Was using test_image8000.png and compared 2.69 release with current git version.

Blender 2.69 release:

  • total memory len: 1475.159 MB
  • peak memory len: 2695.881 MB
  • slop memory len: 11.307 MB

Blender GIT version:

  • total memory len: 1475.800 MB
  • peak memory len: 1751.225 MB
  • slop memory len: 6.483 MB

As you could see memory consumption actually have been reduced dramatically. So please download the very fresh build from http://builder.blender.org/download/ and see whether you still do have the issue.

You can type "Memory Statistics" in the space-menu to get memory statistics printed to the terminal.

Ah hold on, could reproduce crash when starting blender --factory-startup. Something weird is going on here, looking into it..

Fixed the issue now. Please note that you might want to increase cache limit for the images to make more images fit into the memory. Without this performance night not be so much good.

P.S. We indeed implemented cache size limit for loaded images, so now it'll contain as much images as it's possible with cache limit in the user preferences.

Thank you for the fix, Sergey!