Animation render crashes when planes with transparency map are present #32618

Closed
opened 2012-09-21 14:56:28 +02:00 by stinky wizzleteet · 7 comments

%%%Hi,

blender release r50791
ubuntu 12.04.1
geforce n560gtx-ti (2x)

I noticed that the new tile render crashes after five to ten frames when there are transparency maps present.
This behavior occurs when the objects with these maps are behind each other.

Please have a look at the attached files, the images show a number of planes with people mapped on them. When these people are not overlapping, the animation does not crash, when they are overlapping, this particular animation crashes after five frames on my machine with 'experimental' gpu and after nine frames with 'supported' gpu. (YMMV)

Also:

  • the amount of overlapping transparency planes does not factor in, I built a crowd scene with these examples and it crashes only after 10 frames.
  • non-tilerender releases do not exhibit this behavior.

A recurring clue on what goes wrong here is:
CUDA error: Invalid value in cuMemFree(cuda_device_ptr(mem.device_pointer))

But... after re-rendering this scene repeatedly, this showed up (myblender is the softlink to the daily compiled blender-du-jour) :

glibc detected myblender: free(): invalid pointer: 0x00007f34c8211738 ***
Backtrace:
/lib/x86_64-linux-gnu/libc.so.6(+0x7e506)[0x7f34ec806506]
/usr/lib/tls/libnvidia-tls.so.304.43(+0x1cc1)[0x7f34ebe66cc1]
Memory map:
00400000-01cc6000 r-xp 00000000 08:01 2388337 /usr/src/blender-svn/build/bin/blender
01ec5000-01f11000 r--p 018c5000 08:01 2388337 /usr/src/blender-svn/build/bin/blender
01f11000-026fe000 rw-p 01911000 08:01 2388337 /usr/src/blender-svn/build/bin/blender
026fe000-027d0000 rw-p 00000000 00:00 0
0350c000-07888000 rw-p 00000000 00:00 0 [heap]
40b13000-40b15000 r-xs 00000000 08:01 1055155 /tmp/glhzQ6BS (deleted)
4191d000-4199b000 rw-p 00000000 00:00 0
200000000-1000000000 ---p 00000000 00:00 0
7f3474000000-7f347410b000 rw-p 00000000 00:00 0
7f347410b000-7f3478000000 ---p 00000000 00:00 0
7f347c000000-7f347c0fc000 rw-p 00000000 00:00 0
7f347c0fc000-7f3480000000 ---p 00000000 00:00 0
7f3480000000-7f34800fc000 rw-p 00000000 00:00 0
7f34800fc000-7f3484000000 ---p 00000000 00:00 0
7f3488000000-7f34893da000 rw-p 00000000 00:00 0
7f34893da000-7f348c000000 ---p 00000000 00:00 0
7f348c000000-7f348c0fc000 rw-p 00000000 00:00 0
7f348c0fc000-7f3490000000 ---p 00000000 00:00 0
7f3490000000-7f34900fc000 rw-p 00000000 00:00 0
7f34900fc000-7f3494000000 ---p 00000000 00:00 0
7f3494000000-7f34940fc000 rw-p 00000000 00:00 0
7f34940fc000-7f3498000000 ---p 00000000 00:00 0
7f3498000000-7f34980fc000 rw-p 00000000 00:00 0
7f34980fc000-7f349c000000 ---p 00000000 00:00 0
7f349c000000-7f349c0fc000 rw-p 00000000 00:00 0
7f349c0fc000-7f34a0000000 ---p 00000000 00:00 0
7f34a2f01000-7f34a2f02000 rw-p 00000000 00:00 0
7f34a2f02000-7f34a2f03000 ---p 00000000 00:00 0
7f34a2f03000-7f34a3703000 rwxp 00000000 00:00 0
7f34a3703000-7f34a3803000 rw-s 396c58000 00:05 12522 /dev/nvidia0
7f34a3803000-7f34a3903000 rw-s 3a2ec9000 00:05 12522 /dev/nvidia0
7f34a3903000-7f34a3d05000 rw-s 3a1930000 00:05 12522 /dev/nvidia0
7f34a3d05000-7f34a3f00000 rw-s 00000000 00:04 1163521 /dev/zero (deleted)
7f34a3f00000-7f34a4000000 rw-s 3cba41000 00:05 12522 /dev/nvidia0
7f34a4000000-7f34a5117000 rw-p 00000000 00:00 0
7f34a5117000-7f34a8000000 ---p 00000000 00:00 0
7f34a8000000-7f34a9117000 rw-p 00000000 00:00 0
7f34a9117000-7f34ac000000 ---p 00000000 00:00 0
7f34ac000000-7f34ad117000 rw-p 00000000 00:00 0
7f34ad117000-7f34b0000000 ---p 00000000 00:00 0
7f34b0000000-7f34b1118000 rw-p 00000000 00:00 0
7f34b1118000-7f34b4000000 ---p 00000000 00:00 0
7f34b4000000-7f34b5117000 rw-p 00000000 00:00 0
7f34b5117000-7f34b8000000 ---p 00000000 00:00 0
7f34b8000000-7f34b910d000 rw-p 00000000 00:00 0
7f34b910d000-7f34bc000000 ---p 00000000 00:00 0
7f34bc000000-7f34bd118000 rw-p 00000000 00:00 0
7f34bd118000-7f34c0000000 ---p 00000000 00:00 0
7f34c00f4000-7f34c01f4000 rw-s 00000000 00:04 1164568 /dev/zero (deleted)
7f34c01f4000-7f34c02f4000 rw-s 369409000 00:05 12522 /dev/nvidia0
7f34c02f4000-7f34c06f6000 rw-s 3694d6000 00:05 12522 /dev/nvidia0
7f34c06f6000-7f34c06f7000 ---p 00000000 00:00 0
7f34c06f7000-7f34c0ef7000 rwxp 00000000 00:00 0
7f34c0ef7000-7f34c12f9000 rw-s 3b9cd6000 00:05 13763 /dev/nvidia1
7f34c12f9000-7f34c16fb000 rw-s 3a19e1000 00:05 13763 /dev/nvidia1
7f34c16fb000-7f34c16fc000 ---p 00000000 00:00 0
7f34c16fc000-7f34c1efc000 rwxp 00000000 00:00 0
7f34c1efc000-7f34c1ffc000 rw-s 00000000 00:04 1166623 /dev/zero (deleted)
7f34c1ffc000-7f34c1ffd000 ---p 00000000 00:00 0
7f34c1ffd000-7f34c27fd000 rwxp 00000000 00:00 0
7f34c27fd000-7f34c27fe000 ---p 00000000 00:00 0
7f34c27fe000-7f34c2ffe000 rwxp 00000000 00:00 0
7f34c2ffe000-7f34c2fff000 ---p 00000000 00:00 0
7f34c2fff000-7f34c37ff000 rwxp 00000000 00:00 0
7f34c37ff000-7f34c3800000 ---p 00000000 00:00 0
7f34c3800000-7f34c4000000 rwxp 00000000 00:00 0
7f34c4000000-7f34c5117000 rw-p 00000000 00:00 0
7f34c5117000-7f34c8000000 ---p 00000000 00:00 0
7f34c8000000-7f34c8392000 rw-p 00000000 00:00 0
7f34c8392000-7f34cc000000 ---p 00000000 00:00 0
7f34cc0b6000-7f34cc1b6000 rw-s 00000000 00:04 1164567 /dev/zero (deleted)
7f34cc1b6000-7f34cc2b6000 rw-s 00000000 00:04 1166629 /dev/zero (deleted)
7f34cc2b6000-7f34cc3b6000 rw-s 33d31d000 00:05 13763 /dev/nvidia1
7f34cc3b6000-7f34cc3b7000 ---p 00000000 00:00 0
7f34cc3b7000-7f34ccbb7000 rwxp 00000000 00:00 0
7f34ccbb7000-7f34ccbb8000 ---p 00000000 00:00 0
7f34ccbb8000-7f34cd3b8000 rwxp 00000000 00:00 0
7f34cd3b8000-7f34cd3b9000 ---p 00000000 00:00 0
7f34cd3b9000-7f34cdbb9000 rwxp 00000000 00:00 0
7f34cdbb9000-7f34cdbba000 ---p 00000000 00:00 0
7f34cdbba000-7f34ce3ba000 rwxp 00000000 00:00 0
7f34ce4b5000-7f34ce5b5000 rw-s 366edc000 00:05 13763 /dev/nvidia1
7f34ce5b5000-7f34ced9f000 rw-s 00000000 00:04 1166621 /dev/zero (deleted)
7f34ced9f000-7f34ceda0000 ---p 00000000 00:00 0
7f34ceda0000-7f34cf5a0000 rwxp 00000000 00:00 0
7f34cf5a0000-7f34cfce9000 r-xp 00000000 08:01 2234807 /usr/lib/libnvidia-opencl.so.304.43
7f34cfce9000-7f34cfee9000 ---p 00749000 08:01 2234807 /usr/lib/libnvidia-opencl.so.304.43
7f34cfee9000-7f34cffda000 rw-p 00749000 08:01 2234807 /usr/lib/libnvidia-opencl.so.304.43
7f34cffda000-7f34d0000000 rw-p 00000000 00:00 0
7f34d0000000-7f34d0021000 rw-p 00000000 00:00 0
7f34d0021000-7f34d4000000 ---p 00000000 00:00 0
7f34d4060000-7f34d4160000 rw-s 369435000 00:05 13763 /dev/nvidia1
7f34d4160000-7f34d435b000 rw-s 00000000 00:04 1166622 /dev/zero (deleted)
7f34d435b000-7f34d455b000 rw-s 367200000 00:05 12522 /dev/nvidia0
7f34d455b000-7f34d475c000 rw-p 00000000 00:00 0
7f34d475d000-7f34d485d000 rw-s 00000000 00:04 1166628 /dev/zero (deleted)
7f34d485d000-7f34d495d000 rw-s 35c261000 00:05 12522 /dev/nvidia0
7f34d495d000-7f34d4961000 r-xp 00000000 08:01 2234808 /usr/lib/libOpenCL.so.1.0.0
7f34d4961000-7f34d4b61000 ---p 00004000 08:01 2234808 /usr/lib/libOpenCL.so.1.0.0
7f34d4b61000-7f34d4b62000 rw-p 00004000 08:01 2234808 /usr/lib/libOpenCL.so.1.0.0
7f34d4b62000-7f34d540b000 r-xp 00000000 08:01 2234806 /usr/lib/libcuda.so.304.43
7f34d540b000-7f34d560a000 ---p 008a9000 08:01 2234806 /usr/lib/libcuda.so.304.43Aborted (core dumped)

%%%

%%%Hi, blender release r50791 ubuntu 12.04.1 geforce n560gtx-ti (2x) I noticed that the new tile render crashes after five to ten frames when there are transparency maps present. This behavior occurs when the objects with these maps are behind each other. Please have a look at the attached files, the images show a number of planes with people mapped on them. When these people are not overlapping, the animation does not crash, when they are overlapping, this particular animation crashes after five frames on my machine with 'experimental' gpu and after nine frames with 'supported' gpu. (YMMV) Also: - the *amount* of overlapping transparency planes does *not* factor in, I built a crowd scene with these examples and it crashes only after 10 frames. - non-tilerender releases do not exhibit this behavior. A recurring clue on what goes wrong here is: CUDA error: Invalid value in cuMemFree(cuda_device_ptr(mem.device_pointer)) But... after re-rendering this scene repeatedly, this showed up (myblender is the softlink to the daily compiled blender-du-jour) : ***glibc detected*** myblender: free(): invalid pointer: 0x00007f34c8211738 *** **Backtrace:** /lib/x86_64-linux-gnu/libc.so.6(+0x7e506)[0x7f34ec806506] /usr/lib/tls/libnvidia-tls.so.304.43(+0x1cc1)[0x7f34ebe66cc1] **Memory map:** 00400000-01cc6000 r-xp 00000000 08:01 2388337 /usr/src/blender-svn/build/bin/blender 01ec5000-01f11000 r--p 018c5000 08:01 2388337 /usr/src/blender-svn/build/bin/blender 01f11000-026fe000 rw-p 01911000 08:01 2388337 /usr/src/blender-svn/build/bin/blender 026fe000-027d0000 rw-p 00000000 00:00 0 0350c000-07888000 rw-p 00000000 00:00 0 [heap] 40b13000-40b15000 r-xs 00000000 08:01 1055155 /tmp/glhzQ6BS (deleted) 4191d000-4199b000 rw-p 00000000 00:00 0 200000000-1000000000 ---p 00000000 00:00 0 7f3474000000-7f347410b000 rw-p 00000000 00:00 0 7f347410b000-7f3478000000 ---p 00000000 00:00 0 7f347c000000-7f347c0fc000 rw-p 00000000 00:00 0 7f347c0fc000-7f3480000000 ---p 00000000 00:00 0 7f3480000000-7f34800fc000 rw-p 00000000 00:00 0 7f34800fc000-7f3484000000 ---p 00000000 00:00 0 7f3488000000-7f34893da000 rw-p 00000000 00:00 0 7f34893da000-7f348c000000 ---p 00000000 00:00 0 7f348c000000-7f348c0fc000 rw-p 00000000 00:00 0 7f348c0fc000-7f3490000000 ---p 00000000 00:00 0 7f3490000000-7f34900fc000 rw-p 00000000 00:00 0 7f34900fc000-7f3494000000 ---p 00000000 00:00 0 7f3494000000-7f34940fc000 rw-p 00000000 00:00 0 7f34940fc000-7f3498000000 ---p 00000000 00:00 0 7f3498000000-7f34980fc000 rw-p 00000000 00:00 0 7f34980fc000-7f349c000000 ---p 00000000 00:00 0 7f349c000000-7f349c0fc000 rw-p 00000000 00:00 0 7f349c0fc000-7f34a0000000 ---p 00000000 00:00 0 7f34a2f01000-7f34a2f02000 rw-p 00000000 00:00 0 7f34a2f02000-7f34a2f03000 ---p 00000000 00:00 0 7f34a2f03000-7f34a3703000 rwxp 00000000 00:00 0 7f34a3703000-7f34a3803000 rw-s 396c58000 00:05 12522 /dev/nvidia0 7f34a3803000-7f34a3903000 rw-s 3a2ec9000 00:05 12522 /dev/nvidia0 7f34a3903000-7f34a3d05000 rw-s 3a1930000 00:05 12522 /dev/nvidia0 7f34a3d05000-7f34a3f00000 rw-s 00000000 00:04 1163521 /dev/zero (deleted) 7f34a3f00000-7f34a4000000 rw-s 3cba41000 00:05 12522 /dev/nvidia0 7f34a4000000-7f34a5117000 rw-p 00000000 00:00 0 7f34a5117000-7f34a8000000 ---p 00000000 00:00 0 7f34a8000000-7f34a9117000 rw-p 00000000 00:00 0 7f34a9117000-7f34ac000000 ---p 00000000 00:00 0 7f34ac000000-7f34ad117000 rw-p 00000000 00:00 0 7f34ad117000-7f34b0000000 ---p 00000000 00:00 0 7f34b0000000-7f34b1118000 rw-p 00000000 00:00 0 7f34b1118000-7f34b4000000 ---p 00000000 00:00 0 7f34b4000000-7f34b5117000 rw-p 00000000 00:00 0 7f34b5117000-7f34b8000000 ---p 00000000 00:00 0 7f34b8000000-7f34b910d000 rw-p 00000000 00:00 0 7f34b910d000-7f34bc000000 ---p 00000000 00:00 0 7f34bc000000-7f34bd118000 rw-p 00000000 00:00 0 7f34bd118000-7f34c0000000 ---p 00000000 00:00 0 7f34c00f4000-7f34c01f4000 rw-s 00000000 00:04 1164568 /dev/zero (deleted) 7f34c01f4000-7f34c02f4000 rw-s 369409000 00:05 12522 /dev/nvidia0 7f34c02f4000-7f34c06f6000 rw-s 3694d6000 00:05 12522 /dev/nvidia0 7f34c06f6000-7f34c06f7000 ---p 00000000 00:00 0 7f34c06f7000-7f34c0ef7000 rwxp 00000000 00:00 0 7f34c0ef7000-7f34c12f9000 rw-s 3b9cd6000 00:05 13763 /dev/nvidia1 7f34c12f9000-7f34c16fb000 rw-s 3a19e1000 00:05 13763 /dev/nvidia1 7f34c16fb000-7f34c16fc000 ---p 00000000 00:00 0 7f34c16fc000-7f34c1efc000 rwxp 00000000 00:00 0 7f34c1efc000-7f34c1ffc000 rw-s 00000000 00:04 1166623 /dev/zero (deleted) 7f34c1ffc000-7f34c1ffd000 ---p 00000000 00:00 0 7f34c1ffd000-7f34c27fd000 rwxp 00000000 00:00 0 7f34c27fd000-7f34c27fe000 ---p 00000000 00:00 0 7f34c27fe000-7f34c2ffe000 rwxp 00000000 00:00 0 7f34c2ffe000-7f34c2fff000 ---p 00000000 00:00 0 7f34c2fff000-7f34c37ff000 rwxp 00000000 00:00 0 7f34c37ff000-7f34c3800000 ---p 00000000 00:00 0 7f34c3800000-7f34c4000000 rwxp 00000000 00:00 0 7f34c4000000-7f34c5117000 rw-p 00000000 00:00 0 7f34c5117000-7f34c8000000 ---p 00000000 00:00 0 7f34c8000000-7f34c8392000 rw-p 00000000 00:00 0 7f34c8392000-7f34cc000000 ---p 00000000 00:00 0 7f34cc0b6000-7f34cc1b6000 rw-s 00000000 00:04 1164567 /dev/zero (deleted) 7f34cc1b6000-7f34cc2b6000 rw-s 00000000 00:04 1166629 /dev/zero (deleted) 7f34cc2b6000-7f34cc3b6000 rw-s 33d31d000 00:05 13763 /dev/nvidia1 7f34cc3b6000-7f34cc3b7000 ---p 00000000 00:00 0 7f34cc3b7000-7f34ccbb7000 rwxp 00000000 00:00 0 7f34ccbb7000-7f34ccbb8000 ---p 00000000 00:00 0 7f34ccbb8000-7f34cd3b8000 rwxp 00000000 00:00 0 7f34cd3b8000-7f34cd3b9000 ---p 00000000 00:00 0 7f34cd3b9000-7f34cdbb9000 rwxp 00000000 00:00 0 7f34cdbb9000-7f34cdbba000 ---p 00000000 00:00 0 7f34cdbba000-7f34ce3ba000 rwxp 00000000 00:00 0 7f34ce4b5000-7f34ce5b5000 rw-s 366edc000 00:05 13763 /dev/nvidia1 7f34ce5b5000-7f34ced9f000 rw-s 00000000 00:04 1166621 /dev/zero (deleted) 7f34ced9f000-7f34ceda0000 ---p 00000000 00:00 0 7f34ceda0000-7f34cf5a0000 rwxp 00000000 00:00 0 7f34cf5a0000-7f34cfce9000 r-xp 00000000 08:01 2234807 /usr/lib/libnvidia-opencl.so.304.43 7f34cfce9000-7f34cfee9000 ---p 00749000 08:01 2234807 /usr/lib/libnvidia-opencl.so.304.43 7f34cfee9000-7f34cffda000 rw-p 00749000 08:01 2234807 /usr/lib/libnvidia-opencl.so.304.43 7f34cffda000-7f34d0000000 rw-p 00000000 00:00 0 7f34d0000000-7f34d0021000 rw-p 00000000 00:00 0 7f34d0021000-7f34d4000000 ---p 00000000 00:00 0 7f34d4060000-7f34d4160000 rw-s 369435000 00:05 13763 /dev/nvidia1 7f34d4160000-7f34d435b000 rw-s 00000000 00:04 1166622 /dev/zero (deleted) 7f34d435b000-7f34d455b000 rw-s 367200000 00:05 12522 /dev/nvidia0 7f34d455b000-7f34d475c000 rw-p 00000000 00:00 0 7f34d475d000-7f34d485d000 rw-s 00000000 00:04 1166628 /dev/zero (deleted) 7f34d485d000-7f34d495d000 rw-s 35c261000 00:05 12522 /dev/nvidia0 7f34d495d000-7f34d4961000 r-xp 00000000 08:01 2234808 /usr/lib/libOpenCL.so.1.0.0 7f34d4961000-7f34d4b61000 ---p 00004000 08:01 2234808 /usr/lib/libOpenCL.so.1.0.0 7f34d4b61000-7f34d4b62000 rw-p 00004000 08:01 2234808 /usr/lib/libOpenCL.so.1.0.0 7f34d4b62000-7f34d540b000 r-xp 00000000 08:01 2234806 /usr/lib/libcuda.so.304.43 7f34d540b000-7f34d560a000 ---p 008a9000 08:01 2234806 /usr/lib/libcuda.so.304.43Aborted (core dumped) %%%

Changed status to: 'Open'

Changed status to: 'Open'

%%%Hi,
the blend file you posted does not contain an animation? The planes do not overlap after opening.
I manually overlapped all 3 of them and rendered with GPU (CUDA, nvidia 540M) and could not see issues nor a crash. (Rendered 15 frames).

Please also only test with official builds from builder.blender.org.
Also please test if this only happens with 2 GPUs or also when you just use 1 for rendering. %%%

%%%Hi, the blend file you posted does not contain an animation? The planes do not overlap after opening. I manually overlapped all 3 of them and rendered with GPU (CUDA, nvidia 540M) and could not see issues nor a crash. (Rendered 15 frames). Please also only test with official builds from builder.blender.org. Also please test if this only happens with 2 GPUs or also when you just use 1 for rendering. %%%

%%%Attached file crashes here with such a backtrace: http://www.pasteall.org/35521

Linux 64bit, NVidia 560ti, divers 304.43 and CUDA toolkit 4.2.9.%%%

%%%Attached file crashes here with such a backtrace: http://www.pasteall.org/35521 Linux 64bit, NVidia 560ti, divers 304.43 and CUDA toolkit 4.2.9.%%%

%%%Hi,

The file does not contain anything animated, but when rendering this file as an animation it crashes. (btw, two overlapping people are in another layer)

Also;
I also have NVIDIA 560ti's. At another location I testen with quadro 4000 card and no crashes.

I have re tested the file with different blender builds and I found that the crashing behavior is very erratic. Also, my first hunch, that it was linked to the tile render proved to be false, no-tilerender builds also fail.

Linux 64b 2x nvidia 560ti, driver devdriver 295.41
Blender builds in order of testing:
r50372 (no tile render) : no crash
r50483 (tile render): No crash
r50712 (tile render) : crash at frame 42

  • so far you could think it has to do with tile rendering, but:
    r50372 (no tile render) : crash at frame 11
    r50483 (tile render): crash at frame 8
    r50483 (tile render): no crash (reached frame 62 before stopping manually)

r50872 (today): crash at frame 5
r50872 (today): turn on Static BVH, Use Spatial Splits and Cache BVH: crash immediately
r50872 (today): turn on Static BVH, Use Spatial Splits and Cache BVH, turn on Save Buffers: crash at frame 12
r50872 (today): turn on Static BVH, Use Spatial Splits and Cache BVH, turn on Save Buffers, switch to "supported GPU": crash at frame 42

So, there, I now have officially No Clue at what's going on here.
Somewhere along the line CUDA gets borked when dealing with transparency rays.
Even though the render does not change a single pixel, the animation crashes at an arbitrary point and some settings and some builds accelerate this behavior.
Seeing how the render itself does not change a single pixel, I would think the videocard does not get reinitialized properly between renders.

Can I test somehow for this behavior ?

thx wzzl

%%%

%%%Hi, The file does not contain anything animated, but when rendering this file as an animation it crashes. (btw, two overlapping people are in another layer) Also; I also have NVIDIA 560ti's. At another location I testen with quadro 4000 card and no crashes. I have re tested the file with different blender builds and I found that the crashing behavior is very erratic. Also, my first hunch, that it was linked to the tile render proved to be false, no-tilerender builds also fail. Linux 64b 2x nvidia 560ti, driver devdriver 295.41 Blender builds in order of testing: r50372 (no tile render) : no crash r50483 (tile render): No crash r50712 (tile render) : crash at frame 42 - so far you could think it has to do with tile rendering, but: r50372 (no tile render) : crash at frame 11 r50483 (tile render): crash at frame 8 r50483 (tile render): no crash (reached frame 62 before stopping manually) r50872 (today): crash at frame 5 r50872 (today): turn on Static BVH, Use Spatial Splits and Cache BVH: crash immediately r50872 (today): turn on Static BVH, Use Spatial Splits and Cache BVH, turn on Save Buffers: crash at frame 12 r50872 (today): turn on Static BVH, Use Spatial Splits and Cache BVH, turn on Save Buffers, switch to "supported GPU": crash at frame 42 So, there, I now have officially No Clue at what's going on here. Somewhere along the line CUDA gets borked when dealing with transparency rays. Even though the render does not change a single pixel, the animation crashes at an arbitrary point and some settings and some builds accelerate this behavior. Seeing how the render itself does not change a single pixel, I would think the videocard does not get reinitialized properly between renders. Can I test somehow for this behavior ? thx wzzl %%%

%%%UPDATE:

I forgot to test with a single 560ti:
r50872: crash at frame 22
again, r50872:
Saved: /tmp/0021.png Time: 00:06.18
CUDA error: Invalid value in cuMemFree(cuda_device_ptr(mem.device_pointer))
Aborted (core dumped)

No, not noice, not noice at all.
%%%

%%%UPDATE: I forgot to test with a single 560ti: r50872: crash at frame 22 again, r50872: Saved: /tmp/0021.png Time: 00:06.18 CUDA error: Invalid value in cuMemFree(cuda_device_ptr(mem.device_pointer)) Aborted (core dumped) No, not noice, not noice at all. %%%

%%%Fix in svn, thanks for the report. It was a threading issue, they tend to be a bit random.%%%

%%%Fix in svn, thanks for the report. It was a threading issue, they tend to be a bit random.%%%

Changed status from 'Open' to: 'Resolved'

Changed status from 'Open' to: 'Resolved'
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
4 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#32618
No description provided.