Generated Grid image textures deadlock background Cycles renders. #85287

Closed
opened 2021-02-01 17:43:36 +01:00 by Alexander Gavrilov · 12 comments

System Information

Operating system: Fedora 32 x86_64
Graphics card: GeForce GTX 1060

Blender Version

Broken:

  • 2.92 Beta (a9b53daf23)
  • 2.92.0 Beta, branch: master, commit date: 2021-01-28 23:06, hash: 213f8294b5
  • 2.91.2, branch: master, commit date: 2021-01-19 16:15, hash: 5be9ef4177
  • 2.90.1, branch: master, commit date: 2020-09-23 06:43, hash: 3e85bb34d0
    Worked:
  • 2.83.x

Short description of error

Background Cycles renders (to update material preview I think) randomly deadlock if Generated textures of Grid type are present. This can happen when editing materials in the Shading workspace.

Exact steps for others to reproduce the error

Open the attached file and start randomly dragging any of the sliders. Blender should freeze after a short while; there is even a small chance it can happen immediately after opening the file.

test-image-deadlock.blend

Stack traces captured with a local debug build: one thread holds the image manager mutex and is stuck yield-looping in the TBB scheduler code waiting for its tasks to complete, while others are waiting for the mutex.

deadlock.txt

**System Information** Operating system: Fedora 32 x86_64 Graphics card: GeForce GTX 1060 **Blender Version** Broken: - 2.92 Beta (a9b53daf23ba) - 2.92.0 Beta, branch: master, commit date: 2021-01-28 23:06, hash: `213f8294b5` - 2.91.2, branch: master, commit date: 2021-01-19 16:15, hash: `5be9ef4177` - 2.90.1, branch: master, commit date: 2020-09-23 06:43, hash: `3e85bb34d0` Worked: - 2.83.x **Short description of error** Background Cycles renders (to update material preview I think) randomly deadlock if Generated textures of Grid type are present. This can happen when editing materials in the Shading workspace. **Exact steps for others to reproduce the error** Open the attached file and start randomly dragging any of the sliders. Blender should freeze after a short while; there is even a small chance it can happen immediately after opening the file. [test-image-deadlock.blend](https://archive.blender.org/developer/F9607896/test-image-deadlock.blend) Stack traces captured with a local debug build: one thread holds the image manager mutex and is stuck yield-looping in the TBB scheduler code waiting for its tasks to complete, while others are waiting for the mutex. [deadlock.txt](https://archive.blender.org/developer/F9607901/deadlock.txt)
Author
Member

Added subscriber: @angavrilov

Added subscriber: @angavrilov
Author
Member

Added subscriber: @brecht

Added subscriber: @brecht

Changed status from 'Needs Triage' to: 'Confirmed'

Changed status from 'Needs Triage' to: 'Confirmed'

Added subscriber: @rlneumiller

Added subscriber: @rlneumiller

Also broken on Windows 10 1909 with Blender version: 2.90.0, branch: master, commit date: 2020-08-31 11:26, hash: 0330d1af29, type: Release

Also broken on Windows 10 1909 with Blender version: 2.90.0, branch: master, commit date: 2020-08-31 11:26, hash: 0330d1af29c0, type: Release

Bisect points to e50f1ddc65

Bisect points to e50f1ddc65

I haven't been able to reproduce this myself on Linux. But from the backtrace and the bisect, I'm guessing it's the same issue as D7688.

Can anyone test if this works?

diff --git a/intern/cycles/util/util_task.cpp b/intern/cycles/util/util_task.cpp
index 949ba0a..1fdd052 100644
--- a/intern/cycles/util/util_task.cpp
+++ b/intern/cycles/util/util_task.cpp
@@ -35,7 +35,10 @@ TaskPool::~TaskPool()
 
 void TaskPool::push(TaskRunFunction &&task)
 {
-  tbb_group.run(std::move(task));
+  /* Isolate tasks to avoid deadlocks. */
+  tbb_group.run([task = std::move(task)] {
+    tbb::this_task_arena::isolate([task = std::move(task)] { task(); });
+  });
   num_tasks_pushed++;
 }
I haven't been able to reproduce this myself on Linux. But from the backtrace and the bisect, I'm guessing it's the same issue as [D7688](https://archive.blender.org/developer/D7688). Can anyone test if this works? ``` diff --git a/intern/cycles/util/util_task.cpp b/intern/cycles/util/util_task.cpp index 949ba0a..1fdd052 100644 --- a/intern/cycles/util/util_task.cpp +++ b/intern/cycles/util/util_task.cpp @@ -35,7 +35,10 @@ TaskPool::~TaskPool() void TaskPool::push(TaskRunFunction &&task) { - tbb_group.run(std::move(task)); + /* Isolate tasks to avoid deadlocks. */ + tbb_group.run([task = std::move(task)] { + tbb::this_task_arena::isolate([task = std::move(task)] { task(); }); + }); num_tasks_pushed++; } ```
Author
Member

In #85287#1109966, @brecht wrote:
Can anyone test if this works?

Unfortunately this doesn't seem to help for me.

It seems to me this is quite a weird failure that isn't caused by obvious stuff like recursively locking non-recursive mutexes: the parallel execution that gets stuck in wait simply manipulates data and as far as I can tell shouldn't be waiting on any mutexes at all, so I see no reason outside of TBB internals themselves for it to be stuck...

Since you can't reproduce, I wonder if it could depend on TBB version? I have 2020.2 from Fedora 32 distribution. Or if you are using a debug build, do you have kernels built so it uses GPU rendering? More frequently completing renders increase the probability of the deadlock (I actually had trouble reproducing with a debug build because of missing kernels before).

> In #85287#1109966, @brecht wrote: > Can anyone test if this works? Unfortunately this doesn't seem to help for me. It seems to me this is quite a weird failure that isn't caused by obvious stuff like recursively locking non-recursive mutexes: the parallel execution that gets stuck in wait simply manipulates data and as far as I can tell shouldn't be waiting on any mutexes at all, so I see no reason outside of TBB internals themselves for it to be stuck... Since you can't reproduce, I wonder if it could depend on TBB version? I have 2020.2 from Fedora 32 distribution. Or if you are using a debug build, do you have kernels built so it uses GPU rendering? More frequently completing renders increase the probability of the deadlock (I actually had trouble reproducing with a debug build because of missing kernels before).

No noticible change here either (Windows 10 64bit 1909)

Note that I've discovered a potentially related issue as follows:
Load the blend file (same as the original) test-image-deadlock.blend.
Change the Shader editor in the viewport to the timeline editor
Drag the sliders in the Material properties panel
After some time (when the repro occurs) the preview image in the Material properties panel will cease to update
Switching the Timeline editor back to the Shader editor now will deadlock blender

Image shows that the Preview has not updated, but Blender UI continues to respond to user input.
image.png

No noticible change here either (Windows 10 64bit 1909) Note that I've discovered a potentially related issue as follows: Load the blend file (same as the original) [test-image-deadlock.blend](https://archive.blender.org/developer/F9607896/test-image-deadlock.blend). Change the Shader editor in the viewport to the timeline editor Drag the sliders in the Material properties panel After some time (when the repro occurs) the preview image in the Material properties panel will cease to update Switching the Timeline editor back to the Shader editor now will deadlock blender Image shows that the Preview has not updated, but Blender UI continues to respond to user input. ![image.png](https://archive.blender.org/developer/F9779870/image.png)
Author
Member

In #85287#1110079, @rlneumiller wrote:
Note that I've discovered a potentially related issue as follows:

It is the same issue: the preview image deadlocks first in the background, and then the shader editor locks the main UI thread on the same mutex.

> In #85287#1110079, @rlneumiller wrote: > Note that I've discovered a potentially related issue as follows: It is the same issue: the preview image deadlocks first in the background, and then the shader editor locks the main UI thread on the same mutex.

This issue was referenced by 7149ccee57

This issue was referenced by 7149ccee57ceeb305f085719578e5848c58f92d0

Changed status from 'Confirmed' to: 'Resolved'

Changed status from 'Confirmed' to: 'Resolved'
Brecht Van Lommel self-assigned this 2021-05-03 22:10:56 +02:00
Sign in to join this conversation.
5 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: blender/blender#85287
No description provided.