Page MenuHome

OptiX viewport denoising does not work when preferences are set to CUDA (works with final render however)
Closed, ResolvedPublic

Description

System Information
Operating system: Windows-10-10.0.18362-SP0 64 Bits
Graphics card: GeForce GTX 1070 with Max-Q Design/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 446.14

Blender Version
Broken: version: 2.90.0 Alpha, branch: master, commit date: 2020-06-05 22:39, hash: rBb74cc23dc478
Worked: n/a

Short description of error
OptiX denoising was recently enabled for all Maxwell+ cards (rB473aaa389cc).

However, viewport denoising does not seem to work when CUDA is set in preferences. It _does_ work if I select OptiX in preferences OR if I do a final render while in CUDA mode.

Basically I see the following:

SettingDoes Viewport OptiX denoising workDoes Final render OptiX denoising work
CUDANo - this bugYes
OptiXYesYes

Exact steps for others to reproduce the error

  • Open attached .blend
  • Use a non-RTX nVidia graphics card (latest drivers as of May 27th)
  • Ensure CUDA is selected in preferences
  • Go to viewport and observe no denoising is occurring

Viewport CUDA

Viewport Optix

Render CUDA

Event Timeline

I have a GTX1060. I can confirm this. Viewport denoising when selecting Cuda for rendering is not working. But for me not only viewport denoising is not working, neither is viewport rendering. Don't know how to describe it, so pls check the attached for the viewport rendering result.

I have a GTX1060. I can confirm this. Viewport denoising when selecting Cuda for rendering is not working. But for me not only viewport denoising is not working, neither is viewport rendering. Don't know how to describe it, so pls check the attached for the viewport rendering result.

Most likely one of your shaders either uses the Ambient Occlusion or Bevel nodes which are not yet supported by this feature. Notice the "Cancel" in the topleft corner. Which seems like a separate issue from the first post.

I have a GTX1060. I can confirm this. Viewport denoising when selecting Cuda for rendering is not working. But for me not only viewport denoising is not working, neither is viewport rendering. Don't know how to describe it, so pls check the attached for the viewport rendering result.

Most likely one of your shaders either uses the Ambient Occlusion or Bevel nodes which are not yet supported by this feature. Notice the "Cancel" in the topleft corner. Which seems like a separate issue from the first post.

You are right. Just find out that Optix does't support GPU + CPU hybrid rendering yet. After unchecking CPU under CUDA settings, the viewport rendering works fine, but no denoising from Optix.

YAFU (YAFU) added a subscriber: YAFU (YAFU).EditedJun 8 2020, 2:50 PM

I cannot reproduce the problem here in Linux with .blend file above, GTX 960 (440.82 driver). This works in all cases.

A clarification. When you select GPU + CPU under System on Preferences > CUDA, only GPU works in viewport. Also, the shared .blend file above has no components that are not supported by OptiX, so denoiser should work in all cases (Regardless if you select CUDA GPU, GPU + CPU, or OptiX). So this is a Windows build problem, or Windows nvidia driver.

Also, be aware of your monitor resolution and check that you are not having this problem:
https://developer.blender.org/T75289

Edit:
Yes, surely you have a monitor with a higher resolution of 1080p and you are experiencing the bug in the link above. I had not realized that this is only when CUDA is selected, it works with OptiX. I'm going to update that report related to Pixel Size

I cannot reproduce the problem here in Linux with .blend file above, GTX 960 (440.82 driver). This works in all cases.

A clarification. When you select GPU + CPU under System on Preferences > CUDA, only GPU works in viewport. Also, the shared .blend file above has no components that are not supported by OptiX, so denoiser should work in all cases (Regardless if you select CUDA GPU, GPU + CPU, or OptiX). So this is a Windows build problem, or Windows nvidia driver.

Also, be aware of your monitor resolution and check that you are not having this problem:
https://developer.blender.org/T75289

I'm also on Linux, with a 3840*2160 hidpi monitor. Optix denoising using Cuda works after settings the viewport pixel size from "automatic" to "1x", but it's verrrrrrrrry slowwwwww with my Gtx1060 6G, compared to just using Optix. Viewport denoising simply doesn't work with Cuda with viewport pixel size set to anything other than 1X.

YAFU (YAFU) added a comment.EditedJun 8 2020, 3:35 PM

@Tyler (ghfujianbin) , It is slower because with Pixel Size=1 the Viewport it is working at the maximum resolution of your monitor. As I said before, even when you use GPU + CPU, GPU is the only one that works in Viewport. So for now until the Pixel Size and CUDA problem is solved, while you need Viewport denoiser you can choose OptiX from Preferences in System item. Also when you have CUDA selected in System item, OptiX denoiser limitations regarding supported features remain the same, so for all this until the bug is resolved there are no major reasons to just use OptiX in viewport for now (Change to CUDA if for some reason you prefer it for final render)

Interesting, yes, it's actually the Pixel Size setting as I'm on hidpi here too. So seems like this and T75289 really are the same, I wouldn't have guessed that :)

Patrick Mours (pmoursnv) closed this task as Resolved.Jun 10 2020, 2:14 PM
Patrick Mours (pmoursnv) claimed this task.