Page MenuHome

Cycles: Add support for denoising in the viewport
Needs ReviewPublic

Authored by Patrick Mours (pmoursnv) on Thu, Jan 9, 8:22 PM.
Tags
None
Tokens
"Love" token, awarded by Kubo_Wu."Love" token, awarded by Alphisto."Love" token, awarded by 1D_Inc."Love" token, awarded by MetinSeven."Love" token, awarded by Yegor."Doubloon" token, awarded by hitrpr."Like" token, awarded by TheCharacterhero."Love" token, awarded by bnzs."Love" token, awarded by Tetone."Love" token, awarded by crantisz."Love" token, awarded by billreynish."Love" token, awarded by juang3d."Love" token, awarded by symstract."Like" token, awarded by amonpaike."100" token, awarded by mistaed."Love" token, awarded by franMarz."Love" token, awarded by jc4d.

Details

Reviewers
Sergey Sharybin (sergey)
Group Reviewers
Cycles
Summary

The OptiX denoiser can be a great help when rendering in the viewport, since it is really fast and needs few samples to produce convincing results. This patch therefore adds support for using any Cycles denoiser in the viewport also (both the OptiX and NLM ones).

Unfortunately this sounds easier on paper than it actually is, because the current architecture didn't handle this case at all and things fell apart with multiple GPUs because the multi-GPU implementation made various assumptions about the tile buffers, that were not met in viewport rendering.
This patch therefore extends the tile manager to properly handle tile neighbors even if those tiles all reside in one big buffer (as is the case in the viewport). It also changes that big buffer to be a 2D image, so that the multi-device copy code can be re-used (which already supports sliced downloads via mem_copy_from). Finally the session synchronization code is modified to synchronize denoising settings in viewport rendering (which means keeping track of the view layer there too) instead of just doing so for batch rendering.

Diff Detail

Repository
rB Blender
Branch
cycles_optix_denoiser_viewport (branched from master)
Build Status
Buildable 6349
Build 6349: arc lint + arc unit

Event Timeline

Juan (jc4d) added a subscriber: Juan (jc4d).
Juan (jc4d) removed a subscriber: Juan (jc4d).

Moved UI to enable viewport denoising into viewport shading options

Enabling denoising for the viewport is now part of the viewport shading options, so to avoid having to sync render layers (which really have no meaning in the viewport).
Also disabled viewport denoising for non-OptiX devices, since the NLM denoiser in its current state is too slow to be usable in the viewport.

Are there plans to also support OIDN? There are still lots of non-Optix systems out there and LuxCore is working nicely with OIDN viewport denoising.

Are there plans to also support OIDN? There are still lots of non-Optix systems out there and LuxCore is working nicely with OIDN viewport denoising.

I would guess that this might be different because it runs on the CPU?

If it's possible though I would second the desire for this. Plenty of people don't have RTX cards for whatever reason-- they have an AMD card or an older Nvidia card, and leaving them without an option isn't great.

I noticed that the Optix denoiser can be enabled only when you use an Optix GPU.

But Optix denoiser is compatible with Optix GPU and with normal CUDA gpu's, as the Grant Wilk addon does.

Can we have it enabled also for CUDA devices, not just for Optix devices?

I have a patch for viewport denoising that supports OIDN - unfortunately it works different than this one. We should coordinate for a unified approach.

It might be useful to enable OptiX denoising for CPU rendering too - when you have scenes that exceed VRAM or rely on CPU only features (decoupled volumes, OSL) you might still want to leverage GPU denoising.

I noticed that the Optix denoiser can be enabled only when you use an Optix GPU.
But Optix denoiser is compatible with Optix GPU and with normal CUDA gpu's, as the Grant Wilk addon does.
Can we have it enabled also for CUDA devices, not just for Optix devices?

see comments in rBd5ca72191c36 for a (tmp-hack) workaround

I have a patch for viewport denoising that supports OIDN - unfortunately it works different than this one. We should coordinate for a unified approach.
It might be useful to enable OptiX denoising for CPU rendering too - when you have scenes that exceed VRAM or rely on CPU only features (decoupled volumes, OSL) you might still want to leverage GPU denoising.

Totally agree with this

@Stefan Werner (swerner) Yes. Question is how a good architecture for that would look like ...

Your implementation works because it's all CPU code, but adding OptiX into the mix is going to make it more difficult. Not to mention that I think it'd be cleaner to avoid including OptiX headers all over the codebase and instead concentrate on a selected subset of projects (currently the device and BVH implementations).
It seems wasteful to ignore all the existing denoising code that is already in Cycles and write something from scratch just for the viewport. There also is the question on how a user would decide which devices to run denoising on if this is separate from rendering (with all the complications attached, like that you can only run OptiX denoising on GPUs and can only run OIDN on CPUs.
On the other hand, denoising on multiple devices is rather slow because of all the copies involved (need the neighbors for each tile), so denoising the entire image in one go on a single device has its advantages too.

Maybe adding an additional DeviceTask to denoise an entire buffer instead of just a tile could help (e.g. similar to the FILM_CONVERT task). Just need the MultiDevice implementation to select a single device then and pass the task to that one (selected based on whether user wants OptiX denoising or OIDN or something else). Because I think it is best to re-use contexts and data from the device implementations, rather than setting up an entirely new environment just for denoising. OptiX denoising would then sit in the OptiX device (like it already does) and OIDN in the CPU device. This would still couple the available denoising options to which API is selected though (so need to render with OptiX to get the option to use OptiX denoising etc.). But that could improve once CPU + OptiX rendering is supported sometime in the future (whenever the BVH system is improved to support multiple implementations concurrently).

Reworked UI to theoretically decouple the denoiser from the used render API. This would still need plumbing in the background though. The UI to enable viewport denoising now also resembles that for EEVEE, to make it more consistent. And showing different render passes is fixed.