Page MenuHome

Blender does not save the alpha channel correctly for volumetric flames / fire with no smoke
Closed, ArchivedPublic


System Information
Operating system: Windows-10-10.0.18362-SP0 64 Bits
Graphics card: GeForce GTX 1060/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 446.14

Blender Version
Broken: version: 2.83.1, branch: master, commit date: 2020-06-25 09:47, hash: rB8289fc688b3e
Worked: (newest version of Blender that worked as expected)

Short description of error
[Blender does not save the alpha channel correctly for volumetric flames / fire with no smoke, if 'density' is set to zero or less in the 'Principled Volume' shader node.]

Exact steps for others to reproduce the error
[load the attached blend and try to get a correct alpha channel saved, I've tried saving as EXR, Targa, DPX and TIFF {all 16bit RGBA), all the files show the flames in the, but only the plane object shows at all in the alpha.
If you set 'density' above zero in the 'Principled Volume' shader, despite fixing the render preview issue ( T52680 -sort of), as in you can see the flames slightly in the render-preview window now, you still get zero or black in the alpha channel in any saved files.

This is what I'm seeing in the compositor-

Just to be clear: I'm not talking about the render preview issue T52680, with not showing the flames/fire at all as you watch the render.

I'm saying I'm not getting the alpha channel for the flames, that should be generated from what I can actually 'see' in the viewport, as in the rendered camera view in the GUI.
Surely if it shows in the GUI-viewport correctly, it should be able to save a valid alpha for that image? I mean if you can't trust that you can get what you create in the GUI-viewport out of the software that's quite a fundamental one isn't it?

[Based on the default startup or an attached .blend file (as simple as possible)]

and here's the VDB frame in case it didn't pack-

Event Timeline

I've been told that this is a 'technically correct' behaviour in T56280 and I do get the argument, but I'd still say it should be a check box- 'technically correct' alpha channel or one that works y/n.

This is the one that works. It really is.

Asking for something to occlude, in a manner that is not occluding in the path tracing engine is asking for a wonky path tracing hack.

What would you expect a path tracing engine to generate for a pure emission? Can you describe it?

  1. Alpha represents degree of occlusion.
  2. Emissions can happen independent of occlusion.

It is what makes things like the following possible:

@Troy Sobotka (sobotka) I do understand, and totally get the argument, but for my two pennies worth-

The whole point of Cycles / any GI renderer is physically accurate rendering, and it does this beautifully for the most part, but, in the real world my eyes can see things occluded by other 'emissive' things, therefore to my humble human eyeball / brain combination these things are, to all intents and purposes, opaque. As in: I cannot see the specular pattern on the wall behind the flame in your image above.

Secondly, at a more fundamental level, Blender's viewport is showing me what I'd expect to see in the 'real' world (to my eye anyway- the emissive objects may well not be totally opaque, but they look it in parts), while the render is not giving me the same results, therefore I can't, in this instance, 'trust' the viewport / render. Which is more of a problem to my thinking.

Additive light will “occlude” in compositing too. It works exactly the same way! This is a good little demo test file created by someone else that shows how the additive flames bury the background emissions. It’s less “occlusion” in this instance than overpowering the background emission. The mechanics however are completely different to actual occlusion, which would block the light, which is extremely relevant in a light transport model. We do not want a reflection for example, to block the light. Nor a subtle glow or flare. Nor fire.

Therein lies the crucial distinction; occlusion blocks light, while emissions do not. So a glowing ball of gas is fundamentally different from the edge of a starship.

The viewer being totally garbage certainly isn’t helping anyone, and arguably having file encoding for all integer encodings completely screwed doesn’t help either, as folks are saving their reflections, flares, and fires incorrectly, as well as getting completely mangled colour transforms.

Clément Foucault (fclem) closed this task as Archived.Sep 13 2020, 7:34 PM
Clément Foucault (fclem) claimed this task.

I think @Troy Sobotka (sobotka) answered the question thoroughly.

The flame is not occluding the background because of its opacity but by the light it adds to it. But for that to work you need to composite it in scene referred linear color space and then pass it to film transform curve.

Closing this ticket then.

I think your ideals are clouding your judgement on matters of practicality.

While the proper and correct way to do it is leave it like this and rely on the user compositing it correctly, the reality is that in a lot of situations where you would want to use a glowy alpha image, the software is simply not capable of compositing it in the proper and correct way. Web browsers come to mind.

@Piotr Adamowicz (madminstrel) this is an issue of the web browsers then (more likely the broken png format). But then if you need a workaround I would put the responsability towards the user. You just *cannot* emulate correctly emissive transparency with just a alpha hack. Since this hack is very situation dependant we will not implement it. Do you have any example of any other 3D renderer that handle this case the way you want?

If you want an alpha channel for your fire, just render your fire with some absorption shader, maybe even inside a separate render pass.

Hello All
I've woken up and smelled the coffee as they say: I'm doing the grown-up linear workflow and all is fine...

But, as I think I commented in T52680, I do still think it's quite an issue that you get totally different images from the viewport rendered preview and the F12 render preview (before you go properly linear for your workflow and find all this out that is!). It looks like the viewport is doing a proper linear comp of the flame/whatever emissive stuff over transparency, while the F12 window is not, as in you should be able to 'trust' all the previews in Blender, therefore they should be consistent. The F12 window really ought to show the emissive stuff over transparency like the viewport does, and (as it already does) show the lack of alpha channel info for the emissive over transparent stuff if you look at the alpha.