Page MenuHome

Eevee dark edges on 8bit sRGB textures
Closed, ResolvedPublic

Description

System Information
Operating system: Windows-10-10.0.17134 64 Bits
Graphics card: GeForce GTX 1080/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 418.91

Blender Version
Broken: version: 2.80 (sub 64), branch: blender2.7, commit date: 2019-05-14 00:25, hash: rBa5b5bd2c24e0
Worked: 2.80, commit date: 2019-05-11 18:20, hash ebc44aae9897

Short description of error
Current builds of Blender 2.80 seem to be discarding colour information in textures under specific conditions. Any texture you use in a material node tree with an alpha channel and "Use Alpha" turned on displays incorrectly: any pixels that are black on the alpha channel are also black on the red, green, and blue channels . This did not happen with earlier builds and, weirdly, affected images display correctly when viewed in the image editor.

This is leading to the following side effects:

  • Opaque parts of alpha transparent textures now have soft black lines around their edges (see included screenshots) and appear more aliased and sharp than they should
  • Channel packing -- a technique popular in video game development where several greyscale masks are placed in the red, green, blue, and alpha channels of a texture to save memory, "packing" them into one image that can then be split up and used with a "Separate RGB" node -- now no longer works correctly on an image with an alpha channel since the masks in the red, green, and blue channels are "blacked out"

Exact steps for others to reproduce the error
Drag and drop any texture with an alpha channel (PNG or TGA) into a material node tree. Do not disable "Use Alpha" for the texture. Hook up the Color output to Base Colour. Observe that all pixels where the alpha channel is black are now black in the red, green, and blue channels as well. If necessary, to examine each channel individually, place and connect up a "Separate RGB" converter node. Note this issue persists regardless of whether you're using Eevee or the Cycles render engine.

To help reproduce the bug, I've attached a Targa file below which is a colour gradient with the words "This is a test" written on the alpha channel. This serves as an ideal example of an image that has colour data in the 100% transparent pixels that you can view in Blender's image editor by isolating the "Color" channel.

Also included are two screenshots of the "black lines around edges" issue: "Before.png" is how it looked prior to the bug, "After.png" is how it looks now. Note the considerably more jagged edges to the letters.

Event Timeline

Brecht Van Lommel (brecht) triaged this task as Needs Information from User priority.

Please attach a .blend file file to reproduce the problem.

I didn't think one was necessary but here you go. The attached file is a scene with a single cube in it, a material assigned, and the "AlphaTest.tga" texture applied to it. "Use Alpha" is turned on. The "Color" output is plugged into the Base Color of a Principled BSDF. Without the alpha plugged in to anything, each side of the cube should just show a coloured gradient. You can see how the texture should look in the image editor. The text "This is A Test" should not be visible. However it IS visible, because the alpha channel is affecting the Color output in the material.

Brecht Van Lommel (brecht) raised the priority of this task from Needs Information from User to Needs Triage by Developer.May 14 2019, 10:47 PM
Brecht Van Lommel (brecht) triaged this task as Waiting for Developer to Reproduce priority.

I've re-tested this with the current build of 2.80 and it now doesn't happen if your texture color space is set to "Raw" or "Non-Color" but still happens if you set it to any other color space.

This serves as an ideal example of an image that has colour data in the 100% transparent pixels that you can view in Blender's image editor by isolating the "Color" channel.

If there is data in the RGBA where A is zero, it should be treated as an unoccluding emission, and a pure add.

What is the current behaviour? Unassociated alpha?

This is about unassociated alpha, pure emission is not going to happen here.

Probably the darkening issue is that hardware accelerated sRGB to linear does not upremultiply the colors for conversion. Although I did not see much darkening in the .blend file. The screenshots were not made with that exact .blend file I guess.

For the channel packing case we have a separate bug: T53672.

Brecht Van Lommel (brecht) renamed this task from Texture nodes discarding colour information when "Use Alpha" is enabled to Eevee dark edges on 8bit sRGB textures.May 21 2019, 11:58 AM
Brecht Van Lommel (brecht) lowered the priority of this task from Waiting for Developer to Reproduce to Confirmed, Medium.
Brecht Van Lommel (brecht) raised the priority of this task from Confirmed, Medium to Confirmed, High.May 23 2019, 1:56 AM

The alpha is premultiplied with the color values in IMB_colormanagement_imbuf_to_srgb_texture, even though Alpha is set to "Straight". Is that correct?
There is nothing the gpu can do here, because the color information is lost in the premultiplication.

...because the color information is lost...

I am loathe to comment here, but stop saying this.

Alpha doesn’t work this way, and nothing is lost with associated alpha. It’s the way alpha works, but don’t take my word for it, have a read of several folks with Academy Science awards and a very experienced compositor.

Parroting bogus hot takes on alpha is only serving to make folks think that anything but associated alpha works. It doesn’t.

The problem is that hardware accelerated sRGB to linear conversion appears to assume unassociated alpha. It could have been implemented differently on GPUs, but it wasn't.

So we'll have to either store 8bit textures unassociated and convert to associated in the shader, or not use hardware accelerated sRGB to linear and convert sRGB to linear in the shader. It's unfortunate because the current code means we always get associated linear results automatically regardless of the type of image or shader, but seems the quality is not there and we'll have to make the shaders more complicated again.

The problem is that hardware accelerated sRGB to linear conversion appears to assume unassociated alpha.

The nature of nonlinear encodings, due to suckery, sadly means that they all must be dissociated prior to linearization, otherwise the math falls apart.

From a technical vantage, if someone tries to solve this on the GPU side, there are three cases, all of which are made a mess because of the way Blender currently handles alpha.

Assuming nonlinear RGB associated alpha imagery encoding, the three processes are:

  1. Associate. If the source buffer is unassociated alpha, this associates. Simple multiply in * A.
  2. Dissassociate. If the source buffer was already properly encoded to associated, respect RGB emissions. if (A != 0) RGB * A, else RGB = RGB
  3. Reassociate. If the source buffer was already properly encoded to associated alpha, and the buffer is currently dissociated, respect the RGB emissions. if (A != 0) RGB = RGB * A, else RGB = RGB

Anyone familiar with the alpha format in Blender is well aware of the nightmare, and can see that Blender currently has no way to determine the alpha state distinction between 1. and 3., and that’s a problem.

TL;DR: Despite conventional wisdom, there are three cases, not two.

This has nothing to do with Blender design regarding alpha, but about GPU performance and compatibility with game engines. We can distinguish between the various cases just fine, the only question is how to handle 8bit textures it in a way that:

  • Minimizes memory usage
  • Maximizes performance
  • Avoids artifacts
  • Is compatible with game engines that users export assets to
  • Keeps code as simple as possible

I believe UE BLEND_AlphaComposite performs the canonized Porter Duff and assumes associated.

Seems like it needs a custom shader, as the GPU path can’t work anyways and will mangle everything up?

Ah, thanks for the merge. So the task already exists.
Please, feel free to use the screenshot and example file I provided in my report since they're a bit clearer.

Regarding compatibility with game engines (because it keeps getting mentioned here), wouldn't it be beneficial if the materials actually looked the same in Blender and most game engines? Not meaning to sound rude, but wasn't that the point of Eevee?
I rebuilt my example shader in Unity and Unreal Engine, and neither of them display dark outlines in the semi transparent areas of the image texture. (Neither does any other graphics software such as Photoshop or Krita.)

This is a confirmed bug report, which means we intend to fix it.

I rebuilt my example shader in Unity and Unreal Engine, and neither of them display dark outlines in the semi transparent areas of the image texture.

Does the image display correctly in Unreal with BLEND m_AlphaComposite?

gobb_blend added a comment.EditedMay 28 2019, 2:04 PM

I'm afraid I won't have access to UE4 for some time, but all that Alpha Composite mode does is basically boost the alpha strength:

res.rgb = src.rgb + (dst.rgb * (1.0 - src.a))
res.a   = src.a   + (dst.a   * (1.0 - src.a))

Be that as it may, here's Blender's issue in a nutshell. It's actually not that complicated:
(The texture used in this image is white and red dots on a transparent background.)


In Blender's viewport straight alpha images are acting as if they were using premultiplied mode. The Lookdev material in the viewport on the right should look exactly like the raw color channels in the image viewer on the left. Instead what we're getting in the viewport is rgb * alpha. That's what causes the dark outlines. No other graphics engine I've ever worked with treats textures like that.

Edit:
Here's regular alpha blend (left) and alpha composite (right) in Unity in case you're still interested.

Edit2:
Another example of what Blender is doing wrong. This time with shader set-ups in Blender and Unity for better comparison.


Once again we're getting those dark areas where the image is semi-transparent since Blender is multiplying the color with the alpha.
In order to achieve this effect in Unity, we actually need to do this:

This is the result we want in Blender's viewport:

The alpha format dictates the over operation here. Wrong over, wrong output.

In truth, only associated alpha actually works.

Try your white background image with the blobs. If you smudge, blur, or do any convolution on that image without it being associated, the output is completely corrupted. This is why associated alpha is the de facto standard and is the only thing that works for convolutions, as well as expressing fire, flares, glows, and blooms. It’s the only means to represent both occlusion and emission.

So no, associated doesn’t just “boost alpha strength”; it’s the proper way to represent emission and occlusion.

You never “mix” like your second node example. That’s flat out wrong. Perform an over operation properly. In your Unity example, remove the multiply, for starters.

If somewhere, as Brecht has hinted, the associated alpha isn’t being overed correctly via the proper formula, or the alpha is being encoded incorrectly, that is indeed an issue and will be sorted. It is more nuanced than your claim however. Your two demonstrations however, are fundamental in their misunderstanding about a proper encoding.

You never “mix” like your second node example. That’s flat out wrong.

Exactly. And that's what Blender appears to be doing judging by the look of the material in the viewport. At least that was the impression I got, being a mere artist and not a tech guy.
TBH I think I have done everything in my power to help with this bug fix. So good luck from here on out.

The bug was already identified in T64625#686381, there is no need to speculate on the cause.