Eevee dark edges on 8bit sRGB textures #64625
Labels
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
6 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: blender/blender#64625
Loading…
Reference in New Issue
No description provided.
Delete Branch "%!s(<nil>)"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
System Information
Operating system: Windows-10-10.0.17134 64 Bits
Graphics card: GeForce GTX 1080/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 418.91
Blender Version
Broken: version: 2.80 (sub 64), branch: blender2.7, commit date: 2019-05-14 00:25, hash:
a5b5bd2c24
Worked: 2.80, commit date: 2019-05-11 18:20, hash
ebc44aae9897
Short description of error
Current builds of Blender 2.80 seem to be discarding colour information in textures under specific conditions. Any texture you use in a material node tree with an alpha channel and "Use Alpha" turned on displays incorrectly: any pixels that are black on the alpha channel are also black on the red, green, and blue channels . This did not happen with earlier builds and, weirdly, affected images display correctly when viewed in the image editor.
This is leading to the following side effects:
Exact steps for others to reproduce the error
Drag and drop any texture with an alpha channel (PNG or TGA) into a material node tree. Do not disable "Use Alpha" for the texture. Hook up the Color output to Base Colour. Observe that all pixels where the alpha channel is black are now black in the red, green, and blue channels as well. If necessary, to examine each channel individually, place and connect up a "Separate RGB" converter node. Note this issue persists regardless of whether you're using Eevee or the Cycles render engine.
To help reproduce the bug, I've attached a Targa file below which is a colour gradient with the words "This is a test" written on the alpha channel. This serves as an ideal example of an image that has colour data in the 100% transparent pixels that you can view in Blender's image editor by isolating the "Color" channel.
Also included are two screenshots of the "black lines around edges" issue: "Before.png" is how it looked prior to the bug, "After.png" is how it looks now. Note the considerably more jagged edges to the letters.
Added subscriber: @Interference
#65178 was marked as duplicate of this issue
Added subscriber: @brecht
Please attach a .blend file file to reproduce the problem.
I didn't think one was necessary but here you go. The attached file is a scene with a single cube in it, a material assigned, and the "AlphaTest.tga" texture applied to it. "Use Alpha" is turned on. The "Color" output is plugged into the Base Color of a Principled BSDF. Without the alpha plugged in to anything, each side of the cube should just show a coloured gradient. You can see how the texture should look in the image editor. The text "This is A Test" should not be visible. However it IS visible, because the alpha channel is affecting the Color output in the material.
AlphaTest.blend
Added subscriber: @troy_s
I've re-tested this with the current build of 2.80 and it now doesn't happen if your texture color space is set to "Raw" or "Non-Color" but still happens if you set it to any other color space.
If there is data in the RGBA where A is zero, it should be treated as an unoccluding emission, and a pure add.
What is the current behaviour? Unassociated alpha?
This is about unassociated alpha, pure emission is not going to happen here.
Probably the darkening issue is that hardware accelerated sRGB to linear does not upremultiply the colors for conversion. Although I did not see much darkening in the .blend file. The screenshots were not made with that exact .blend file I guess.
For the channel packing case we have a separate bug: #53672.
Texture nodes discarding colour information when "Use Alpha" is enabledto Eevee dark edges on 8bit sRGB texturesAdded subscriber: @JacquesLucke
The alpha is premultiplied with the color values in
IMB_colormanagement_imbuf_to_srgb_texture
, even though Alpha is set to "Straight". Is that correct?There is nothing the gpu can do here, because the color information is lost in the premultiplication.
I am loathe to comment here, but stop saying this.
Alpha doesn’t work this way, and nothing is lost with associated alpha. It’s the way alpha works, but don’t take my word for it, have a read of several folks with Academy Science awards and a very experienced compositor.
Parroting bogus hot takes on alpha is only serving to make folks think that anything but associated alpha works. It doesn’t.
The problem is that hardware accelerated sRGB to linear conversion appears to assume unassociated alpha. It could have been implemented differently on GPUs, but it wasn't.
So we'll have to either store 8bit textures unassociated and convert to associated in the shader, or not use hardware accelerated sRGB to linear and convert sRGB to linear in the shader. It's unfortunate because the current code means we always get associated linear results automatically regardless of the type of image or shader, but seems the quality is not there and we'll have to make the shaders more complicated again.
The nature of nonlinear encodings, due to suckery, sadly means that they all must be dissociated prior to linearization, otherwise the math falls apart.
From a technical vantage, if someone tries to solve this on the GPU side, there are three cases, all of which are made a mess because of the way Blender currently handles alpha.
Assuming nonlinear RGB associated alpha imagery encoding, the three processes are:
in * A
.if (A != 0) RGB * A, else RGB = RGB
if (A != 0) RGB = RGB * A, else RGB = RGB
Anyone familiar with the alpha format in Blender is well aware of the nightmare, and can see that Blender currently has no way to determine the alpha state distinction between 1. and 3., and that’s a problem.
TL;DR: Despite conventional wisdom, there are three cases, not two.
This has nothing to do with Blender design regarding alpha, but about GPU performance and compatibility with game engines. We can distinguish between the various cases just fine, the only question is how to handle 8bit textures it in a way that:
I believe UE
BLEND_AlphaComposite
performs the canonized Porter Duff and assumes associated.Seems like it needs a custom shader, as the GPU path can’t work anyways and will mangle everything up?
Added subscriber: @gobb_blend
Ah, thanks for the merge. So the task already exists.
Please, feel free to use the screenshot and example file I provided in my report since they're a bit clearer.
Regarding compatibility with game engines (because it keeps getting mentioned here), wouldn't it be beneficial if the materials actually looked the same in Blender and most game engines? Not meaning to sound rude, but wasn't that the point of Eevee?
I rebuilt my example shader in Unity and Unreal Engine, and neither of them display dark outlines in the semi transparent areas of the image texture. (Neither does any other graphics software such as Photoshop or Krita.)
This is a confirmed bug report, which means we intend to fix it.
Does the image display correctly in Unreal with
BLEND m_AlphaComposite
?I'm afraid I won't have access to UE4 for some time, but all that Alpha Composite mode does is basically boost the alpha strength:
Be that as it may, here's Blender's issue in a nutshell. It's actually not that complicated:
(The texture used in this image is white and red dots on a transparent background.)
In Blender's viewport straight alpha images are acting as if they were using premultiplied mode. The Lookdev material in the viewport on the right should look exactly like the raw color channels in the image viewer on the left. Instead what we're getting in the viewport is rgb * alpha. That's what causes the dark outlines. No other graphics engine I've ever worked with treats textures like that.
Edit:
Here's regular alpha blend (left) and alpha composite (right) in Unity in case you're still interested.
Edit2:
Another example of what Blender is doing wrong. This time with shader set-ups in Blender and Unity for better comparison.
Once again we're getting those dark areas where the image is semi-transparent since Blender is multiplying the color with the alpha.
In order to achieve this effect in Unity, we actually need to do this:
This is the result we want in Blender's viewport:
The alpha format dictates the over operation here. Wrong over, wrong output.
In truth, only associated alpha actually works.
Try your white background image with the blobs. If you smudge, blur, or do any convolution on that image without it being associated, the output is completely corrupted. This is why associated alpha is the de facto standard and is the only thing that works for convolutions, as well as expressing fire, flares, glows, and blooms. It’s the only means to represent both occlusion and emission.
So no, associated doesn’t just “boost alpha strength”; it’s the proper way to represent emission and occlusion.
You never “mix” like your second node example. That’s flat out wrong. Perform an over operation properly. In your Unity example, remove the multiply, for starters.
If somewhere, as Brecht has hinted, the associated alpha isn’t being overed correctly via the proper formula, or the alpha is being encoded incorrectly, that is indeed an issue and will be sorted. It is more nuanced than your claim however. Your two demonstrations however, are fundamental in their misunderstanding about a proper encoding.
Exactly. And that's what Blender appears to be doing judging by the look of the material in the viewport. At least that was the impression I got, being a mere artist and not a tech guy.
TBH I think I have done everything in my power to help with this bug fix. So good luck from here on out.
The bug was already identified in #64625#686381, there is no need to speculate on the cause.
This issue was referenced by
fb03f50e06
Changed status from 'Open' to: 'Resolved'