Page MenuHome

alpha value is never available in the color buffer
Closed, ResolvedPublic


Category: Rendering

Even if we replace all glClearColor to use alpha 0.0 as default alpha we don't get alpha stored in the RGBA buffer.

If you take a look at the first file you can see that alpha is always
1.0 in BGE (1_alpha_picker.blend)

Once that's working 2Dfilter can be tackled too (2_2D_filter_alpha_tests).

According to Brecht:

"Blender doesn't request the alpha buffer on all OSes"
"if you render to FBO it's a matter of making sure the texture has an alpha channel, if you render into back buffer it requires changes in GHOST"
"also not sure if all cards support it, probably they do, but i don't think it's a requirement for OpenGL"

Event Timeline

Attached Win32(by me) and X11 fix (by Campbell).

In the Win32 I'm simply increasing the hDC weight when we have an alpha available.
* missing, OSX fix

(not to be commited - yet?)

bug fix and patch commited on rev. 54745.
I honestly don't think there is any considerable memory or performance cost on having this on. And OSX has had alpha in the buffer since a looong time. So it's reasonable to have the same behaviour across the other OSs.

Dalai Felinto (dfelinto) changed the task status from Unknown Status to Resolved.Feb 22 2013, 8:39 AM