EEVEE: Intel Shader Compiler Bug

When using Intel GPU EEVEE did not display anything. This was due to an
internal shader compilation bug inside the intel drivers. We had fixed
this for other vertex shaders. The same change we have to apply to other
vert shaders that want to limit the need of Matrix multiplications.
This commit is contained in:
Jeroen Bakker 2019-05-16 13:35:02 +02:00
parent b019d8b2fe
commit 4cd191aa29
Notes: blender-bot 2023-02-14 08:35:51 +01:00
Referenced by issue #64466, Eevee not working on Intel UHD 630 / HD 620 / Iris
2 changed files with 14 additions and 0 deletions

View File

@ -13,6 +13,13 @@ in vec3 pos;
void main()
{
#ifdef GPU_INTEL
/* Due to some shader compiler bug, we somewhat
* need to access gl_VertexID to make it work. even
* if it's actually dead code. */
gl_Position.x = float(gl_VertexID);
#endif
#ifdef HAIR_SHADER
float time, thick_time, thickness;
vec3 worldPosition, tan, binor;

View File

@ -11,6 +11,13 @@ out vec3 viewNormal;
void main()
{
#ifdef GPU_INTEL
/* Due to some shader compiler bug, we somewhat
* need to access gl_VertexID to make it work. even
* if it's actually dead code. */
gl_Position.x = float(gl_VertexID);
#endif
vec3 world_pos = point_object_to_world(pos);
gl_Position = point_world_to_ndc(world_pos);
#ifdef MESH_SHADER