This does seem to have been solved in either the latest Blender 2.8 build or Mesa version: I can now go into Edit Mode on meshes with a lot of vertices, no longer getting either the graphical corruption or any GPU crashes.
Jan 26 2019
Jan 2 2019
In the meantime I've preformed a test which was recommended to me by another user. The crash seems to be unaffected by booting with either "amdgpu.dc=0" or "amdgpu.dc=1" and will occur identically in both cases.
Dec 29 2018
A screenshot of the graphical corruption which can be observed moments before the crash.
Jun 19 2018
Sorry for not being aware of 2.8 reports being closed. I was going to report something else for this version, but now I know to wait until 2.8 reports are open.
Jun 17 2018
Screenshots showing the result I get in the viewport versus the rendered result.
Blend file I used to replicate the issue.
Feb 5 2018
I'd still be curious why shadow rays can't be probabilistically terminated upon touching a glass shader, based on the transparency value of the glass at that pixel. Even if it's based on estimation and not fully realistic, couldn't at least an option for that be added in the meantime?
Jun 28 2017
I understand... thank you for clarifying. It's especially sad that Blender can't get better GPU Compute support for Mesa & the radeon driver; With Blender being open-source, many of us use Linux over Windows, and most of those will avoid all proprietary software including proprietary video drivers (for ATI cards, fglrx). Same reason why I have an AMD card instead of Nvidia... IIRC Cuda is proprietary and not available in Nouveau, but OpenCL is an open-source standard.
Jun 27 2017
As for support, I remember once reading that the official stance was "all GCN cards are supported, but we can't guarantee that cards not officially tested and listed will work". I actually bought this video card with Blender in mind, hoping that anything modern and GCN enabled was a safe option. Blender has been very picky and "discriminatory" when it comes to video cards and allowing GPU rendering, with seemingly just a few users privileged to have this feature (at least for AMD / OpenCL)... therefore I hope this can be further investigated, and not closed with a simple "we don't support this card at this time" note.
Thank you for those suggestions, and pointing out that debug parameter. I gave it a try, and found some differences with two debug options:
Apr 3 2016
Today I tried to render my scenes with fresh Blender settings (new ~/.config/blender/2.77 directory). To my surprise, the issue went away entirely! Blender also uses a lot less memory at startup and while rendering now.
Very well. This was actually easier than I thought: A simple Suzanne head with a subsurf of 6 is enough to visibly trigger the problem. These are the results I'm getting with it:
Jan 16 2016
Thanks for the clarification! Although I wanted to hope they weren't accurate or that this might change, several people stated that this patch does too little to improve performance in production scenes. If a whole new implementation is needed to get the best effect, I can agree that's better, even if it takes a bit longer to happen. Otherwise I'm familiar with being busy with multiple software project, and not being able to get as much as you wish done... it would be wrong to expect too much, though I do hope we might see something interesting during 2016 if all goes well. Thank you and good luck!
Somehow I predicted this was going to happen... hence why I kept poking the Blender team about getting this done sooner, before it got abandoned. If an alternative is in the works however, not all hope is lost. Any links to the new project please?
Is there any news on when we might expect this to enter Blender, even as an Experimental feature? As someone who strongly struggles with noise vs. render times (no GPU support here) I'm eagerly waiting for it to happen already! Taking the forum thread into account, this has been around for over an year now.
Aug 8 2015
That should not be a huge problem, and it's probably the only working solution to begin with. First of all because you'd get double stereoscopy if both the 3D surface and envmap texture are stereoscopic, and envmaps are typically cubes so things like eye convergence are probably not applicable. I don't have Blender set up to compile here, so I'll try the fix once 2.75b or 2.76 are out. Thank you for solving this.
Aug 7 2015
The issue is so quick to reproduce, I figured one wouldn't be needed. But here it is: Simply press F12 and Blender should crash.
May 13 2015
Feb 9 2015
I'm not completely certain if it's GE only, or affects the viewport as well. In the example blend I posted, it only happens in GE. But in different cases, smooth shading doesn't work on certain meshes in the viewport, and flickers if you look from certain angles. The issues look quite similar, and I can't tell if they have the same root cause.
Feb 8 2015
A third person confirmed the issue on the #gameblender IRC. They didn't mention as much as @Sybren A. Stüvel (sybren)... just that they remember noticing the issue in 2.68, which means it might be older than I initially thought.
Feb 2 2015
Sep 21 2014
Sep 20 2014
Jun 30 2014
Thank you, I'll keep this alternative in mind. Still, I prefer the noise texture for simulating static noise, since it feels like it gives the best effect and precisely what I need. As long as this texture will exist and is possible to use of course... I can understand if development changes might make it unusable in the future.
Jun 29 2014
I'm also on Linux, using x64 Blender 2.70, official download from the website. I updated to 2.71 and the problem persists.