Page MenuHome

depth clipping visualization errors
Closed, ArchivedPublic

Description

It seems that depth clipped objects presents visualization errors when in vertex/face select mode. Edge select mode semms to work fine instead.

The images show the situation:
vertex select mode + depth clipping = visualization errors (I can see vertexes that should be clipped)
face select mode + depth clipping = visualization errors (I can see faces that should be clipped)
edge select mode + depth clipping = correctly visualized.

Details

Type
OpenGL Error

Event Timeline

Collegato: no
user_id=3475

Missing hardware info:
Linux Fedora Core 3
Vanilla Kernel 2.6.10
X.org 6.8.1
Ati Radeon Mobility U1 (IGP320M).

Under Windows XP works great instead.

Logged In: YES
user_id=103

Check on the zbuffer depth resolution you have. It looks like a low-res buffer to me, 16 bits?
Not a real fix possible, nor bug... only thing we could provide is some opengl tweaking menu for this stuff. Will move to opengl tracker.

Collegato: no
user_id=3475

Hi Ton.
Personally, I don't know how to verify this under Linux and this drivers.
The only thing I know about this drivers is they don't support more than 24 bit, actually. If someone is so nice to show me how to tweak this, I'll be happy.

Logged In: YES
user_id=103

I doubt you'll get help here, this is the bug tracker :)
Try forums, search with google...

You can also tweak the zbuffer resolution by making the "clip start" as large as possible, and "clip end" as small as possible. For the 3d windows, these settings are in a Panel, in menu "View -> View properties".

Collegato: no
user_id=3475

Ok, Ton.
Thanks to alien-xmp and his python script:

from Blender.BGL import *

print
print "OpenGL Driver"
print "---------------------------------------------"
print "Vendor : " + glGetString(GL_VENDOR)
print "Renderer : " + glGetString(GL_RENDERER)
print "Version : " + glGetString(GL_VERSION)
print

depth_bits = Buffer(GL_INT, 1)
glGetIntegerv(GL_DEPTH_BITS, depth_bits)

print "Depth Bits : %d"%(depth_bits[0])


I receive the following output:
OpenGL Driver
---------------------------------------------
Vendor : Tungsten Graphics, Inc.
Renderer : Mesa DRI Radeon 20030328 AGP 4x x86/MMX+/3DNow!+/SSE NO-TCL
Version : 1.2 Mesa 6.1

Depth Bits : 24
Saved session recovery to /tmp/quit.blend

Blender quit

So I was not using a 16 bit depth buffer.

Logged In: YES
user_id=2923

please confirm that this is still a problem with current CVS builds any unconfirmed bugs by the end of the week will be closed (they can be reoppened though).

you can get a current cvs build from

http://blender.org/forum/viewforum.php?f=18

Collegato: no
user_id=3475

Confirmed. This problem still apply. I can not verify this with the current Xorg 7 at the moment. But with Xorg 6.8.x is still a valid problem.

This is a generic request to test your bug report and see if it is still an issue in 2.5alpha2 if so please let me know by making a comment in this report ie 'also in 2.5alpha2' and I will add it to the 2.5 bug list.

Matt Ebb (broken) closed this task as Archived.Mar 26 2010, 6:15 AM