Page MenuHome

Cycles: Different result when a procedural noise map used as displacement versus a material blend map.
Closed, ArchivedPublic

Description

System Information
Operating system:
Windows 10 1809
Graphics card:
GTX 980 Ti

Blender Version
Broken:
2.80, 4ef09cf937f2, blender2.8, 2019-02-02

Short description of error

I'm trying to use a Cycles procedural noise to drive both the displacement and blending between two shaders on an object, with the goal of the displaced areas having one shader, and the undisplaced areas another shader. However,the displacement does not match up 1:1 with the mixed shaders, despite using the same noise to drive both.

You can see the shader blend is close to matching the displacement, but is off in several places.

Example file and render uses a standard subdivision modifier on the sphere, the same mismatch error is present when using adaptive displacement as well.

Edit:

I figured out what is happening. Despite being driven from the same noise, the noise seems to be generated/sampled twice - Once on the undisplaced mesh to generate the displacement, then again once the geometry is displaced. Because the displaced geometry now occupies a different location than the undisplaced geometry, the noise generates differently.

Increasing the amount of the displacement increases the amount of mismatch-

So now I wonder if this is expected behavior, or if it would be desired to "freeze" the generated noise before displacement and use that result through the whole shader chain?

Exact steps for others to reproduce the error
Open the attached .blend file, and render.

Details

Type
Bug

Event Timeline

I'm not sure if this should be considered a bug or more a feature request.
Sometimes you might want to evaluate the noise twice.

In any case, could you please create an as-simple-as-possible .blend file to reproduce the issue?

@Brecht Van Lommel (brecht), is this a bug or feature request?

Brecht Van Lommel (brecht) claimed this task.

Not a bug, generated or uv texture coordinates should be used instead.

Yeah, this is due to the textures being 3D and the coordinates used. This doesn't happen with UV coordinates which are 2D or with Generated coordinates which stick to the surface, it's only when using Position or Object coordinates and the likes, the newly elevated surfaces intersect with the texture at a different point in space. I reccomend you using UV or Generated in the meantime.

Note that the current behavior is also useful in other cases, so I don't intend to make changes here. Generated and UV are specifically intended to stick to the surface, while Object and World are not.