Page MenuHome

Architecture Texture Coordinates in Cycles
Open, NormalPublic


Hello dear Cycles team and developers :)

I've been working in an archiviz project and stumbled upon a short come of Cycles: we don't have a straightforward way of setting world-based-dimensioned textures . This is a must for architecture, and I can give a few real-world use cases if requested.

As a Blender user I came up with a node system that can handle this. You can check my approach here: Basically what this does it to take the Object texture coordinate and transform it based on the face normal (so we have an unstretched mapping for X, Y and Z).

As a Blender/Cycles developer I think we can do even better, and have this feature into master once we agree on the design and approach for the implementation.

I wonder why this never came up before as an issue. As a side note, if you read the Expose 3D Blender migration post on BlenderNation ( you will find that they consider the "Lack of architectural UV mapping" as the number one "flaw" of Cycles rendering.

My initial suggestion would be to have a Node where you could control (via socket input) the offset, scale and rotation of the texture. At first I would like to keep it simple and have offset to be a Vector(2), Scale to be a Vector(2), and Rotation to be a Float. In other words, I wouldn't bother with having per-axis nitty-gritty control. Also, the object size should NOT affect the texture size (unlike my original approach, or even the one from the "Sure UVW mapping” addon used by the Expose 3D team). I'm not sure if this last bit is possible though.

Any thoughts?



Event Timeline

Dalai Felinto (dfelinto) updated the task description. (Show Details)
Dalai Felinto (dfelinto) raised the priority of this task from to Needs Triage by Developer.
Dalai Felinto (dfelinto) set Type to Design.
Dalai Felinto (dfelinto) updated the task description. (Show Details)

I am unsure how robust your approach is to the corner alignment problem but even if it is 100% robust I am still not sure we might not want to keep the generation for this on the blender side of things.

For some kinds of textures in combination with artistic preferences it likely will remain needed to be able to manually tweak things.

I think a way to force world scale on textures independent of what the object scale may be. or to unwrap to world-scale / unwrap with these architectural properties kept intact could also solve this issue.

For RhinoCycles we have implemented a set of texture coordinates that might be useful for this:

Env Spherical
Env EMap
Env Light probe
Env Box
Env Cubemap
Env Cubemap Vertical Cross
Env Cubemap Horizontal Cross
Env Hemispherical

This probably covers most of what you want.

I can prepare a patch for the Cycles side. The Blender side still needs adding, since we didn't need that.

nodes.cpp :

edit: almost forgot to mention that to successfully use this we added also a MatrixMathNode:

Not only for architecture would it be a great addition. It would save a lot of time for many artists. Maybe a UV modifier would be an even better solution to make it real time, view-port ready and compatible with all rendering engines out of the box.
But the node solution from Nathan is already done and gives more flexibility for artists as mentioned by Martijn, so it looks like the best solution.

To specify lengths in world space and align to the corner of the bounding box, it should be possible to start from generated coordinates, which are normalized 0..1 inside the bounding box. The generated coordinates can be projected based on the object space normal, and then scaled to account for the normalized to object and object to world space transforms.

The object to world space matrix is available to shaders, and the ATTR_STD_GENERATED_TRANSFORM or generated_transform attribute gives you an object space to normalized space matrix. To me that seems enough information to implement this in a shader. For efficiency, scaling could be precomputed and stored as an attribute, but it's not strictly required to get this working.

A UV modifier would not be able to account for object to world space scaling. Having the world space length specified in the material nodes also seems easier if you want to build a collection of materials and reuse them easily, you could just apply the material and get the correct scale without having to remember to add a modifier.

Some other ideas:

  • The Vector Transform node could support the normalized space at some point too, if someone wants to make a custom node group to do mappings like this.
  • Rounding the scale factor to an integer could be useful for some types of textures, if you want e.g. floor tiles to fit exactly and are ok with the world space length being slightly different than specified.
  • For the UI, perhaps this could fit with the mapping options built into each Cycles texture node, with some way to specify the texture coordinates without having to add more nodes. We should really make those more discoverable in the UI though. Currently they can only be edited in the properties editor texture tab, when a texture node is active. They are not visible in the node editor, maybe they should be in a popup or panel.

Please forgive me if I am missing something trivial here, but I am an architect working daily in Blender and rendering in Cycles (hell I am doing it right this very moment ;) ) and for a very very very long time I ached for something like this in Blender.
It was my number one feature request for a while, I also though I wanted that "UV Modifier" like 3DSMax has one, and I even went through Sure UVW mapping addon and alikes. That is until I learned how to use nodes properly.

Isn't all this solved simply by using "Object Coordinates" from the Input > Texture Coordinates node ?
That is what it does, it gives you texture sizing options in Blender scene units virtually allowing you to define real-world-size for images in scene units.
Only problem remaining is when objects are not uniformly scaled, which is quickly solved by applying a Vector Transform node from Local to World coordinates to the output of the Object socket.

Ever since I found out about this I never looked back and I use it daily to set size to textures for my arch viz projects. Particularly useful for ceramic tile, or woodboards, etc which usually have precise measurements like 2m per 20 cm or the like. It even works perfectly for procedural textures like cycles bricks, no need for particularly complex node setups and workarounds.

Again please forgive me if I missed something but it seems we are talking about the same thing and it's a problem I haven't really had for a while now.

@Duarte Farrajota Ramos (duarteframos) it works great as long as you have angles near the 90° everywhere in you object and your mash is not rotated, only the object containing it. As soon as you import a model with faces not following the world axis or have faces with complicated orientation, etc... the simple node setup you mention will give noticeable distortions. For cubic architecture, it works, but if you model with nurbs/curves or anything not cubic, it just can't do it properly. Or you have to send me your nodegroup :)

Ah I see, that's right. I did manage to make my own "Cube Mapping" custom node that masks he mesh faces through normals and applies the correct projection to the face.
It can easily be modified to fade smoothly between normals and with an even more complex node setup it could have more "normal steps" for more accurate projections on curved surfaces, but as you mentioned it will create visible seams and is bound to fail sooner or later if the mesh is more complex.

One small complaint I do still have which, I'm not sure, but I think may fit in this discussion is the fact that the Bricks procedural texture is particularly hard to use on non planar meshes, at least without unwrapping first, which partially defeats one the the appeals of procedural textures: not having to unwrap.

Since it is always projected from one axis the same node setup can never be applied successfully on say a mesh comprised of walls (vertical surfaces) and floors (horizontal surfaces) as is often the case.
I mean I can happily use it with my above mentioned custom cube mapping node (it was the reason I created it in the first place), but a native solution not requiring workarounds would be desirable. For example it would be nice if the bricks texture was "volumetric" by nature like 3Ds Max's "Tiles" texture.

If you like I'd be happy to share my node setup

@Dalai Felinto (dfelinto) Your nodesetup gives strong distortions on an icosphere for some faces. Some other are good. So it doesn't really take the face normals to project the texture if I'm right?

made using brick texture from here:
@Duarte Farrajota Ramos (duarteframos) Yes, would be good to compare with dfelinto 's one.

Sorry for the delay in replying, here is my node group setup for cycles cube mapping.

It basically "masks" the mesh faces by normal, so the correct mapping projection can be applied to each face according to the direction it is facing or it's angle.
It can be used along with the regular Vector Mapping node for further non-destructive adjustments outside the group, and has an additional control for the rotation of the side mapping. I often need it for using directly on bezier curve objects, depending on their orientation in the scene (screenshots 1 and 2).

It works fairly well for relatively orthogonal objects, but of course it creates visible seams in curvy or more organic shapes.
The three Greater Than nodes in the "Masking Normals" frame could easily be replaced by a color ramp for a smooth transition, instead of a hard break.

Curiously the other day some question came up at Blender Stack Exchange with the same problem I tried to solve with this node setup, but someone else got to it before I did