Page MenuHome

VR input integration and mapping
Confirmed, NormalPublicTO DO


The following lists a number of higher level related questions that have to be figured out.

How to implement support for operators, tools and gizmos for VR interaction?
Currently, operators, tools and gizmos assume a 2D based interaction. Adding a third dimension is not trivial and we need to think about how to do this best. Possible solutions:

  • Add an invoke_3d() and/or modal_3d() callback to operators. Downside is that C-defined operators need this callback to be implemented in C too, it can’t just be implemented in Python.
  • Add wrapper operators (possibly in Python) which only call the wrapped operators to apply the operation. May not work well with modal operators and Python wrappers are limited by the Python API capabilities.

How will picking with controllers work?
When users point into the VR-space with a controller, we need to figure out what it is pointed at.
Note: We can probably fake some picking feedback while no operations are invoked, i.e. we can draw a little dot on surfaces the controller points at with help of the depth buffer of the current frame.

How should the abstract VR actions (like “Teleport”) be implemented in our event system?
There’s a good chance we’ll offload VR rendering onto a separate thread with its own main-loop. Should we query and handle actions there, separate from the usual event system?
Also, some information we’d want to update for every frame, e.g. the controller position and rotation. So this is not event driven like the rest of Blender. How can this information be dispatched to modal operators (e.g. so that the transform tool keeps updating the object position as the controller moves).

How to implement 2D elements for 3D?
For VR GUIs, we’d want to display quite a few 2D elements in the VR space. For example pie menus, popups and all kinds of regular widgets (checkboxes, number buttons, etc.). Of course, they’d have to be interactive, too.
It may also be useful to allow displaying any editor of Blender as floating plane in the 3D environment and allowing interaction with controllers.
Note that this should never be the prominanting way to interact with the Blender UI in VR. It’s just something that can be implemented without too much trouble while being useful to workaround limitations of the VR UI, without having to take off the HMD. The exception, not the rule.