VR: Python API for Controller Interaction
We want to provide Python developers with sufficient tools to build their own VR experiences. That means the Python API needs to give access to devices (including controllers), but in a way that is as little device and platform dependent as possible.
It's probably best to first think about how we want the Python API to look like, before thinking about the internal C/C++ code design. The former has big impact on the latter.
About the OpenXR input and haptics design
- OpenXR tries to give much control to the applications using it, in a way that does not set any restrictions on the device design.
- Therefore, it tries to operate on an abstract level. E.g. rather than addressing a controller button as "A" it's addressed by a custom, abstract action "teleport".
- These abstract actions are entirely defined by the application and bound to higher level controller events identified with a path for a semantic path tree, e.g. "/user/hand/right/input/trigger/click".
- These paths are defined in device specific interaction profiles as part of the OpenXR definition. OpenXR runtimes (e.g. Monado, Oculus or WinMR) may however change these, e.g. depending on a custom user configuration.
Assessment for Blender
The level of abstraction in OpenXR seems very appropriate for a Python API too. It provides sufficient flexibility and device independence for Add-ons. So rather than doing much work in C/C++ to deal with the OpenXR design, the principal design could be forwarded to the Python API.
-> As more devices become available via OpenXR (e.g. a quite different kind of controller, like the Logitech VR Ink), changes in C/C++ code are not needed for an Add-on to use it.
OpenXR actions could be nicely integrated with the Blender design. Rather than specifying an action directly, an operator name could be specified. The C code could then automatically invoke an operator upon an action event.
Since OpenXR and the OpenXR runtime define the interaction profiles (mapping of physical buttons to higher level concepts like "trigger"), Blender should not do any additional mapping (e.g. no custom key-maps within Blender).
OpenXR actions have to be created and assigned to an action-set at a specific point during VR session start. So Add-ons need to be able to hook into that to create their own actions. This could be done with an application handler. E.g. bpy.app.handlers.xr_session_start.
With that handler, creating actions could look like this:
@persistent def action_set_setup_hander(): # Action set for this add-on. action_set = bpy.types.XrAction.create_action_set(name="Foo Add-on Actions") action = action_set.add_action(binding="/user/hand/right/input/trigger/click", operator="view3d.vr_addon_teleport") action.some_enum_operator_option = 'ENUM_VALUE'
We'll probably need a way to drive modal operators through OpenXR actions too. This could be defined in a similar fashion, a proposal for that is a todo.
Without extensions, the only haptic feedback supported by OpenXR is vibration. The OpenXR action for this can be created automatically, so Add-ons can trigger this on demand:
# ... regular operator code bpy.types.XrHaptics.vibrate(duration=300, intensity='LOW')