Page MenuHome

Virtual Reality (XR/VR/AR/MR)
Open, NormalPublic


The description here and in sub-tasks is based on a design document by @Julian Eisel (Severin): VR/XR Big Picture Kickoff. Please don't take it as official document.

Goals and Overall Vision

XR enables an entirely new way to work with computer generated 3D worlds. Instead of trying to take the UIs we’ve already implemented for traditional monitor use and make them work in XR, we should establish an experience on entirely new grounds. Otherwise, what we’ll end up with is just a different, more difficult way to use Blender.
So to enter the new immersive world, we should think about which workflows could benefit the most from XR, and carefully craft a new, improved experience for them. Don’t try to enable huge amounts of features for XR, risking bad outcome - or even total failure - but find out which features matter the most and focus on them first.

Long term, the experience can then be enriched further, with more features added as smaller, more concise projects. Eventually, XR would allow creating 3D content with unseen interactivity. The entirety of Blender’s content creation capabilities available in an immersive 3D experience. However, this better be the long term vision for XR, not the goal for this initial project.


All VR support in Blender will be based on OpenXR, the brand new specification for VR, AR, MR, etc - or short: XR. It is created by the Khronos group and has a huge number of supporters from all over the VR industry. It’s likely going to be the standard for VR/AR/MR/… (XR).

OpenXR splits the workload between the application (Blender) and the OpenXR runtime. The runtime handles all the device specific stuff, provides many common features (e.g. time wrapping) and can add extensions to the OpenXR API. So the devices Blender will support are defined by the available OpenXR (compatible) runtimes. Currently:

  • Windows Mixed Reality
  • Oculus (though, not officially released)
  • Monado (FOSS Linux OpenXR runtime)

For information on how to test our current OpenXR driven implementation, see

State of the Project

While there are multiple initiatives related to Blender and VR (e.g MPX and Marui’s Blender XR), the first important milestone for XR support in main Blender is being finished with an ongoing Google Summer of Code project. It aims to bring stable and well performing OpenXR based VR rendering support into the core of Blender. It’s not supposed to enable any interaction, it’s solely focused on the foundation of HMD support. For more info, refer to D5537.

The one important part that’s missing is controller support, or more precisely, input and haptic support. Although we can do some experiments, it would be helpful to first get a better idea of how VR interaction should work eventually, before much effort is put into supporting this.

This is the foundation that we can build on. It should allow people with experience in the field to join efforts to bring rich immersive experiences to Blender.

Development Process

Proposal is to do a use-case driven development process. That means, we define a number (say 10-20) of specific use cases. These should be chosen wisely, based on what we think has most potential with XR.
First, just vaguely describe what requirements the system has for a use-case, a bit later on, define what tools and options are needed to get the use case done (possibly by a specific persona).

On top of that, the process should be done iteratively. That means, we define short term goals for an iteration, work on them and evaluate afterwards before we get into the next iteration. The iterations may include implementation work. So we don’t do the waterfall-like “design first”, but there should be enough design work done before first implementations are made.

Any implementation should be driven by a use-case (or multiple) that the team agrees is drafted out well enough.

The Team

Dalai Felinto is available for the role as coordination lead, Julian Eisel can lead development on the Blender source code side. The MPX team cares a lot about usability for drawing related workflows, they are interested in joining as well. Then there are people from MARUI who have shown much interest in helping tools and general UI development.
We further have quite a few candidates to be artist stakeholders. It’s important that we don’t only have artists that care for the usability, but also for the production environment aspects of the project outcome.

This seems to be a good mix of people with complementing roles. We should soon get together to define the overall project goals, the general vision for XR in Blender and to get work started.


Old discussion task: T47899.


To Do

Event Timeline

Dalai and I agreed on merging T47899 into this. The old task can still be referred to, but development took new paths that are better discussed here and in the sub-tasks. We'd also like to discuss things on a broader level - the "big picture" for VR in Blender.
Will publish more info soon.