VR design / usability #68994

Open
opened 2019-08-21 16:19:48 +02:00 by Dalai Felinto · 13 comments

The following lists a number of higher level, usability related questions that have to be figured out.

General Workflow

How will navigation in the virtual world happen?
Early on, we should decide how navigation in the VR view will work. It should probably be usable with and without handheld controllers.
Other VR engines have already implemented their solutions, which may be useful as a reference.

How can we create workflow specific VR UIs?
It would be nice if VR UIs could be integrated well with other workflow mechanics we have in Blender. For example there could be a simplified UI for 101-like templates, or specialized sculpting UIs. A version with focus on accessibility might also be interesting, VR offers new approaches for handicapped access.
Basically we shouldn’t set one VR UI in stone, but allow people to come up with better or alternative solutions for specified use-cases. We’d still deliver a polished default experience obviously.

This has big technical implications, as it influences how tools, operators, gizmos and other graphical elements have to be implemented.

Operators, Tools & Gizmos

How will operators, tools and gizmos work together in VR?
Will we only provide the active-tool concept, or can we allow non-modal operator execution as well? How will tools and operators be selected/invoked?

How can users customize keymap bindings of controllers?
Note: The OpenXR specification “outsources” this part to the runtime. In Blender we would listen to abstract actions (e.g. “Teleport”) and the OpenXR runtime (Oculus, Windows Mixed Reality, etc.) binds this to a device button/gesture. This makes our job easier, we don’t have to care for device specific keymap bindings.
So assuming we don’t need to listen to device specific button events (which might still be needed), OpenXR already answered this question for us.

Setup

How is the VR reference space defined?
What’s needed is a point in space that acts as the origin of the VR experience, a up direction and an initial direction to look at (always perpendicular to the up direction?).
Note that the up direction and the origin basically define the floor plane.
The OpenXR runtime calculates the head position/rotation based on that, including eye level above the ground. It can also define the room/movement boundaries.
Other engines probably already figured this out and we may be able to copy them.

How can settings be accessed from within the VR session?
There are different settings to think about: Viewport settings, VR-specific settings (e.g. toggle positional tracking), operator settings, tool settings, etc. We’d probably not expose properties editor or user preference settings for now, although we could make it possible to show any editor in a 3D floating popup.

How can users change settings for the VR session, but outside of it? Do we allow this at all?
For example, users may want to change viewport settings from the regular 2D UI, because the VR session renders too slow. This also matters for users that don’t have controllers.

How are session defaults and startup settings defined?
This is mainly about viewport settings (display mode, lighting, shading, etc.) and VR specific settings. You may want to set these up even before the session starts. And you may want to set your own default settings, too.

The following lists a number of higher level, usability related questions that have to be figured out. ## General Workflow **How will navigation in the virtual world happen?** Early on, we should decide how navigation in the VR view will work. It should probably be usable with and without handheld controllers. Other VR engines have already implemented their solutions, which may be useful as a reference. **How can we create workflow specific VR UIs?** It would be nice if VR UIs could be integrated well with other workflow mechanics we have in Blender. For example there could be a simplified UI for 101-like templates, or specialized sculpting UIs. A version with focus on accessibility might also be interesting, VR offers new approaches for handicapped access. Basically we shouldn’t set one VR UI in stone, but allow people to come up with better or alternative solutions for specified use-cases. We’d still deliver a polished default experience obviously. This has big technical implications, as it influences how tools, operators, gizmos and other graphical elements have to be implemented. ## Operators, Tools & Gizmos **How will operators, tools and gizmos work together in VR?** Will we only provide the active-tool concept, or can we allow non-modal operator execution as well? How will tools and operators be selected/invoked? **How can users customize keymap bindings of controllers?** Note: The OpenXR specification “outsources” this part to the runtime. In Blender we would listen to abstract actions (e.g. “Teleport”) and the OpenXR runtime (Oculus, Windows Mixed Reality, etc.) binds this to a device button/gesture. This makes our job easier, we don’t have to care for device specific keymap bindings. So assuming we don’t need to listen to device specific button events (which might still be needed), OpenXR already answered this question for us. ## Setup **How is the VR reference space defined?** What’s needed is a point in space that acts as the origin of the VR experience, a up direction and an initial direction to look at (always perpendicular to the up direction?). Note that the up direction and the origin basically define the floor plane. The OpenXR runtime calculates the head position/rotation based on that, including eye level above the ground. It can also define the room/movement boundaries. Other engines probably already figured this out and we may be able to copy them. **How can settings be accessed from within the VR session?** There are different settings to think about: Viewport settings, VR-specific settings (e.g. toggle positional tracking), operator settings, tool settings, etc. We’d probably not expose properties editor or user preference settings for now, although we could make it possible to show any editor in a 3D floating popup. **How can users change settings for the VR session, but outside of it? Do we allow this at all?** For example, users may want to change viewport settings from the regular 2D UI, because the VR session renders too slow. This also matters for users that don’t have controllers. **How are session defaults and startup settings defined?** This is mainly about viewport settings (display mode, lighting, shading, etc.) and VR specific settings. You may want to set these up even before the session starts. And you may want to set your own default settings, too.
Author
Owner

Added subscribers: @dfelinto, @JulianEisel

Added subscribers: @dfelinto, @JulianEisel

Added subscriber: @SimeonConzendorf

Added subscriber: @SimeonConzendorf

Added subscriber: @RainerTrummer

Added subscriber: @RainerTrummer

Added subscriber: @Somnolent

Added subscriber: @Somnolent

Added subscriber: @aronkvh

Added subscriber: @aronkvh

As an outsider noob, I'd make a suggestion to check out this SketchUp extension , how they managed to achieve simplicity and efficiency (in SU which is a slow and shit program and this is the only reason I still use it).

As an outsider noob, I'd make a suggestion to check out [this SketchUp extension ](https://vrsketch.eu/index.html), how they managed to achieve simplicity and efficiency (in SU which is a slow and shit program and this is the only reason I still use it).

Added subscriber: @Sj

Added subscriber: @Sj

Added subscriber: @Skyfish

Added subscriber: @Skyfish

Adobe is plowing ahead in this field after they bough "oculus medium". I used medium quite a bit (vive + steam + revive) and would say it is intuitive and not overly complex how they solved their UI. In the new "Adobe substance 3D modeler" where theu use tech from "medium", Adobe is now increasing complexity and adding more and more feature that look intuitive and relatively minimalist while still retaining function, it's a nice case study for seeing how UI in VR modeling can be done. Here is a video that shows basic operations. https://youtu.be/DR6U2tX_73A

There are a few other VR apps that it benefits to study how they try to solve these same challenges. Like "Kodon" https:*store.steampowered.com/app/479010/Kodon/ (trying to do both sculpting and poly modeling seperately) and Masterpiece studio (was best at sculpting (not tried the studio version) https:*masterpiecestudio.com/ as well as Gravity sketch https://www.gravitysketch.com/ (has interesting takes on how to UI pollymodeling two handed)

I am an artist, with not much tech knowledge, but if there is an arena I can partake in discussion or test UI (with help) I have vive headset and index controllers.

RSI (repetitive strain injury) is something to consider when designing UI for VR. I find that sculpting in VR can be tiring, holding the arms stretched out takes a toll fast.

The benefits of sculpting in VR is very high, I am 5 times faster in Medium than Blender. (Not exaggerating)

Adobe is plowing ahead in this field after they bough "oculus medium". I used medium quite a bit (vive + steam + revive) and would say it is intuitive and not overly complex how they solved their UI. In the new "Adobe substance 3D modeler" where theu use tech from "medium", Adobe is now increasing complexity and adding more and more feature that look intuitive and relatively minimalist while still retaining function, it's a nice case study for seeing how UI in VR modeling can be done. Here is a video that shows basic operations. https://youtu.be/DR6U2tX_73A There are a few other VR apps that it benefits to study how they try to solve these same challenges. Like "Kodon" https:*store.steampowered.com/app/479010/Kodon/ (trying to do both sculpting and poly modeling seperately) and Masterpiece studio (was best at sculpting (not tried the studio version) https:*masterpiecestudio.com/ as well as Gravity sketch https://www.gravitysketch.com/ (has interesting takes on how to UI pollymodeling two handed) I am an artist, with not much tech knowledge, but if there is an arena I can partake in discussion or test UI (with help) I have vive headset and index controllers. RSI (repetitive strain injury) is something to consider when designing UI for VR. I find that sculpting in VR can be tiring, holding the arms stretched out takes a toll fast. The benefits of sculpting in VR is very high, I am 5 times faster in Medium than Blender. (Not exaggerating)

Added subscriber: @JacobMerrill-1

Added subscriber: @JacobMerrill-1

we have openXR included and there is an api we can use to create context to call bpy operators / bmesh /etc using VR handles already,

the VR addon needs extended to include sculpting and modeling easier for sure.

we have openXR included and there is an api we can use to create context to call bpy operators / bmesh /etc using VR handles already, the VR addon needs extended to include sculpting and modeling easier for sure.

Added subscriber: @CrivenS

Added subscriber: @CrivenS

Just joined solely as I really want to be able to sculpt directly in blender in VR. Adobe medium is nice and all but Blender is where its at. My background is arts and I'm attempting from scratch to figure out what is required to add this functionality any assistance I can offer by way of testing designing Ui etc is freely given.

In another different but related note, Open Brush is also Python, Open source and ready made is there some way it could be rolled in? https://openbrush.app/

Just joined solely as I really want to be able to sculpt directly in blender in VR. Adobe medium is nice and all but Blender is where its at. My background is arts and I'm attempting from scratch to figure out what is required to add this functionality any assistance I can offer by way of testing designing Ui etc is freely given. In another different but related note, Open Brush is also Python, Open source and ready made is there some way it could be rolled in? https://openbrush.app/
Philipp Oeser removed the
Interest
EEVEE & Viewport
label 2023-02-09 15:15:29 +01:00
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
9 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#68994
No description provided.