Blender HMD Support #47899

Closed
opened 2016-03-23 16:31:49 +01:00 by Julian Eisel · 31 comments
Member

Blender HMD Support

For virtual reality support, it's crucial that Blender has tools to work with a variety of HMD (head-mounted display) devices. At the bare minimum, it should be able to handle input from them and render a viewport adjusted for HMDs.
If we want to work towards a more feature rich support, we need a solid base system, meaning good initial interface and code design decisions are needed.

HMD Drivers

While there are free drivers available for all common HMDs, almost none are cross-platform and compatible with Blender's GPL license.
Currently the only option that fits those needs appears to be the OpenHMD driver/library. It's still a young project - features such as position tracking are still missing (though being worked on)- but it's heading into a promising direction.
The point has been raised though, that it should be possible to use other drivers as well and let the user decide which one to use.

UI/Workflow

The UI/workflow should work well together with multiview/stereo 3D since they are greatly overlapping features. HMDs require some special features though, which need to be implemented nicely too:

  • Most HMDs work just as an additional screen, so Blender should allow to open a separate window and render the HMD driven viewport into this.
  • The ability to open an extra window for a HMD viewport might be an interesting feature to adapt for multiview/stereo 3D too.
# Blender HMD Support For virtual reality support, it's crucial that Blender has tools to work with a variety of HMD (head-mounted display) devices. At the bare minimum, it should be able to handle input from them and render a viewport adjusted for HMDs. If we want to work towards a more feature rich support, we need a solid base system, meaning good initial interface and code design decisions are needed. ## HMD Drivers While there are free drivers available for all common HMDs, almost none are cross-platform and compatible with Blender's GPL license. Currently the only option that fits those needs appears to be the [OpenHMD ](http://openhmd.net/) driver/library. It's still a young project - features such as position tracking are still missing (though being worked on)- but it's heading into a promising direction. The point has been raised though, that it should be possible to use other drivers as well and let the user decide which one to use. ## UI/Workflow The UI/workflow should work well together with multiview/stereo 3D since they are greatly overlapping features. HMDs require some special features though, which need to be implemented nicely too: * Most HMDs work just as an additional screen, so Blender should allow to open a separate window and render the HMD driven viewport into this. * The ability to open an extra window for a HMD viewport might be an interesting feature to adapt for multiview/stereo 3D too.
Author
Member

Changed status to: 'Open'

Changed status to: 'Open'
Author
Member

Added subscribers: @JulianEisel, @TheOnlyJoey, @dfelinto

Added subscribers: @JulianEisel, @TheOnlyJoey, @dfelinto
Member

Added subscriber: @LazyDodo

Added subscriber: @LazyDodo
Member

While there are free drivers available for all common HMDs, almost none are cross-platform and compatible with Blender's GPL license.

While i don't want to get in a licensing debate here, OpenVR supports linux/mac/linux and ships on a BSD license. That being said it's not open as in opensource, ships with a header and binary libs, then again, that has never bothered us for other libraries (cuda). I'm not advocating focusing all our efforts on openVR and dropping openhmd completely though, a flexible backend that allows relatively easy support of new and upcoming HMD's would be preferable. whether that support comes from openhmd or the blender side of things, 'meh' as long as it's there.

"Most HMDs work just as an additional screen"

This is no longer true, both the rift and vive have switched to 'direct' mode to allow rending to the front buffer and keeping the OS'es compositor from claiming the display (which caused an extra frame of latency and in some cases locked the hmd to the refresh-rate of the main display on the system among other problems) and the 'extended mode' (where it appears as just another monitor) is no longer available.

I tried adding a openVR backend to openhmd over the weekend, most of the information openhmd wanted i couldn't give it (separate ipd , positional and rotational parameters, for instance, openvr gives you a handy transformation matrix for the left and right eye) also the rendering to the window did not fit this pipeline at all, in the end i just gave up on it.

>While there are free drivers available for all common HMDs, almost none are cross-platform and compatible with Blender's GPL license. While i don't want to get in a licensing debate here, OpenVR supports linux/mac/linux and ships on a BSD license. That being said it's not open as in opensource, ships with a header and binary libs, then again, that has never bothered us for other libraries (cuda). I'm not advocating focusing all our efforts on openVR and dropping openhmd completely though, a flexible backend that allows relatively easy support of new and upcoming HMD's would be preferable. whether that support comes from openhmd or the blender side of things, 'meh' as long as it's there. > "Most HMDs work just as an additional screen" This is no longer true, both the rift and vive have switched to 'direct' mode to allow rending to the front buffer and keeping the OS'es compositor from claiming the display (which caused an extra frame of latency and in some cases locked the hmd to the refresh-rate of the main display on the system among other problems) and the 'extended mode' (where it appears as just another monitor) is no longer available. I tried adding a openVR backend to openhmd over the weekend, most of the information openhmd wanted i couldn't give it (separate ipd , positional and rotational parameters, for instance, openvr gives you a handy transformation matrix for the left and right eye) also the rendering to the window did not fit this pipeline at all, in the end i just gave up on it.
Member

On some further googling, https://github.com/OSVR on first sight seems like a viable more complete alternative to openhmd?

On some further googling, https://github.com/OSVR on first sight seems like a viable more complete alternative to openhmd?
Member

I want to start with adding that I am not opposed per-se to using other external libraries, but do have my reservations and reasons to highly recommend not using them. This post will mainly consists out of reasons not to use X and provide reasons for why.

In #47899#365690, @LazyDodo wrote:

While there are free drivers available for all common HMDs, almost none are cross-platform and compatible with Blender's GPL license.

While i don't want to get in a licensing debate here, OpenVR supports linux/mac/linux and ships on a BSD license. That being said it's not open as in opensource, ships with a header and binary libs, then again, that has never bothered us for other libraries (cuda).

That is something that is a non issue using OpenHMD, since that does not require binary's at all.
Also OpenVR is mainly implemented as a framework for the Vive and implentations for other devices are poorly managed and a non-priority. Valve is also notorious for dropping these kind of initiatives (like VOGL).

"Most HMDs work just as an additional screen"

This is no longer true, both the rift and vive have switched to 'direct' mode to allow rending to the front buffer and keeping the OS'es compositor from claiming the display (which caused an extra frame of latency and in some cases locked the hmd to the refresh-rate of the main display on the system among other problems) and the 'extended mode' (where it appears as just another monitor) is no longer available.

This is also partly true, Direct mode is still iffy and currently only supported on Windows.
Also extended mode is not removed for these devices, Oculus removed it from their official SDK and Vive still has it as a option. Extended mode can still be enabled using OpenHMD (currently in branch, will be mainlined soon) on the Oculus and will continue to support this as a compatible mode.

Also your reaction is 'the rift and vive' which is not 'most of the HMD's' but a smaller percentage.
Currently we are looking at about 20 HMD's releasing this or next year, of course Rift and Vive are the current big ones in the media, but they are not the entire industry. Most of these don't have access to Direct mode yet and still use extended.

I tried adding a openVR backend to openhmd over the weekend, most of the information openhmd wanted i couldn't give it (separate ipd , positional and rotational parameters, for instance, openvr gives you a handy transformation matrix for the left and right eye) also the rendering to the window did not fit this pipeline at all, in the end i just gave up on it.

OpenHMD also has features like that (4x4 projection and model matrices etc), though we are not currently using this yet in the Blender implementation. We are currently fixing up the current version for mainlining and will start running through everything and reimplementing certain parts to fit a nicer implementation. Keep in mind everything was build on a strict deadline and was hooked into the Multiview code, and needs re-evaluation after finishing the first version.

In #47899#365700, @LazyDodo wrote:
On some further googling, https://github.com/OSVR on first sight seems like a viable more complete alternative to openhmd?

OSVR is something that in theory sounds good, but has a big pile of issues, some which OpenVR also has.

They don't really do anything
Both frameworks are just wrappers with a lot of (bloated) boilerplate code to abstract existing libraries, they don't solve a lot of problems you have with versioning of libraries, support of the devices, os support and licensing issues (see next point).
The maintenance of using a library like OSVR is also high, breaking constantly with release of new versions, and mostly focussing on their own OSVR HMD which is their preferred device. Also compiling with support for hmd 1 can break hmd 2 etc etc.

They are not drivers:
Why is this a issue? mostly licensing and operating system support.
OS support is a simple one, a lot of HMD developers focus on Windows, and only Windows, OpenHMD provides support for FreeBSD, Linux (even works on ARM), Mac OSX, Windows, Android and soon iOS.
Licensing is a bit more tricky. Some HMD's don't allow their SDK to be used alongside other HMD's or sort-like peripherals, some don't allow sharing of systems such like Lens distortion shaders (regardless if they are provided by their SDK or if they are externally provided). Back in 2013 when we first started with VR development, we had a IT lawyer look into these issues, its a fallpit of problems you don't want to get involved in and was a major reason for us to push a open source driver alternative.
Last year on GDC-Europe there was a session purely about VR support in games and how not to screw yourself over.

My reason for preferring OpenHMD is that it solves most issues, has a good and constantly active development base, does not cost a lot to maintain (other then occasionally updating the source, which I would gladly do myself), works on every platform, provides support for a wide array of HMD's (a lot will be mainlined in the next 1/2 months).

I want to start with adding that I am not opposed per-se to using other external libraries, but do have my reservations and reasons to highly recommend not using them. This post will mainly consists out of reasons not to use X and provide reasons for why. > In #47899#365690, @LazyDodo wrote: >>While there are free drivers available for all common HMDs, almost none are cross-platform and compatible with Blender's GPL license. > > While i don't want to get in a licensing debate here, OpenVR supports linux/mac/linux and ships on a BSD license. That being said it's not open as in opensource, ships with a header and binary libs, then again, that has never bothered us for other libraries (cuda). That is something that is a non issue using OpenHMD, since that does not require binary's at all. Also OpenVR is mainly implemented as a framework for the Vive and implentations for other devices are poorly managed and a non-priority. Valve is also notorious for dropping these kind of initiatives (like VOGL). >> "Most HMDs work just as an additional screen" > > This is no longer true, both the rift and vive have switched to 'direct' mode to allow rending to the front buffer and keeping the OS'es compositor from claiming the display (which caused an extra frame of latency and in some cases locked the hmd to the refresh-rate of the main display on the system among other problems) and the 'extended mode' (where it appears as just another monitor) is no longer available. This is also partly true, Direct mode is still iffy and currently only supported on Windows. Also extended mode is not removed for these devices, Oculus removed it from their official SDK and Vive still has it as a option. Extended mode can still be enabled using OpenHMD (currently in branch, will be mainlined soon) on the Oculus and will continue to support this as a compatible mode. Also your reaction is 'the rift and vive' which is not 'most of the HMD's' but a smaller percentage. Currently we are looking at about 20 HMD's releasing this or next year, of course Rift and Vive are the current big ones in the media, but they are not the entire industry. Most of these don't have access to Direct mode yet and still use extended. > I tried adding a openVR backend to openhmd over the weekend, most of the information openhmd wanted i couldn't give it (separate ipd , positional and rotational parameters, for instance, openvr gives you a handy transformation matrix for the left and right eye) also the rendering to the window did not fit this pipeline at all, in the end i just gave up on it. OpenHMD also has features like that (4x4 projection and model matrices etc), though we are not currently using this yet in the Blender implementation. We are currently fixing up the current version for mainlining and will start running through everything and reimplementing certain parts to fit a nicer implementation. Keep in mind everything was build on a strict deadline and was hooked into the Multiview code, and needs re-evaluation after finishing the first version. > In #47899#365700, @LazyDodo wrote: > On some further googling, https://github.com/OSVR on first sight seems like a viable more complete alternative to openhmd? OSVR is something that in theory sounds good, but has a big pile of issues, some which OpenVR also has. **They don't really do anything** Both frameworks are just wrappers with a lot of (bloated) boilerplate code to abstract existing libraries, they don't solve a lot of problems you have with versioning of libraries, support of the devices, os support and licensing issues (see next point). The maintenance of using a library like OSVR is also high, breaking constantly with release of new versions, and mostly focussing on their own OSVR HMD which is their preferred device. Also compiling with support for hmd 1 can break hmd 2 etc etc. **They are not drivers:** Why is this a issue? mostly licensing and operating system support. OS support is a simple one, a lot of HMD developers focus on Windows, and only Windows, OpenHMD provides support for FreeBSD, Linux (even works on ARM), Mac OSX, Windows, Android and soon iOS. Licensing is a bit more tricky. Some HMD's don't allow their SDK to be used alongside other HMD's or sort-like peripherals, some don't allow sharing of systems such like Lens distortion shaders (regardless if they are provided by their SDK or if they are externally provided). Back in 2013 when we first started with VR development, we had a IT lawyer look into these issues, its a fallpit of problems you don't want to get involved in and was a major reason for us to push a open source driver alternative. Last year on GDC-Europe there was a session purely about VR support in games and how not to screw yourself over. My reason for preferring OpenHMD is that it solves most issues, has a good and constantly active development base, does not cost a lot to maintain (other then occasionally updating the source, which I would gladly do myself), works on every platform, provides support for a wide array of HMD's (a lot will be mainlined in the next 1/2 months).
Joey Ferwerda self-assigned this 2016-03-23 21:47:41 +01:00
Member

Your whole case is 'I want openHMD the driver' which is fine, it's your code, you can make it do whatever your heart desires, You make a great case for why openHMD should exist (multiplatform, licensing etc etc) but none on why blender should not support vendors native SDK's or a wrapper-lib like OSVR when available (and have a compatible license), I for one feel that the vendors of HMD's are better at making driver for their hardware than you are and would rather leverage them than rely on your reverse engineering skills, sure they are not available on all platforms and not as open as you would like, but in the end they do give the best end user experience.

My reason for preferring OpenHMD is that it solves most issues, has a good and constantly active development base

But does it? It has 109 commits since 2013, It lacks basic features like positional tracking in DK2 (which shipped in Juli 2014) sure you might have some things going on behind the scenes, (like support for 20 up and coming headsets) but it sure doesn't "look" active , well supported or complete for that matter.

From where i'm sitting I see a young, immature, incomplete library trying to push it self upon us, in a rather aggressive way, with the only justification 'it'll be awesome in the future, I promise!' while talking loud smack about everything else out there, we don't accept this kind of behavior from any other library, and while i'm willing to give you the benefit of the doubt (maybe it will be awesome, maybe it won't) I am extremely uncomfortable betting on it.

Once more i'm not saying drop openHMD and stop working on it, I'm saying allow for other libraries to be used in blender, whether that be OSVR , openVR or vendor XYZ's specific SDK. If we design the interfaces on the blender side well enough we should be 'future proof' enough for whatever fantastic contraption comes out next week/month/year.

Your whole case is 'I want openHMD the driver' which is fine, it's your code, you can make it do whatever your heart desires, You make a great case for why openHMD should exist (multiplatform, licensing etc etc) but none on why blender should not support vendors native SDK's or a wrapper-lib like OSVR when available (and have a compatible license), I for one feel that the vendors of HMD's are better at making driver for their hardware than you are and would rather leverage them than rely on your reverse engineering skills, sure they are not available on all platforms and not as open as you would like, but in the end they do give the best end user experience. >My reason for preferring OpenHMD is that it solves most issues, has a good and constantly active development base But does it? It has 109 commits since 2013, It lacks basic features like positional tracking in DK2 (which shipped in Juli 2014) sure you might have some things going on behind the scenes, (like support for 20 up and coming headsets) but it sure doesn't "look" active , well supported or complete for that matter. From where i'm sitting I see a young, immature, incomplete library trying to push it self upon us, in a rather aggressive way, with the only justification 'it'll be awesome in the future, I promise!' while talking loud smack about everything else out there, we don't accept this kind of behavior from any other library, and while i'm willing to give you the benefit of the doubt (maybe it will be awesome, maybe it won't) I am extremely uncomfortable betting on it. Once more i'm not saying drop openHMD and stop working on it, I'm saying allow for other libraries to be used in blender, whether that be OSVR , openVR or vendor XYZ's specific SDK. If we design the interfaces on the blender side well enough we should be 'future proof' enough for whatever fantastic contraption comes out next week/month/year.
Member

In #47899#365792, @LazyDodo wrote:
Your whole case is 'I want openHMD the driver' which is fine, it's your code, you can make it do whatever your heart desires, You make a great case for why openHMD should exist (multiplatform, licensing etc etc) but none on why blender should not support vendors native SDK's or a wrapper-lib like OSVR when available (and have a compatible license), I for one feel that the vendors of HMD's are better at making driver for their hardware than you are and would rather leverage them than rely on your reverse engineering skills, sure they are not available on all platforms and not as open as you would like, but in the end they do give the best end user experience.

You clearly have not read my post then, specially the part regarding license issues.
That OSVR as a library has a good license does not mean it is free from license issues, they do support using conflicting backends (licensing wise) in conjunction which can cause problems in the end for us, hence me staying away from it (see my part about it not being a driver), again I suggest to read my post more carefully.

I am open to implement the system in a way where other drivers or SDK's could hook in, but I would strongly recommend not shipping any with blender in any shape or form to avoid previous mentioned issues.
I also recommend doing this in a later stage, when the current implementation is finished to keep focus for now on getting this done. (since we are close to finishing this current implementation)

Edit:
A lot of development is done on a wider scale then just supporting the DK2 camera, positional tracking is being worked on for a while now developing a generic system with CV backends for supporting all kinds of camera's (also third party) to implement positional for not just the HMD's that provide a positional implementation, or using higher quality camera's then provided first party

> In #47899#365792, @LazyDodo wrote: > Your whole case is 'I want openHMD the driver' which is fine, it's your code, you can make it do whatever your heart desires, You make a great case for why openHMD should exist (multiplatform, licensing etc etc) but none on why blender should not support vendors native SDK's or a wrapper-lib like OSVR when available (and have a compatible license), I for one feel that the vendors of HMD's are better at making driver for their hardware than you are and would rather leverage them than rely on your reverse engineering skills, sure they are not available on all platforms and not as open as you would like, but in the end they do give the best end user experience. You clearly have not read my post then, specially the part regarding license issues. That OSVR as a library has a good license does not mean it is free from license issues, they do support using conflicting backends (licensing wise) in conjunction which can cause problems in the end for us, hence me staying away from it (see my part about it not being a driver), again I suggest to read my post more carefully. I am open to implement the system in a way where other drivers or SDK's could hook in, but I would strongly recommend not shipping any with blender in any shape or form to avoid previous mentioned issues. I also recommend doing this in a later stage, when the current implementation is finished to keep focus for now on getting this done. (since we are close to finishing this current implementation) Edit: A lot of development is done on a wider scale then just supporting the DK2 camera, positional tracking is being worked on for a while now developing a generic system with CV backends for supporting all kinds of camera's (also third party) to implement positional for not just the HMD's that provide a positional implementation, or using higher quality camera's then provided first party
Member

I am open to implement the system in a way where other drivers or SDK's could hook in,

That's all i'm asking for

but I would strongly recommend not shipping any with blender in any shape or form to avoid previous mentioned issues.

It's a little too early to decide what we'll ship with, if a vendor has an acceptable license (Vive ships with a BSD license on their SDK) I see no reason not shipping with support for it. ,

I also recommend doing this in a later stage, when the current implementation is finished to keep focus for now on getting this done. (since we are close to finishing this current implementation)

I'd rather we'd integrate some 3rd party sdk sooner rather than later, Just to rule out any 'oh we didn't think of that scenario's' (Like not having the hmd behave as a second monitor you can just drag a window to) like i said I had a stab at integrating openVR, over the weekend and miserably failed due to the programming models not mapping well (or at all) (Then again, that could have been due to the openhmd version that's public right now being awfully outdated)

>I am open to implement the system in a way where other drivers or SDK's could hook in, That's all i'm asking for >but I would strongly recommend not shipping any with blender in any shape or form to avoid previous mentioned issues. It's a little too early to decide what we'll ship with, if a vendor has an acceptable license (Vive ships with a BSD license on their SDK) I see no reason not shipping with support for it. , >I also recommend doing this in a later stage, when the current implementation is finished to keep focus for now on getting this done. (since we are close to finishing this current implementation) I'd rather we'd integrate some 3rd party sdk sooner rather than later, Just to rule out any 'oh we didn't think of that scenario's' (Like not having the hmd behave as a second monitor you can just drag a window to) like i said I had a stab at integrating openVR, over the weekend and miserably failed due to the programming models not mapping well (or at all) (Then again, that could have been due to the openhmd version that's public right now being awfully outdated)

Added subscriber: @VRguy

Added subscriber: @VRguy

Hello all,

I work for Sensics and we co-founded the OSVR project, built the software platforms and continue to do lots of work - in collaboration with many other contributors - on OSVR. As such, I thought I would chime in to the discussion. There is also a suggestion at the end of the email.

  1. License. OSVR software is licensed under Apache 2.0. It is free and open source.

  2. OSVR supports many, many devices. You can see the full list here http://osvr.github.io/compatibility/ . This includes multiple HMDs (Vuzix, Oculus, Sensics, OSVR HDK and many others), orientation and position trackers, eye trackers, cameras, devices like the Leap Motion, the Nod Ring and the Myo as well as many others.

  3. The idea behind OSVR is to perform device abstraction for several types of devices (display, tracker, eye tracker, gesture, skeleton, imager, etc.) so that the application can have a common API regardless of the underlying hardware. For instance, you can have a Unity executable that uses OSVR and then change the target HMD in runtime using a configuration utility.

  4. OSVR runs on Windows, Android, Linux and OS X.

  5. OSVR provides tools for high-performance VR experiences. For instance, it includes direct mode (OpenGL/DirectX) for both AMD and NVIDIA cards, time warping, distortion correction, predictive tracking and more.

  6. We use a client-server model. The server can be placed on the same machine as the client(s) or it can be separate. Can even have different operating systems running each. For instance, we had a user run OSVR client on a Gear VR and then connect to an external server for a wide-area tracking system.

  7. We allow both open-source and closed-source plugins that are dynamically loaded. This allows vendors like Leap Motion to connect to OSVR without exposing their IP. In some cases - like Oculus - we can either connect to their runtime or use a direct (non Oculus) driver to obtain most functionality. Not all the plugins are device plugins - for instance we have a new plugin for the open-source ARToolit project to provide augmented reality target recognition.

  8. OSVR provides various language bindings including a stable C API. We also have initial Python support. A group of students at UNC also created an initial OSVR plugin for Blender. It is, of course, open-sourced. You can find it and many other projects here http://osvr.github.io/

  9. Contrary to popular believe, OSVR software is not focused on supporting the OSVR HDK (HMD). We support many other HMDs including HMD plugins that we cannot mention yet.

  10. OSVR now has been connected to multiple engines: Unity, Unreal (native support coming), Monogame, CryEnging, WebVR, SteamVR and more. This ensures not only an active user based but also demonstrates that it is entirely doable.

  11. Besides community involvement, we have a full-time team that is dedicated to driving the platform and supporting the user base.

  12. OSVR has more than 300 partners today including Intel, Jaunt VR, Leap Motion, Vuzix and many others.

I could go on... but perhaps the best thing is to organize a Web meeting where we can answer any questions regarding OSVR towards helping the Blender Foundation and community make an informed decision?

Hello all, I work for Sensics and we co-founded the OSVR project, built the software platforms and continue to do lots of work - in collaboration with many other contributors - on OSVR. As such, I thought I would chime in to the discussion. There is also a suggestion at the end of the email. 1. License. OSVR software is licensed under Apache 2.0. It is free and open source. 2. OSVR supports many, many devices. You can see the full list here http://osvr.github.io/compatibility/ . This includes multiple HMDs (Vuzix, Oculus, Sensics, OSVR HDK and many others), orientation and position trackers, eye trackers, cameras, devices like the Leap Motion, the Nod Ring and the Myo as well as many others. 3. The idea behind OSVR is to perform device abstraction for several types of devices (display, tracker, eye tracker, gesture, skeleton, imager, etc.) so that the application can have a common API regardless of the underlying hardware. For instance, you can have a Unity executable that uses OSVR and then change the target HMD in runtime using a configuration utility. 4. OSVR runs on Windows, Android, Linux and OS X. 5. OSVR provides tools for high-performance VR experiences. For instance, it includes direct mode (OpenGL/DirectX) for both AMD and NVIDIA cards, time warping, distortion correction, predictive tracking and more. 6. We use a client-server model. The server can be placed on the same machine as the client(s) or it can be separate. Can even have different operating systems running each. For instance, we had a user run OSVR client on a Gear VR and then connect to an external server for a wide-area tracking system. 7. We allow both open-source and closed-source plugins that are dynamically loaded. This allows vendors like Leap Motion to connect to OSVR without exposing their IP. In some cases - like Oculus - we can either connect to their runtime or use a direct (non Oculus) driver to obtain most functionality. Not all the plugins are device plugins - for instance we have a new plugin for the open-source ARToolit project to provide augmented reality target recognition. 8. OSVR provides various language bindings including a stable C API. We also have initial Python support. A group of students at UNC also created an initial OSVR plugin for Blender. It is, of course, open-sourced. You can find it and many other projects here http://osvr.github.io/ 9. Contrary to popular believe, OSVR software is not focused on supporting the OSVR HDK (HMD). We support many other HMDs including HMD plugins that we cannot mention yet. 10. OSVR now has been connected to multiple engines: Unity, Unreal (native support coming), Monogame, CryEnging, WebVR, SteamVR and more. This ensures not only an active user based but also demonstrates that it is entirely doable. 11. Besides community involvement, we have a full-time team that is dedicated to driving the platform and supporting the user base. 12. OSVR has more than 300 partners today including Intel, Jaunt VR, Leap Motion, Vuzix and many others. I could go on... but perhaps the best thing is to organize a Web meeting where we can answer any questions regarding OSVR towards helping the Blender Foundation and community make an informed decision?

Hi Yuval. I attended your presentation at IEEEVR last year, as well as chatted with you and your college about OSVR python bindings (I was presenting a poster on BlenderVR in case you recall it). I'm glad you are following this topic.

I think OSVR seems quite a match for Blender's requirement. I would like to hear from you on the following though.

License: I know OSVR is Apache, but how about its plugins? Can the user install the plugins separately from the main code-base? Are they easy to install or they need to be compiled with the "server/osvr runtime"?

Oculus: Are you using Oculus SDK underneath? (I guess not since Oculus license is not apache). Or yes you are using it but the plugins licenses are not apache, and are loaded in a way that doesn't break apache (or GPL in our case)?

Direct Mode: For the operating systems that do not support DM is the fallback provided from OSVR itself? Or the hosting application needs to create a new window in the proper display and then call OSVR for the drawing?

Forward/Backward Compatibility: Since this was mentioned earlier as a shortcome of OSVR care to comment on it?

Hi Yuval. I attended your presentation at IEEEVR last year, as well as chatted with you and your college about OSVR python bindings (I was presenting a poster on BlenderVR in case you recall it). I'm glad you are following this topic. I think OSVR seems quite a match for Blender's requirement. I would like to hear from you on the following though. **License:** I know OSVR is Apache, but how about its plugins? Can the user install the plugins separately from the main code-base? Are they easy to install or they need to be compiled with the "server/osvr runtime"? **Oculus:** Are you using Oculus SDK underneath? (I guess not since Oculus license is not apache). Or yes you are using it but the plugins licenses are not apache, and are loaded in a way that doesn't break apache (or GPL in our case)? **Direct Mode:** For the operating systems that do not support DM is the fallback provided from OSVR itself? Or the hosting application needs to create a new window in the proper display and **then** call OSVR for the drawing? **Forward/Backward Compatibility**: Since this was mentioned earlier as a shortcome of OSVR care to comment on it?

Added subscriber: @godbyk

Added subscriber: @godbyk

Added subscriber: @rpavlik

Added subscriber: @rpavlik

I'm Ryan - senior software engineer at Sensics heavily involved in the design and implementation of OSVR - I can answer some of the technical questions.

Regarding the structure:

OSVR, as you'd likely use it, has two general parts: OSVR-Core, and OSVR-RenderManager. Core is the "main" part, handling input/interaction devices and configuration, providing for network transparency, etc. RenderManager originated primarily as the API for accessing direct-mode functionality, but has now been extended as the primary recommended method for interacting with displays in OSVR (getting projection information, presenting frames, etc), as it provides time warp, predistortion, etc. (independent of whether or not you are using vendor specific direct-mode) without each application having to, for instance, re-implement every predistortion technique.

OSVR-Core is cleanly split into two halves: a server that loads plugins that provide generic interfaces and metadata, and a client library that connects to (one or more) server using those generic interfaces. Your application code, as far as OSVR-Core is concerned, would only link to the OSVR-ClientKit library, an Apache-2 licensed library, and use device-independent interfaces and semantic paths to access resources/devices from the server. That library has a very intentionally-designed, stable C API (with optional C++ header-only wrappers, which you may or may not use, depending on what level you'd want to integrate at - as mentioned earlier, the C API is designed for easy FFI use including from Python): I don't believe we've actually broken that API or ABI at all in the history of the project yet, despite having a version in 0.x. (I, and others on the project, take API, ABI, and protocol stability very seriously. We're nearing 1.0, so there will be a few minor changes, probably minimal to none on the client API, but then even more intense care of the API, ABI, and protocol.)

As a client application, your concern about OSVR-Core stops there: you wouldn't have to worry about the server, device support, plugins, nor would you have to ship any of those. Those are for the user to obtain and install. The server, as part of OSVR-Core, is also licensed Apache 2.0, and many plugins are as well. This includes a few that link to non-open-source libraries - Apache 2.0 is not a reciprocal license. (There are plugins to use Oculus devices using their SDK, and you can also use Oculus DK1/DK2 devices using the built-in VRPN interoperability of OSVR and the reverse-engineered drivers for those devices in VRPN. There is also a plugin to use another high-profile consumer VR device that loads some of the vendor's provided driver at runtime.) Plugins have their own carefully-designed stable C API with C++ header-only wrappers (PluginKit) to build against, so while we do often re-build the out-of-tree plugins we maintain in our CI when we build a new Core version, that's not necessary. (Plugins can be, and are, maintained in the OSVR-Core source tree or outside of it) A design goal when building this part of OSVR was that even if you only had a binary plugin from a vendor, your hardware/plugin shouldn't stop being useful if your vendor disappeared if we could help it. Plugins are literally a single shared object (dll/so) file dropped into the correct directory. Some come with a sample config file, many autoconfigure. Some that related to displays may come with an additional JSON data file containing display information. They're designed to be self-contained, and we're working on the Windows user experience for installing them right now. (They're suitable for separate packages on Linux systems and currently on Mac they'd be separate formulas on Homebrew.) The ecosystem is open to both open-source and binary plugins, and as a part of the VRPN ecosystem that has been the de-facto VR peripheral standard for some 15+ years (some commercial VR hardware comes with their native API being a built-in VRPN server) it's also open to multiple/distributed servers. I am not a lawyer, but this client/server arrangement with standardized generic interfaces used by the client keeps the client linking only to permissively licensed code, and the same arrangement has been used by open-source projects using VRPN for years.

I know that was a bit more prose than bullet-points, but hopefully it mostly answered the License, Oculus, and Compatibility questions - I'm not sure where forward/backward compatibility was mentioned specifically as a shortcoming of OSVR. I do see a rather vague reference above to some VR libraries as a group "constantly breaking"; I think I've addressed that point fairly well, but I'll summarize the history of breaking changes off the top of my head in the history of the OSVR-Core codebase: in the history of the project (publicly launched Jan 2015 at CES), I can't recall an API/ABI break for ClientKit (there may have been one very very early, before the source was public), there have been a handful of small API/ABI breaks for PluginKit done in conjunction with the hardware vendors and plugins that were using those interfaces, and there has been I believe 1 protocol break, at version 0.2 for the full introduction of the path tree throughout the stack (which, because of ABI compatibility, just meant you had to swap out dlls to make sure your server and clients were all on the same side of the 0.2 dividing line). An upcoming 1.0 will be our "last chance" for protocol breaks for a long time, and we'll hopefully get to fix a few plugin APIs while we're at it, but it should be pretty seamless on the client side. (And, we use VRPN internally and are compatible with devices supported by it, whose major version 07 highlights that it has broken its network protocol only 7 times over its extensive history...) Like I said, we're serious about this. I hate it when stuff breaks.

Regarding OSVR RenderManager: The current structure of this functionality, which builds on OSVR-Core (using a display as configured by an OSVR Server, and tracked head, etc) to provide a way to render to a VR device, automatically applying time-warp, any pre-distortion, and if applicable and required, direct mode, is as a single library. The main repository is Apache 2.0 licensed and provides for non-direct-mode rendering (but all the other features) in D3D and OpenGL on Windows, OpenGL on Linux and Mac, and GLES on Android. (So yes, the fallback is performed automatically by OSVR based on config files, etc. - if you start up OSVR ClientKit and RenderManager successfully, you'll get something to render into: it may or may not be direct mode.) The APIs needed to perform direct rendering on Windows are subject to NDA, and the (necessarily closed-source) code that uses them are in submodules in the RenderManager repository. RenderManager builds fine with or without those submodules initialized - it just affects what functionality it provides. Currently, the NDA direct-mode code is either built into the library or isn't: there isn't a client-server compositor structure yet paralleling that of OSVR-Core that would insulate GPL code from closed-source direct-mode code. Thus, my understanding is that at this time, you could build against the open source RenderManager library and codebase, but not distribute an application capable of direct mode. We distribute pre-compiled versions of the RenderManager DLL with the NDA features included, so it might be permissible for a user to replace the RenderManager DLL with the direct-mode capable one but not redistribute the combination, but naturally that's not ideal and we recognize that. (The RenderManager has a C API that, while not as stable as OSVR-Core, is becoming more stable, and has only broken ABI once that I can remember since it started being used more widely - an unused field was removed from a struct so API technically broke but no applications were affected beyond a recompile. We will similarly be committing to stabilizing and maintaining a stable API/ABI for RenderManager.)

However, that is a short-term problem, as we plan to move to a "compositor" style architecture for RenderManager with a separate compositor/server (running in a separate process on desktop platforms) and a client library accessing generic interfaces, just as in OSVR Core. The C API has intentionally been made minimal, so hopefully we won't have to change much there to make that happen, either. The distribution scenario in that case would then match OSVR-Core: A user would obtain the OSVR server, plugins, and the RenderManager compositor independently (from a vendor, from us, from their Linux distro or Homebrew, built themselves). The compositor may or may not include closed-source components as contractually (NDA) required, but a fully-featured open-source version would always be available, and would expose a generic interface for compositor clients. (We'd basically be splitting RenderManager in two, and adding additional functionality with layers and shared contexts.)

One quick note regarding "OpenVR" since I saw mention of someone trying it when scrolling through this issue: While the files in the repository from Valve are MIT licensed, their client/application API only contains header files and pre-built binaries of a client library. There are no sources for the openvr_api.{dll/so} files (even though architecturally, they appear to be primarily IPC clients for the main logic of the SteamVR middleware). As such, (IANAL again, but) I don't think GPL applications can actually use so-called "OpenVR" without adding a special exception to their usage of the GPL license since the library they're linking to is effectively proprietary, just with a permissively-licensed header file. (I personally tend to refer to that software and API as SteamVR, to make it clear it's not functional without Steam, to avoid leading people on about the freedoms they actually have with it since it appears to be GPL-incompatible, and to avoid confusion with OSVR.)

I'm Ryan - senior software engineer at Sensics heavily involved in the design and implementation of OSVR - I can answer some of the technical questions. Regarding the structure: **OSVR, as you'd likely use it, has two general parts: OSVR-Core, and OSVR-RenderManager.** Core is the "main" part, handling input/interaction devices and configuration, providing for network transparency, etc. RenderManager originated primarily as the API for accessing direct-mode functionality, but has now been extended as the primary recommended method for interacting with displays in OSVR (getting projection information, presenting frames, etc), as it provides time warp, predistortion, etc. (independent of whether or not you are using vendor specific direct-mode) without each application having to, for instance, re-implement every predistortion technique. **OSVR-Core is cleanly split into two halves:** a server that loads plugins that provide generic interfaces and metadata, and a client library that connects to (one or more) server using those generic interfaces. Your application code, as far as OSVR-Core is concerned, would only link to the OSVR-ClientKit library, an Apache-2 licensed library, and use device-independent interfaces and semantic paths to access resources/devices from the server. That library has a very intentionally-designed, stable C API (with optional C++ header-only wrappers, which you may or may not use, depending on what level you'd want to integrate at - as mentioned earlier, the C API is designed for easy FFI use including from Python): I don't believe we've actually broken that API or ABI at all in the history of the project yet, despite having a version in 0.x. (I, and others on the project, take API, ABI, and protocol stability very seriously. We're nearing 1.0, so there will be a few minor changes, probably minimal to none on the client API, but then even more intense care of the API, ABI, and protocol.) **As a client application, your concern about OSVR-Core stops there: you wouldn't have to worry about the server, device support, plugins, nor would you have to ship any of those.** Those are for the user to obtain and install. The server, as part of OSVR-Core, is also licensed Apache 2.0, and many plugins are as well. This includes a few that link to non-open-source libraries - Apache 2.0 is not a reciprocal license. (There are plugins to use Oculus devices using their SDK, and you can also use Oculus DK1/DK2 devices using the built-in VRPN interoperability of OSVR and the reverse-engineered drivers for those devices in VRPN. There is also a plugin to use another high-profile consumer VR device that loads some of the vendor's provided driver at runtime.) Plugins have their own carefully-designed stable C API with C++ header-only wrappers (PluginKit) to build against, so while we do often re-build the out-of-tree plugins we maintain in our CI when we build a new Core version, that's not necessary. (Plugins can be, and are, maintained in the OSVR-Core source tree or outside of it) A design goal when building this part of OSVR was that even if you only had a binary plugin from a vendor, your hardware/plugin shouldn't stop being useful if your vendor disappeared if we could help it. Plugins are literally a single shared object (dll/so) file dropped into the correct directory. Some come with a sample config file, many autoconfigure. Some that related to displays may come with an additional JSON data file containing display information. They're designed to be self-contained, and we're working on the Windows user experience for installing them right now. (They're suitable for separate packages on Linux systems and currently on Mac they'd be separate formulas on Homebrew.) The ecosystem is open to both open-source and binary plugins, and as a part of the VRPN ecosystem that has been the de-facto VR peripheral standard for some 15+ years (some commercial VR hardware comes with their native API being a built-in VRPN server) it's also open to multiple/distributed servers. I am not a lawyer, but this client/server arrangement with standardized generic interfaces used by the client keeps the client linking only to permissively licensed code, and the same arrangement has been used by open-source projects using VRPN for years. **I know that was a bit more prose than bullet-points**, but hopefully it mostly answered the License, Oculus, and Compatibility questions - I'm not sure where forward/backward compatibility was mentioned specifically as a shortcoming of OSVR. I do see a rather vague reference above to some VR libraries as a group "constantly breaking"; I think I've addressed that point fairly well, but I'll summarize the history of breaking changes off the top of my head in the history of the OSVR-Core codebase: in the history of the project (publicly launched Jan 2015 at CES), I can't recall an API/ABI break for ClientKit (there may have been one very very early, before the source was public), there have been a handful of small API/ABI breaks for PluginKit done in conjunction with the hardware vendors and plugins that were using those interfaces, and there has been I believe 1 protocol break, at version 0.2 for the full introduction of the path tree throughout the stack (which, because of ABI compatibility, just meant you had to swap out dlls to make sure your server and clients were all on the same side of the 0.2 dividing line). An upcoming 1.0 will be our "last chance" for protocol breaks for a long time, and we'll hopefully get to fix a few plugin APIs while we're at it, but it should be pretty seamless on the client side. (And, we use VRPN internally and are compatible with devices supported by it, whose major version 07 highlights that it has broken its network protocol only 7 times over its extensive history...) Like I said, we're serious about this. I hate it when stuff breaks. **Regarding OSVR RenderManager**: The current structure of this functionality, which builds on OSVR-Core (using a display as configured by an OSVR Server, and tracked head, etc) to provide a way to render to a VR device, automatically applying time-warp, any pre-distortion, and if applicable and required, direct mode, is as a single library. The main repository is Apache 2.0 licensed and provides for non-direct-mode rendering (but all the other features) in D3D and OpenGL on Windows, OpenGL on Linux and Mac, and GLES on Android. (So yes, the fallback is performed automatically by OSVR based on config files, etc. - if you start up OSVR ClientKit and RenderManager successfully, you'll get something to render into: it may or may not be direct mode.) The APIs needed to perform direct rendering on Windows are subject to NDA, and the (necessarily closed-source) code that uses them are in submodules in the RenderManager repository. RenderManager builds fine with or without those submodules initialized - it just affects what functionality it provides. Currently, the NDA direct-mode code is either built into the library or isn't: there isn't a client-server compositor structure yet paralleling that of OSVR-Core that would insulate GPL code from closed-source direct-mode code. Thus, my understanding is that at this time, you could build against the open source RenderManager library and codebase, but not distribute an application capable of direct mode. We distribute pre-compiled versions of the RenderManager DLL with the NDA features included, so it might be permissible for a user to replace the RenderManager DLL with the direct-mode capable one but not redistribute the combination, but naturally that's not ideal and we recognize that. (The RenderManager has a C API that, while not as stable as OSVR-Core, is becoming more stable, and has only broken ABI once that I can remember since it started being used more widely - an unused field was removed from a struct so API technically broke but no applications were affected beyond a recompile. We will similarly be committing to stabilizing and maintaining a stable API/ABI for RenderManager.) However, that is a short-term problem, as we plan to move to a "compositor" style architecture for RenderManager with a separate compositor/server (running in a separate process on desktop platforms) and a client library accessing generic interfaces, just as in OSVR Core. The C API has intentionally been made minimal, so hopefully we won't have to change much there to make that happen, either. The distribution scenario in that case would then match OSVR-Core: A user would obtain the OSVR server, plugins, and the RenderManager compositor independently (from a vendor, from us, from their Linux distro or Homebrew, built themselves). The compositor may or may not include closed-source components as contractually (NDA) required, but a fully-featured open-source version would always be available, and would expose a generic interface for compositor clients. (We'd basically be splitting RenderManager in two, and adding additional functionality with layers and shared contexts.) **One quick note regarding "OpenVR"** since I saw mention of someone trying it when scrolling through this issue: While the files in the repository from Valve are MIT licensed, their client/application API only contains header files and pre-built binaries of a client library. There are no sources for the `openvr_api.{dll/so}` files (even though architecturally, they appear to be primarily IPC clients for the main logic of the SteamVR middleware). As such, (IANAL again, but) I don't think GPL applications can actually use so-called "OpenVR" without adding a special exception to their usage of the GPL license since the library they're linking to is effectively proprietary, just with a permissively-licensed header file. (I personally tend to refer to that software and API as SteamVR, to make it clear it's not functional without Steam, to avoid leading people on about the freedoms they actually have with it since it appears to be GPL-incompatible, and to avoid confusion with OSVR.)

Added subscriber: @NicholasBenge

Added subscriber: @NicholasBenge

Added subscriber: @david.rysk

Added subscriber: @david.rysk

Added subscriber: @JoelGerlach

Added subscriber: @JoelGerlach

Added subscriber: @EjnerFergo

Added subscriber: @EjnerFergo

Added subscriber: @ppicazo

Added subscriber: @ppicazo

Added subscriber: @hoxolotl

Added subscriber: @hoxolotl

Added subscriber: @jherico

Added subscriber: @jherico

I've approached this problem in my own work on an open source project and my solution was to create an abstraction for output devices and add support for dynamically loading plugins that implement the abstraction for specific devices. Our core codebase is Apache licensed, not GPL, so there are likely constraints you would have that we don't, but it seems like it might be a viable path forward. If you made the plugin specification itself available under a more permissive license than GPL (say LGPL or Apache), people could build plugins regardless of the underlying HMD SDK license, while core plugins (for basic 2D display) could remain under GPL.

Because of this approach I tend to get the best parts of every SDK, and in some cases I'm able to simulate support for features an SDK doesn't provide (such as enabling timewarp for OpenVR based devices) and the ability to continue to support legacy platforms (such as supporting the Rift DK2 and DK1 on Mac and Linux).

Ultimately the interface between the application and the final output device can be very thin, since all you really need to hand the output device is a texture containing the rendered content (possibly a texture with a stereo rendering) and information about the head pose used to render that image (only applicable for HMD devices, of course).

I've approached this problem in my own work on an [open source project ](https://github.com/highfidelity/hifi) and my solution was to create an abstraction for output devices and add support for dynamically loading plugins that implement the abstraction for specific devices. Our core codebase is Apache licensed, not GPL, so there are likely constraints you would have that we don't, but it seems like it might be a viable path forward. If you made the plugin specification itself available under a more permissive license than GPL (say LGPL or Apache), people could build plugins regardless of the underlying HMD SDK license, while core plugins (for basic 2D display) could remain under GPL. Because of this approach I tend to get the best parts of every SDK, and in some cases I'm able to simulate support for features an SDK doesn't provide (such as enabling timewarp for OpenVR based devices) and the ability to continue to support legacy platforms (such as supporting the Rift DK2 and DK1 on Mac and Linux). Ultimately the interface between the application and the final output device can be very thin, since all you really need to hand the output device is a texture containing the rendered content (possibly a texture with a stereo rendering) and information about the head pose used to render that image (only applicable for HMD devices, of course).

Added subscriber: @Ozeki

Added subscriber: @Ozeki

@jherico - OSVR (including RenderManager) is also Apache 2.0 licensed. At least from your description, it sounds like you're re-treading some of the same ground we've already covered with OSVR: for instance, we can have timewarp on all devices, whether or not they have direct mode or what the plugin's underlying SDK is built on, we've got non-Oculus Runtime-based Rift support (raw USB HID) that's cross-platform, etc.

I think you perhaps more clearly and concisely summarized some of the benefits of using OSVR in Blender than I did - I got caught up answering questions. The client library that the application interacts with (primarily OSVR-ClientKit) is fairly thin and operates only on generic interfaces/abstractions. The server provides concrete implementations of those abstractions with dynamically-loadable plugins that (due to the network protocol separation and generic interface) may have any license that can link in Apache 2.0, keeping potentially proprietary HMD SDK licenses or binary blobs (in the case of some commercial consumer HMDs) properly air-gapped away from Blender. Adopting OSVR, then, not only gets you substantial device support right away, it also essentially saves you the work of having to write your own abstraction, since it was designed to be a universal VR abstraction layer, based on over a decade of work in VR I/O abstractions.

(You can certainly write a driver plugin for OSVR directly; that API is designed to make such development quick and easy. See for instance our video-based positional tracking plugin - the whole algorithm is built into an OSVR plugin that ships with the core system. However, you can also write a plugin that wraps another SDK to use as an "adapter" and we have these as well: there's an Oculus Runtime-based one, and there's one that loads the SteamVR Lighthouse driver directly (not through the SteamVR application API, though that would be another approach that would work) to operate the Vive.)

@jherico - OSVR (including RenderManager) is also Apache 2.0 licensed. At least from your description, it sounds like you're re-treading some of the same ground we've already covered with OSVR: for instance, we can have timewarp on all devices, whether or not they have direct mode or what the plugin's underlying SDK is built on, we've got non-Oculus Runtime-based Rift support (raw USB HID) that's cross-platform, etc. I think you perhaps more clearly and concisely summarized some of the benefits of using OSVR in Blender than I did - I got caught up answering questions. The client library that the application interacts with (primarily OSVR-ClientKit) is fairly thin and operates only on generic interfaces/abstractions. The server provides concrete implementations of those abstractions with dynamically-loadable plugins that (due to the network protocol separation and generic interface) may have any license that can link in Apache 2.0, keeping potentially proprietary HMD SDK licenses or binary blobs (in the case of some commercial consumer HMDs) properly air-gapped away from Blender. Adopting OSVR, then, not only gets you substantial device support right away, it also essentially saves you the work of having to write your own abstraction, since it was designed to be a universal VR abstraction layer, based on over a decade of work in VR I/O abstractions. (You can certainly write a driver plugin for OSVR directly; that API is designed to make such development quick and easy. See for instance our video-based positional tracking plugin - the whole algorithm is built into an OSVR plugin that ships with the core system. However, you can also write a plugin that wraps another SDK to use as an "adapter" and we have these as well: there's an Oculus Runtime-based one, and there's one that loads the SteamVR Lighthouse driver directly (not through the SteamVR application API, though that would be another approach that would work) to operate the Vive.)

Added subscriber: @LeeP

Added subscriber: @LeeP

Shouldn't there be in development a type of VR room so that some one can import and view only there work, first.
I think that would bring in more people interested on the development this project.

Shouldn't there be in development a type of VR room so that some one can import and view only there work, first. I think that would bring in more people interested on the development this project.

Added subscriber: @StirlingGoetz

Added subscriber: @StirlingGoetz

Added subscriber: @Olm-e-9

Added subscriber: @Olm-e-9

Added subscriber: @esummers

Added subscriber: @esummers
Author
Member

Closed as duplicate of #68998

Closed as duplicate of #68998
Sign in to join this conversation.
19 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: blender/blender#47899
No description provided.