Page MenuHome

Viewport HMD integration using OpenHMD
Needs RevisionPublic

Authored by Julian Eisel (Severin) on Aug 2 2016, 1:10 AM.

Details

Summary

Viewport HMD integration using OpenHMD

This adds basic support for controlling the viewport view using a head mounted display (HMD).

To use it, go to the Properties Editor, Render Layers context. Enable Views, select "HMD View". You can then open a new HMD Window from there and start a HMD session.

TODOs:

  • Move HMD options to a better place, they are not Render Layer related (maybe properties region?). This also involves separating HMD usage from RenderData.views_format.
  • Zoom and pan don't work in HMD window while in camera view
  • View streched vertically in HMD window while not in camera view
  • TODOs/XXXs marked in code (esp. WM_ calls in BKE_)
  • Apply D1350 for better mouse interaction while in HMD view

Patch by @Joey Ferwerda (TheOnlyJoey), @Koen Mertens (Galadus) and myself.

Diff Detail

Repository
rB Blender
Branch
HMD_viewport
Build Status
Buildable 73
Build 73: arc lint + arc unit

Event Timeline

Julian Eisel (Severin) retitled this revision from to Viewport HMD integration using OpenHMD.Aug 2 2016, 1:10 AM
Julian Eisel (Severin) updated this object.
Julian Eisel (Severin) updated this revision to Diff 7165.
LazyDodo (LazyDodo) added inline comments.
extern/openhmd/src/platform-win32.c
51

this won't build with msvc 2015 needs to be

DWORD __stdcall ohmd_thread_wrapper(void* t)
extern/openhmd/src/openhmd.c
200

Device->close above frees the device pointer, you can't de-reference it anymore here.

extern/openhmd/src/drv_oculus_rift/packet.c
62

READFLOAT is 2 statements , in a for loop only the first one is executed in the loop, second one executes after the loop ,this needs braces

Some initial review. Its's quite hard because the workflow is not really clear and from quick look achieved by some weird hacks and bad level calls.

extern/openhmd/src/omath.h
16

Is that for MSVC? Did you try using #define _USE_MATH_DEFINES prior to including math ?

intern/cycles/blender/addon/ui.py
546

Is there a way to keep Cycles unaware of HMD? It's a bit weird because the way how you visualize scene in the viewport shouldn't really affect on how you render it.

intern/ghost/intern/GHOST_C-api.cpp
1022

If it's possible that someone calls this functions with OPENHMD disabled better to set matrix to unity. Otherwise add an assert(!"Crap happened") so we catch possible issues early.

source/blender/blenkernel/CMakeLists.txt
43

This is a bad level call, must be avoid.

source/blender/blenloader/intern/versioning_270.c
1256

Why's that part of RenderData?

source/blender/makesdna/DNA_gpu_types.h
66

So far flags are capitals, not camel case.

Notes after some testing in Linux with DK2:

  • No wireframe mode support (also, rendered shading mode should not be available).
  • Why let user change "Lens Distortion" ? For example, I have the DK2 selected, and yet I see this as an option.
  • Does the "HMD Window" needs the shader running at all times? I would expect the window to be pitch black when HMD session is not running.
  • The zoom is part of the perspective matrix which is (should be) entirely calculated by the HMD. In other words, I think the perspective matrix has being wrong (fix P436).
  • We need documentation. It's not clear to me what is planned for the future, or what is simply missing. For example:
    • We need a mirror mode, to see the result of the HMD in the regular viewport
    • Timeline editor Playback menu to allow playback only on HMD view

Basically VR is something big. We should communicate to the users the extent of our support for them to accommodate their expectations. As well as a clear picture of where does it fit in a workflow.

Notes for other testers

Set your display to rotated counter clockwise (left) 90 degrees.

I had to run with sudo to get the rotations from DK2. In the console I had the warning: [EE] Could not open /dev/bus/usb/002/006.

Also, to test check if your HMD is detected in the User Preferences > System:

Then go to the viewport and in the HMD View panel:

  1. Click on Open HMD Window
  2. Move the new window to your external monitor corresponding to the HMD
  3. Click on Start Session

A couple of the made comments are already fixed in a push I will be doing in a couple of days.
Thanks for the feedback btw!

Notes after some testing in Linux with DK2:

  • No wireframe mode support (also, rendered shading mode should not be available).

Wireframe works on the 2 systems I test on, am running with a couple of commits in the future though, please retest after commit.
For rendered shading mode I am working on a realtime reprojection method for rendered frames, if i can not get it working in time I will remove the mode.

  • Why let user change "Lens Distortion" ? For example, I have the DK2 selected, and yet I see this as an option.

So I have added a shader for DK1 as well (next push), and the DK2 shader works pretty well as a 'generic shader' for a lot of lens types which gives a 'good enough' result (tested with PSVR and CV1).
The option to disable the lens distortion is also important, since there are HMD's with already corrected lenses (like the Deepoon E2).

  • Does the "HMD Window" needs the shader running at all times? I would expect the window to be pitch black when HMD session is not running.

Interesting take, I would like @Julian Eisel (Severin) 's perspective on this as well, I think keeping things black is bad in general (since to me it implies something is not working/rendering). AFAIK the window does not render new frames until there is activity so it should not take a lot of cycles/frames.

  • The zoom is part of the perspective matrix which is (should be) entirely calculated by the HMD. In other words, I think the perspective matrix has being wrong (fix P436).

I made a couple of fixes to the projection matrix due to some regressions, will be pushed in next patch.
The Zooming etc was requested by a bunch of testers I have, they expect all blender camera functionality to also work with the HMD mode, I think this is good behaviour since it just adjusts some values instead of replacing the entire matrices.

  • We need documentation. It's not clear to me what is planned for the future, or what is simply missing. For example:

Been working on this for quite some time.
There is a development plan I am working on which I will be kicking online soon, I will start doing code blogs soon again (restarting my personal blog), which can be used for kicking around idea's and logging what is up.

User documentation will come in February where demo video's will be shot and all functionality and restrictions will be layed out.

  • We need a mirror mode, to see the result of the HMD in the regular viewport

Will put it on the list, should be optional IMHO since being able to work in other perspectives while putting on/off the HMD seems to be the most usable case at the moment.

  • Timeline editor Playback menu to allow playback only on HMD view

@Julian Eisel (Severin) can you give your opinion about this please?

Basically VR is something big. We should communicate to the users the extent of our support for them to accommodate their expectations. As well as a clear picture of where does it fit in a workflow.

Notes for other testers

Set your display to rotated counter clockwise (left) 90 degrees.

Only the case for some HMD's like the DK2, please check this per device (most don't require this).

I had to run with sudo to get the rotations from DK2. In the console I had the warning: [EE] Could not open /dev/bus/usb/002/006.

This is due to Linux requiring UDEV rules to allow users to access certain USB devices.
Documentation will have a list how to add this, and we are looking at methods to providing packages for easy install.
We are looking for smart methods to request udev rules during runtime, if anyone has a idea about this please pitch in.

This is no issue on Windows and Mac OSX.
Not tested with FreeBSD myself.

Hi Joey, thanks for the quick reply. Some general thoughts followed by specific replies:

  1. Why to hold back on the fixes? Are those OpenHMD fixes? Blender fixes? This patch was sent for consideration for inclusion in master months ago. And yet it still lacks basic documentation as well as proper support for some fundamental behaviour like correct projection matrix.
  1. I'm missing a definition of the audience of this feature. For instance are we to support someone producing for Youtube + Cardboard? Someone making games to be experienced in VR? People that are not using VR for the final product, but want to "work in VR" to get a better sense of scale or what not. Hobbyists first? Professionals first? ...
  1. Are you rendering offscreen or simply applying a screen shader? The HMDs I worked with (Oculus, Vive) have a resolution they expect you to use for the offscreen rendering, so that the final warped buffer doesn't have plenty of stretched pixels and look great.
  1. If the HMD Device is set to None the HMD View panel should not be there (and "Lens Distortion" should be grayed out).
  1. Was there any investigation of ways to create the window in the external monitor (ala Direct Mode) automatically?

The option to disable the lens distortion is also important, since there are HMD's with already corrected lenses (like the Deepoon E2).

Can't OpenHMD handle that? To add this burden to the user seems counterproductive/bad design.

[re: hmd window when not running session] I think keeping things black is bad in general

Black with a text telling user to drag it to the external monitor. I'm sure @Julian Eisel (Severin) can do that in no time. Anyways, my main concern here is performance. To have a non-running window should not affect the performance of the rest of Blender.

The Zooming etc was requested by a bunch of testers I have, they expect all blender camera functionality to also work with the HMD mode (...).

Do they? Could you provide a more clear user-case?

Because as far as I know to get proper zoom in VR we also need to change the model view matrix (for eye separation). Otherwise it's just a weird effect. And on top of that there is no relation between the lens zoom and VR-zoom. VR-zoom would scale the entire scene. That can be an option (in my addon I use scene units for the zoom I think), and a helpful one. I just don't see this tied to the existent viewport/camera lens parameters.

[re: mirror] Will put it on the list, should be optional IMHO since being able to work in other perspectives while putting on/off the HMD seems to be the most usable case at the moment.

I meant optional, of course :)

Dalai Felinto (dfelinto) requested changes to this revision.
This revision now requires changes to proceed.Mar 6 2017, 10:22 AM

Bugs:

  • Camera Lens should not be used!!!
  • Multi-View is interferring with the HMD viewport
  • If I open a file while the HMD window is on, the new file opens in the "new window"

I had a chat about the integration with @Dalai Felinto (dfelinto), will outline the results in a separate comment. Just answering some previous points here.

  • No wireframe mode support (also, rendered shading mode should not be available).

You mean the shader is not used in wireframe mode? I'm aware of that, it's an easy fix and I'll for sure do it before pushing to master ;)

  • Why let user change "Lens Distortion" ? For example, I have the DK2 selected, and yet I see this as an option.

Since rB1d1ba540bba, the shader is generically generated based on the active device. The shader option was removed.

  • Does the "HMD Window" needs the shader running at all times? I would expect the window to be pitch black when HMD session is not running.

Didn't think this would be an issue since I always assumed people would only open the HMD window, however I realize this may actually not be the case. Disabled the shader for until a session is started, we should also skip the viewport drawing until then like you suggested. Will do that after merging to master.

  • The zoom is part of the perspective matrix which is (should be) entirely calculated by the HMD. In other words, I think the perspective matrix has being wrong (fix P436).

We've discussed this a bit today, will outline in an extra comment.

  • We need documentation. It's not clear to me what is planned for the future, or what is simply missing.

Joey worked on coverage for the manual, I also asked him to work on some step by step guides for release notes.

  • We need a mirror mode, to see the result of the HMD in the regular viewport

Added that, rBb3c156927a33.

  • Timeline editor Playback menu to allow playback only on HMD view

Would indeed be handy, added to the todo.

  1. Are you rendering offscreen or simply applying a screen shader? The HMDs I worked with (Oculus, Vive) have a resolution they expect you to use for the offscreen rendering, so that the final warped buffer doesn't have plenty of stretched pixels and look great.

We're not rendering to offscreen, simply because we realized that's not needed. Triple buffer drawing already draws to a texture, I guess if OpenHMD eventually gets direct mode support we can send this texture for drawing in the HMD. The region we draw during the HMD session always takes up the entire screen and has the exact same dimensions as the screen/HMD. That means there shouldn't be any stretched pixels and everything should look great ;)

  1. If the HMD Device is set to None the HMD View panel should not be there (and "Lens Distortion" should be grayed out).

I'm not so sure about removing the panel. IMHO it should still be possible to open the HMD view and start the session, so that you can just plug in a device and it's ready to go (we automatically activate a device if one gets plugged in and the user pref option hasn't been set before).

  1. Was there any investigation of ways to create the window in the external monitor (ala Direct Mode) automatically?

That part had to be supported by OpenHMD first which is not the case right now.

Black with a text telling user to drag it to the external monitor. I'm sure @Julian Eisel (Severin) can do that in no time. Anyways, my main concern here is performance. To have a non-running window should not affect the performance of the rest of Blender.

See my answer above re skipping shader and viewport drawing.

So here is a outline of what I've discussed with @Dalai Felinto (dfelinto). As for a possible merge into master, if the bugs mentioned in Dalai's last comment are fixed, a merge should be fine.

Incorrect camera zoom

The way we apply camera zooming while running an HMD session is wrong. We can't apply it on the projection matrix without messing things up. However we do want to support zooming while in camera view, so we'll try to apply the zoom to the modelview matrix instead. (Zoom isn't the correct term when applying it on the modelview matrix, we actually change the position from which we're looking from)

Improving editing from within HMD session

Users shouldn't have to keep the mouse cursor in the HMD window so they can do edits there. It's quite annoying having to switch back and forth between the windows all the time.
What we could do is sharing an FBO between the HMD window and the 3D view it was created from. Meaning it would always be in mirrored mode, but almost without any slowdown since we don't do an extra pass to draw the geometry for the mirrored display (the FBO can be re-rendered even with different dimensions).

Conflicts with proprietary HMD drivers (Windows only)

The proprietary HMD drivers block any way of requesting information from HMDs while they are running. OpenHMD works around this by shutting down the driver for as long as Blender runs.
@LazyDodo (LazyDodo) made the point that this is not acceptable since it would mean users can't keep Blender open while using their HMD for non-Blender stuff (assuming it's not through OpenHMD). While we agreed that this isn't acceptable, we also noticed OpenHMD can only do this if Blender is launched as administrator. This has to be done explicitly though and is nothing users usually do, so with this in mind it's an acceptable (but still ugly :/) thing. Luckily we can blame proprietary drivers here :P

Docs

We agreed that we need docs that contain a few things:

  • What or which target audience this integration is for - A: The integration is supposed to add support for previewing and editing of VR content from within Blender.
  • Reason why we push this implementation now even if there are known workflow quirks - A: We want to finally bring VR content editing to all platforms. Blender would likely be the first app to do this.
  • Supported devices and limitations
  • Step by step guides on how to make it work on various platforms.
  • [Optionally] Future plans
  • Docs on how to upgrade the OpenHMD version (we can't assume there always is a maintainer for it).

So as you can see we've decided to be honest and communicate the limitations of our integration well.

Conflicts with proprietary HMD drivers (Windows only)

The proprietary HMD drivers block any way of requesting information from HMDs while they are running. OpenHMD works around this by shutting down the driver for as long as Blender runs.
@LazyDodo (LazyDodo) made the point that this is not acceptable since it would mean users can't keep Blender open while using their HMD for non-Blender stuff (assuming it's not through OpenHMD). While we agreed that this isn't acceptable, we also noticed OpenHMD can only do this if Blender is launched as administrator. This has to be done explicitly though and is nothing users usually do, so with this in mind it's an acceptable (but still ugly :/) thing. Luckily we can blame proprietary drivers here :P

Just a recap, the current state is:

  1. Oculus support in blender does not work unless we shut down the vendor supplied 'driver' (service).
  2. When we do shut down this service, any other apps (ue4/unity) using the rift lose their hmd support even though the hmd is not actively being used in blender.
  3. This is ok, because none of our users will run as administrator and it's all the vendors fault anyhow.

If you take all the right steps, run as admin, re-enable extended mode using some hackery (none if which we can expect our end users to to) and finally do get blender using the hmd, there's the following issues:

  1. When you start the hmd session , the hmd will be set as the primary display, meaning the taskbar and all applications running on the main monitor (blender!!) will move to the hmd. leaving you with a completely empty screen with just a wallpaper on the main monitor.
  1. There's something wrong with the projection, it seriously hurt my eyes after a few seconds.
  1. There is currently no positional tracking for the oculus in openhmd.

I'm sorry, but given the current state, I see no compelling reason to ship oculus support on the windows platform. It's just not good enough. I'd be willing to give 6 a pass , but the other points inc the hackery to re-enable extended mode are deal breakers.

I don't have any other hmd's to test with, so i have no idea what the experience will be for vive users, it might not be the worst idea to post a build on Blender Artists and get some feedback from users before we merge this to master?

Julian Eisel (Severin) edited edge metadata.EditedMar 7 2017, 5:23 PM
Julian Eisel (Severin) updated this revision to Diff 8390.
  • Return info in tooltip on why IPD button is disabled
  • HMD settings are completely independent from multiview, render and camera data now
  • Show info in tooltip on why HMD session can't be started
  • Fall back to return unit matrix in OpenHMD GHOST calls
  • Add libhidapi to install_deps.sh script.
  • All HMD options are in properties region (3D View) and User Preferences (System) now.
  • Allow zooming/panning while HMD view is in camera perspective
  • Use solid draw mode by default for HMD view
  • Mirror mode support (sync HMD viewpoint with regular 3D view)
  • Use view orientation data from current 3D view for creating HMD view
  • "Only Render" option for HMD view
  • Don't allow orthographic view in HMD view
  • Disable OpenHMD dummy device for release builds
  • PSVR, Oculus CV1 and Vive support.
  • Fixes for HMD support windows build (thanks LazyDodo)
  • Refactor split-view drawing to make popups readable
  • Don't draw HMD view lens shader if session is not running
  • Draw additional viewport info in HMD view again (3D-cursor, mini-axis, etc)

And of course a bunch of fixes.
Just updating patch so people can have another look if they want, this patch here was ancient.

Just a recap, the current state is:

  1. Oculus support in blender does not work unless we shut down the vendor supplied 'driver' (service).
  2. When we do shut down this service, any other apps (ue4/unity) using the rift lose their hmd support even though the hmd is not actively being used in blender.
  3. This is ok, because none of our users will run as administrator and it's all the vendors fault anyhow.

If you take all the right steps, run as admin, re-enable extended mode using some hackery (none if which we can expect our end users to to) and finally do get blender using the hmd, there's the following issues:

  1. When you start the hmd session , the hmd will be set as the primary display, meaning the taskbar and all applications running on the main monitor (blender!!) will move to the hmd. leaving you with a completely empty screen with just a wallpaper on the main monitor.
  2. There's something wrong with the projection, it seriously hurt my eyes after a few seconds.
  3. There is currently no positional tracking for the oculus in openhmd.

I'm sorry, but given the current state, I see no compelling reason to ship oculus support on the windows platform. It's just not good enough. I'd be willing to give 6 a pass , but the other points inc the hackery to re-enable extended mode are deal breakers.

In short: If we want a multi-platform HMD support in Blender that's all we can get for now. For Linux users that's a big step forward, for Windows users not so much. (Not sure how HMD support in OSX is doing in general.)
I'm not saying I'm really happy with this, and I'm still not against having additional OpenVR support for those who have proprietary drivers for their system. I made sure there's an abstraction between OpenHMD and GHOST/Blender, so if the OpenVR API isn't too terribly different (and from a quick glance it isn't), it shouldn't be a huge task to add support for it (volunteers, raise your hand!).
I'm not aware of what OpenHMD does so that it changes the primary display, and of course that should definitely not be the case. @Joey Ferwerda (TheOnlyJoey), could you describe what's going on there?

Regarding the projection issue you mentioned... it's of course hard to tell, but I'd guess this happened while in camera view? The projection matrix is still wrong in that case, one of the 4 things to fix before pushing. If this happens outside of camera view and if you don't get this in other apps, it's a bug that needs to be addressed.

In short: If we want a multi-platform HMD support in Blender that's all we can get for now. For Linux users that's a big step forward, for Windows users not so much. (Not sure how HMD support in OSX is doing in general.)

I do recognize this is a leap forward for some platforms, and I'm not flat out rejecting the differential, I'm just saying we shouldn't ship oculus support on windows (and should really investigate the vive status) because support on this platform does not reach the quality bar. If if works properly on other platforms, there's no reason to hold those back.

Regarding the projection issue you mentioned... it's of course hard to tell, but I'd guess this happened while in camera view? The projection matrix is still wrong in that case, one of the 4 things to fix before pushing. If this happens outside of camera view and if you don't get this in other apps, it's a bug that needs to be addressed.

I just dragged in a monkey, and clicked start hmd session. if that's not the right way, some other tweaks to the ui might be needed to prevent users from doing the same?

In short: If we want a multi-platform HMD support in Blender that's all we can get for now. For Linux users that's a big step forward, for Windows users not so much. (Not sure how HMD support in OSX is doing in general.)

I do recognize this is a leap forward for some platforms, and I'm not flat out rejecting the differential, I'm just saying we shouldn't ship oculus support on windows (and should really investigate the vive status) because support on this platform does not reach the quality bar. If if works properly on other platforms, there's no reason to hold those back.

Regarding the projection issue you mentioned... it's of course hard to tell, but I'd guess this happened while in camera view? The projection matrix is still wrong in that case, one of the 4 things to fix before pushing. If this happens outside of camera view and if you don't get this in other apps, it's a bug that needs to be addressed.

I just dragged in a monkey, and clicked start hmd session. if that's not the right way, some other tweaks to the ui might be needed to prevent users from doing the same?

So i just pushed the functionality for disabling OVRService only when opening the device, while running in administrator mode.
This requires my hidapi patch (currently on https://github.com/TheOnlyJoey/hidapi), will be mainlined in some form later on.

I also pushed fixes to the CV1 including new distortion parameters that are confirmed to be correct :)
If this is all fine, i will submit my documentation as patch and would like to request merge to master so we can get in before bcon3.

In short: If we want a multi-platform HMD support in Blender that's all we can get for now. For Linux users that's a big step forward, for Windows users not so much. (Not sure how HMD support in OSX is doing in general.)

I do recognize this is a leap forward for some platforms, and I'm not flat out rejecting the differential, I'm just saying we shouldn't ship oculus support on windows (and should really investigate the vive status) because support on this platform does not reach the quality bar. If if works properly on other platforms, there's no reason to hold those back.

Regarding the projection issue you mentioned... it's of course hard to tell, but I'd guess this happened while in camera view? The projection matrix is still wrong in that case, one of the 4 things to fix before pushing. If this happens outside of camera view and if you don't get this in other apps, it's a bug that needs to be addressed.

I just dragged in a monkey, and clicked start hmd session. if that's not the right way, some other tweaks to the ui might be needed to prevent users from doing the same?

So i just pushed the functionality for disabling OVRService only when opening the device, while running in administrator mode.
This requires my hidapi patch (currently on https://github.com/TheOnlyJoey/hidapi), will be mainlined in some form later on.
I also pushed fixes to the CV1 including new distortion parameters that are confirmed to be correct :)
If this is all fine, i will submit my documentation as patch and would like to request merge to master so we can get in before bcon3.

I Just updated the libs in svn, we agreed to gather some user feedback with a testbuild before merging though, just because time is running out doesn't mean we can just start cutting corners. I'll personally do some testing tonight though.

LazyDodo (LazyDodo) requested changes to this revision.Mar 25 2017, 10:12 PM
  1. The custom hidbranch had bad sloppy bugs, causing my cv1 not to be detected at all, wasted most of friday evening on this.
  1. https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/platform-win32.c#l120

    Couple of problems here

    2a) lets imagine the situation that the service is running and we want to start it. result: lets stop the service! -> Should have used &&

    2b) Which would have been forgivable perhaps if it was only called when the device is actually in use however ohmd_ctx_destroy calls destroy on all drivers , which calls ohmd_toggle_ovr_service(1) Result: everytime I quit blender, ovrservice get stopped. (How does this not get noticed during testing?)
  1. After bypassing the bugs in 1+2, the hmd session starts but displays full screen on the main monitor not the HMD. the only way to get any output on the HMD is to
    1. Manually stop the service (cause you can't do B. with the service running)
    2. Manually switch the hmd into extended mode using DirectDisplayConfig.exe (which *NONE* of our user will know how to do, nor can we ask this of them)
    3. Click Open Hmd Window -> Start Session (why is this 2 buttons anyhow!?)

      At this point we have output on the hmd, however there is now the followig issuses to deal with

      -This makes the rift the main monitor, moving *all* applications over to it , leaving the main monitor with just a windows background. Things stay this way until you manage to close blender. (which pretty much boils down to holding down alt-f4 and hopig for the best) , forcing windows to have the monitor be the main screen keeps everything on the monitor (including the hmd window) with a windows wall-paper on the hmd.

      -Low frame rate, i'm not sure why our default cube scene rendered wireframe, would be running at 40fps? the movement is very jarring and unpleasant. I don't have the best gpu around (gtx 670), but I can run scenes with moderate complextiy in ue4 at 90fps easily.

      -When I rotate my head, I seem to be rotating around a virtual point behind me, instant motion sickness! not good! I feel this might have been caused by accidentally using my scroll wheel and causing the zoom to change. (not sure why this is allowed,if anything if we're gonna allow this this should move the camera position, not change the view matrix)

      -No positional tracking on my oculus cv1,I understand there's a technical reason for this, but for the end user it is still deeply disappointing and unexpected.

      -When you switch the HMD viewport to Rendered you lose all tracking

      -There's a race condition, use after free or double free problem somewhere, however every-time it happened visual studio was on the hmd making it impossible to debug, and I haven't been able to repro it on my main monitor, it rarely happens, and i can't seem to repro easily, so i'm gonna let this one slide. (but the problem exists)
  1. Code
  1. There's 4 packet.c in the openhmd codebase, these all show up in the ide without a path, hard to tell which is which.
  1. Funky code

https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_oculus_rift/packet.c#l61
https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_deepoon/packet.c#l61

READFLOAT is a multi statement macro, without brackets only the first part will be executed inside the loop.

Mentioned this on irc months ago, still not fixed, got told to send a pull to openhmd, you're trying to merge this code into our codebase, not me, you fix the bugs, not me.

  1. Left over debugging code (there's quite a bit of this, these are just some examples)

https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_htc_vive/vive.c#l194
https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_htc_vive/vive.c#l197

  1. code with unknown status, why is this here? why is this disabled? can it be removed? should we do extra work to enable it ?

https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_htc_vive/vive.c#l321

  1. calling CreateThread directly is not recommended

https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/platform-win32.c#l66
see http://stackoverflow.com/questions/331536/windows-threading-beginthread-vs-beginthreadex-vs-createthread-c for details

  1. Error checking

No validation if OpenSCManager succeeded and serviceDbHandle contains a valid value

No validation of serviceHandle

doesn't validate the return value of QueryServiceStatusEx, if this call fails , status does not have valid information in it

Mentioned above, but i'll mention it again, should be && otherwise you'll *STOP* the service when blender exits.

only needed if OpenSCManager succeeded

Only needed if OpenService succeeded

Final verdict:

It's virtually impossible to get any hmd output and when it worked I couldn't do anything else, the performance is disapointing,
features are lacking and it made me sick really quickly, the bugs I found were nothing short of sloppy and shouldn't have been
there if this code recieved *any* kind of testing. (same goes for the fixes I had to do in the custom hidapi fork)

Recommendations:

  1. I really can't recommend shipping with CV1 support on windows for 2.79 (and would really like users of other hmd's to have a go at it before deciding to ship support there) DirectMode will have to implemented before I can give my blessing for shipping oculus support on windows.
  1. Get users involved! Throw out testig builds, gather feedback! This seems to be developed by technical people doing technical solutions to problems, without ever seeing how this would affect an end user (the mucking about with the service and DirectDisplayConfig is a good example of this, the open hmd window, start hmd session another) most of our other branches have a thread on BA with people using it and providing feedback before finally getting considered for merging in master, why not the hmd_viewport branch? I feel like this branch is getting ready for testing, not for merging into master.
  1. Stop working in isolation! These code drops every couple of months with promises how almost done it is are clearly not the way to go.
  1. Carry over VR support to 2.80, I don't think it's quite mature enough for 2.79
This revision now requires changes to proceed.Mar 25 2017, 10:12 PM

First of all i want to mention that all LazyDodo's testing is done on the Oculus CV1 on Windows, one of the 6 current devices supported by the OpenHMD version in Blender, and the most problematic case over all.

  1. The custom hidbranch had bad sloppy bugs, causing my cv1 not to be detected at all, wasted most of friday evening on this.

I found this odd, after your feedback i fixed the issue and it is working out of the box right now, i know the code is not perfect, but it is functional and hopefully a temporary fix.

  1. https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/platform-win32.c#l120 Couple of problems here 2a) lets imagine the situation that the service is running and we want to start it. result: lets stop the service! -> Should have used && 2b) Which would have been forgivable perhaps if it was only called when the device is actually in use however ohmd_ctx_destroy calls destroy on all drivers , which calls ohmd_toggle_ovr_service(1) Result: everytime I quit blender, ovrservice get stopped. (How does this not get noticed during testing?)

I have not reimplemented re-enabling the driver yet, if this is a problem, it is literally a couple of lines fix.
The Service would only be stopped when starting the HMD session, and re-enabled when stopped, if previously enabled.
The reason why this functionality is needed is because there is not easy disabling of the OVR service though official methods.
Other drivers by other vendors manage this a lot nicer.

  1. After bypassing the bugs in 1+2, the hmd session starts but displays full screen on the main monitor not the HMD. the only way to get any output on the HMD is to
    1. Manually stop the service (cause you can't do B. with the service running)
    2. Manually switch the hmd into extended mode using DirectDisplayConfig.exe (which *NONE* of our user will know how to do, nor can we ask this of them)
    3. Click Open Hmd Window -> Start Session (why is this 2 buttons anyhow!?)

So the DirectDisplayConfig.exe stuff is a known situation and is part of the documentation, also just the case for the CV1 (on windows), and only when the official Oculus driver is present.
Not all HMD's support Direct Mode, and some work better in extended, so there should always be a option to manually manage your HMD window to appear on the correct screen.
Eventually when OpenHMD gets access to Direct Mode, we can implement this for most HMD's, but since it has drawbacks as well (can claim input exclusivity etc), it should always be optional.
Having the Window appear on the HMD screen automatically can be done with extended mode as well, but would require heavy modifications to the blender ghost window (a move to SDL2 for windowing has been suggested by me in the past, but got shut down).

At this point we have output on the hmd, however there is now the followig issuses to deal with
-This makes the rift the main monitor, moving *all* applications over to it , leaving the main monitor with just a windows background. Things stay this way until you manage to close blender. (which pretty much boils down to holding down alt-f4  and hopig for the best) , forcing windows to have the monitor be the main screen keeps everything on the monitor (including the hmd window) with a windows wall-paper on the hmd.

Your CV1 can be managed like a normal screen just like any other secondary display while in extended mode. this is also part of the documentation.
These issues are unfortunately part of the problematic CV1 support on Windows by Oculus their selves for things out of their closed ecosystem, having access to direct mode (hurry up Nvidia/AMD please) will make this easier though, but its still usable without.

-Low frame rate, i'm not sure why our default cube scene rendered wireframe, would be running at 40fps? the movement is very jarring and unpleasant. I don't have the best gpu around (gtx 670), but I can run scenes with moderate complextiy in ue4 at 90fps easily.

This is weird, I am testing on a Intel GPU (Iris Pro 580P) and Nvidia GTX 1060 and both are giving me a constant 60 fps vsynced even on heavier test scenes, if you can give me some profiled debug information i can look into it though.

-When I rotate my head, I seem to be rotating around a virtual point behind me, instant motion sickness! not good! I feel this might have been caused by accidentally using my scroll wheel and causing the zoom to change. (not sure why this is allowed,if anything if we're gonna allow this this should move the camera position, not change the view matrix)

This is one of @Julian Eisel (Severin)'s todo's that he is working on right now, i mentioned this on IRC as well, is easy fix. I hoped this would be pushed before your review.

-No positional tracking on my oculus cv1,I understand there's a technical reason for this, but for the end user it is still deeply disappointing and unexpected.

Yeah this has technical limitations and will be for a bit, mostly due to the complexity of a dependency on libusb, which has been shot down for inclusion in Blender in the past.
This will not be a issue for devices like the HTC Vive, since these don't require external USB trackers.

-When you switch the HMD viewport to Rendered you lose all tracking

Rendered is not supported, should be removed from the dropdown.

-There's a race condition, use after free or double free problem somewhere, however every-time it happened visual studio was on the hmd making it impossible to debug, and I haven't been able to repro it on my main monitor, it rarely happens, and i can't seem to repro easily, so i'm gonna let this one slide. (but the problem exists)

Yeah i tried a couple of times but can not reproduce this, let me know when you find it though!

  1. Code
    1. There's 4 packet.c in the openhmd codebase, these all show up in the ide without a path, hard to tell which is which.
    2. Funky code

https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_oculus_rift/packet.c#l61
https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_deepoon/packet.c#l61
READFLOAT is a multi statement macro, without brackets only the first part will be executed inside the loop.
Mentioned this on irc months ago, still not fixed, got told to send a pull to openhmd, you're trying to merge this code into our codebase, not me, you fix the bugs, not me.

  1. Left over debugging code (there's quite a bit of this, these are just some examples)

https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_htc_vive/vive.c#l194
https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_htc_vive/vive.c#l197

The Vive code is still in constant development, since this was a platform that needed fast merging for the 2.79 target, I merged it.
It is pretty functional, will get cleaned up in the next month.

  1. code with unknown status, why is this here? why is this disabled? can it be removed? should we do extra work to enable it ?

https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/drv_htc_vive/vive.c#l321

See comment above

  1. calling CreateThread directly is not recommended

https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/refs/heads/HMD_viewport:/extern/openhmd/src/platform-win32.c#l66
see http://stackoverflow.com/questions/331536/windows-threading-beginthread-vs-beginthreadex-vs-createthread-c for details

I did not write this, will let the developer ask to look into it.

  1. Error checking

No validation if OpenSCManager succeeded and serviceDbHandle contains a valid value

No validation of serviceHandle

doesn't validate the return value of QueryServiceStatusEx, if this call fails , status does not have valid information in it

Mentioned above, but i'll mention it again, should be && otherwise you'll *STOP* the service when blender exits.

only needed if OpenSCManager succeeded

Only needed if OpenService succeeded

I will add the needed error handling in my next update.

Final verdict:
It's virtually impossible to get any hmd output and when it worked I couldn't do anything else, the performance is disapointing,
features are lacking and it made me sick really quickly, the bugs I found were nothing short of sloppy and shouldn't have been
there if this code recieved *any* kind of testing. (same goes for the fixes I had to do in the custom hidapi fork)

Recommendations:

  1. I really can't recommend shipping with CV1 support on windows for 2.79 (and would really like users of other hmd's to have a go at it before deciding to ship support there) DirectMode will have to implemented before I can give my blessing for shipping oculus support on windows.

As mentioned before, Direct Mode is a difficult situation with vendor lock in's and difficult processes to get access to it. Extended mode works, but needs configuration (though you use the same steps as with a external display).
People using other HMD's don't have this issues, and the tens of thousands of users of Oculus DK1 and Oculus DK2 devices will not have this issue as well (since there is no support through the oculus driver anyway!).

  1. Get users involved! Throw out testig builds, gather feedback! This seems to be developed by technical people doing technical solutions to problems, without ever seeing how this would affect an end user (the mucking about with the service and DirectDisplayConfig is a good example of this, the open hmd window, start hmd session another) most of our other branches have a thread on BA with people using it and providing feedback before finally getting considered for merging in master, why not the hmd_viewport branch? I feel like this branch is getting ready for testing, not for merging into master.

People have been testing ever since the first commit to the branch, we are helping people out building blender quite often in the OpenHMD channel and have gathered and used feedback to improve further functionality and fix bugs.
The documentation is partly written with this feedback and examples from the testers.

  1. Stop working in isolation! These code drops every couple of months with promises how almost done it is are clearly not the way to go.

Cool can you get me payed to work on this, this is still a labor of love i am doing mostly in my spare time, if you want more frequent updates, please either help out with the development, or get me funding.

  1. Carry over VR support to 2.80, I don't think it's quite mature enough for 2.79

I disagree, If you are feeling strong about the CV1 support, I would suggest leaving it out of the supported devices and focus on the use of the Vive, Oculus DK1, DK2, PSVR and Deepoon E2 but keep the CV1 support enabled for the people who are ok with the additional steps.
I have mostly been testing on Linux (none of these issues exist) and with the DK2, Vive and PSVR on Windows where PSVR and DK2 use extended by default, and Vive can be set in extended really simple by their interface, and works as a regular external display as well.

All the issues (minus direct mode) mentioned can be squashed in about a week, and I think it would be unwise to keep this back while being fully functional for multiple devices on all platforms, while still having about 2 months to stabilize and work on the last tiny things.

First of all i want to mention that all LazyDodo's testing is done on the Oculus CV1 on Windows, one of the 6 current devices supported by the OpenHMD version in Blender, and the most problematic case over all.

  1. The custom hidbranch had bad sloppy bugs, causing my cv1 not to be detected at all, wasted most of friday evening on this.

I found this odd, after your feedback i fixed the issue and it is working out of the box right now, i know the code is not perfect, but it is functional and hopefully a temporary fix.

I'm not sure what's odd about it? The patch I send didn't just fall out of the sky, time and effort went into it. Also it doesn't work out of the box quite yet, svn libs still need updating will probably get that done this weekend.

-There's a race condition, use after free or double free problem somewhere, however every-time it happened visual studio was on the hmd making it impossible to debug, and I haven't been able to repro it on my main monitor, it rarely happens, and i can't seem to repro easily, so i'm gonna let this one slide. (but the problem exists)

Yeah i tried a couple of times but can not reproduce this, let me know when you find it though!

It's flaky but....
-have a debug build
-start blender
-open hmd window
-start hmd session
-open preferences
-close blender

this will work perfectly 90% of the time, but occasionally you get

	blender.exe!sig_handle_crash(int signum) Line 135	C    <-- extra bonus crash here, cause our crash handler assumes G.main is not null, which it is at this point in time, will file a separate bug on this 
 	blender.exe!windows_exception_handler(_EXCEPTION_POINTERS * ExceptionInfo) Line 267	C
 	[External Code]	
 	blender.exe!GHOST_OpenHMDManager::closeDevice() Line 207	C++  <-- actual crash here  
 	blender.exe!GHOST_OpenHMDManager::~GHOST_OpenHMDManager() Line 47	C++
 	[External Code]	
 	blender.exe!GHOST_System::exit() Line 367	C++
 	blender.exe!GHOST_System::~GHOST_System() Line 66	C++
 	blender.exe!GHOST_SystemWin32::~GHOST_SystemWin32() Line 219	C++
 	[External Code]	
 	blender.exe!GHOST_ISystem::disposeSystem() Line 94	C++
 	blender.exe!GHOST_DisposeSystem(GHOST_SystemHandle__ * systemhandle) Line 59	C++
 	blender.exe!wm_ghost_exit() Line 1501	C
 	blender.exe!WM_exit_ext(bContext * C, const bool do_python) Line 581	C
 	blender.exe!WM_exit(bContext * C) Line 614	C
 	blender.exe!wm_window_close(bContext * C, wmWindowManager * wm, wmWindow * win) Line 326	C
 	blender.exe!ghost_event_proc(GHOST_EventHandle__ * evt, void * C_void_ptr) Line 1117	C
 	blender.exe!GHOST_CallbackEventConsumer::processEvent(GHOST_IEvent * event) Line 53	C++
 	blender.exe!GHOST_EventManager::dispatchEvent(GHOST_IEvent * event) Line 102	C++
 	blender.exe!GHOST_EventManager::dispatchEvent() Line 113	C++
 	blender.exe!GHOST_EventManager::dispatchEvents() Line 120	C++
 	blender.exe!GHOST_System::dispatchEvents() Line 247	C++
 	blender.exe!GHOST_DispatchEvents(GHOST_SystemHandle__ * systemhandle) Line 239	C++
 	blender.exe!wm_window_process_events(const bContext * C) Line 1446	C
 	blender.exe!WM_main(bContext * C) Line 505	C
 	blender.exe!main(int argc, const unsigned char * * UNUSED_argv_c) Line 529	C

Hey!

I was asked to post my results on using this branch.
I have been doing occasional testing for about a half year now, initially with my Oculus DK2 and since the last version with my Vive.
On Linux everything works how i would expect it, the only thing which is a little confusing is that there is no virtual mouse representation yet in the VR window.
Normally this is done by creating a virtual mouse pointer in each half and have them track with half of the horizontal resolution when the hardware mouse is present in the window.

On Windows I only tested twice, the first time (about 4 months ago) i had a bit of trouble getting it to work, but it functioned mostly like on Linux after i made some tweaks.
With the current function it works without any modification for me, both my Vive and Oculus worked fine (i had to rotate the DK2 screen, but that is normal).
I must add that i don't have the official Oculus drivers installed, so the OVRService things do not apply on me, this because both DK1 and DK2 have no support by Oculus anyway.

When i used it my primary case is mostly architectural, I am a interior designer and use the VR support to get a better feel of depth and feel of the room.
It helped me speed up my process in 2 projects already and would love for this to be widely available (So i am not the one having to explain my colleagues what 'compiling' means...).

Edit: Forgot to mention, Using Kubuntu 16.04 and Windows 10 64 bit.

intern/ghost/intern/GHOST_C-api.cpp
942

This segfaults when you start blender with the --background parameter.

I used a much earlier version of this to model one of my girlfriends characters. She wore the HMD while I did the editing. We tried it again with all these changes and it is a big improvement. For example, mirror mode makes repositioning the HMD in the virtual world faster and much more pleasant for the HMD wearer.

extern/openhmd/src/openhmd.c
332

Missing return

338

Missing return

Reviewed Blender, window-manager side. Made some minor fixes but otherwise think this is fine on the DNA/RNA/WindowManager side.


I have one minor concern, WITH_INPUT_HMD is used before updating the window pointers in some places wm->hmd_view.hmd_win.

This could be a problem if you:

  • load a file made with HMD
  • modify windows on a Blender without-HMD
  • then open back on a HMD blender.

The wm->hmd_view.hmd_win pointer could be corrupt.

So suggest to keep all window pointer management outside WITH_INPUT_HMD blocks.

Joey Ferwerda (TheOnlyJoey) updated this revision to Diff 8624.
  • Use projection matrix from OpenHMD even if "Rotation from HMD" is disabled
  • OpenHMD update, disables OVRService upon starting the device, only when run with administrator.
  • Fixes Oculus CV1 Lens Correction
  • msvc fix in packet.c in PSVR driver
  • Merge branch 'master' into HMD_viewport
  • Fix HMD panel not visible after recent changes in master
  • Disable zooming/pannig of HMD view while in camera perspective
  • Updated Vive lens correction values, is a decent approximation, needs extra sample points when looking down
  • Numeral small fixes from upstream 0.3rc
  • Fixes for Windows compile, fixes crashes when trying to open locked devices
  • Merge branch 'master' into HMD_viewport
  • Fix for building without HMD
  • Use max_ff
  • Correct last commit, also quiet warnings
  • use 'use_' as an RNA prefix
  • Sync with master and minor formatting
  • Remove ifdef check for file versioning
  • Clear hmd window when no loaded
  • Merge branch 'master' into HMD_viewport
  • Cleanup: tabs
  • Special Blender-managed cursor drawing for HMD view
  • Add WM level utility check for active HMD view
  • Make interactions in HMD view work nicely
  • Fix Multi-view drawing interfering with HMD view drawing
  • Remove unnecessary check
  • Cleanup: Naming, use util function
  • Fix HMD view drawing black if device is set to 'None'
  • Fix crash when closing HMD window during running session
  • Re calibrated the vertical position for the Vive lens distortion shader
  • Fix projection matrices not taking focal length of HMD into account
  • Fix lens separation applied wrongly onto projection matrix
  • Force using HMD parameters instead of camera ones in camera perspective
  • Fix manipulator interaction not using correct projection matrix

Hey @Joey Ferwerda (TheOnlyJoey) , so I took a peek at the openhmd driver source to get familiar w/ what you integrated and have this branch running in the VS debugger - ready to roll man.

Where are you currently at with this and where could you use some help?

This revision now requires changes to proceed.Jun 9 2017, 2:00 AM

Joey,

First, support for the idea. I was looking for this a while back, I just found out this exists. I just bought a couple of junk HMD modelling tools, really pretty awful. Wish I had Blender in my Oculus. Granted, it's still early for a good quality experience, but IMO this sort of functionality is the real utility of HMDs.

Second, I feel your pain, thanks for the work. Unfortunately C isn't one of my strengths. C#, Web here. If I can help anyway, let me know. I see one reviewer is a little harsh in the phrasing. Please don't take it hard, some engineers just lack tact during review; on the bright side his comments look like good effort was expended and good info was collected as a result.

Lastly, I was sadly unable to get the 4-18 11:18 build functioning properly on Windows, NVidia. Here's my experience:

  • I tried starting without changes first. After "Start Session", the HMD window opens fullscreen on my main display, and (non-positional) head tracking is reflected there.
  • When I don the headset, the Oculus application manager starts in the HMD. I figured that can't be good, so I looked around. Hints were hard to find, but eventually dug through the comments here, and so tried stopping the OVR Service and running DirectDisplayConfig. I hear device enable/disable chimes in Windows, but after trying various sequences, wasn't able to get the viewport directly into the HMD, or get the HMD into an "extended" state where I can drag Windows to it.

I'd love help diagnosing this so I can try it out.

Joey,
First, support for the idea. I was looking for this a while back, I just found out this exists. I just bought a couple of junk HMD modelling tools, really pretty awful. Wish I had Blender in my Oculus. Granted, it's still early for a good quality experience, but IMO this sort of functionality is the real utility of HMDs.
Second, I feel your pain, thanks for the work. Unfortunately C isn't one of my strengths. C#, Web here. If I can help anyway, let me know. I see one reviewer is a little harsh in the phrasing. Please don't take it hard, some engineers just lack tact during review; on the bright side his comments look like good effort was expended and good info was collected as a result.
Lastly, I was sadly unable to get the 4-18 11:18 build functioning properly on Windows, NVidia. Here's my experience:

  • I tried starting without changes first. After "Start Session", the HMD window opens fullscreen on my main display, and (non-positional) head tracking is reflected there.
  • When I don the headset, the Oculus application manager starts in the HMD. I figured that can't be good, so I looked around. Hints were hard to find, but eventually dug through the comments here, and so tried stopping the OVR Service and running DirectDisplayConfig. I hear device enable/disable chimes in Windows, but after trying various sequences, wasn't able to get the viewport directly into the HMD, or get the HMD into an "extended" state where I can drag Windows to it.

I'd love help diagnosing this so I can try it out.

Hey, thanks for the report!

So currently on Windows you have to run Blender with Administrator Permissions to automatically disable the OVRService (which is a windows service which is always on....), this will change soon with a patch that is in review at the moment (which will automatically ask for elevation when disabling the service).
Extended mode is a bit tough, when the OVR driver is installed, you have to run 'DirectDisplayConfig.exe off' to disable direct mode.
If the OVR service is not installed, it will work out of the box.
The Screen will turn on in extended when creating the HMD window.

We are in the process of getting direct mode access on Windows through Nvidia, but the NDA process is a bit tough so that can take a while.
We do have Extended mode support on Linux now, though that has to be implemented still.

Hey @Joey Ferwerda (TheOnlyJoey) , so I took a peek at the openhmd driver source to get familiar w/ what you integrated and have this branch running in the VS debugger - ready to roll man.
Where are you currently at with this and where could you use some help?

So though i have spoken to you on IRC, its nice to give a reaction here as well.
Something that might be nice to investigate is oversampling.
Currently we are using native resolution, but it is always nice to have the option to oversample the viewport to increase the clarity of the image.
Since you are quite new to Blender, this is a small and manageable feature that would be a good exercise, bit of GL involved.

Joey,
First, support for the idea. I was looking for this a while back, I just found out this exists. I just bought a couple of junk HMD modelling tools, really pretty awful. Wish I had Blender in my Oculus. Granted, it's still early for a good quality experience, but IMO this sort of functionality is the real utility of HMDs.
Second, I feel your pain, thanks for the work. Unfortunately C isn't one of my strengths. C#, Web here. If I can help anyway, let me know. I see one reviewer is a little harsh in the phrasing. Please don't take it hard, some engineers just lack tact during review; on the bright side his comments look like good effort was expended and good info was collected as a result.
Lastly, I was sadly unable to get the 4-18 11:18 build functioning properly on Windows, NVidia. Here's my experience:

  • I tried starting without changes first. After "Start Session", the HMD window opens fullscreen on my main display, and (non-positional) head tracking is reflected there.
  • When I don the headset, the Oculus application manager starts in the HMD. I figured that can't be good, so I looked around. Hints were hard to find, but eventually dug through the comments here, and so tried stopping the OVR Service and running DirectDisplayConfig. I hear device enable/disable chimes in Windows, but after trying various sequences, wasn't able to get the viewport directly into the HMD, or get the HMD into an "extended" state where I can drag Windows to it.

I'd love help diagnosing this so I can try it out.

Hey, thanks for the report!
So currently on Windows you have to run Blender with Administrator Permissions to automatically disable the OVRService (which is a windows service which is always on....), this will change soon with a patch that is in review at the moment (which will automatically ask for elevation when disabling the service).
Extended mode is a bit tough, when the OVR driver is installed, you have to run 'DirectDisplayConfig.exe off' to disable direct mode.
If the OVR service is not installed, it will work out of the box.
The Screen will turn on in extended when creating the HMD window.
We are in the process of getting direct mode access on Windows through Nvidia, but the NDA process is a bit tough so that can take a while.
We do have Extended mode support on Linux now, though that has to be implemented still.

Hey @Joey Ferwerda (TheOnlyJoey) , so I took a peek at the openhmd driver source to get familiar w/ what you integrated and have this branch running in the VS debugger - ready to roll man.
Where are you currently at with this and where could you use some help?

So though i have spoken to you on IRC, its nice to give a reaction here as well.
Something that might be nice to investigate is oversampling.
Currently we are using native resolution, but it is always nice to have the option to oversample the viewport to increase the clarity of the image.
Since you are quite new to Blender, this is a small and manageable feature that would be a good exercise, bit of GL involved.

Awesome man - I'm out on a long vacation w/ mi familia until next week, but I'll start looking at it when I get back.

I'd love help diagnosing this so I can try it out.

Hey, thanks for the report!
So currently on Windows you have to run Blender with Administrator Permissions to automatically disable the OVRService (which is a windows service which is always on....), this will change soon with a patch that is in review at the moment (which will automatically ask for elevation when disabling the service).
Extended mode is a bit tough, when the OVR driver is installed, you have to run 'DirectDisplayConfig.exe off' to disable direct mode.
If the OVR service is not installed, it will work out of the box.
The Screen will turn on in extended when creating the HMD window.
We are in the process of getting direct mode access on Windows through Nvidia, but the NDA process is a bit tough so that can take a while.
We do have Extended mode support on Linux now, though that has to be implemented still.

Thank you so much for the reply Joey. I'm sorry I haven't gotten back at this right away to return the favor. I'm kind of dreading trying it again after my failure last time and since what you describe as the process sounds like what I tried last time; or I suppose really it's just because of all the unfinished projects I have going on. But I do still plan to try this again in the next week or two.

Joey,
First, support for the idea. I was looking for this a while back, I just found out this exists. I just bought a couple of junk HMD modelling tools, really pretty awful. Wish I had Blender in my Oculus. Granted, it's still early for a good quality experience, but IMO this sort of functionality is the real utility of HMDs.
Second, I feel your pain, thanks for the work. Unfortunately C isn't one of my strengths. C#, Web here. If I can help anyway, let me know. I see one reviewer is a little harsh in the phrasing. Please don't take it hard, some engineers just lack tact during review; on the bright side his comments look like good effort was expended and good info was collected as a result.
Lastly, I was sadly unable to get the 4-18 11:18 build functioning properly on Windows, NVidia. Here's my experience:

  • I tried starting without changes first. After "Start Session", the HMD window opens fullscreen on my main display, and (non-positional) head tracking is reflected there.
  • When I don the headset, the Oculus application manager starts in the HMD. I figured that can't be good, so I looked around. Hints were hard to find, but eventually dug through the comments here, and so tried stopping the OVR Service and running DirectDisplayConfig. I hear device enable/disable chimes in Windows, but after trying various sequences, wasn't able to get the viewport directly into the HMD, or get the HMD into an "extended" state where I can drag Windows to it.

I'd love help diagnosing this so I can try it out.

Hey, thanks for the report!
So currently on Windows you have to run Blender with Administrator Permissions to automatically disable the OVRService (which is a windows service which is always on....), this will change soon with a patch that is in review at the moment (which will automatically ask for elevation when disabling the service).
Extended mode is a bit tough, when the OVR driver is installed, you have to run 'DirectDisplayConfig.exe off' to disable direct mode.
If the OVR service is not installed, it will work out of the box.
The Screen will turn on in extended when creating the HMD window.
We are in the process of getting direct mode access on Windows through Nvidia, but the NDA process is a bit tough so that can take a while.
We do have Extended mode support on Linux now, though that has to be implemented still.

Hey @Joey Ferwerda (TheOnlyJoey) , so I took a peek at the openhmd driver source to get familiar w/ what you integrated and have this branch running in the VS debugger - ready to roll man.
Where are you currently at with this and where could you use some help?

So though i have spoken to you on IRC, its nice to give a reaction here as well.
Something that might be nice to investigate is oversampling.
Currently we are using native resolution, but it is always nice to have the option to oversample the viewport to increase the clarity of the image.
Since you are quite new to Blender, this is a small and manageable feature that would be a good exercise, bit of GL involved.

Awesome man - I'm out on a long vacation w/ mi familia until next week, but I'll start looking at it when I get back.

@Joey Ferwerda (TheOnlyJoey) - Hey man, just wanted to drop a line so you don't think I forgot about this :D

I hadn't gotten around to any off-screen rendering stuff in OpenGL yet prior to this, so I spent the last few weeks tinkering w/ source code from a couple books (the Superbible 7th Ed. and learnopengl.com - both really awesome btw) so that I could better understand how to do off-screen rendering w/ multiple FBOs in general, starting from the smaller code samples they provide as a starting point... so I haven't touched the Blender code yet, but now that I have a better feel for how to get data between FBOs and the default FBO w/ a different resolution color buffer, I'm going to start digging into the Blender code for this next weekend.

I'll hit you up in IRC as questions come up (and they certainly will haha).

Cheers,
Danrae

Hey @Joey Ferwerda (TheOnlyJoey), sorry for the slow progress here (and blowing up your IRC this wkend lol) - end of August / September was crazy for me b.c. we had that hurricane hit Texas and then I had to move closer to work like a week after during all the aftermath... we were totally fine - had planned the move long before hurricane came, but ended up having to do it during all the craziness in Houston...

Anywho, this has been an awesome intro to the blender window mgmt / drawing code - think I finally have a solution and wanted to run it by you as well as get your thoughts on how to make this configurable in the UI for the end-user...

I was thinking we could just define a float value in "GPULensDistSettings" (called "sampleFactor" or something) by which we scale both the GPUFX offscreen fbo color_buffer / color_buffer_sec's width & height during their creation (so as not to screw up the aspect ratios)... we could expose this parameter through the open-hmd UI panel, restricting it to some reasonable range. I think this means the lens distortion shader gets run on a higher resolution texture (containing the drawn viewport scene) when drawing the result to the backbuffer - is this what you intended for this feature?

As far as using the float value for the end-user to control the offscreen-rendering resolution, I though it would be easier this way b.c. they could downsample to half the display res by setting the "Sample Factor" to 0.5 or oversample by twice the display res by setting the "Sample Factor" to 2.0, etc... Maybe we could clamp it to some reasonable range like 0.25 - 4.0 (minimum offscreen rendering resolution of 1/4 display res or 4 times display res)... what do you think?

Hey @Joey Ferwerda (TheOnlyJoey), I just dropped a patch for the HMD viewport oversampling / undersampling, buuuut... drumroll (major #fail coming) I accidentally created a completely separate revision for it D:

I thought just mentioning this revision's number (D2133) in the title would automatically link it - but it didn't, so I tried putting D2133 in the body after to no avail lol...

Anywho - it's out there - https://developer.blender.org/D2902

Let me know if you think I need to change anything!

Cheers,
Danrae

Hey @Joey Ferwerda (TheOnlyJoey), I just dropped a patch for the HMD viewport oversampling / undersampling, buuuut... drumroll (major #fail coming) I accidentally created a completely separate revision for it D:
I thought just mentioning this revision's number (D2133) in the title would automatically link it - but it didn't, so I tried putting D2133 in the body after to no avail lol...
Anywho - it's out there - https://developer.blender.org/D2902
Let me know if you think I need to change anything!
Cheers,
Danrae

Great!

I will look at it later this week and merge it in the branch, or give feedback if needed.
You are just in time as well since I was merging in master and planned to move to 2.8 for future development, which would be the next step to keep the hmd branch up to date.

Small update from my side with a small todo list:

  • Next patch will be VSYNC control in blender, being able to toggle vsync on and off, and automatically VSYNC'ing to HMD screens (resolves tearing issues etc)

This will also allow you to force VSYNC on a particular 3D context.

  • Fix up the pointer to use the same converging settings as the 2D widgets, should be the last patch we need to do for basic 2D work
  • Prepare multi-interface for OpenHMD to allow multiple devices to be used in the viewport, we need this for Controllers and support for setups like the NOLO
  • Re-Research libusb dependency regarding positional tracking setups with camera's (such as Oculus DK2/CV1, PSVR, Microsoft MR)

OpenHMD 0.3.0 will be released soon and will be the current stable version for the Blender HMD release, if additional drivers are added to master, we will manually merge if requested

Hey @Joey Ferwerda (TheOnlyJoey), I just dropped a patch for the HMD viewport oversampling / undersampling, buuuut... drumroll (major #fail coming) I accidentally created a completely separate revision for it D:
I thought just mentioning this revision's number (D2133) in the title would automatically link it - but it didn't, so I tried putting D2133 in the body after to no avail lol...
Anywho - it's out there - https://developer.blender.org/D2902
Let me know if you think I need to change anything!
Cheers,
Danrae

Great!
I will look at it later this week and merge it in the branch, or give feedback if needed.
You are just in time as well since I was merging in master and planned to move to 2.8 for future development, which would be the next step to keep the hmd branch up to date.
Small update from my side with a small todo list:

  • Next patch will be VSYNC control in blender, being able to toggle vsync on and off, and automatically VSYNC'ing to HMD screens (resolves tearing issues etc)

This will also allow you to force VSYNC on a particular 3D context.

  • Fix up the pointer to use the same converging settings as the 2D widgets, should be the last patch we need to do for basic 2D work
  • Prepare multi-interface for OpenHMD to allow multiple devices to be used in the viewport, we need this for Controllers and support for setups like the NOLO
  • Re-Research libusb dependency regarding positional tracking setups with camera's (such as Oculus DK2/CV1, PSVR, Microsoft MR)

OpenHMD 0.3.0 will be released soon and will be the current stable version for the Blender HMD release, if additional drivers are added to master, we will manually merge if requested

Awesome man! I'm moving on to 2.8 tasks now - but let me know if you need any more help... I'll be doing Blender work almost daily moving forward.

Also, I wanted to start capturing feature requirements in preliminary end-user docs for any features I work on in Blender moving forward, so here's a rough start:

https://docs.google.com/document/d/1R2jTDH-g9qBgxTqPHvigzxyPEqVFHtTW6GT4dk7UQmQ/edit?usp=sharing

I tried to follow the "look and feel" of the stuff @ docs.blender.org, but it's a google doc so, yeah lol... There's quite a bit of detail that could be added and I didn't know what the "Mirror HMD View" or "Only Render" settings were for. If you want edit access for the doc, shoot me an email at blink.ornitier@gmail.com and I'll send you a private link w/ edit permissions.

Lastly, I only added the "Enable HMD View" checkbox b.c. there is not actually any GPULensDistSettings struct allocated during normal operation. That said, from a usability perspective, I thought maybe this checkbox could replace the "Open HMD Window" button, but didn't want to replace it w/o talking to you first.

btdubs, I got a windows MR headset too if you need some dev done on this. My initial testing looks pretty damn promsing - inside-out tracking is really impressive o/