Page MenuHome

Fisheye / panorama in EEVEE
Confirmed, NormalPublicTO DO

Assigned To
None
Authored By
Dalai Felinto (dfelinto)
Aug 21 2019, 4:15 PM
Tokens
"Love" token, awarded by juancahf."Like" token, awarded by Nikita."Love" token, awarded by mindinsomnia."Evil Spooky Haunted Tree" token, awarded by filibis."Love" token, awarded by johantri."Love" token, awarded by floboriaan."Yellow Medal" token, awarded by Alumx."Love" token, awarded by TWCSwagger."Love" token, awarded by raincole."Love" token, awarded by sybren."Like" token, awarded by evilvoland."Love" token, awarded by Lac."Love" token, awarded by Valerian."Party Time" token, awarded by Scaredyfish."Burninate" token, awarded by jonimercado."Love" token, awarded by davidbrum."Love" token, awarded by juanquico."Love" token, awarded by hjrabi.

Description

Same level of support as we do for Cycles, no selection, or support in non lookdev/render mode.

  • Basic equirectangular support
  • Basic fisheye support
  • Over-scan to better support screen base shaders, and compositing
  • Optimized fisheye support (restrict the number of rendered views to the bare minimum)

Event Timeline

Dalai Felinto (dfelinto) lowered the priority of this task from 90 to Normal.Aug 21 2019, 4:15 PM
Dalai Felinto (dfelinto) created this task.
Dalai Felinto (dfelinto) edited a custom field.

Instead of using the Cycles panoramic algorithms, wouldn’t EEVEE be better off by (1) rendering onto a skybox, then (2) applying DoF into another skybox whose faces are only then (3) mapped onto an equirectangular, fish=eye, or whatever panorama? The implementation according such process seems soooo much simpler yet still fairly accurate.

As a bonus, the pixels at the poles receive a more fair #samples compared to the ones at the equator. As another bonus, the skybox could be outputted directly into a game-engine friendly image format (before and/or after phase (2)). Alternatively, phases (2) and (3) could be implemented by a compositor node or a custom OSL script. Just an Idea.

That's the kind if the idea (warped cubemap) same as we once did in the BGE. There is no way of using Cycles full raytracing method. It also facilitates overscanning.

Any plans for stereo 360? My understanding is that it is not as simple as having two, it involves translation to get correct results, which Cycles handles perfectly. But real time engines like unreal have to make a lot of vertical splices and move the camera on each rotation. Even if not realtime, it would be nice for Eevee

Would this capability to render 360* make it possible for Eevee to offer screenspace raymarching reflections that can be raymarched in 360* directions? So for example, behind the camera?

That's the kind if the idea (warped cubemap) same as we once did in the BGE. There is no way of using Cycles full raytracing method. It also facilitates overscanning.

How would screen space effects work in this case? Does overscanning allow to add entire spherical view to account?

The idea would be for overscanning to be done in a per-slice based, not for the entire "sphere". Otherwise effects like circular blur would show extremely distorted when experienced "in VR".

Would this capability to render 360* make it possible for Eevee to offer screenspace raymarching reflections that can be raymarched in 360* directions? So for example, behind the camera?

I don't know if this is exactly related to the question quoted above, but considering that volumetric shadowing only works inside the view frustum, would this camera lens support consistent volumetric shadows in the entire field of view of the panoramic?