Page MenuHome

RenderViewport: Texture Format
Needs ReviewPublic

Authored by Jeroen Bakker (jbakker) on Tue, Nov 5, 5:00 PM.



When doing viewport rendering the color management happens on the CPU.
This has overhead in downloading a float texture from the gpu and
performing color management on the CPU.

Based on the scene fileformat bit depth the result will be rendered to
a byte texture where the colormanagement happens on the GPU or a float
texture where the colormanagement happens on the CPU.

This is only done during Viewport Render Animation in other
cases a float texture is being used.

Baseline (HD render of wanderer.blend workbench engine no samples) 15.688038 s
After changes: 9.412880s

Diff Detail

rB Blender
T71364 (branched from master)
Build Status
Buildable 5577
Build 5577: arc lint + arc unit

Event Timeline

Maybe it's better to use BYTE and FLOAT for Python API instead of RGBA8u and RGBA32f. Makes little sense to have RGBA and precision is implicit. It could be ok if we plan to have more format in the future but I doubt we would.

I don't think this needs to be an operator property.

For animation renders it can check if scene->r.imageformat.depth <= R_IMF_CHAN_DEPTH_8, to automatically use the appropriate bit depth depending on the output format.

  • check render output format bitdepth for selecting the ImBuf bitdepth
Jeroen Bakker (jbakker) edited the summary of this revision. (Show Details)Wed, Nov 6, 10:08 AM

Removed debug statement in previous diff

Used correct logic for is_animation it was reversed.