Wed, Sep 13
Alright. My bad. Wasn’t aware. I’ll see if I can come up with a patch or two to address the issues, though, because this is pretty important to me.
Thanks for the report, but it doesn't sound as a bug at all, more like a feature request or an enhancement request which we do not accept in the bug tracker. There are lots of areas in Blender which can behave better, but with are behaving fully according to the design and hence not considered a bug.
Mon, Sep 11
FYI: For now I’m preprocessing the input by converting to 24-bit RGB before importing into Blender, but that clearly isn’t enough most of the time. Even 8-bit (per channel) YUV can only be used with heavy dithering, and e. g. 10-bit input (which is getting increasingly common) loses way too much precision. Given FFmpeg’s rather awkward pixfmt support (owing to swscale legacy), it seems like using libzimg directly is the best option.
Sun, Sep 10
Just confirmed that FFmpeg defaults to Rec. 601. Bad. This needs a selection on import (defaulting to “Auto”, which should apply the same resolution-based heuristics as video players for untagged files, i.e. use 709 with HD resolutions) and export (should autoselect based on render resolution and also mark the colorspace appropriately in the resulting file—no idea what the API is for that on lavf’s side).
Wed, Sep 6
I think we can tweak the mapping to avoid this, even if it means break backwards compatibility a bit.
Tue, Sep 5
Aug 3 2017
Before opening this task I talked to @Bastien Montagne (mont29) over IRC and we could not find a real good reason for this feature. So I created this task to get some deeper discussion.
I do not think that the bf-funboard has ever really been a good place where users actually get feed back from there ideas, so I would rather open the task here.
Design tasks are mainly for developers, who wants to present their design for a particular project and get feedback. It is not for functionality discussions. For such discussions please use bf-funboard mailing list.
Jul 26 2017
Hi brecht. Even in 2017 me and some others stumble over this behaviour and I want to try another solution myself. I am an experienced software developer, though never done anything in blender sourcecode so far.
Jul 6 2017
Jul 4 2017
Well, then, thanks for the report, but this is not something we can work on directly, issue could be reported to ffmpeg I guess (though this is primarily a playback library, so not sure how much they care about HDR type of data, on the other hand some HDR video formats are emerging these days, so maybe they'll have to consider the issue for those as well?).
Jun 26 2017
Jun 23 2017
This is sadly due to the swscale reliance in FFMPEG. Currently, all buffers loaded from codecs are converted to RGB at 8 bits per channel via the swscale library. This likely won't be changing any time in the near future unless the FFMPEG handling is overhauled to load raw YCbCr and let OCIO handle the colour transforms.
Thanks for the suggestion, but this tracker is not for feature requests, see here for other options:
May 16 2017
From quick around seems we can not solve it from our side, just need to wait for the new OpenEXR to be released.
This seems to be a bug in OpenEXR itself, which was fixed by . Unfortunately, this commit is in none of the branches except from development one (it is not in the 2.2fixes branch).
May 15 2017
May 12 2017
Apr 18 2017
@Aaron Carlisle (Blendify) ffmpeg gets the length right because the duration (in frames) has doubled as well as the tbr. If you let Blender change the scene rate to whatever ffmpeg tells it, the length in elapsed time is correct.
Apr 17 2017
I've done some digging, and the loading of multilayer EXR files seems a bit indirect. load_image_single() loads the image with its metadata, then calls IMB_freeImBuf(ibuf); to immediately free the image buffer. A later call to BKE_image_acquire_ibuf() fetches the image buffer from some cache using image_get_cached_ibuf(), which then returns an ImBuf without metadata.
Apr 15 2017
If there were an issue, it would be a regression since 01fa4fb6 but I suspect the duplicated timebase is correct for paff-encoded video, see also tickets #3339 and #5378. In any case, there is most likely a bug in blender as FFmpeg always detects the correct length of the clip, no matter the timebase
Apr 13 2017
Mar 26 2017
It seems that @ronan ducluzeau (zeauro) clarified the issue, and that there is no bug here after all. Closing.
Mar 21 2017
Mar 10 2017
I appreciate your in depth response. I wasn't able to use an Image Sequence by baking paint but was able to use the Weight to bake something for an Animated Weight Group. I never knew that was possible and I'll keep experimenting and learning.
Mar 8 2017
Particles emission does not support animated weight groups, animated textures or image sequence textures.
It is a known limitation of current particle systems.
Mar 6 2017
Mar 5 2017
Mar 1 2017
Dec 20 2016
@ Jens verwiebe; would you mind describing the solution a little bit more in detail?
I'm trying to get DNG support within Blender VSE; but either blender shows a blank frame or crashes.