Page MenuHome

Graph Editor Normalization broken by Keyframe that uses Easing or Dynamic interpolation
Closed, ResolvedPublic

Description

Version: 2.80 rc2 (Linux 64)

How to reproduce:

  1. Inspect the file (a cube with one animated value)
  2. Change the interpolation type of the first keyframe to one of the "easing" methods (Sine, Quadratic, Cubic, etc.)
  3. Observe what happens to the normalization of the curve!

This is not the expected behavior. Neither of the two keyframe values have been changed, so the apparent Y-scale of the curve should not change either. It seems like some interpolation types break whatever function is used to normalize the curve. The behavior is difficult to predict though; for instance, the bug will not initially appear if the interpolation type of the last keyframe in the example is changed.

Event Timeline

This is not expected behavior but it looks like a bug that was present in 2.79, too.

I believe it! Perhaps this has gone undetected because "easing" and "normalize" are both under-utilized in Blender. But it's definitely not helpful.

I don't know anything about the code for this feature, but suspect that the bug may be caused by the whole thing being too complicated. For instance, this graph (not the example file) shows two curves which start at -10.0 and end at 10.0:


The actual value of those two end keyframes equals 10, but the blue curve is normalized by a greater factor, because Blender wants to normalize based on the extreme of the curve itself, not just the keyframe values. So there must be a function for calculating the extreme, which gets broken by the newer interpolation types.

Maybe the whole mess could be fixed by ignoring interpolated points and only taking the keyframe values into account, but that might break certain workflows. Personally, I think it might be helpful to view overshooting curves as going beyond the normalized range of -1.0...1.0, though.

Sebastian Parborg (zeddb) claimed this task.
Sebastian Parborg (zeddb) triaged this task as Confirmed, Medium priority.

@Sergey Sharybin (sergey) The issue here is that the code doesn't handle the (newish) interpolation type. A quick and dirty fix would to just treat it as a normal curve:

diff --git a/source/blender/editors/animation/anim_draw.c b/source/blender/editors/animation/anim_draw.c
index 61bf7f95340..6965c3418e2 100644
--- a/source/blender/editors/animation/anim_draw.c
+++ b/source/blender/editors/animation/anim_draw.c
@@ -438,7 +438,7 @@ static float normalization_factor_get(Scene *scene, FCurve *fcu, short flag, flo
           max_coord = max_ff(max_coord, prev_bezt->vec[1][1]);
           min_coord = min_ff(min_coord, prev_bezt->vec[1][1]);
         }
-        else if (prev_bezt->ipo == BEZT_IPO_BEZ) {
+        else if (prev_bezt->ipo >= BEZT_IPO_BEZ) {
           const int resol = fcu->driver ?
                                 32 :
                                 min_ii((int)(5.0f * len_v2v2(bezt->vec[1], prev_bezt->vec[1])),

However, this is not correct and will lead to other inconsistencies/problems.
So I'm guessing that we should create some general function for this?

I'm also guessing that one could use the functions in blender/blenkernel/intern/fcurve.c to do this. However, it seems like those are only sampled, so if I use those, the scaling might jump around because of missed max/min points.

@Sebastian Parborg (zeddb), ideally should indeed share code, so the drawing and normalization are based on same exact things.
Surely, there could be some missed extremum which falls between of sampling points, but it will be missing for drawing as well then. So I wouldn't worry about this.