Page MenuHome

List of Properties that should use the Units system
Open, NormalPublic

Description

This is a list of Properties that should make use of units, but currently do not.

It is not exhaustive, but what I could find by browsing through the tools and properties. I can update this list if anyone notices any more examples.

Using consistent and correct units is a massive help for users, and makes it much easier to understand what values represent. We already have a nice units system in place - we just need to actually use it more.

Additional Units Needed

In Blender we support these unit types:

UnitInternal NameCommon measurement
DistancePROP_DISTANCEm
AreaPROP_UNIT_AREAm2
VolumePROP_UNIT_VOLUMEm3
MassPROP_UNIT_MASSkg
AnglePROP_ANGLEdegrees
TimePROP_TIMEFrames or Seconds
Field Of ViewPROP_UNIT_CAMERAmm
Speed/VelocityPROP_VELOCITYm/s
AccelerationPROP_ACCELERATIONm/(s^2)
PowerPROP_POWERW

However, we have properties that are measured in units which we currently do not support, such as:

UnitExample Common measurement
TemperatureSmoke Flame TemperatureCelcius or Fahrenheit
Light TemperatureBlackbody Light valueKelvin
DensitySmoke Densitykg/m3

Tools

  • Bevel Amount
    • Needs refactor: It is Sometimes a Distance, sometimes Percentage, depending on the Amount Type setting
  • Randomize Amount (Distance)
  • Randomize Uniform (Factor)
  • Randomize Normal (Factor)
  • OBJECT_OT_randomize_transorm.scale (This is wrongly set to Distance. It should not have a unit, just like object scale)

Brush:

  • Brush.smooth_stroke_radius (pixels)

Properties

Eevee

  • eevee.bloom_radius (it's relative to the view, but not in actual pixels. Unsure what the unit is here)
  • eevee.bokeh_max_size (Pixels)
  • eevee.gi_cubemap_display_size (Distance)
  • eevee.gi_irradiance_display_size (Distance)
  • render.motion_blur_shutter (Factor)

Cycles

  • cycles.volume_step_size (Distance)
  • cycles_curves.minimum_width (Pixels)
  • cycles_curves.maximum_width (Pixels)
  • cycles.camera_cull_margin (Pixels)
  • cycles.distance_cull_margin (Pixels)
  • cycles.filter_width (Pixels)
  • cycles.preview_start_resolution (Pixels)
  • cycles.rolling_shutter_duration (Factor)

Render:

  • render.motion_blur_shutter (Factor)
    • Note: Change soft max to 1 (motion blur shutter speed >1 frame isn't a thing in the physical world)

Bake:

  • render.bake.cage_extrusion (Distance)

Output:

  • frame_start [Time]
  • frame_end [Time]
  • render.stamp_font_size (Pixels)

View Layer

  • denoising_radius (Pixels)

Cache

  • frame_start [Time]
  • frame_end [Time]

Curves ObData

  • path_duration (Time)
  • eval_time
    • Needs refactor: This option is weird, it is a frame number that relates to the total frames set in path_duration. These options could be removed - they are superseded by the Follow Curve constraint anyway.

Bones ObData

  • tail_radius (Distance)
  • head_radius (Distance)
  • envelope_distance (Distance)

Text Obdata

  • offset_x (Distance)
  • offset_y (Distance)
  • text_boxes[ ].width (Distance)
  • text_boxes[ ].height (Distance)
  • text_boxes[ ].x (Distance)
  • text_boxes[ ].y (Distance)

Motion Paths

  • frame_start (Time)
  • frame_end (Time)
  • frame_before (Time) - the unit "frames" still feels more useful, because we want a discrete number of motion path samples
  • frame_after (Time)

Constraints:

  • constraints["Action"].frame_start (Time)
  • constraints["Action"].frame_end (Time)
  • constraints["Pivot"].offset (Distance)

Modifiers:

  • modifiers["Mesh Cache"].frame_start
  • modifiers["Mesh Cache"]. frame_end (Time)
  • modifiers["Build"].frame_start (Time)
  • modifiers["Build"].frame_duration (Time)
  • modifiers["Wireframe"].thickness (Distance)
  • modifiers["Shrinkwrap"].offset (Distance)
  • modifiers["Shrinkwrap"].project_limit (Distance)
  • modifiers["Ocean"]. frame_start (Time)
  • modifiers["Ocean"]. frame_end (Time)
  • modifiers["Wave"].time_offset (Time)
  • modifiers["Wave"].lifetime (Time)
  • modifiers["Wave"].speed (should be renamed to frequency)
  • modifiers["Bevel"]. width_pct (Percentage)
  • modifiers["Set Split Normals"].offset (Distance)
  • modifiers["Set Split Normals"].mix_factor (Factor)
  • modifiers["VertexWeightMix"].default_weight_a (Factor)
  • modifiers["VertexWeightMix"].default_weight_b (Factor)

Smoke:

  • modifiers["Smoke"].domain_settings.dissolve_speed (Time)
  • modifiers["Smoke"].domain_settings.burning_rate (Seconds / Time)
  • modifiers["Smoke"].domain_settings.flame_ignition (Temperature)
    • Needs refactor: New unit needed (Temperature)
  • modifiers["Smoke"].domain_settings.flame_max_temp (Temperature)
    • Needs refactor: New unit needed (Temperature)
  • modifiers["Smoke"].domain_settings.display_thickness
    • Needs refactor: New unit needed (density)
  • modifiers["Smoke"].flow_settings.texture_offset (Distance)
    • I have no idea what space this is in (and if this relates to unit scale)...

Dynamic Paint:

  • modifiers["Dynamic Paint"].canvas_settings.canvas_surfaces["Surface"].dissolve_speed (Time)
    • (should be renamed to Dissolve Time)
  • modifiers["Dynamic Paint"].canvas_settings.canvas_surfaces["Surface"].dry_speed (Time)
    • (should be renamed to Dry Time)

Fluid:

  • modifiers["Fluidsim"].settings.start_time (Time)
  • modifiers["Fluidsim"].settings.end_time (Time)

Softbody:

  • modifiers["Softbody"].settings.mass (Mass) - physics settings use the unit scale internally currently, but yeah, this can be updated
  • modifiers["Softbody"].settings.ball_size (Distance)

Force Fields:

  • field.distance_min (Distance)
  • field.distance_max (Distance)
  • field.radial_min (Angle)
    • Needs refactor: sometimes angle sometimes distance depending on force field type
  • field.radial_max (Angle)
    • Needs refactor: sometimes angle sometimes distance depending on force field type

Particles:

  • frame_start (Time)
  • frame_end (Time)
  • lifetime (Time)
  • normal_factor (Velocity)
  • tangent_factor (Velocity)
  • factor_random (Velocity)
  • angular_velocity_factor (Velocity - Radians per second)
  • root_radius (Distance)
  • tip_radius (Distance)
  • mass (Mass)
  • drag_factor (Factor)
  • damping (Percentage)
  • timestep (Time)
  • display_size (Distance)
  • child_parting_min (Angle)
    • Needs refactor: This is stored in degrees but must use radians to use the unit system
  • child_parting_max (Angle)
    • Needs refactor: This is stored in degrees but must use radians to use the unit system
  • Child Roughness Properties all seem to be lengths
  • child_radius (Distance)
  • kink_amplitude (Distance)

Cloth:

  • settings.mass (Mass)
  • collision_settings.distance_min (Distance)
  • collision_settings.self_distance_min (Distance)

Light Probes:

  • visibility_bleed_bias (Factor)
  • visibility_blur (Factor)

Materials / Nodes

  • Subsurface Radius in both Subsurface Scattering and Principled BSDF nodes (Distance) - how accurate is that? Example? Also such units don't work well in a node system.

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes

@Brecht Van Lommel (brecht)
Users need to understand [0, 1] factors anyway, so hiding that only partially is not that helpful in my opinion.

Well, currently it's not very clear. Some properties are float values, and others are values from none to full. These are conceptually percentages, and extremely useful for the user to know. A Weight of 0.500 is is ambiguous - the user can't infer what that means. But a Weight value of 50% is immediately clear. So I think this is actually a big deal.

I get the issue you are referring to, where the F-curves will map between 0.000 - 1.000, whereas the percentage value goes from 0-100%. But actually I don't see that as a problem at all. That's normal for any percentage in maths, so I don't think users will be confused by that. IMO it's better to use the correct and clear unit of measure (%).

@Brecht Van Lommel (brecht), ok, I don't care too much if we use percentages or factors. @William Reynish (billreynish) previously said that he prefers percentages, so maybe you two should discuss that.

@William Reynish (billreynish), yes but 2s will always be two seconds long, 48 frames will not always be two seconds long. To me that is a fairly big difference.
How would you type frame 10 into a field that shows seconds?
And displaying 0.208333333s instead of Frame 5 is not useful + what would happen when you want to increase the value by 1. Does it increase it by one second or 1 frame?

@Jacques Lucke (JacquesLucke) First, by that logic, we could never use the time unit in Blender, because the values are all in frames under the hood.

Second, all of those issues exist for length-based units too.

If you change the Unit Scale, 1m is no longer 1m. In most animation projects, you don't change the fps all the time. You set it once and leave it for the entire project - just like the Unit Scale. It's the same thing.

Yes, seconds don't always map exactly to frames always, but that is the same as Blender Units that don't always map well to Imperial Units, for example. 1 BU = 3.28084ft But that's just how it is - not a problem as I can see.

So, frame 5 = 0.2 seconds. But that may be exactly what the user wants to know. Especially for motion graphics, dealing with real-world time units may be much preferable.

What we could do for better sub-second precision, is to do what Video Editing apps use do. They typically use [Hours] [Minutes] [Seconds] [Frames]

In other words, assuming a FPS of 24, frame 100 = 4s, 4f.

That is the equivalent of the Separate Units toggle for lengths.

Separating time units in seconds+frames could be an even better solution indeed. I wonder how that works with fps values like 29.97.

A Weight of 0.500 is is ambiguous - the user can't infer what that means.

We already solve that ambiguity by showing the button as a slider. I think that is immediately clear.

Using [0, 1] values is standard in computer graphics software, especially those that have node based systems and programmability. Percentages are just a different convention, they are not more correct.

For example color values are typically presented as [0, 1]. And if those then feed into shader nodes that uses percentages, you can no longer just copy/paste values, and you have to mentally convert between the two all the time. That mess can be avoided by just using the same convention throughout the application.

Currently at least, the slider is not the same thing as a percentage: It seems like we use the sliders to mean a useful min - max value, rather than an actual percentage.

  • There are many percentages that don't use sliders ( eg modifiers["Softbody"].settings.shear, but there are many others)
  • There are many sliders on properties which are most certainly not percentages (ParticleSettings.shape, for example)
  • There are examples where the max value of a percentage is not 100%. See the Eevee Overscan size percentage, for example. The max is 10%. There are also examples where the max percentage may be 1,000%. The sliders in those cases don't map to 0-100%.

Also, for me at least I don't find the sliders sufficient. It's often times very ambiguous even with them. Take the Simplify > Max Child Particles properties. This is displayed as 1.000. I often have to go and check if 1.000 means 100%. In this case it does. It would be so much clearer if it simply read 100% - it would be unambiguous then. Users wouldn't have to worry about 0.5 sometimes meaning 0.5 and other times meaning 50%. This is the ambiguity that would be great to resolve.

Lastly, what is the reason we permit the use of real percentages in the few places where we do use it? Why is it ok for the Render Resolution to be defined as a percentage and not 0.000-1.000? Because it is an integer?

I've separated the percentages to a separate todo.

We could start by addressing all the length, angle, pixel, mass and speed values.

Then we can handle the time properties, and last we can think of what to do with percentages.

Some ideas:

  • I could go through and make all the remaining percentages display as sliders
  • I could remove sliders from properties that are not percentages
  • We could make it so the slider actually represents the difference between None and Full, instead of Min-Max.
  • We could also add a unit option to display percentages as percentages (0% to 100%), rather than as 0.000-1.000 float values. I really do think there's a marked difference in clarity here. If these values are marked as percentages in the RNA, then at least we have the flexibility to display them however we like, as % or as floats.

I'd like to use this task a a starting point for some new developers who to contribute a little bit of code to Blender for the first time.
I will make a public call on Twitter a bit later today.
If I find no one, I'll make the changes we agreed on myself.

@William Reynish (billreynish), certainly there are some properties that are wrong now.

I wouldn't mind changing some "quality" type properties to percentages, as those are not animated/linked/textured. But for most of the list in the other task I still think we should use [0, 1]. We could have an option but it should be off by default.

@Brecht Van Lommel (brecht) @Jacques Lucke (JacquesLucke) Ok, so we agree to:

  • Make sure all the 0-1 properties are set to PROP_FACTOR at the very least (many are not - especially in the Physics panels) , and then we can add an option to display those as 0-100%
  • Make all the frames start/end/duration properties display as real-world time units, and add a toggle for 'Separate Time Units', similar to Separate Length Units. Time unit default can just be Frames, to make it more similar to how it was.

The nice thing about the above, is that at least then everything is set correctly in RNA and the UI can the interpret that in various useful ways.

ok?

This comment was removed by Joel Godin (FloridaJo).

@Joel Godin (FloridaJo) These are defined in what Blender calls RNA in C, not in the Python UI. The RNA files are in source/blender/makesnra/intern

@Joel Godin (FloridaJo), often it is a good idea to just do a full text search for the property name or its tooltip to find where it is defined. (or for some other property that is closely related, often they are defined in roughly the same place)

I have submitted a diff for cloth properties here D4260, first time i contribute so let me know if i'm doing something wrong

Would you prefer to get single revision for multiple positions from the list above, or multiple separated by context? Particles/Modifiers, etc?

@Łukasz Kwoska (Szakulus), while I'm happy you want to help, I would like to not finish every just now so that maybe other can give it a shot as well.
Let's at least wait until monday. Then feel free to make a patch that contains fixes for multiple contexts.

In the meantime you might want to take a look at this list: T56950
Those tasks are a bit harder, but should still be achievable.

I am against the idea to convert every numerical value setting normalized between 0 and 1, a percentage look.
Sometimes, pertinent range, when a significant change occurs, is a tenth. Sometimes, it is an hundredth. Sometimes, it is a thousandth.

When the tweak needed has to occur at the fourth digit after coma, presenting things as percentages does not help at all.

I also don't think that expressing key values as portion of seconds is more meaningful for user than a frame number.
An approximation in time may be visible in tooltip according to frame rate.
A automatic conversion from typing 10s to frame 250 may help.
But nothing can be more accurate than frame number.

Well, we would not be getting rid of frames, obviously. I think the default display should be in frames. But, currently, if you want to, say, make sure your shot is exactly 10 minutes long, you can't easily do that. You'll have to type in 10*60*60*[fps]. This is very clumsy, and slow if you want to tweak or change it.

It could also be that you would like to make the life of particles last exactly 2 mins and 30 seconds. That's also a pain currently to input.

This is exactly the same use-case as for length units. Some people also argued that we should not include the ability to use real-world units for lengths. In the end, using real-world units makes things easier to understand and use, and much faster too, esp if you want to type in time lengths in minutes or hours. You don't have to do back-of-envelope calculations all the time.

As for percentages, we agreed above to display factor values by default as 0-1 still, but with an option in the Units panel to display them as percentages. However, your argument makes no sense. You can have percentages written out as 100.0% or 100.000% if you need. That's no issue.

With the option to display factors as percentages, if nothing else, I expect we will discover loads of places where prop_factor should be used but currently is not.

Shutter time is not a typical time unit.

@Brecht Van Lommel (brecht): What is shutter time in then? I thought it's the time (in frames) in which the shutter is open?

Same for Rolling Shutter Duration. That is also a time unit. How is it not?

So I don't get that. In what way are these two properties not measured in time?

@Brecht Van Lommel (brecht): What is shutter time in then? I thought it's the time (in frames) in which the shutter is open?
Same for Rolling Shutter Duration. That is also a time unit. How is it not?
So I don't get that. In what way are these two properties not measured in time?

It is and isn't a measurement of time. (Note that Brecht said: typical time unit)
"It's a measurement of time in relationship to the frame rate."
The proper measurement (and unit) is Degrees. This video explains it.

We could measure the shutter in degrees, and that might be nicer, but the actual, current property value *is* a measurement in time. All our time properties are currently measured in frames, and so they are *all* relative to the frame rate - so this is no different.

@Jacques Lucke (JacquesLucke) Absolutely. Right after sending this comment I thought exactly this. Thanks for T56950. That's what I wanted to ask for.

This comment was removed by Joel Godin (FloridaJo).

This page explains the process of creating a diff. It's the same for Linux and Windows.

In the initial list, in Properties -> Output
image size could be added:

It is currently labelled Resolution, the unit of measurement of which is dpi
and should be called Size, expressed in px.

I reported it on:
https://devtalk.blender.org/t/blender-ui-paper-cuts/2596/1315

and I have submitted a patch (my first):
https://developer.blender.org/D4253

The patch deals just with the UI part, not with the python property (I suppose it may cause incompatibility with some addons).

This comment was removed by Joel Godin (FloridaJo).
This comment was removed by Joel Godin (FloridaJo).

@Joel Godin (FloridaJo) That's just the hash of your local commit, Jacques won't be able to commit any code with that. It seems you have some trouble with the process so let me try to explain it to you. The reason this is done with git even for two lines of code is to have a consistent and predictable end result when you've got several people working on the project. So let me walk you through the steps (some of these may be obvious to you, but just I'll write them down anyway in case this is something you're unsure about).

  1. You need to install git
  2. Open a console/terminal and navigate to a folder where you want to put the Blender source code. The command for this is cd followed by the directory name.
  3. Clone the Blender repository using git clone http://git.blender.org/blender.git. This creates a local copy of Blenders code and allows git to track both your local changes and the ones contributed by other developers. If you want to update your local repository you can use git pull to pull in the lastest changes. When you clones the repository you are on the master branch. You can think of the git repository like a tree, the master branch is the trunk and when you want to develop features based of the master you can create a new branch. The branch uses the master as a starting point, but all code changes of the new branch don't impact the master until you merge the changes back (that has to be done manually). So if you want to create a patch, it's a smart thing to create a branch for it, just to be safe that you're not creating any conflict. BTW you can't directly push changes to the remote Blender master branch anyway, this is for the core developers only. That is why you've got to upload a diff, but more about that later on.
  4. Create a branch and switch to it for your changes git checkout -b some-new-feature where some-new-feature can be any name you want. This command creates a branch and switches to it. You can view your local branches with git branch, the current branch is marked with an asterisk *. You can view all branches including the remote branches on Blender's server with git branch -av.
  5. Now that you are on your feature branch (make sure that you are!) you can edit a file.
  6. Compile Blender and make sure everything is running like it's supposed to. I won't get into detail here because there is a tutorial here.
  7. Once everything looks good you can add and commit (read as save permanently) your changes to the local branch. Git creates a history of all changes created with commits, which allows you to see every edit made and revert to a past state if necessary. Use git add /path/to/changes/file.c to stage files that you'd like to commit (replace the path with the actual path and filename that you have changed). Once you've done this for every file you can git commit -m "text explaining why and what you've changed". Once you've run this command you have stored your changes to your branch locally.
  8. Creating a diff is basically to identify changes that exist between two branches, in this case between the master branch and your branch for the patch. git diff master..some-new-feature > some-new-feature.diff compares the code of the master branch against a branch called "some-new-feature" (replace it with the actual name of your branch). The differences, hence the name for the diff command, are written to a file using a redirect > of standard out (stdout) to a file called some-new-feature.diff. You can add a path in front of the filename because you'll likely don't want the diff file in the git directory.
  9. Now you can upload the created diff-file on this page. Upload you file under "Raw Diff From File". Choose the repository this diff is for (likely rB Blender in your case). You'll have to confirm your changes and if you know who is maintaining the specific project your want to contribute to, you can add them as subscribers. Once this is done it may take a bit for other developers to verify you changes. If they aren't happy with your changes, you may have to update your patch and they'll review it again. When everything looks good, they'll commit it to the right branch in the blender repository.
  10. In case you work on a larger project and need to keep your branch up-to-date with the changes of the master, you'll have to change to the master branch git checkout master, pull the latest changes, switch back to your feature branch using git checkout some-new-feature and then rebase using git rebase master.

I hope this helps.

@Riccardo Gagliarducci (rickyx) I think 'resolution' is actually the correct term. Changing these values doesn't change the size, and doesn't change *what* you see, but the amount of pixels used to describe what you see. You get the same image but with more or less pixels defining it. That is what resolution is.

A 27" TV which is 4K has higher *resolution* than a 27" TV which is HD.

Size is has to do with how you are viewing your image. You can view a 320*200 image in an IMAX theatre. That is a huge size but low resolution.

Resolution is how many pixels are used to describe an image. An 8K image displayed on a postage stamp is tiny, but has real high resolution.

This comment was removed by Joel Godin (FloridaJo).
This comment was removed by Joel Godin (FloridaJo).

@Joel Godin (FloridaJo) Have you tried building it from the console using make? This will give you a detailed output on what went wrong.

This comment was removed by Joel Godin (FloridaJo).

@Joel Godin (FloridaJo), please ask for help on IRC or on devtalk.blender.org when you have build problems. This task is not the right place.

To clarify my thinking on the time and factors, here's what I think we could do:

  • Add two options to the Units panel - Separate Time Units and a 'Factors' setting:

  • When Factors are set to Percentages, all factors are displayed as percentages, like so:

  • When Time is set to Seconds, we display the time units in Seconds. The Separate Time Units shows both minutes, seconds and frames, similar to how Separate Length Units works now:

I would like to add my 2 cents here on the debate.
As Brecht said about graphics, 0-1 is the common factor.
.566 makes more sense than 56.6 %

As for shutter speed, it is fractions of a second. (ie. 1/500 shutter speed )
This guys says it well here on shutter speed for video:
"For example, when shooting at 25fps, your shutter speed should be 1/50 of a second.
If your camera can shoot at 50 or 60 fps, your shutter speed should be 1/100 or 1/125 of a second.
The reason for this 180-degree rule is because it helps us to record video that contains natural movement.
If the shutter speed is too slow, you’ll get blurred movement, if you shoot at a shutter speed that’s too high,
everyone in your scene will look like robots or as if they were recorded in stop motion.
Sticking to the 180-rule will give you the most natural movement." - Jon Devo

As for bloom radius:
From what I've read it is some form of Gaussian Blur
or Multiple Gaussian Blurs blended.
The radius for Selective Gaussian Blur in Gimp
has a tool tip that says:
"Radius of a square pixel region,
width and height will be radius*2 + 1."
Maybe that helps.

Yes, absolutely agreed, it would be nice to change the shutter to be in degrees rather than in frames. This is how you set the shutter speed in film cameras.

A 180 deg shutter is more correct that a 0.5 shutter.

I'd like to mention one that's been confusing me forever.
Randomize Transform has meters as the unit for location and scale. For location this makes sense, but for scale I feel it doesn't.
Also, if I understand it correctly, setting this number to 1.15m would set the boundaries for the randomisation to 85% - 115%, which is not very clear in this UI.
I would like to propose two fixes for this:

  • Lose the meters unit for the random scale part of the tool, and display the input as the factor it actually is.
  • Work with actual, defined boundaries in the UI - minimum and maximum. I feel this offers more control and predictability than using one number that the user has to imagine going in both directions of the axis.

@Sam Van Hulle (sam_vh) Yes, that's true. That seems like a mistake. I added it to the list.

As for your proposal on how Randomize Transform works, it's outside of the scope of this task. This is just about giving values the correct units.