There are a number of features that should be supported but further design work is needed before they can be implemented properly. Everyone is welcome to join the discussion by proposing more ideas or solutions, by creating mockups or by just giving some own input.
Scope of Widgets
When should a tool use a widget?
- Edit Mesh Spin Tool
- Edit Mesh Smooth tool (number of steps, smooth factor)
- Edit Mesh Bevel/Inset
- Edit Mesh UV Rotate
- Edit Mesh Subdivide
Spin tool is an example where widgets make a lot of sense (you rotate around an axis, so direct manipulation is useful).
The issue with Smooth & UV-Rotate is that using a 3D manipulator for these is using manipulator simply as a way to access buttons in the view-port.
An issue with bevel/inset depends on the kind of manipulator used. Having a single manipulator in the view doesn't directly relate to the edge, and again - its more a replacement for accessing buttons in the tool-bar.
Suggest not to use manipulators if they are simply an alternate access to buttons.
As for bevel/inset: we could make the bevel edges themselves into manipulators, so you could click on beveled edge and drag it.
Toggling Visibility of Widgets
Currently, only the widgets of the active object are drawn. While this works for most cases, being able to precisely control which widgets are visible and which not can be useful.
For example, in future, a camera could have widgets for the focal length, sensor size, depth of field distance, ... If all are visible all the time it's easy to accidentally manipulate them.
- Draw a panel for the active object in which its widgets can be enabled/disabled individually (similar to "Display" for cameras).
- Add a toggle button next to every button represented by a widget (similar to how it's done in the transform panel in the properties region, where you can press the 'lock' icon to disable the widget for this axis)
- Where would the manipulator toggles be stored? (Object / Viewport / Tool-Settings)
- How to expose settings that aren't related directly to objects? (Group Center, View-Port Clipping Bounds)
It should be possible for users to input precise values to manipulate a widget (and thus the value it represents). This way they can keep their locus of attention in the viewport, avoiding having to search for a button in the UI.
- Enabling number input by holding a modifier key while clicking on widget (as in - Ctrl+LMB on widget activates modal mode to input value)
Users may want to snap manipulators to other elements in the scene, eg:
- Snapping the spin center-point to a vertex on a mesh.
- Snapping bisect angle to a vertex
- Snapping the camera's DOF to the surface of an object.
Note that currently holding Ctrl sets WM_MANIPULATOR_TWEAK_SNAP but each manipulator needs to implement this.
- Each manipulator has a snap flag for 'TRANSLATE | ROTATE | SCALE', when set, holding Ctrl will snap to any of the elements set.
- Snapping can also respect scene settings (in the 3D) view-port, so users can snap as with mesh edit-mode.
When the cursor is over a manipulator, do we want to support I-Key to keyframe that setting (DOF, lense for eg).
Manipulators now use theme colors, there are basic preset colors to use: (HIGHLIGHT, PRIMARY, SECONDARY, A, B).
Where A/B are for other controls.
Currently they're hard coded in manipulators but they can be set however we like.