Interface Vs. Buttons

The interface, modeling, 3d editing tools, import/export, feature requests, etc

Moderators: jesterKing, stiv

Post Reply
Posts: 133
Joined: Wed Oct 16, 2002 3:52 am
Location: Northampton, MA (US)

Interface Vs. Buttons

Post by slikdigit » Mon Feb 03, 2003 1:54 am

I've noticed a lot of the discussions on the blender interface focus on the buttons- appearance, visual cues as to function, organization, completeness vs. hotkeys, etc. While these are important issues (esp. for those new to the program). I'd like to talk about all those "I wish I could ..." moments intermediate and experienced blender users have while working.
For me these things would be:
1- being able to see vertice weight assignment edit mode for feedback on precise weight selection.
2- being able to select by face in the UV window
3- The ability to edit armature deformed meshes in the deformed state for doing mesh keys that fix specific joint deformation problems
4-being able to optionally select faces and vertices of a mesh in the 3D window by clicking on them in the UV window
5- more stuff

i.e. things that are hard to list as a 'feature' of the program, but straddle the line between feature and interface and have an impact on workflow, as opposed to things like edge-loop editing or raytracing that are 'pure' feature, or things like button features or hotkey assignement that are 'pure' interface

I thought it would be interesting to hear other people's "I wish I could " moments with blender. Some of them could I bet be programatically simple, while others might need more architectural changes (which could still serve to improve blender)
Please do chime in.

Post Reply