I don't use the mouse gestures, and I do think they're fairly pointless on a laptop or desktop machine. However, they are useful if you run Blender on a PDA, or similar device that lacks a keyboard. I have never done that, so I can't really comment on whether or not that is even a viable thing to do.
Pablosbrain wrote:I like the fact that Wings3D makes it easy to make the program match the functionality of whatever other 3D application you may be using.
I don't think Blender should try to emulate all the other 3d programs out there; it can't just be a clone, or a wannabe me-too type of program. Blender development should concentrate on doing things in the best way possible, not in the most common way. Features such as the 3d cursor illustrate the type of thought that needs to happen. I couldn't stand 3ds max because of the lack of a 3d cursor -- you can't rotate around an arbitrary point, center the view, or zoom based a given center in a convenient or expedient fashion. I think the 3d cursor is an important enough function to keep mapped to the left mouse button.
Regarding the issue of the spacebar vs. right clicking to bring up a menu:
I think context menus are a severely overrated concept in Windows programming -- always mapped to the right mouse button, and usually only marginally useful. The mouse buttons could be doing much more useful things than popping up a list of simple options that are much faster to execute with a keystroke. Furthermore, "context menus" are the only type of menu in most applications that will appear directly under (or at least near) the mouse cursor postion. Everything else pops up in the center of the screen (or some equally useless location), and you have to move the mouse over and press "OK" or "Cancel", or choose an option. Blender's method of placing all dialogs under the mouse cursor is much more convenient; features like this allow it to stand out as efficient and fast.
The current mapping of mouse buttons to functions emphasizes the difference between workflow in Blender and workflow in most applications such as 3ds max.
Most "industry standard" applications map selection, executing an action (such as moving) and accepting the action to the left mouse button. The mode is global; you work in "move mode" for a while, then in "rotate mode" for a while, executing actions in each mode. This is very inefficient since choosing a mode and executing an action can be done in one step. The right mouse button is reserved for "quads" or other glorified names for context menus.
Blender does compact mode selection and execution into one step; to move an object the user simply presses 'g'. Switching to rotation mode involves simply pressing 'r', etc. Pressing the left mouse button never executes any action (it accepts changes effected by an action, but it does not initiate the action). Therefore, its purpose must be made unique; moving around the 3d cursor is a perfect candidate. It modifies where an action will be performed (adding an object, centering the view, etc), or how that action will be performed (warping an object).
These differences in workflow demand a unique interface. Bottom line: selection and 3d cursor movement are not miscible. If the clicking the RMB selects, then dragging the RMB should bandbox select or lasso. If things were changed and the RMB were mapped to moving the 3d cursor, then dragging it should not perform a bandbox select.
Sorry about this ramble; I just feel strongly about the purpose of the mouse.