BLENDer Interface Roadmap (improve blender usability)

The interface, modeling, 3d editing tools, import/export, feature requests, etc

Moderators: jesterKing, stiv

alt

Postby alt » Tue Feb 11, 2003 11:16 pm

To broken:
just that you're overlooking some of the positive aspects of the current hotkey layout

Being the youngest child who grew up in the shadows of older ones, I only bring up negative points :(

grab/rotate/scale ... are not the only operations that hotkeys are useful for

Sorry. I was pretty confused, it seems. I meant ctrlaltshift for view translations, not SRG for object translations. Hotkeys are very important and that's why I keep whining about them.

Configurable hotkeys could be a solution. Though too much configurability slows down learning (IMHO).


To Noorah:
amount of controls that are actually placed on the screen should be kept to a minimum

I agree. Controls should be placed in some common place. Maybe some panel which would update it's contents for every tool in use, list modes, layers etc.

I also like XSI. But it happens to wrestle in a different league than Blender. Using a hotkey per action with XSI's feature set would make it overkill to anyone.

IMHO, programs needs hotkeys only for most used actions/tools/features/whatever. These should stay same or fit to same mindset throughout program. If action is used only once or twice during program usage, it doesn't need a hotkey. Eg. 'quit'. I do use it, but I don't need it.


To wavk:
I think the amount of buttons is the most daunting for new users

Button-load may also make it harder for others. Most of these buttons are not used extensively, some are hardly used more than once during typical work. Maybe settings could be layered, showing the most used first, and then, if user wants to, he/she could dive to the next level.

Material editor seems very nice :)


I toyed with an idea about having a vertical panel instead of horisontal one for buttonscreens and menus. It would show active tool, possible tool modifiers (like proportional editing) and menus for current mode (eg. model, animate, render and realtime).
And then user could flip through pages of settings for each editable aspect (eg. render), arranged from most-used to least-used. Vertical panel would also free up some screenspace for 4-view -modelling.

diz
Posts: 41
Joined: Mon Jan 06, 2003 2:50 am
Location: Switzerland
Contact:

Postby diz » Wed Feb 12, 2003 2:18 am

ABOUT GUI SPACE
wavk wrote:Also check out my material editor, I based the interface loosely on Blender, but I ended up coding my entire new interface.

I love the idea of working with boxes&links! But (there is always a but), I think that this kind of interface is also much more space consuming then a serie of small button such as it is now.

The aim of a good computer interface is to create a nearly infinite workspace thanks to interface multi-layering. The best example of this in blender is that there can be several windows in each area (3d, buttons, ipo, ...), and the position/size of these areas can be completely customized. You can also note that this windowing system is very efficient because every window takes 100% of it's allocated area. This may not be the case in a drag&drop system as in your material editor. And as a matter of fact, even if you put your boxes tight together, there is still a lot of unused space.

Thus, such kind of drag&drop box&link interfaces must only be used in complex cases, where other representations are not adequate. A very good document that mostly speaks about such kind of interfaces seems to be the Proposal for improved Logic Editing in blender by the University graduation project by Jonathan van Wunnik (I didn't have time to really read it yet).

GUI FUNCTIONALITY

It is faster to type key sequences than combinations. I mean it is faster to push a first W for "window" and then F5, than SHIFT and at the same time F5. In this case it is even worse, because even my big hands are too small for typing SHIFT+F5 with one hand, thus I need my second hand. Thus I would prefer sequences than combinations (there are already some sequences, such as W...[number] for the special menu, and they are very fast to use). As for my examples, these sequences could be presented with a menu appearing under the mouse for the possibilities of the second element of the sequence. It would also permit a more modular organization, that would be easier to learn (the more complex functions would all be groupped in menus such as the actual W menu). Fore example, the bezier commands H, SHIFT+H and V could be groupped in one menu, such as H...A (or H...1) for automatic handle, H...L (or H...2) for aligned, H...F (or H...3) for free, and H...V (or H...4) for vector.

As already evoqued, another disater of speed modelling, is the SHIFT+Click and then SHIFT+Backspace in order to precisely edit the value of a button. This could be solved like this: SHIFT+Click or MiddleClick = traditional edit, ALT+Click or RightClick = edit and the content is already selected so that a new value can be entered to overwrited the older one.

USER FRIENDLY

In the button interface, when certain buttons are not availible, they should be either hidden or in a neutral grey to show they are inactive. That could help newer users. It would also be useful to have a very strict color code to easier see what is where. And there should be some simple boxes drawn around groups of buttons to delimit some logical group of buttons.

Finally, every key shortcut should be availible by a central menu, so that new users find what they want and also see what key is associated to it. There should also be a small tooltip pop-up when staying on a button to explain what it does (this feature should also be disabled).

Ok, that's a lot of writing, I hope that I was more or less clear.

Gabriel

dcuny
Posts: 68
Joined: Mon Jan 27, 2003 11:22 pm

Postby dcuny » Sat Feb 15, 2003 1:59 am

Image
I'm not kidding.

I had this horribly long post explaining exactly why Poor Newbie (me) gets tripped up by Blender. Things like:
  • objects being automatically selected and placed in Grab mode when something is created (great, but confusing)
  • The menu hint "Add | Mesh >> Sphere" explaining that it "Adds Mesh Sphere" (which provides no real information)
  • Clicking an object does not select it, no matter how many times you continue to click it.
The executive summary:
  • Newbie tries to do things in Blender.
  • Blender behaves differently than standard Windows programs.
  • Newbie becomes frustrated and gives up.
Fortunately, Newbie only generally wants to do a small set of tasks. Make it possible for Newbie to perform these actions, and they'll be happy. Early success leads to a desire to learn more.

And please, integrate the help into the program.

Finally, make sure the "Hide Agent" really does get rid of that freaking paper clip, so the Power User never has to deal with the evil that is Clippy.

matt_e
Posts: 898
Joined: Mon Oct 14, 2002 4:32 am
Location: Sydney, Australia
Contact:

Postby matt_e » Sat Feb 15, 2003 2:43 am

diz wrote:ABOUT GUI SPACE
But (there is always a but), I think that this kind of interface is also much more space consuming then a serie of small button such as it is now.

The aim of a good computer interface is to create a nearly infinite workspace thanks to interface multi-layering.

I have to disagree with this point. The aim of a good computer interface is to allow the user to interact with the computer in an easy and efficient way. While some tasks may be performed well with large areas of virtual workspace, it's not a 'magic bullet'. The fact that an interface may take up more space is irrelevant if that interface allows you to work more efficiently and easily with the computer. In fact, having an interface that crams as much in as possible to fill up all available screen space can work negatively, since the user must then cope with information overload.

One of the first things that any graphic design student is taught is not to be afraid of white space. It's a tendency for inexperienced designers to cram a page/website/whatever with as much stuff as they can and fill in all the bits of empty space in order to be what they see as 'efficient'. What they don't realise is that negative space is extremely important in allowing graphic elements to be laid out in an organised manner. Negative space allows for grouping, alignment, visual heirarchy, etc. which is very hard to acheive if the page/screen/etc is absolutely full of stuff.

So how does this relate to wavk's idea? Well, the idea behind it is to make something more usable by representing the connections between materials in a more visual and logical manner. You can contrast Blender's current material/texture buttons approach, which requires a lot of human memory usage (what texture slot am I working in? What sort of textures are in the slots below the one I'm working on? How will the different textures interact with each other (eg. blending modes), etc. etc. A more schematic solution allows the user to visualise the texture network at a glance, in it's native tree-like structure. The user can instantly see what's going on, and what's causing what effects. By moving the different blocks around in the schematic diagram, the user can group and organise them, and therefore understand them with far greater clarity than in a buttons/windows situation. It is the fact that wavk's idea doesn't take up all the available space that facilitates this kind of organisation and mental mapping.

Of course that's not to say that wavk's idea could be improved if it was wasting space unnecessarily etc. My point is that interfaces should be evaluated against their utility; against how easy and efficient they are to use, not against some sort of arbitrary measure such as how much space they take up.

diz
Posts: 41
Joined: Mon Jan 06, 2003 2:50 am
Location: Switzerland
Contact:

Postby diz » Sat Feb 15, 2003 3:10 pm

broken wrote:Of course that's not to say that wavk's idea could be improved if it was wasting space unnecessarily etc. My point is that interfaces should be evaluated against their utility; against how easy and efficient they are to use, not against some sort of arbitrary measure such as how much space they take up.


Ok, you're right... I re-read our two posts, made some tests in blender and I must agree with you that I was wrong about insisting on that. Sorry about that :oops:, and thank you for your reaction.

Another thing that I thought about while re-reading the posts of this topic, is that if we change or reorganize some keyboard hotkeys, then we must introduce a hotkey configuration. I believe that hotkeys is a very sensible topic and if we get too deep into discussions about that, we will never aggree and we will lose a lot of time with talking about that. There are many ways, how hotkeys can be improved, but there are also very strong habits all the blender users already took, thus hotkey config would also be an acceptable solutions for the users that do not agree with our hotkey changes.

But this may only be true for the elements that will stay close to the actual blender interface. All parts that are newly designed and work with another logic (such as wavk's interface) will need a new hotkey organization.

Gabriel

diz
Posts: 41
Joined: Mon Jan 06, 2003 2:50 am
Location: Switzerland
Contact:

Postby diz » Sat Feb 15, 2003 9:31 pm

alt wrote:But yes. 'S' stands for scale, 'R' for rotate etc. What does it stand for when i18n kicks in? When languages change, meaning vanishes. And when program evolves there's not enough keys to keep this logic.


I just thought that yes, languages change, but also keyboard layouts!! I have a qwertz keyboard that is probably different tou your azerty layout... Thus your argument must be pushed farer. These are the possibilites I see:
- keep the hotkeys more or less the same as now (hotkey letter corresponds to the english name of what it does)
- make groups with the hotkeys that belong together, but users with other keymaps have to switch to us keymap to use it, else there is absolutely no more logic.
- make the same nice hotkey groups, but implement them low level, independant to what letter is assigned to that key. That way, everybody would for eg. have the actual GSR assigned to the upper left three keys of the keyboard (AZE or QWE depending to the keyboard). That would make it really hard for tutorial instructions, because it would no more be possible to say "press Y to do ...".

Thus, I would prefer the first possibility, and introduce some config file, so that anyone can change them the way he wants to. It would also be possible to give 2 different config files with the Blender distrib, so that you can choose the traditional way (with the tutorial hotkeys still working) or the next generation way working with us keymaps...

Gabriel

matt_e
Posts: 898
Joined: Mon Oct 14, 2002 4:32 am
Location: Sydney, Australia
Contact:

Postby matt_e » Sun Feb 16, 2003 2:11 am

diz wrote:I just thought that yes, languages change, but also keyboard layouts!! I have a qwertz keyboard that is probably different tou your azerty layout... Thus your argument must be pushed farer. These are the possibilites I see:

That's very true! A few of my friends are travelling throughout Europe at the moment. They keep sending me back emails with srange letters in the wrong places because they can't get used to the different keyboard layouts they encounter.

As for tutorials, if we can also work to bring about a standard menu structure that has all the functionality of the hotkeys then we can just tell the user to choose that function from the menu, and it will display the hotkey beside it already. In this way, it won't matter if the hotkey has been customised.

wavk
Posts: 255
Joined: Wed Oct 16, 2002 9:58 am
Location: The Netherlands
Contact:

Postby wavk » Sun Feb 16, 2003 5:37 pm

Hi diz,

I totaly agree with all stuff in some menu and the strict colour code!

broken, what a marvelous post, you couldn't be more right, great points.

One of the things I might implement is the grouping of several blocks into one, which you can then name and enter to edit the contents. You could name the block and the inputs and outputs are created automatically. Could save a LOT of space. For instance my new bricks block, I created it first with tens of blocks. I exported the shader code and copy/pasted it into c++ and made a bricks block. Of course with a couple of modifications to the code. But if this could be done automatically, and you could then save the block...

Also, as in blender's OOPS window, I think of hiding stuff, maybe cliking an input hides all blocks connected to that input...

The graduation project document is very good I think. It would make the game engine a lot more powerfull. I think we should think about one interface for all these types of windows, the OOPS window, the game bricks window, the state window (proposed in the document), the material window and maybe even other stuff, Ton talked about a maya like construction history. This interface type would be perfect for such a window.

Also here comes in the grouping of blocks, again. Wouldn't it be great if you could make for instance a mouse look function (in the form of blocks) and you could drag a box around them, call it mouse look and save it, exchange it with other users. And if you want total control, double click the block and you can edit it's contents.

It's very much like object oriented programming, if you keep it structured, each block does it's own task and you keep a bright overview of the structure.

Sorry that this post is a bit chaotic, it's just ideas put down.

diz
Posts: 41
Joined: Mon Jan 06, 2003 2:50 am
Location: Switzerland
Contact:

Postby diz » Sun Feb 16, 2003 10:00 pm

wavk wrote:One of the things I might implement is the grouping of several blocks into one, which you can then name and enter to edit the contents. You could name the block and the inputs and outputs are created automatically. Could save a LOT of space.
[...]
It's very much like object oriented programming, if you keep it structured, each block does it's own task and you keep a bright overview of the structure.

I think that it's a great idea to organize the user interface that way. So you could create your modules, then instantiate them the number of time you wish and link them to the objects or to further blocks.

wavk wrote:Also, as in blender's OOPS window, I think of hiding stuff, maybe cliking an input hides all blocks connected to that input...

I think that this may become a big problem in complex situations. If I take the example of the oops window that also works with free floating elements linked together, it quickly becomes very hard to still see something when there are many elements. And the existing hiding possibilities are often not very useful, thus we must find some better working hiding options, such as you suggested, by hiding connected blocks, or other... IMHO, this is one of the biggest problems of an interface such as wavk's, and if we decide to take it, we must find an elegant solution to avoid a visual overload of blocks and connections.

As I am speaking of the oops window, I also want to say that it could be greatly improved. Because currently, the oops window has absolutely no functionality other than to select something with CTRL+RMB. In a window such as the oops window, it would be great to be able to create/modify/delete all possible links. Furthermore, it could be used as central element for all the windows that would be replaced by wavk's interface, what would make one huge modular control interface.

Gabriel

matt_e
Posts: 898
Joined: Mon Oct 14, 2002 4:32 am
Location: Sydney, Australia
Contact:

Postby matt_e » Mon Feb 17, 2003 3:01 am

wavk wrote:One of the things I might implement is the grouping of several blocks into one, which you can then name and enter to edit the contents. You could name the block and the inputs and outputs are created automatically.


Yes! That's an excellent idea! It would help a lot to organise the workflow into a much more object oriented structure. What would also be very cool is if you could save a group (like you mentioned) and exchange it with other people, or even save it as a built-in preset in Blender. If this was possible, instead of re-creating the material networks from scratch every time, you could just add one that's already saved and it would appear as the collapsed group. This would be great for newbies too, because they could download preset material network groups from the web, and then just use them as a 'black box' within their own material networks. You wouldn't have to understand the network within the group, in order to still use it, and build upon it.

Like you mentioned, I think if such a GUI system were developed, it would be great if it could be re-used for all sorts of things within Blender such as materials, OOPS, game logic, and even sequence editing. You could have node-based compositing using sequence plugins, in a similar to Shake, all within Blender! Not to mention that the UI would be a lot simpler to learn - right now, sequence editor, OOPS window, material/texture buttons, game logic etc all use very different GUI controls and interface philosophies in general. It's a lot for a user to learn. If these interfaces could be made consistent (and I believe they can), then it would make learning and using Blender so much more easy and efficient.

What I also like about these schematic type approaches is that it exposes the internals of Blender to the user much more. The OOPS architecture with datablocks, users, links etc. in Blender has so much potential and could be made so much more powerful if only users could visualise, and modify the data easily. As it is right now we've got this great, logically structured architecture behind Blender, which is just begging to be taken advantage of.

diz wrote:And the existing hiding possibilities are often not very useful, thus we must find some better working hiding options, such as you suggested, by hiding connected blocks, or other...

Just to throw an idea around, maybe Blender's OpenGL windowing can be taken advantage of in order to zoom and scale things. Personally, one thing I like about the Mac OS X dock is that it can provide a lot of information at a glance, but also allows you to easily zoom in on areas of interest, for more precision and clearer communication (viewing larger icons and reading the mouseover text).

Maybe a schematic diagram could do something similar - for example, if you click on a block in a chain, then the other blocks that aren't connected to it zoom down smaller, allowing you to concentrate on that chain? Maybe you could make other disconnected blocks become partially transparent? I don't know, but there are certainly a lot of avenues for visual communication which can be explored.

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

In order to support a better UI blender must support it..

Postby thorax » Mon Feb 24, 2003 9:06 am

Those of you who have coded in PHP, for instance will
have maybe run into the problem of managing the UI seperate
from the code base. You can do it if your code supports it,
but if say if your UI is a one to one correspondence with your
code structure, modifying the screen will require moving code
around, very similar to PHP its like everytime you want to edit
your webpage that may involve interpretting the code
and modifying the text and graphics code interspersed within
the apps actual functionality.

To support a dynamic UI, you must disassociate the UI from the
underlying code.. This means you should be able to move the code
and the interface functionality indpendent of each other..
To do this your interface must use a event queue, when you do anything in blender, then events you perform go on a queue to be read
off or reacted to with callback routines..

Now that said, if blender doesn't support this kind of event handling,
you can't have skins, you may be able to reduce the interface,
but modifications to the interface may be hard, even impossible
say for instance if one UI element depends on another elsewhere..

First specify what is the structure of blender's UI internally,
determine what needs to be done to make it so it can be changed easily..
Also collect information about what kind of UI would be good..
Then change blender to support a generalistic approach.. Then
implement test interfaces on blender.. Then skins, etc.. You just
can't put a facade atop blender and expect it to take..

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

In order to support a better UI blender must support it..

Postby thorax » Mon Feb 24, 2003 9:07 am

Those of you who have coded in PHP, for instance will
have maybe run into the problem of managing the UI seperate
from the code base. You can do it if your code supports it,
but if say if your UI is a one to one correspondence with your
code structure, modifying the screen will require moving code
around, very similar to PHP its like everytime you want to edit
your webpage that may involve interpretting the code
and modifying the text and graphics code interspersed within
the apps actual functionality.

To support a dynamic UI, you must disassociate the UI from the
underlying code.. This means you should be able to move the code
and the interface functionality indpendent of each other..
To do this your interface must use a event queue, when you do anything in blender, then events you perform go on a queue to be read
off or reacted to with callback routines..

Now that said, if blender doesn't support this kind of event handling,
you can't have skins, you may be able to reduce the interface,
but modifications to the interface may be hard, even impossible
say for instance if one UI element depends on another elsewhere..

First specify what is the structure of blender's UI internally,
determine what needs to be done to make it so it can be changed easily..
Also collect information about what kind of UI would be good..
Then change blender to support a generalistic approach.. Then
implement test interfaces on blender.. Then skins, etc.. You just
can't put a facade atop blender and expect it to take..

matt_e
Posts: 898
Joined: Mon Oct 14, 2002 4:32 am
Location: Sydney, Australia
Contact:

Postby matt_e » Mon Feb 24, 2003 9:35 am

It depends to what level you mean 'interface'. I'm no hardcore coder (I only know PHP and VBscript!), but I've had a cursory look at the UI code, and at least the drawing of the buttons and the locations/arrangements/divisions of buttons, other UI controls and windows seems to be quite easy to modify. It's also relatively easy to change between UI controls (buttons<->sliders<->menus etc).

greasyScott
Posts: 16
Joined: Fri Dec 20, 2002 6:43 am
Contact:

Creativity drives technology

Postby greasyScott » Mon Mar 03, 2003 9:38 pm

From Thorax:
Those of you who have coded in PHP, for instance will have maybe run into the problem of managing the UI seperate
from the code base.


This may be the case with Blender. Like broken, I'm no hard core programmer. I am first and foremost an artist. My concerns are ease of use, mastery, quality of the finished product, and efficiency. Not always in that order.

I have been on holiday for the past two weeks and haven't been able to monitor or post on a regular basis. It's exciting to see how the discussion has progressed so far.

Soon I will get my project submission finalized and up on the board. It has already been approved. I just need to fill in the blanks and set up the structure.

My work process is from the general to the specific and that is my intent with this project. Pointing out specific deficiencies is tremendously helpful. They point to larger issues. Some may be able to be grouped or associated in different ways. I specifically do not want to know more about what can and can't be done with Blender code wise at this stage.

Without the shackles of code limitations or "it's been done this way because..." new and exciting possibilities open up. Who knows, the brain trust we have working here may be an inspiration to all the other big-time 3D programs. The beauty is, this is a community and whoever wants to run with an idea can.

Let's see how far we can go.
Good judgement
tends to precede
good fortune


Return to “Interface & Tools”

Who is online

Users browsing this forum: No registered users and 0 guests