Page 1 of 1


Posted: Fri Sep 14, 2007 12:30 am
by ace1
I recently asked a question about the allqueqe() function.
Thanks Bebraw for the answer! But, to update GUI buttons in a consistent manner in all cases is to update all drawing during the video's vertical refresh.Most video games update in that way in real time.I've wrote code to do so.Couldn't Blender's UI update do the same instead of using the allqueqe() function and just update all at once with any change in the whole screen's appearance?

Also, there's a simpler way to trap button presses in regards to button UI code.Is there a planned rewrite of the UI code? In regard's to simple button presses,the button is either up or down.It seems to me that the current UI code takes you all around the world to do a simple job instead of just tracking the mouse state and harmonizing it with the button drawing code and using a simple read flag.

I found it somewhat complicated and very nested and was tempted to write my own UI handler functions but since I'm developing for Blender,I wanted to use the native code.

These are just some suggestions. :D
My theory is code simple,brief,powerful,and reusable code.

Posted: Fri Sep 14, 2007 7:05 am
by BeBraw
There is currently a project going on that aims to refactor Blender's event system. You can read more about it at .

Posted: Fri Sep 14, 2007 2:09 pm
by ace1
I looked at the docs and it seems promising.But, I think that the UI code functions need to be more modular and simpler in nature which will produce performance speedups and improved readability.It will also speed up Blender's development because new developers will be able to easily grasp the understanding of how to use the UI function code to implement them in their improvement endeavors. Let menu buttons be menu buttons only,sliders sliders only,toggle buttons toggle buttons only etc.Each with their own codebase with a common event handler.Then you will have power and efficiency .I wish that I could help with the design.

Again, just some suggestions :D