Could someone please consider optimizing armature speeds.

Animation tools, character animation, non linear animation

Moderators: jesterKing, stiv

matt_e
Posts: 898
Joined: Mon Oct 14, 2002 4:32 am
Location: Sydney, Australia
Contact:

Postby matt_e » Wed Jun 25, 2003 6:14 am

Money_YaY! wrote:Could you create the nla concert thing in the mean time ?


http://download.blender.org/documentati ... x2434.html ?

Money_YaY!
Posts: 876
Joined: Wed Oct 23, 2002 2:47 pm

Postby Money_YaY! » Wed Jun 25, 2003 3:18 pm

yeah I know That is what I meant. That the Armature is the only one
that can have the action strip. Makes no sense.
All items should beable to have some sort of strip method.
Maybe a different color. Besides the NLA is just as buggy as the armatures

But as I recall. Before NaN went poof. They were working on several
more features for NLA.

oh well
^v^

Zarf
Posts: 88
Joined: Mon Oct 14, 2002 3:54 am

Postby Zarf » Wed Jun 25, 2003 3:33 pm

Hos wrote:Quite often the slowness exhibited is caused
by circular parenting, which may be tricky for an
armature novice to detect (see earlier post by
kaktuswasser). If you guys want to send me a
link to your files I can look them over...

... but that doesn't mean that things couldn't
be speeded up a bit. Certainly a source of slowness
is this bit of code in drawobject.c:

Code: Select all

#if 1
#ifdef __NLA
         /* Force a refresh of the display list if the parent is an armature */
         if (ob->parent && ob->parent->type==OB_ARMATURE && ob->partype==PARSKEL){
#if 0         /* Turn this on if there are problems with deformation lag */
            where_is_armature (ob->parent);
#endif
            if (ob != G.obedit)
               makeDispList (ob);
         }
#endif
#endif


which isn't constraint related at all. When you comment it
out things go much faster... the only prob is that
somethings don't wind up where you expect them.
A smarter way to test whether the displists need
rebuilding or not is needed!

Regards,
Chris



something that has been bothering me is that I have wondered if instead of using display lists vertex arrays would be a better option. There are of course limitations to this (display lists are far more generalized and can encapsulate a variety of GL commands) and vertex arrays I have heard can lead to crashes on some video cards (never had it happen myself) but the speed tradeoff should be worth it (Vertex arrays are dynamic, they dont need to be completley rebuilt each time a change is made to a 3d dataset)

They used to be an extension but are now part of gl 1.2(?) according to the red book.

If your going to be changing the 3d dataset encapsulated by the display lists for EVERY frame (as is the case for objects deformed by armatures) I see the sense in recalculating the display list call it, then throw it out, advance a frame, recalculate the display list ect. I hope the animation preview 'alt-a' dosnt do this, since it seems rather wastefull.

It is possible to have a display list comprised of other display lists and then only update the portions you need, but I really doubt this compares performance-wise to vertex arrays.

Are bone matrices cached after constraints are resolved or is the whole system recalculated for each screen redraw? This could be a major problem for just rotating in the viewport. I had one scene where a character with vertex weighted skin in local mode (with skeleton) could be rotated and posed with quite a bit of interactivity,when viewed in global scope next to a rather simple character with only dummy geometry attached to it (no vertex weights) everything slowed down considerably (on an athlon 1.2 ghz 650megs of ram nvidia quadro2 w/64 megs of ram).
both characters had a similar set of constraints (no constraint loops)

Just some thoughts.

Zarf

LoZaR
Posts: 4
Joined: Fri Oct 18, 2002 11:17 am

Postby LoZaR » Wed Jun 25, 2003 5:02 pm

Hello

I can't possibly comment on the speed of the armature system in general, just the IK Solver part which I had the pleasure of implementing at NaN.

The 2 main speed problems are
1) History independent IK solves
The IK solver as used by the armature system always starts from the same bone positions. This often means it is a LONG way from the end-effector and consequently takes more iterations in a numerical system. It's essential to maintain history independence otherwise you can get unexpected results if you add more key frames etc. This has a significant impact on speed.

2) Robustness. It's easy to write a simple IK solver that works for example only if the End effector is reachable. Making code that works in all cases is much harder. The current solution (inverse Jacobian solver) has to do quite a lot of matrix mangling to work out what to do when the Jacobian is singular. I admit that this code could be further optimized by first quickly identifying the Determinant (also can use old determinant value as a guide) and then doing using a quicker inversion method.

In it's defense:

The current code is really quite flexible it should not be too hard to add joint constraints and different joint types.

The code is well documented take a look in blender/intern/iksolver you should see there a link to the excellent thesis it was based on (hang on a minute):

http://ligwww.epfl.ch/~baerloch/papers/thesis.pdf

Good luck,
Laurence

soletread
Posts: 83
Joined: Fri Jan 10, 2003 7:11 pm

Postby soletread » Wed Jun 25, 2003 8:40 pm

Zarf wrote:
Are bone matrices cached after constraints are resolved or is the whole system recalculated for each screen redraw? This could be a major problem for just rotating in the viewport.


I have a feeling that this is the case.

Rotating the viewport DOES slow drastically with a fully configured armature. Does this mean that the calculations are being done for every new screen redraw?

Is there a certain constraint that is more computation intensive than another? Would rotational constraints allow faster solving than knee pointers for instance?

I had no idea that armature coding was so complex. After what I have read I believe that the existing one, although slow, is accurate. I would rather have accurate and slow.

I have also noticed a marked speed improvement with rotating the viewport when the rig is put in the rest pos. Not as fast as without the rig altogher, but significantly faster. Currently if I am using more than one character in a scene I can only have ONE out of rest pos.

----

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

Postby thorax » Wed Jun 25, 2003 10:59 pm

LoZaR wrote:Hello

I can't possibly comment on the speed of the armature system in general, just the IK Solver part which I had the pleasure of implementing at NaN.

The 2 main speed problems are
1) History independent IK solves
The IK solver as used by the armature system always starts from the same bone positions. This often means it is a LONG way from the end-effector and consequently takes more iterations in a numerical system. It's essential to maintain history independence otherwise you can get unexpected results if you add more key frames etc. This has a significant impact on speed.


In the IKA solver object (is it an object?) I saw the history independent
IK.. I was wondering why History Independent is used.. Why
not allow the solver to to use previous frames to determine
the solutions for the joints? Also I was wondering if its
possible to have a quicker solution with more constraints, because
it would seem that by having more constraints you could reduce the
number of solutions..

The armature spec in the document Ton posted, it seems to
imply a history dependent IK, see
"While interactively positioning an effector the solution is based on the previous frame's positioning".

Also could it be possible to determine cyclic relationships using a
recursive breadth first graph search?


2) Robustness. It's easy to write a simple IK solver that works for example only if the End effector is reachable. Making code that works in all cases is much harder. The current solution (inverse Jacobian solver) has to do quite a lot of matrix mangling to work out what to do when the Jacobian is singular. I admit that this code could be further optimized by first quickly identifying the Determinant (also can use old determinant value as a guide) and then doing using a quicker inversion method.




Inverse Jacobian according to the net is the interative
version of IK that everyone uses.. Ton said it was Analytical..
He claimed it would handle hundreds of bones.. I found a paper
which I passed to him that covers these topics..

http://www.cis.upenn.edu/~badler/gmod/0528a.pdf

I think I know where the original blender IK solver came from,
look at the diagrams, they even look like the old IKA..

http://www-inst.eecs.berkeley.edu/~cs29 ... matics.pdf


In it's defense:

The current code is really quite flexible it should not be too hard to add joint constraints and different joint types.



If its like the proposal in the document Ton gave, I can attest that
it is.. It was very well thought out how the interface was specified..
The solver has access to all constraints which can be submitted to
it.. It also is simple enough it doesn't have to know about
the bone structure its using..

Ton said that it wouldn't be possible to port Blender to C++
in the future.. Is this really the case or is he pressed for
some kind of future deadline? I really can't imagine how that would
be so since C is a subset of C++.. Even the structs and classes
are the same thing only a different definition of public/private..

I'm also not suggesting a rewrite of the existing, but a parallel
project to make it so blender is more easily maintainable..

Its Open source I don't see why it couldn't be split into two sources
and one to be the C++ rewrite of the other.. I'm only talking about a
incremental rewrite, starting from the C code and gradually
shifting the code so it works in C++ as C, then gradually shifting
data structures to use object-style data structures.. Eventually
the whole code could become object based..

For instance we could use a object to contain all the globals
and have everything access that.. Then associate data types with
methods that modify them, and create objects that have the methods,
and replace all the code that works on the data.. There is more than one way to go to a C++ codebase, and even just puting the C
code in C++ and allowing C++ objects to exist in the structure
is a start..


The code is well documented take a look in blender/intern/iksolver you should see there a link to the excellent thesis it was based on (hang on a minute):

http://ligwww.epfl.ch/~baerloch/papers/thesis.pdf

Good luck,
Laurence

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

Postby thorax » Wed Jun 25, 2003 11:16 pm

How hard would it be to have a multi-chain IKA
skeleton Lozar? This is what I would use in Kinemation..
Basically there would be one handle chain over all the
joints from the shoulder to the wrist, then for the elbow
joint another handle chain going from the shoulder.
I could move the handle chain for the wrist to a particular
point, constrain the positional orientation of the handle
for the wrist.. Then draw the elbow handle target out and
drag it to the side to make it bend, while the wrist is
stil trying to strive for the location it was constrained to..
This allows easy posing of armatures without adjusting the joints
and is easier to control and more intuitive.. I don't believe there
will be a need for a IK solver that can completely guess at the proper
rotation of the joints, if you can have multiple overlapping IK
chains that start at a particular joint and end at
another one, no more than three bones, it should make for easy
posing and animation.. Also the IK handles need to be rotateable..
Does the current solver framework allow for twisting of the IK handle?
Because you could twist the wrist and have the shoulder rotate
causing the elbow to swing about.. This made for very easy
posing of skeletons..

green
Posts: 81
Joined: Sun Oct 13, 2002 8:04 pm

Postby green » Wed Jun 25, 2003 11:20 pm

blah blah blah blah blah blah

harkyman
Posts: 278
Joined: Fri Oct 18, 2002 2:47 pm
Location: Pennsylvania, USA
Contact:

Postby harkyman » Thu Jun 26, 2003 3:31 am

Thorax -

I read through your other posts about the armature system and skinning, and I must admit to getting lost in the terminology. This however, seems simple and straightforward. The ability to overlap or nest IK chains would go a long way to solving many of the controlability problems that I encounter regularly. This is a great and simple idea.

Money_YaY!
Posts: 876
Joined: Wed Oct 23, 2002 2:47 pm

Postby Money_YaY! » Thu Jun 26, 2003 4:53 am


Code: Select all

#if 1
#ifdef __NLA
         /* Force a refresh of the display list if the parent is an armature */
         if (ob->parent && ob->parent->type==OB_ARMATURE && ob->partype==PARSKEL){
#if 0         /* Turn this on if there are problems with deformation lag */
            where_is_armature (ob->parent);
#endif
            if (ob != G.obedit)
               makeDispList (ob);
         }
#endif
#endif



Zarf


What part do I comment out. I tried /* */ and #if 0
but both did not build.
Or maybe "how" do I comment out stuff.

Also How do I go about makeing small builds.
Each time I build from Tuhopuu it takes 50 minutes.

Zarf
Posts: 88
Joined: Mon Oct 14, 2002 3:54 am

Postby Zarf » Thu Jun 26, 2003 8:30 am

thorax wrote:In the IKA solver object (is it an object?) I saw the history independent
IK.. I was wondering why History Independent is used.. Why
not allow the solver to to use previous frames to determine
the solutions for the joints? Also I was wondering if its


I think this has to do with creating repeatable motion on various platforms (rounding issues ect).


thorax wrote:Inverse Jacobian according to the net is the interative
version of IK that everyone uses.. Ton said it was Analytical..
He claimed it would handle hundreds of bones.. I found a paper
which I passed to him that covers these topics..


That depends on what you mean by 'everyone'. There are countles methods (analytical methods, Cyclical coordinate descent, physics based methods ect.) Inverse Jacobian methods seem to be popular (in CG) though. Robotics textbooks would be an interesting source of information for different techniques from what I hear.

The current solver is not analytical. it becomes extremely difficult to nigh-impossible to find analytical solutions for likages with an arbitrary number of elements and excess DOF.

Cheers,
Zarf

Zarf
Posts: 88
Joined: Mon Oct 14, 2002 3:54 am

Postby Zarf » Thu Jun 26, 2003 8:35 am

thorax wrote:How hard would it be to have a multi-chain IKA
skeleton Lozar? This is what I would use in Kinemation..
Basically there would be one handle chain over all the
joints from the shoulder to the wrist, then for the elbow
joint another handle chain going from the shoulder.
I could move the handle chain for the wrist to a particular
point, constrain the positional orientation of the handle
for the wrist..

<snip>

I think you should check out the animanium demo movies at
http://www.animanium.com

It claims to be the 'future of IK'.

This dosnt really seem to be as new as it claims to be. I have heard of other apps that reuputedly have similar animation/posing tools (motionbuilder for one)

makes everything else kind of pale in comparison eh?

Cheers,
Zarf

ton
Site Admin
Posts: 525
Joined: Wed Oct 16, 2002 12:13 am
Contact:

Postby ton » Thu Jun 26, 2003 12:29 pm

I don't have a scientific math background, so forgive me for not being accurate enough!

When I talked about IK in first instance, I mentioned the old method that I coded for "Ika", and the new one Blender 2.2x. This is a good reference:

http://www.martinb.com/physics/kinematics/joints/

A comprehensible explanation of IK. Our old system just simply itterated positions and angles to get the end-effector at the right location. That's the "itterative approach".
The new method in Blender expresses chains & constraints in matrices, and use a Jacobian to solve the matrix. This is heavy math, and already beyond what I really (want to!) grasp. I only know Jacobians are an accepted and proven method for computer animation.

http://www.cis.upenn.edu/~badler/gmod/0528a.pdf

Here the authors go one step further. They call Jacobian solving "Numerical" and present a real "Analytical" method to solve the IK problem. Or rather, present a concept that mixes both methods for better results.

I can't judge if this paper would improve our current IK solver. But what I do know, is that such IK solvers only form one component in the whole Armature system. The original topic - speed problems - most likely is not in the IK solver, but in the way all Armature components work together.

This is not a theoretical or mathematical problem, but simply an implementation problem. Solve it by closely reading the original Reevan doc, and find out where the bottleneck in the current code is.

I'm getting quite annoyed by the amount of ignorant assumptions that are being theorized by thorax, that don't help anyone to get to understand Blender better, or help improving tools for the artists here.
There's no need for people who'll keep pointing to general concepts that MIGHT work, but are just far from the down-to-earth daily practice of the blender source. Unfortunately Blender was not designed with knowledge of general software engineering methods. That's a problem, and it requires a different mindset for tackling problems in Blender.

Only people who take the time to understand how it currently works, can actually help out getting the problems solved, and propose or implement methods that make coding easier. Changes & improvements come from within, not by blindly attacking it with existentially alien concepts.

It's like turning Iraq into a democracy. Apart from the *question* if you would like that, still definitely something that requires iraqi's to make it happen.

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

Postby thorax » Fri Jun 27, 2003 3:24 am

We do need people to look at the sources and determine what
its doing, then develop a diagram of the code from this and determine
both how to make it better and how to make it more like what we want,
if we don't it will be just a layer of unguided hacks..

I liked the reference to the "Cscope" program that uses Ctags,
that will help.. I think more programs like this would help..
If there was something that could show function/struct relationships
that would be helpful..

The multi-chain handles would help the IK.. I only want what
is easiest to use, no need to get into solvers that predict the future
but I was wondering what options there were.. And if history-dependent
would offer some better solutions.. I only recall the
kind of IK system I liked, Kinemation, and a lot of my thoughts are
trying to fuzzily think a way back to that setup because
I have not used a thing like that in a long time..

If you continue to use C, consider the following..

In general for the sources I would recommend
localize complexity (the more you have to describe to
someone to do something with the source, the more you are failing to
localize complexity, this is a fundamental rule of
abstraction).. And all object oriented relationships
that are good will have high localization of complexity.

Localization of complexity is simply taking all the code and
data that has anything to do with each other and
placing it as close to each other physically and semanticall, so that all you need to know
is right there, not somewhere else.. C++ provides a
framework for this as a part of its mechanism of working.
In C there is no such mechanism, C doesn't enforce any relationship
other than compiling individual pieces of code as modules and
linking them together, this only localizes the scope of the
variables (enforcing scope helps in abstraction),
and as a result the coders have to be very smart about how to
get the most out of C to keep things together and easy to maintain,
but most often it isn't..

I don't think anyone writes complex applications in C anymore,
unless they are required to..

The basic recognizeable difference is like this:

C:

Code: Select all

struct a {
int data;
char data2;
} stuff;

int myfunction(stuff in) {
/* do something useful with structures of type "stuff"*/
}

stuff aa;

myfunction(aa);



C++:

Code: Select all

class a {
int data;
char data2;
function myfunction() {
// do something useful with objects of this class type..
}
} stuff;

stuff aa;
aa.myfunction();



In C you must match the data with the code that manages it.
In C++, C++ manages the code code and data you just
refer to the functions by the data structure..

This allows for a unification in the way of interfacing code
because you could have ten different kinds of data
all with the same method "myfunction", but in C
you would need a function named after every piece of
data it modifies, most likely named something different
each time.. Also leads to a coding process that
can be recognized by an increase of copy/paste code
from one function to another, for functions that are similar.

Oh and C++ can deal with this problem by entering in
overloading and inheritance, so that functions and data that are
similar are related and shared..

The smart way in C to handle this would be like
making a set of functions that work on all data..
And the data would have to be passed with void pointers,
the structures internally would have code to check the
datatype to see what kind it is and then determine the
function to execute on the data.. But the functions have to be written
very carefully and everytime new features are added you
must know about all the data and code that needs to
be modified to make it work.. But to do it right requires a
lot of creativity with C as there is very little to
be creative with.

There is one other way in C that is similar to creating a
language in C.. Using functions to recursively call each other
when performing complex functions on a database,
using each function according to the data its repponsible for.
How this is like writing a language, you
recusively traverse a data structure that describes how
to function and process data.

Something that is really bad in C and C++, void pointers..
They are a necessity in C but in C++ their use leads to
developing of objects as data.. Its okay to use pointers,
but if you use void pointers, there is no way to guarantee
you will not have problems unless you think carefully on casting
in and out of void pointers.. Usually leads to placing flags in
the data structures.. Which is okay but open to data errors
due to code that modifies the data type somewhere down the
line.

You can write C++ objects to maintain the integrity of the data
structure so that data type modification can be traced and enforced.
Especially if you make all data types private and
use methods to modify the data.. A function call probably
takes about 10 operations to perform and actually
is smarter than accessing a heavily pointer based data-type directly..

3D graphics code is highly pointer based, and heavily computation
intensive, so it will be tough to maintain integrity of data
if the functions that modify the data are not
localized to the data.. And in C this is impossible without
passing data to functions with pointers. And pointers are the
leading cause of bugs in programs. Also coding in C leads
to replication of code which leads to bugs as well.. If you
can eliminate these two, it makes it easier to maintain and
adjust.. But harder to hack (code without any guidelines or
consideration of the future).

That's why I will continue to suggest a migration to C++ despite
the contreversy.. The design suggestions should be implemented in future
but its okay to add them now, okay hack.. But there will need to be a migration.

Michel
Posts: 208
Joined: Wed Oct 16, 2002 7:27 pm
Location: Somewhere below the rivers in Holland (but not Limburg)

Postby Michel » Fri Jun 27, 2003 10:24 am

Hi Thorax,

thorax wrote:We do need people to look at the sources and determine what its doing...


From the above line, I get the impression that you haven't looked at the code at all. Also, I as one of the current developers on Blender, am getting pretty insulted at the way you're 'giving lecture' on the differences between C and C++. :evil: .

Instead of writing these long posts on the forums, you should spend your time at looking through the sources yourself and then I hope you get the feeling of what's possible and what's not possible.

I have not read all of your posts, because they are way too long and usually I have no idea what you're trying to tell. But what I do know from your posts is that you're trying to suggest things that are not possible in the current architecture, or at least really hard to do.

I, as a developer, try to listen to users. You, however, are the #1 on my ignore list. Currently the list contains 1 name.

With regards,
Michel


Return to “Animation”

Who is online

Users browsing this forum: No registered users and 0 guests