Could someone please consider optimizing armature speeds.

Animation tools, character animation, non linear animation

Moderators: jesterKing, stiv

soletread
Posts: 83
Joined: Fri Jan 10, 2003 7:11 pm

Could someone please consider optimizing armature speeds.

Postby soletread » Wed Jun 18, 2003 8:06 pm

Blender is GREAT! Yes. I cannot believe the dedication that has gone in to such an awesome project.

I have bravely invested a lot of time in learning this app. I am not just an experimenter, I intend to use it for REAL broadcast work. Till now I have believed it was possible to do this. I still do actually. But there are a few things that I find frustrating.

Armatures with constraints really slow down the workspace. Take any rig with a pair of hands with fingers. Hand rigging can be quite complex and even with the minimum of constraints, the rig slows to the point where I really battle to pose effectively. Not only does the whole pose process become terribly slow, the rest of the process does also, modelling etc.

Now add another character in your scene, or maybe even 3.

Why does this happen? Is it because constraints have to be calculated for every screen re-draw, even when out of pose mode. (For instance when zooming in and out etc.)?

Moving the rig to another layer does not help either.

Surely there must be a much more effective way of doing this. How about just doing the calculations when there is a change in the positioning of an bone in the rig itself. Like an "onchange" event or something?

Or maybe the code just needs optimizing somewhere......

Please could someone consider looking into this? I feel this is Blenders biggest problem right now, forget Ray Tracing, etc. There is just no way we can use this product like this. Even on fairly fast machines. (800MHz PIII's 512 M Ram) When it comes to character animation, Blender really battles in more than just basic scenes. This is a shame as we REALLY want to use it.

Please dont suggest using a faster machine. Lightwave, Poser, Animation Master's Rigs are all fine on these spec'd boxes. WHen you have over 15 machines running these specs it can be costly upgrading.

:roll:

---

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

Postby thorax » Sun Jun 22, 2003 1:20 am

I remember talking to Ton about this long ago and I think he said that
the IK is computed in passes, and each pass approximates the
proper joint rotations for the skeleton. So it settles in at some
solution over like say 10 passes over the structure, if you have
lots of bones, this can take even longer.. However I've
read in SIGGRAPH papers that there are solutions that
are interative that take a minimal number of passes.. But
working through this would require a math major which I am not..
I just know that if every bone is requiring multiple passes
to solve a orientation its probably due to some lack of understanding
about IK. Also I think the runtime would be more efficient with rotational constraints.. Maybe if for the first bone the bone tries to achieve
the end effector, then the next bone tries to achieve the end effector
within its constraints, and so on until the last bone, if each bone is
constrained by length and axis specific rotational limits (not very many models that have arbitrary axis rotational limits, aside from "plastic man") what optimizations can be made with computing the joint rotations
necessary to achieve the end effector or the best possible solution?

There is no excuse for the runtime of the IK skeletons,
Wavefront TAV's Kinemation handled skeletons with
multiple bones interactively on a SGI Indigo Extreme (Pentium-400
speed machine) about a decade ago with vertices
in the tens to hundreds of thousands without the kind of
slow-down experienced in blender. I guess the solver
that Kinemation used was a single-chain solver, and what
blender seems to be using is a relative (??) chain solver..
I only know the name in reference to what Maya uses, and
the distinction between a IkRC and IkSC solver.
The major difference in Maya is that RC solvers have a twist
mechanism and the SC solver doesn't that in a SC solver
rotation of the end effector affects the orientation of armature,
given the rotationali constraints (which blender doesn't use)..
Blender's armatures try to get away with a lot with very little
(no support for rotational constraints, using solving multi-segment
chains with a
RC style solver).. You can tell is a RC style solver because it
has a discrete axis about which it flips..

Most animators would prefer the RC chain, but I prefer the SC chain
solvers because its orientation independent. RC chains by definition
that they work with two bones don't need to define constraints of rotation,
but then it makes animating of cats and hoarses and preying
mantises impossible.

Note that if the skin deformation is taking too long, its a problem with the coder who implemented it, this could also be related to problems in the data structures. In Maya skin deformation is handled with
clusters which act like relative vertex keying where each cluster is
like a pose in a relative vertex keying system. In fact Maya implemented
vertex keying and skin deformation with clusters. A cluster is
an object that contains pointers to vertices of any kind, the
cluster could conceivably point to mesh and surface vertices,
the cluster's per-vertex-magnets have a pull-multiplier which
determine how much influence the cluster can have on the point in the
mesh/surface . When applying skin deformation by way of multiple clusters
all clusters must be evaluated, or some optimization
can be made to only compute the clusters that change orientation.
There is some distinction in Maya between skin attached to bones and
skin attracted by clusters, I think the structures for attaching the
skin to the bones are the same as the clusters, the distinction is
made to minimize confusion.

Flexors implemented in kinemation (and possibly Maya as well)
could be done with lattices.. Flexors are how you can accurately
control rounding of joints and bulging muscles. I would suggest
though that we could reuse relative vertex keying to define
poses of muscles and drive the poses of muscles with the
rotation of the joints. In Maya how this is handled is by rounding the
flexor and the flexor deforms the vertices. So when binding a skin to a
IK skeleton, you determine which points are assigned to the
bones, then you create clusters with weight 1.0 to assign the points
directly to the bones.. Parent the clusters to the bones. Then allow
overlapping clusters for posing. Assign a flexor to the joints and bones
that serve as special kinds of clusters, the flexor is a lattice controlled
cluster that is rounded by the joints when parented to a joint.

So we need clusters, which are a form of a pose as used in a relative vertex
keying system.. Flexors which are lattice controlled clusters
that are driven by joints. Multiple clusters that are combined to act like
multiple poses of muscles on a armature (read relative vertex keying applied
to skin groups owned by bones).

If you implement all of the above, it will be really easy to do character
animation in blender. But this make require a massive rewrite of the IK
system and skinning..

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

UML and design of a new skin control system..

Postby thorax » Sun Jun 22, 2003 6:35 am

(when I mention "vertex keying" below replace that with "relative vertex keying"
its what I meant to say)

A bigger image can be found here:
http://www.bl3nder.com/UML/DesiredRelat ... lender.png

Image

The original MDL (a CORBA standard UML description file format
that is produced by rational rose, I think its in a XML subformat,
but doesn't really matter unless you have Rose or some MDL
compatible CASE tool, I'll try to use ArgoUML next time..):
http://www.bl3nder.com/UML/SkeletonBonesSkin.mdl


My desired relationship of Bones to Skin in blender..

I didn't include relationships with respect to handles/end effectors
as those are kind of implicit in understanding skeletons.
But I'll describe this graph which represents a class
relationship, but should represent a working relationship
between classes of stuff.. This is not a formal definition or
anything of that sort, just a relationship of the ideas presented
as a class diagram.. I did this in Rational Rose (ArgoUML wouldn't
have allowed me this much freedom).

Okay, the diagram:
A skeleton can have bones and joints, we understand this from
blender, the skeleton is ordered in a hiearchy, but this class
diagram just specifies what relationship can exist, each bone
has exactly two joints, and each joint can have one or two bones
(the multiplicity just requires at least one bone for every joint and
vice versa, but I'm describing this alteration here). Functionally,
the joint actually owns and transforms the bone, and the bone owns one joint.

A cluster modifies the model display, the model is virtually parented
to the bones and joints but functionally the model is bound with clusters,
one cluster (per bone) with 100% influence is assigned to modify
the display of the model. The assignment of the cluster to the
the model is made at bind time, by some closest-point algorithm.

Additional clusters influence the display of the model in other ways
that deform and perturb the surface in a desired way by the user. A
flexor is a kind of lattice that bends a cluster with respect to a joint
(I made the flexor a kind of lattice and a kind of cluster since it
combines the advantages of both, but it could be seen that the flexor is a lattice that
has a cluster that is deforms). So, the cluster controls points
in the model, and is bent by the flexor. The advantage of having
the cluster bend the model instead of the lattice bend the model,
is the cluster is independent of the model it modifies, it could modify
points in a surface, polygon, curve and so on (basically anything that has
vertices or points, regardless of implementation of points).

Flexors like lattices have sides. If you position a lattice
left to right, it works just as it does in blender on meshes
and surfaces. But in this case the lattice isn't deforming
meshes but clusters which indirectly control the form of the
geometry. However the flexors are aligned with the
joint its parented to and the bottom half of the parent bone and
the lower half ot the child bone. Consider the parent bone to be the
upper arm, and the child bone to be the lower arm. The flexor
remains aligned with these bones even when the elbow is bent..
The flexor bends with the elbow joint. But its the flexor that
bends the skin, not the joint. Just as bending a lattice in blender would
bend anything parented to it. The only difference is that here
we are defining the flexor as a lattice that affects clusters and the
clusters bend the geometry. There is a reason for this which I will talk
about later..

Again..
The model is assigned to the bone clusters and joint flexors
according to a closest point tollerance, initially, and can be
adjusted by the user by assigning points to the flexors or clusters
afterward. This is the same as the way blender assigns
skin groups to bones, but more formal and straightforward..
The clusters can be observed in relation to the bones
and tweaked, the tools necessary to do this can
easily be created to view the contribution of one or multiple
clusters on a geometry. And since the bones and joints
own these clusters, it makes it easy to access the clusters.

Also a controller, a kind of mathimatical expression
that controls one parameter by accessing multiple other
parameters as input. The controller may also have a slider with
it that is keyed to the time or anything else that has a single value.
The controller in this case accesses the joint angle every time it changes
to influence (or interpolation between) multiple cluster poses modifying one
or more models (or model displays).

This is like creating relative vertex keys for multiple muscles in a
arm muscle and interpolating the influence of the vertex keys with
respect to the orientation of the shoulder joint, or elbow with respect
to the upper arm, for example.

It may seem like a lot to do in realtime, but for the purposes of
interaction the influence over the clusters can be turned off..

Also clusters can be used like "puffers" (a term from animatronics)
that puff up or deform a surface with respect to an IPO curve, for instance,
or with respect to some expression influenced by time, or with respect to any
multiple number of inputs, given through a special expression called a mixer
(that combines the inputs of several expressions to produce one output). In
implementation the expressions may end up all being alike, so the distinction
of "controller" versus "mixer" is purely descriptive of the style of relationship.

A bit more on the Description of multiplicity between objects:

Joint --- Flexor
A joint can have zero or more flexors (I didn't
think about this when I made the UML graph above,
this should read 1 .. 0 for both relationships, as
a joint doesn't necessarily require a flexor to bend it, this is similar to
binding a surface statically to a skeleton in
Maya, to have a flexor means the joint will flex the flexor
at the middle of the flexor, which overlaps visually
with the the joint over the surface of the model).

Flexor -> Model
Zero or more flexors can modify the
model, flexors are detachable which describes the
zero relationship, but there can be multiple
flexors modifying the model at any time,
and the model does not have access to the flexor
that modifies it ( the arrow defines the accessibility, or navigability).
Any flexor, just as a cluster, can modify multiple
models (that means the upper end of the
arm could be a polygon model and the lower end
could be a surface, the flexor modifies both,
this is either not true of Maya or I haven't tested
it. But if blender was designed to treat
all vertices the same, it would be possible to
have this kind of relationship. )

Cluster ---> Model
Like with the Flexor, the cluster can modify
one or more models.. In general a
model may have zero or more clusters, but in
the case of a skeleton the model has at least
one cluster unless the model is not
close to any bones (we could assume that
the model is always assigned to the parent
bone unless otherwise assigned by way of the
closest point algorithm, just as a way of alerting the user
to the problem).

Morph ---> Cluster
(this is one to one the same as what the relative vertex
keying system does with meshes and surfaces, only applied to
clusters which pull on the points in the model, if you've
messed with Maya you know what I'm talking about)
Read as "a Morph Interpolator of a Cluster" can
affect the influence of one or more clusters to a
model, while the cluster can have zero or more
(this should read zero or one) morph interpolators.

It wouldn't make much sense to have multiple
morph interpolators fighting to control the same
cluster, unless somehow the cluster can distinguish
between the morph interpolators and combine them,
but this would nullify the need for the interpolator.


Controller ---> Morph
Controller controls the morph (interpolator) of the clusters.

Controller ---> Joint
Controller accesses the joint parameters to
make a decision on how to behave. The result of
this behaviour in relation to a Morph Interpolator
is to control the influence of particular
clusters by popping values into the morph interpolator.

I should leave this relationship here a little open, there
could be a controller that produces multiple outputs (a program,
such as a python script) or the controller could be a single
expression that outputs a value, in which case there may
be multiple outputs to the morph interpolator.


INHERITANCE RELATIONSHIPS

Controller -----|> expression
Controller is a kind of expression (this is a inheritance
relationship, it means that the controller takes on the
attributes and methods of the expression class/object).

Flexor ----|> Cluster, Lattice
Blender already has a object called a lattice that
influences the placement of points for models that are
parented to it.. The Flexor is like a lattice that
controls clusters, so it inherits from a cluster
or is a hybrid of the lattice and cluster. Haven't
determined the best relationship here, its purely interpretive
or descriptive of the relationship that flexors have different
from clusters.
The main difference between clusters and flexors aside from the
use of a lattice to deform the cluster, is the
flexor is directly influenced by a joint and aligned with
upper and lower bones. If its not influenced by a joint
its a lattice being applied to a cluster (unless there should
be a need to call it something else).

CONSIDERATIONS:

Another thing to mention, where there is a parenting
relationship, there is a transform, or a transform matrix..
Flexors are parented
to joints so whatever orientation there is of a
joint, a flexor is at least transformed by the
transformation matrix of the joint.

Also the clusters
that are parented to the bone are transformed by
the bone so they move and rotate with the bone, because
matrix transformations are applied to them,
as are the flexors to the joints.

The flexors however
are aligned with the bones that the joints are
related to..

The Cluster will not be affected by
the joints unless through a morph interpolator
driven by a Controller, or by way of being part of a flexor.

Since the controller
knows about the morph interpolator and the
morph interpolator knows the Cluster, the cluster
doesn't need to know of the influence other things
have on it. Nor does the model.. It keeps the relationship of the
bones and joints to the cluster types, simple
without over complicating the relationship by considering
other data types that can be transformed and deformed by the
bones and joints.. It also allows the rigging and configuration of
skin groups to seperated in the middle of the process,
so that bones can be adjusted apart from skin groups..

The model shouldn't have to be transformed
by the bones or joitns directly, this is the responsibility
of the clusters and flexors. Also the model doesn't
even have to know that its being deformed at all..
The skeleton doesn't have to know its modifying
a mesh/surface/curve/... . The flexor/clusters
only have to know about affecting models and the
bones/joints only have to know how to deform
cluster types (including the flexor).

What is the order of application?

First you determine the joint rotations via your IK solver,
after the joint rotations are computed,
The bone and joint transforms are applied to clusers and flexors.
(note the model's display has not been modified yet)

Then the cluster types that are parented to the bones/joints are
evaluated (they modify the model display database).

Then secondary clusters are applied (pose morphs, modify the model display database).

In the event that a skeleton is applied to the deformation of the
model, will the model display be replaced with the actual model.
This assumes that the model display and model share the same
kind of vertex type, but not the same vertex database.

Also if there is any additional transformers and deformers outside of
the skeleton structure, those should be applied to the skeleton if
the skeleton is being deformed. Or after the dekeleton deforms the
mesh if the mesh is being deformed.. Utimately there should
be precedence that can be determined by the user. The
scoping of this operation could be limited by a concept like a group
(maybe it could be called a scope or a focus, I don't know) that limits
the orders precedence of a group of objects,
transformers and deformers. Precedence is a programmers
terms for an order of operation. Precedence is often defined ahead
of time and used to simplify the representation code by assuming
an order of operation.. Like in C: a = 3 + 4 * 2; by precedence rules
multiplies come before adds, so 4 is multiplied by 2 before being added to
3, and last the value is assigned to a. But it should be possible for
in blender to the user to rearrange precedence to achieve
different results.. Like if multiply was a "scale" transformation and add
was a "translation" (or move) operation, we could make our object
four times its size and move it 3 spaces to the right, or we could
redefine precedence and make it so that objects are moved
3 spaces to the right and scaled 4 times (with respect to the
origin). If you use Maya, this is a little like defining order of application
of rotation, scale, translation.


See Ton, ain't it beautiful what you can do with some UML and
a little though?


PS- How this flexor/cluster, bone, skinning relationship is different
from Maya or Wavefront products.. Wavefront's Kinemation did not
support the deformation of Surfaces/Polygons by the same flexor/cluster.
The concept of the cluster is really a "Alias" concept.. Wavefront called
them "Flexors" in Kinemation, but Wavefront had no actual
deformers outside of what could be had through dynamation with
particle forces. Alias invented features like Relative Keying using a
average of clusters as poses, controlled by sliders that
could be keyed. I represent the sliders here as controlling expressions,
and the flexors are a combined lattice (which blender already supports)
with a cluster that is a concept I stole from Alias Power Animator 8,
which I have used.. Its also in Maya. Also Kinemation had the ability
to key flexor positions according to keyframes or joint keyframes (like
keyframes according to time, only time = joint rotation). And
Alias power animator used a similar idea with clusters driven by
joints, only the amount of interpolation could be controlled with a kind of
IPO per relative key, and the time-line for the morphing between
clusters was controlled by the joint rotation).. In both Poweranimator
and Kinemation, you could controll at cluster with a joint anywhere in
the skeleton. With maya you can possibly control any relative keying
anywhere in a skeleton or elsewhere with a joint or a slider..
Sliders in poweranimators were really defined as two objects that
changed a value in relation to each other and the slider part had
constrained values on two axis, so it could only move up and down,
and could be constrained on the remaining axis, and parented to the
slider-base so the tranformation was local.. This is only if you
wish to implement sliders that have multiple dimensions, like a
dial, button, switch, etc.. in the 3D space.. The expressions allow
a sort of freedom to define strange interfaces, this should be left up to the
user to define.. But there can be input from a Panel in blender,
should the user need something more solid and dependable.

I'm going to work on the graphical representation of this structure
in relation to blender types..

soletread
Posts: 83
Joined: Fri Jan 10, 2003 7:11 pm

Postby soletread » Sun Jun 22, 2003 2:18 pm

Wooooooah, Thorax !!!

What a piece. Brilliant. I am going to have to take some time to go over this. Needs printing out :lol:

This HAS to be the longest post ever.

I must say I am really impressed with the thought you have put into this.

Before I read this more carefully, just a few thoughts quick.

I dont think blender is slow with the actual mesh deformations. It IS the IK solving without a doubt.

Assuming one has done the rig properly (an improperly constrained rig can definitely have an impact on speed) the response gets slower and slower for every contrained or parented bone.

It is not however a problem with the rendering as this is negligable in comparison to the time required to do AA, subsurfs etc. So the Calculations of the IK could possibly be bypassed on the pre-render side.

What I mean by bypassed is to have some sort of an algorithm that looks at the armature and set all the relevant bone positions into a memory buffer. The memory buffer only needs to be updated when a bone or a set of bones move and only the relevant affected bones get updates. Much like an "onchange()" event of sorts. When updating a bone only the affected rig area is considered. For instance if I move a toe bone, there is no real need to recalculate the entire rig (unless the toe bone moves the foot, moves the leg) which seems to be the case in blender.

I may be incorrect but this would explain why the rig seems to be recalculated every time the screen needs redrawing even if the rig is left alone. As is the case when panning or zooming.

SO then for redraws like this and posing, read the memory buffer.

On an animated render, calculate the conventional way.

Just a thought. Probably not do-able or I may be wrong. This however would be a fix to the symptom and not dealing with the cause, which is something I know you are talking about. So I will go now and read yours more carefully. :D

-----

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

Postby thorax » Mon Jun 23, 2003 5:22 am

Well your idea sounds like burning Inverse Kinematics (solving
joint rotations based on rigid lengths of bones and a desired goal)
into Forward Kinematics (animation by specifying joint rotations)..
I've not worked on this much, but it would allow for some reuse
of animation computed with IK.. But it leads to animating with
joints that with goals.. There could be a weighted goal ro FK system
where one solution of the joint rotations via IK suggests one
set of rotations and the FK suggests another, then the two
are mixed/averaged into some go-between.. Then You could
do the walk-cycles with the FK and screw with the motion using the IK
to do things like hold a hand up and wave, turn the head, etc..
But the caching of rotations is like recalling joint rotations from
a memory and playing them back on certain joints or all joints..
Also it should be possible to animation the influence IK's have
at any particular time on the bones, so that you can turn off
all influence accept for the toe bone and animate that with Ik,
then at another key frame turn it off the toe bone and between
the one that was IK'd and the one that is pure FK, the bone interpolates
its rotation with respect to the foot..

Anyhow..

(take the rest of this with a grain of salt, I look back on the old days
with much yearning but now that I look at those old animations,
there is nothing all that great about Kinemation.. It may be I need to
learn the new way to do this stuff.. But you don't find very many
people animating stuff with multiple legs.. For instance,
in Finding Nemo Pixar really struggled with no limbs (* joke *),
probably was a relief.. I wonder if animators ever do any full body animation down to fingers anymore.. Its a painful experience I know..
I once did animation of this mexican in a sombrero in Blender
a ways back and the hands were impossible to animate, so I
just let them freely wave side to side.. If I can find the animation
I will post it. Some people animate hands using relative vertex keys
and the arms and legs with IK, I don't know if this is possible in blender
but I remember seeing a tutorial for Lightwave that did animation this
way.. I would use full multiple overlapping single-chain IK
if trying to animate something complex like a cat.. But it would
be tough to animate in blender's IK the part where the cat twists
its body the land swiftly on the ground after falling upside-down,
blender's IK would only work well if all characters remain right-side up,
doing mid-air flips and acrobatics are probably out of the question).

What I would love to have in blender on IK, is to be able to
actually see the IK chains like you can in Maya..
In blender its made into a kind of engineering
exercise.. You have to name your bones, then
determine which bones are affected by which solver..
This is a waste, you should be able to instantly hack together
a skeleton of any form, draw your IK handles from
joint to joint, then add joint constraints (in Kinemation you
could see the joint axis , change its orientation, then set joint
constraints, this was true a decade ago, I think Maya
has some of this still but there was a lot that was lost, I
don't think Wavefront made Kinemation, it was a package
that they contracted someone else to do.. I remember
my animation instructor introducing me to one of the
engineers when I went to SIGGRAPH in 1996 in
New Orleans.).. Chances are the Wavefront
programmers hadn't time to really understand Kinemation,
and when Alias merged with Wavefront, the
functionality was lost in the shuffle.. I think I got an award
from A/W due to some ignorance of Alias to
imagine how anyone could use Wavefront's packages to
do anything useful..


Well one feature I liked in Kinemation was that
the end effectors could be given collision detection
features.. Like if you had a foot you could put a handle in the foot,
and adjust the collision detection of the end-node (the leaf-node of a limb
that would be a joint if it had a bone attached to it), and when
you pulled the foot down, it would stop at the polygon surface because
it detects collision, the handle then becomes temporarily
sticky.. Sticky means it strives to stay in one position (if its hard-sticky,
then it tries to stay in the position and the precise orientation it was,
collision could be made to affect eithe rsoft or hard sticiness).

The stickiness of the foot with the surface would
make it so you can't move it anymore (if its hard stickied)
or that you can't move it past the surface (with soft sticky)
until you break collision with the surface, it would act like trying to
pull a hand through a brick wall to reach a goal.. You could
create a ice skating effect by placing end effectors on
the opposite end of a ice-barrier, the feet try to
reach the goal but collide with the ice before reaching it..
There were a lot of tricks I developed in Kinemation like this..

Also had ways of switching out geometry for other geometry,
so like I could design a skeleton for one geometry, then
assign similar skin groups to another geometry and write into
the skeleton's bodyfile the location of a different geometry,
and it would magically bind to the new geometry without
having to do a closest-point search to find close geometry..
So I could have a low-resolution skin and a high resolution
skin and trade one out for the other.. Also could create geometries
that would focus only on arms, legs, etc.. So I could see the affects
of the deformation of thousands of polygons.. This was before anyone
knew waht NURBS or subdivision surfaces were..

It was fun to put a skeleton into a pose, set the handles for collision
of soft-sticky and ram the body into the wall and watch all the limbs
assume a pose like a guy hitting a brick wall after running into it at
20 Mph.. All the animation in Thorax was done with Kinemation,
and IK'd every step, there was no reuse of walk cycles,
you will notice if you look at the animation, int he beginning
it looks very robotic, this is because I didn't understand
how IK end effectors could be interpolated along curves, once I understood that the animation became more natural when the
bug comes out of the hole and walks up to the sign..
The flip of the bug was just three sets of keyframes on the limbs
and rotation of the root.. But I had one keyframe at either end of
the flup on the root to cause the spring effect.. The surface
collision of the handles allowed me to drop the root down
until the legs recoiled from the feet having collided with the ground.

See http://www.bl3nder.com/movies/Thorax.mpeg
or http://www.bl3nder.com/movies/NewThorax.mpg

The last one was requested by Alias/Wavefront when I got the award..
So I did a new version with particle collision of the flame and
fixed up some render problems.. One bad thing about Wavefront
TAV back then was that dynamation (the particles engine) only
allowed rendering via OpenGL.. And Composer (the compositor
program) reuire tricks to make particles look real, by bluring
particles and layering them to get glowing affects.. Some of the
first effects used in Start Trek TNG were done with Composer
and Dynamation.. I do not wish for either.. Blender's sequence
editor effectively replace Composer for me, and blender's
particles a step better than Dynamation, but Dynamation had
particle forces (that Ton wants in Blender now).


Another cool thing with Kinemation..

Also I could turn off the keyframing system so that the IK wasn't
recomputed while I moved the skeleton, so I could get a
discrete pose, then if I wanted certain limbs to remain posed I
could sticky them postionally or rotationally..

(I described this earlier, I made a second pass here, but
I think its relevant to describe the sticky concept again)..

Positional sticky would keep the limb wanting to reach a certain position but wouldn't require a particular rotation of say the wrist.. The
rotational+positional sticky would require the wrist to be at a particular
location with a particular rotation (a locked orientation for a particular bone). Anyhow, I could soft sticky the handle in space, then
turn the keying system back on and all the bones that were not
stickied would assume the logical positions for that frame
in the animation, I could key the stickied handle..

(the way blender's IK is, the solver uses the pose to
determine the desired orientation, but for joints that
are typically controlled with constrained ball joints like in the neck, shoulder, spine, it can get complicated.. Also if
you raise a end effector past the flip axis, the
limb will flip int he opposite direction.. I never dealt with this
until I switched from Kinemation to Alias Poweranimator (which sucked
at animation).. Everything looks like Alias's Poweranimator
concept because few ever mastered Kinemation.. I should
have produced tutorials and made it impossible for them
to influence public opinion through marketing rhetoric..
You can see some of the IK problems with the Bruce Lee
animation that was produced.. It was nice but the shots
tried as much as possible to hide problems the animators
were having initially with animation of bones that had
bevel surfaces attaching arms to shoulders (thank the lord
we don't have to animate with NURBS surfaces anymore).
Kinemation was no better at this..

But hopefully its gotten better since the days I did it, I'm just now getting
back into all this stuff, but I have played with the bones and skin
from a tutorial Wierdhat had, and it got better more is possible, but
is not the best way to do this..

Anyhow..

If you stickied a handle you could move the root of the skeleton
and the handles would try to attain the goals that were stickied..
This would allow things like positioning of a guy climbing a ladder
by stickying one limb at a time, adjusting the root, keyframe,
sticky, adjust root, keyframe, and so on.. Blender doesn't
have the sticky concept, instead it borrowed from Alias'
early attempts at this and the way a lot of others do it by
keyframing the end effector to sticky, and
if you want to move something, you move it, then keyframe it again..
It produces a lot of unecessary keyframes..

I haven't used the "pose" sheet in blender, but I
would imagine its just reuseable poses that can be interpolated
between.. But some animation is just combinations of different
limbs posed at alternating intervals, not interpolation of statics
poses. That works fine for FK but not IK. If you understand curve fitting
you can create natural motion with IK..

Kinemation had a short lifespan of say 2 years, and
about all I can recognize of it in Maya is the Single handle chains,
flexors and soft-sticky. It doesn't seem to do collision of sticky
to surface, or turning off the keyframes to pose with sticky, turn
on, keyframe, turn off, pose, sticky, keyframe, turn on check
animation, turn off keyframes, go to a frame, assume a pose,
sticky some handles, turn on keyframes, end effectors assume
current step interpolation, keyframe the desireable limbs,
unsticky, re-evaluate.. Sounds complicated but offered a lot
of flexibility..

Also wherever the handle chain begun, that was the space
(or matrix transform) at which the joints down from the
handle chain were computed.. So If you were keying
finger positions with respect to the wrist, the swinging motion
of the arms wouldn't mess with the keys applied to the fingers..
But I'm sure I could sticky a handle in the absolute space so that
the fingers try to remain striving for some set of goals in space,
then I could key frame the position of the fingers over a
set of wrist or arm movements..

This is the kind fo functionality I would eventually like to see
in the IK for blender.. You could also apply the concepts
of the cluster/relative-vertex keying to end effectors
to have multiple goals possible in varying influence.. Like
I could start out with a basic walk cycle, add some up and down randomness, then add a force of gravity, create some points of interest
that would cause the head and torso to turn when near a point of interest. It would make for more natural looking animation and ease of
adding secondary motion..

Note that these days, the single chain IK
is probably still used but its combined with motion suits
and probably requires thousands of dollars to animate..
Makes me wonder if anyone has tried to animate a cat
lately, or a spider crawling up a hill..


What sucked about kinemation was that if you didn't take time to adjust
the joint constraints, it would be impossible to animate a object
without the limbs taking wierd orientations.. The constraints actually
make animating easier.. Another thing in Thorax I had not set joint constraints.. The legs were animated by animating handles in
the knees of the spider and in the tips of the thorny part of the leg..
It was like animating 8 handles and the root.. So every now and
then you will see something like at the end where a leg
bends sideways because I didn't set a key-frame for the position of a
knee.. Toward the end of the animation I was getting like 12 bugs length of walk motion keyframed in a day, which was hard.. But probably easy by
todays terms, with teh reuse of walk cycles and all..
Seriously I thought by this time there would be so much reuse of motion
nobody would be messing with IK anymore.. I remember
wishing evry day for a 50K motion capture suit or something to
make things easier.. but moving a leg was like,
navigate left, move handle, navigate right, move handle, sticky handle,
work on next handle, then when the pose was done, keyframe the
limbs.. Move to next frame.. In Alias poweranimator it was like
a nervous reaction, keyframe everytime the handles changed position,
move a handle, keyframe, move another, keyframe,
and so on.. I miss Kinemation that way.. Its pretty sad..

ton
Site Admin
Posts: 525
Joined: Wed Oct 16, 2002 12:13 am
Contact:

Postby ton » Mon Jun 23, 2003 1:54 pm

I remember talking to Ton about this long ago and I think he said that the IK is computed in passes, and each pass approximates the proper joint rotations for the skeleton. So it settles in at some solution over like say 10 passes over the structure


The old "Ika" system used an itteration process based at error-correction. This is slow, and doesn't give good results with long chains.

It was (in 2.2x series) replaced with a new system, which uses analytical matrix solving, for entire chains. This is all heavy math stuff, which i can't really grasp. I've seen the coder demonstrating chains with 100s of joints without problems.

I don't know what the current problem is... But most likely it is not the IK solver itself, but the way Armatures use it, linking multiple chains and adding constraints to it.

There's an excellent proposal/doc for the Armature system:
http://www.blender.org/modules/document ... esign.html

The curent team works on completing it. I can't judge if Thorax' text is of help for that... we need someone who understands the Blender implementation to say something relevant on it.

soletread
Posts: 83
Joined: Fri Jan 10, 2003 7:11 pm

Postby soletread » Mon Jun 23, 2003 11:20 pm

Thorax:

I find the concept of stickies and single chain IK's Fascinating. Would posing fingers for instance, be a good time to use single chain IK?

After reading what you have said, I think you have helped me with the current Blender problem. A mixture would probably be the best for a biped rig.

I can only hope that one of these days some of these points you raise are considered for future Blender development. You are obviously a very experienced animator, what about a programmer ? :lol:

I wish I could comment more on what you have said, I just dont have enough experience for that, but I had never considered using single chain IK's for character animation until now. If I can get it to work, (and of course, use it) I will be very grateful for the info you have given.

There's an excellent proposal/doc for the Armature system:
http://www.blender.org/modules/document ... esign.html

The curent team works on completing it. I can't judge if Thorax' text is of help for that... we need someone who understands the Blender implementation to say something relevant on it.


This is an excellent proposal and if the current team do manage to complete it, we are in for a FANTASTIC time indeed. Is this a pre 2.x doc?, as I seem to see some famaliar already working concepts in there.

Ok now to tackle those fingers....

-----

Money_YaY!
Posts: 876
Joined: Wed Oct 23, 2002 2:47 pm

Postby Money_YaY! » Tue Jun 24, 2003 3:52 am

see back when IKA ruled the land It had the option of turning it into
fk and Ik with just the press of a button. And that meant when the IKA
chain was first created it started as an IK chain already.
It was buggy as maddness but you had the option anyway.

So if we could get that ik/fk thing going again that would help.
^v^

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

Postby thorax » Tue Jun 24, 2003 8:46 am

Well you can weight things, IK or skin, you've played around with
relative vertex keys so you know what weighting is like..
You could combine IK and FK together by computing the armature with IK then
mixing in rotations from the same armature but without the IK
computation, then you could mix the two together in varying amounts,
that would get you a IK/FK mix.. Also you could add things
like a computation for force by analyzing the speed at which
the handle changes from one position to another (like if you were boxing),
and then apply gravitation, wind, etc..

Adding forces is not complex, its like weighting stuff..
You could simulate it with relative keys by adding a "saggy face,
when a certain amount of gravity is applied, say.. "
then you could do the normal lip-sync and apply the saggy
face once say your character jumps off a bus,
their skin will temporarily bounce, which is the shock of the
force of the ground applying to their skeleton but
the skin will continue to go down, so you temporarily apply a
saggy face..

It doesn't solve everything, but it allows you to add motion..
I think Ton included this in his spec..

matt_e
Posts: 898
Joined: Mon Oct 14, 2002 4:32 am
Location: Sydney, Australia
Contact:

Postby matt_e » Tue Jun 24, 2003 9:59 am

Money_YaY! wrote:see back when IKA ruled the land It had the option of turning it into
fk and Ik with just the press of a button. And that meant when the IKA
chain was first created it started as an IK chain already.
It was buggy as maddness but you had the option anyway.

So if we could get that ik/fk thing going again that would help.
^v^


You may be able to do something similar by adjusting (and keyframing, I guess) the Influence slider of the IK Solver constraint.

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

Postby thorax » Tue Jun 24, 2003 10:45 am

soletread wrote:Thorax:

I find the concept of stickies and single chain IK's Fascinating. Would posing fingers for instance, be a good time to use single chain IK?


Well posing fingers is hard whatever you use, but a
single chain with rotational constraints makes it easier..
Then you can prevent the fingers from going into
each other and the palm.. Single chain is what I'm
used to..

On Kinemation how you setup a rig is you would adjust the rotation
of the joint and pick upper and lower bounds for the X,Y,Z rotation
of every joint.. Its fairly easy to do.. But it kept the limbs., fingers
from making unrealistic poses.. Then you would set up a
few handles from the base knuckle to the tips of the fingers..
And maybe a few handles from the writs to the thumb and
index finger, so that you could control rotations of the wrist either
by using the wrist's IK handle or adjusting positions of the thumb
and index finger.. But its up to whatever you want to do..
There can be multiple handles overlapping, each offering
a control over different joints. It would be quite easy to do
a lot of karate moves with it because you could place a handle
int he elbow, pull the elbow up, that would pull on the shoulder,
you could them pull the other shoulder back, pull the wrist back, form the
fingers, and so on.. Then pick up the leg and do a round house,
all without anything flipping out of position.. One problem with kinemation was not finding any ready rigged humanoid figures..
The one that came with it was called "Kineman", which could
easily be used to make something like spiderman..

The biggest problem with Kinemation was not enough non-rigid
animation tools... Like there was no limb along a Curve (which
is hardly IK).. You could animate the length of a joint,
size, and so on.. But it wasn't possible to have one geometry
shared by two skeletons.. The Thorax bug I had,
at the end of the animation I started experiementing with
constraints, and I think I had added a set fo constraints on the
back legs that limited how far the legs could move inward..
As I moved the root of the bug, the back legs wer twitching
at the knees because the feet were stuck to the surface,
and the constraints were causing solutions that
were in conflict.. The behaviour was like holding
someone back while they are striving to get past you,
so any force that was not constraining the legs would assume
the positions that allowed the best positional goal but
any joint that was not constrained would cause strange rotations
at the thighs, and the legs would jump a little (like a miniscule amount,
but if you look at the back leg of the Thorax, you will see it jittering a
little bit..

Also when I brought the Thorax down for the recoil before the flip,
I noticed from the animation I rotated the root a little bit forward,
I think this was me trying to get the bug down as far as I
could, but the constraints in the knees, and the stickied
handles at the feet were preventing me from getting it down
any further.. So constraints went both ways, to the root of the skeleton
the same as at the feet.. But it never did anything unpredictable
like have the limb fly backwards through the bug, like what happens
in Blender..

After reading what you have said, I think you have helped me with the current Blender problem. A mixture would probably be the best for a biped rig.


Well I don't know what can be done about mixing FK and IK..
But that would allow for some stuff.. Really there needs to be some kind of overlapping IK handles..

Also I have an idea how to do the limb constraints that
is better than the cartesian constraints (limits on X, Y, Z,
that tend to produce square-like limitations) is to offer
dot-product based constraints, where the limb can never reach
a certain angle if it gets too close to a vector (we could
call it the never-here vector).. Also constraints limited by a curve
on surface.. Paint on your constraint (would vary the amount
of force on the limb (like say you want a rickety old man,
you could make the force spotty about the joint, that
would simulate arthritis..

I can only hope that one of these days some of these points you raise are considered for future Blender development. You are obviously a very experienced animator, what about a programmer ? :lol:


Well I just think a lot.. I've got ideas in a book on animation,
wireless technologies, I've even got a plan for distributing
healthcare records in a patient controllable way, without
the patient having to know all the clinics they were at..
Ways to integrate business/wireless that is better than
the ways its done now with wireless.. Use of bluetooth
to improve networking events and seminars so that
you find people that you are more compatible with without
doing a keyword search of their profile.. I just take what I
I want, use it to constrain what I know, and develop the
structure of the idea.. But sometimes that ideas are not
feasible, like the other day I was thinking "I wonder why
I've never seen a lenticular billboard".. I thought this would be great
for 3D ads on billboards.. Problem is lenticular 3D cards work
because your eyes are wide enough to see two different
images, it wouldn't work on a sign.. Nor would holograms..
But about 40% of my ideas are feasible, just I haven't the
money/company to pursue them.. You know there are
lenticular monitors now, which is a bit like a hologram
but is really more of a cheat.. An example of a lenticular
3D card, is the recent TV Guide with the Hulk on the front..
I almost collected all the issues, because I love 3D
stuff..

I wish I could comment more on what you have said, I just dont have enough experience for that, but I had never considered using single chain IK's for character animation until now. If I can get it to work, (and of course, use it) I will be very grateful for the info you have given.


Well thank the guy who made Kinemation, I'm just regurgitating
everything I remember of the program.. I worked with it for several
years.. I remember asking the Alias/Wavefront sales people for a
copy, but they said its been pulled from the market and can't
purchase a new license.. Even if I find someone with the original
package, the license is not transferable.. So I would have to
purchase their computer and get them to purchase
the license every year, not disclosing to A/W that I
have the computer.. A/W made very sure it would
never come back to haunt their tech support or steal
from their market share.. This is the kind of
thing that makes me think "how would they like it if all
your stuff came back to haunt them in open source
where it can't be controlled."

Another word of advice.. Never buy Casio products..
A friend of mine bought one of their "Exim" cameras,
he said the camera records sound, but there is no jack that
lets you hear the sound.. Its typical.. I've purchased Casio
keyboards in the past, and Casio will take a perfectly
good product, and disable one feature from the product,
then they will make 4 other products just like it,
disable different features, call each product something different,
then release it.. You end up buying all four keyboards to
get each feature that you are lacking.. I think its a
kind of sales strategy that keeps each product from
being devalued by a newer product,
by making each unique and non-replaceable..
It also makes each products useless for experimentation..

Yamaha on the other hand will produce a keyboard
and then give you all the technical manuals and programming
information necessary to access every feature in the keyboard..
However Yamaha isn't as big as Casio is these days, and
I figure it may be because of this unfair sales tactic that
casio uses..

A common business term when doing negotiations is
the concept of "value add" which is "why would anyone
want to buy this" or "what willl maximize the value of
this thing so we can make back what we invested".

Note Eskil expresses ideas that use a similar thought process
by saying something that sounds illogical, like packages
should not clone features, this is why I tend to scoff at him..
I can see the business gears turning.. I'm sure he's
considering "what's the value add in this"..

I've already suggested ideas that enforce open source
by mechanism of the software (use of virtual machines),
he will not talk about it because he knows its commercially
evil.

But commercially evil stuff is good for consumers,
I tend to be bent away from commercial "business sense"
in terms of software, I tend to be more in favor of Richard
Stallman than Bill Gates.. Mainly because Bill Gates has never felt
the pain of not being controlled by his tools.. Richard Stallman
went through a whole ordeal where commercially driven software
developers caused their employees to sign NDA's that
forced them to not talk to Richard at all because they knew he
would develop better technologies..

I have a theory about why Bill Gates hires the best minds in the world..
So nobody else can have them.. Its pretty simple, if you can eliminate
the ideas, you can control the marketplace..

Anyhow.. I guess I'm crazy that way.. But
I also see good business ideas.. I just don't like
unfair leveraging of consumer trust.. Business has
become about builing consumer trust, then screwing
the customer, building trust, screwing the customer,
and so on.. Until the Customer gets a clue..

.. A good tactic is to allow artists to
provide a service for money over the Internet..
Business processes that make consumers money so they
can purchase goods, possibly from other consumers..
Then the business guys can make premiums of the
service.. This is worthwhile and transparent.. Doesn't
get in the consumers way..

Verse will allow for this, and this is what I agree with Eskil
on.. But I don't know if he fully understands the potential here..
It would allow artists to freelance anywhere without
having to be where the software is.. Like I could work
for Pixar without having to live in San Francisco,
by loggin into their Verse server through the Blender
client, then I could put in my punch card, start working on
rigs, talk via voice chat to the other artists, then
punch out.. Get up eat lunch, go workout, come back,
punch in, work, punch out.. I would not mind this so much,
Eskil could setup a paypal style money transaction
site to collect pay from pixar, take a premium of
.01%, then give the worked the rest.. It would
beat having to pay 60% to rent and city/state taxes..

Well it may be the case though that there would be dozens of
payment systems, but having Verse would
make the artists visible-heads to investors,
the business people could see their work
in development and allow the inhouse software
to remain inhouse... You get access to it by login and password..
But you get paid to make stuff.. If you lose access, its
not like paying for the tools and then having to make money to support
them, to have A/W turn it off next year and force
you to pay about few thousand to keep using it, to
try an make enough money to pay for the license..

There's an excellent proposal/doc for the Armature system:
http://www.blender.org/modules/document ... esign.html

The curent team works on completing it. I can't judge if Thorax' text is of help for that... we need someone who understands the Blender implementation to say something relevant on it.


This is an excellent proposal and if the current team do manage to complete it, we are in for a FANTASTIC time indeed. Is this a pre 2.x doc?, as I seem to see some famaliar already working concepts in there.

Ok now to tackle those fingers....

-----


Yeah I still need to read it.. But I'm sure its good..
I will probably reduce it to a UML diagram.. Just for simplicity
sake..

Money_YaY!
Posts: 876
Joined: Wed Oct 23, 2002 2:47 pm

Postby Money_YaY! » Tue Jun 24, 2003 3:00 pm

snor.... Blender is all about the fun. Not the use last method.

If blender were more powerful you would know it. It will be. But
it needs ALOT more programers with a clue to how blender code
works , and organazation.

whatever ^v^

soletread
Posts: 83
Joined: Fri Jan 10, 2003 7:11 pm

Postby soletread » Tue Jun 24, 2003 8:28 pm

Thorax:

Thanks once again. And for your advice on Casio purchases :)

I think, after going through all the other forums, its almost time for rotational constraints. Well should I say it IS time for rotational constraints.

and I guess Money_YaY! sums it all up:
If blender were more powerful you would know it. It will be. But
it needs ALOT more programers with a clue to how blender code
works , and organazation.


So I guess the best is to make do with what we've got NOW. I think I will find a way to optimise my rig for speed.

Sure, Blender is about fun. I am going to try and have fun now and put it to some REAL use. :!:

---

Hos
Posts: 215
Joined: Wed Oct 16, 2002 12:06 am

Postby Hos » Tue Jun 24, 2003 9:38 pm

Quite often the slowness exhibited is caused
by circular parenting, which may be tricky for an
armature novice to detect (see earlier post by
kaktuswasser). If you guys want to send me a
link to your files I can look them over...

... but that doesn't mean that things couldn't
be speeded up a bit. Certainly a source of slowness
is this bit of code in drawobject.c:

Code: Select all

#if 1
#ifdef __NLA
         /* Force a refresh of the display list if the parent is an armature */
         if (ob->parent && ob->parent->type==OB_ARMATURE && ob->partype==PARSKEL){
#if 0         /* Turn this on if there are problems with deformation lag */
            where_is_armature (ob->parent);
#endif
            if (ob != G.obedit)
               makeDispList (ob);
         }
#endif
#endif


which isn't constraint related at all. When you comment it
out things go much faster... the only prob is that
somethings don't wind up where you expect them.
A smarter way to test whether the displists need
rebuilding or not is needed!

Regards,
Chris

Money_YaY!
Posts: 876
Joined: Wed Oct 23, 2002 2:47 pm

Postby Money_YaY! » Wed Jun 25, 2003 2:28 am

www.aprilcolo.com/bunny/a.zip

it is just a sample thing I made to test .
Even without the texture it runs slow.

I found that if you use the object ik instead of the armature ik
it runs faster. But you lose the convert nla to a single bar thing.

Could you create the nla concert thing in the mean time ?


Return to “Animation”

Who is online

Users browsing this forum: No registered users and 2 guests