I'm a programmer for a game development company (console games). My position for the last 2-3 years has been as the primary animations programmer for our team/company. We use 3DS Max for our animating (and modelling for that matter, but that is unimportant). As animations programmer, I am very familiar with how Max handles animatable object, from character bones to morph targets to animatable channels of arbitrary objects, and I've written code to export this kind of information from a Max scene.
I was reading the Blender 2.3 Guide, obviously with an eye to animation, and I've come across some... concerns about how Blender handles animated objects.
I'm not really interested in workarounds for these problems. My questions about them are more centered on what the animators that use Blender think about these issues, and even what the animation programmers who have to extract animated data from Blender think about these issues. Do the Blender developers care about these problems, or are they content with leaving them as they are? With that in mind, here goes.
#1: My principle concern is over the IPO animation system itself. Now, as a programmer, I can definately appreciate having one simple system to handle the animation of any kind of floating-point value. It makes developing a UI for interacting with animated values much easier, since the animated values are all of the same kind; there is no concept of a non-float animation type. My concern is this: for the kinds of tracks where a floating-point breakdown is a reasonable means to represent the data, this is fine, but what about for everything else?
Rotations, for example. Representing a rotation by a sequence of axis-aligned rotation angles is a really bad method for representing rotations. The 2.3 Guide even says so, as well as a mechanism to work around it. However, just because there is a workaround (and some might say, because there is something to be worked-around) doesn't mean that there isn't still a problem.
Max doesn't care. It doesn't force the user to care. Max simply does the right thing. In Max, you can have two keys, where the first is the original orientation and the second is a 3600 (yes, 3600) degree rotation, or 10 full rotations around an arbitrary axis. If you do this in Max, it will do exactly what you expect and want. Trying something similar in Blender will cause all kinds of problems, thanks to the IPO system. Not even a quaternion can handle 10-full rotations, so the Max representation must be something more than that.
The workaround discussed in the 2.3 Guide for solving this problem creates its own problems. By putting different transformations on different objects, I can't imagine how this isn't even slightly annoying for the user. Plus, it makes my job, as animations programmer (whos domain includes getting data out of the tools into the game), to convert these 2 objects into one game object. We can't afford to have multiple objects with different rotations on them in-game, because each object in-game has a certain overhead to it (plus, it wastes performance computing up the hierarchy). In order to hide this workaround from the game-code, my exporter must now recognize when this kind of workaround has been employed and internally combine the rotations itself. Doable? Certainly. Annoying? Yes. It would be better if Blender could simply animate rotations in a more sensible fashion normally.
#2 IK. I'm not an animator, so I'm not entirely aware of all the tools that Max has for animating. And this is much more of an animator consideration than anything else. However, after reading through the 2.3 Blender Guide, I have to say that Blender's IK support is... lacking.
IK seems to fail on "shoulder" joints. This forces the user to use a workaround to create 2 shoulder joints, one for each kind of motion. This is very similar to the IPO workaround, and may even be related in some fashion.
More importantly, you must understand a fundamental failure of IK: it finds a
solution; there are more than one solutions to most 3D IK problems. With an arm, for example, there is a "spin" factor: the elbow position can be rotated without moving the hand or the shoulder from their fixed points. Because this angle of freedom can reasonably be changed depending on the circumstances, it clearly should be something that the user can adjust.
I'm not entirely sure about Blender's IK solver, but, from the Guide and a few hints I've seen in the past on the net, it does not appear that this degree of freedom is available to the user.
Lastly, there's the ability to turn on and off IK at will. As I understand it, IK in Max is something external; it is a tool to generate animation data and nothing more. It is not part of a hierarchy of objects, nor is it something that exists in the scene.
Blender definately does not deal with IK in this fashion. IK is either on or off, and it is always on or off for that armature. It would be very useful, from the animator perspective, to be able to turn on and off IK effortlessly, and to even have it bind itself to any object dynamically.
#3 The Armature. The entire armature itself is an object; the fact that it has bones and separate pieces is irrelevant. It is treated as a single object. Indeed, the ability to parent objects to particular bones is hacked on through a different parenting user interface.
Max doesn't do this. Indeed, Max doesn't care what you use to build a weighted, skinned mesh with. You can use any object in the scene, even if that object is a mesh or something. This makes writing animation exporters very easy; once you have identified the base of the "armature" hierarchy, just go to each bone and do what you would have done for a single object.
This is better from the animation programmer perspective, as it makes exporting data that much easier. This is better from the user perspective, because they can use the same tools that they used for moving regular objects around for animating the "armature". In Blender, you have to go through an entirely different UI just to get at animating armatures.
I don't intend this post as a big "Blender sux, Max is great" flame-bait kind of thing. I, personally, hate Max (crashes a lot, and behaves oddly a lot). But, I appreciate what Max does that is good, and I would like to see some of their better ideas make it over to Blender. What I'm interested in is whether or not the community thinks these are problems. The developers certainly won't "fix" them if the community doesn't want them to.
I, for one, think that these are significant problems. Because of them, I know that I could never go to my technical director, producer, and artistic director and ask them to consider switching from Max to Blender. There's just too much that Max does for us that our people, both artists and programmers, rely upon each day to get their work done.
Blender is a nice tool. I would definately love to see it be able to go up against Max and the other top-end 3D packages. However, from an animation perspective, Blender is lacking in several key areas. And I want to know if the community even considers these areas problems at all.[/i]
umm, I don't have much time right now but it appears you missed a few techniques for animating with blender [by reading only the blender 2.3 guide?]
it appears you are looking at the IK flag for bones as wether the bone can do IK or not, the purpose of this is more to allow the user to control how far an IK chain goes [so, for example, they will probably animate the shoulder seperate from the arm, but they could be directly related bones]
however, I hadn't considered [I don't know if there is an interface for this or not] what things would be like if the user could control how much IK affected that particular bone, or how it would behave
I couldn't really tell if you were noting that the IK solving doesn't give the user much control about which way for example an elbow points
[many users have requested being able to define the plane IK is done on]
however, at least in the rigs I have built and seen, the method for controlling that I like the best is to have a bone child of the root bone [or back bone....] for each arm [and knee] which is tracked to by the upper arm, iirc with a locked track constraint
while the control isn't absolute, I can still achive what I need to
one last thing:
you can have rotations of 10 times around any axis in the ipo window using eulers pretty easy, but staying out of the ipo window I think it may be difficult to keyframe it
for something you would have rotating that far you'd probably want to mess with it in the ipo window anyway
[btw: I am by no means a competent animator, I have no experience with max, and am somewhat a coder]
#1 (rotation input)
I think you miss a couple of points here. Which is:
- When working with objects, rotations are wrapped to cycle -360 - +360 degrees when you rotate objects interactively. This convention works fine in most cases.
- If you want fixed numbers (like 720 degrees) you can just type it in (press NKEY for the object info panel) and insert it as a key.
- Inside of the IpoEditor you can easily assign values by grabbing points or typing too (NKEY again)
I don't know how 3DS allows inputting of 720 degrees. But I'm sure the 3DS users are required to learn that somehow too... so your remark that "you were not interested in how to do this in Blender" confuses me.
Further; rotations for Bones (in Armatures) are expressed in Quaternions, not Eulers. This works with Ipos as well. The limitations you notice (from 2.3 manual) probably have to do with issues regarding Euler limitations, which is evidentally something more 3d packages cope with.
A full switch to Quaternions for Objects and animation was already considered in 2000, but is still pending unsolved issues (especially in compatibility and animation usablity).
However, internally Blender calculates with quaternions wherever possible.
(BTW: I'm the coder responsible for that part :)
The way this is integrated in Blender lacks a lot, it is unfinished and still pending work for a team we don't have yet...
I think every Blender animator would love to see work here. The problem isn't in the IK library though, but in the method we give access to this.
Here too we should improve on access to the Armature. The design decision to stick it to a single 'object' is a limitation but a benefit too, this just shouldn't frustrate users... yep.
In general discussions we had in the past with developers who currently maintain this code we already concluded this system actually could use a rewrite from scratch... including reviewing design decisions.
As an animation programmer yourself you can imagine this not to be easy at all for volunteers in an open source project...
For the upcoming release (october hopefully) I've decided to focus mostly on GUI issues, but after that (xmas or so) the animation system will get top priority for me to do work on.
I'm not a programmer, but an animator. and i think the rotation system for objects makes a lot of trouble. by rotating some object in some desirable way, it happens much too often that if it's keyed, the result is some weird rotation / often instead of rotating by a small amount around one axis, a weird rotation around all axis.
Instead of armature object, there could be some kind of grouping objects- this would be very useful eg in NLA, because now, you cannot do any animation strips for non-armature objects, and working with keys, if you have a lot of controll objects is not very clear. there could be, at least for the NLA some kind of "create a group" or "actor"command, which collapse all selected objects to 1 line(which would be expandable again...)
whether a bone is IK or FK in a chain does not constraint to FK or IK animation, merely the possiblity of it; to actually do IK animation you need a constraint, (IK solver) to a goal. The constraint itself is animatable with IPOs, so you can indeed switch IK and FK easily. This is a must.
The interface from a user perspective could be better- not for the IK FK switch specifically, but animating constraints is a little 'quirky' interface wise in blender.